I have the 4060TI 16GB and it's OK

13

Comments

  • marblemarble Posts: 7,500
    edited August 2023

    We have a thing called PayWave. I don't have a credit card but I have a debit card which pays directly from my bank account. I just wave it over the card reader and it pays. I visited the UK recently and the same method is used there. I could also use my phone but I find it less trouble to wave the card. I can't remember the last time I paid cash and I have not had a cheque book for years.

    Post edited by marble on
  • We have that here too. It's called tap to pay. It can be connected to a debit/credit card or Apple/Samsung/Google pay. But not all stores accept it here. Most people just swipe their card or insert the chip end.

  • oddboboddbob Posts: 396

    marble said:

    At the risk of veering off into a whole new discussion (my fault, sorry)... perhaps one last comment: I do get that - I see lots of beautiful people in starring roles on American shows whereas British (and other) shows tend to use people who look like you might expect if you walked into a police station or law office.

    This is because British people tend to be quite ugly, I know this because I'm one of them. I'm stunning but that's not the norm.

    Back on topic, I'm rendering and gaming with a 4090 and using an 850w psu, lots of folk are. I also have a CPU that can pull 250w and a water pump in there.

    According to the natty little power read out on the side of the psu - rendering using just the GPU uses about 400w wiith the GPU contributing about 280w and the CPU about 50w. It uses more power if the CPU is being worked hard but that's rare in DS unless you have CPU rendering on.

    In a heavy gaming load with the GPU and CPU slightly overclocked I've hit 700w. On stock settings 600w or lower total system power draw is more common. PSU quality is at least as important as the number on the side. There's a handy PSU tier list somewhere on the net.

  • ArtiniArtini Posts: 9,462

    How is it the noise level from the water pump on the CPU?

    I have had a water pump on the CPU on my previous computer,

    but it was so noisy, that I have moved the case to the kitchen

    and used long cables to the screen, keyboard and mouse in the room.

  • oddboboddbob Posts: 396

    Artini said:

    How is it the noise level from the water pump on the CPU?

    I have had a water pump on the CPU on my previous computer,

    but it was so noisy, that I have moved the case to the kitchen

    and used long cables to the screen, keyboard and mouse in the room.

    The water pump is running a loop with a 420mm and 280mm radiator and waterblocks on the cpu and gpu.

    It's the loudest part of the system when it's running at higher speeds with the computer under heavy load. But then the fans are running at <1100 so loud is subjective.

    Part of the problem is that I bought a DDC pump because I liked the housing/reservoir it came in more than the ones typically seen on the quieter D5 type pumps. I'll swap it next time I do a rebuild.

    As you've found being in a different room to the PC is the ultimate in quiet computing but it isn't for everyone.

  • KitsumoKitsumo Posts: 1,216
    edited August 2023

    @marble, If you want to see how the rest of us live here in the US, there's a decent Youtube channel called Joe & Nic's Road Trip. Ok, maybe it's not all that bad, but some of it is pretty bad.

    I still use a paper check once in a while, but it's rare. I've gone mostly cashless since the pandemic. I did score a bunch of $2 bills last week at my credit union (they're not rare or out of circulation as some people believe, you just don't see them often). But once I had them I realized "when am I going to spend these? I don't really use cash." I love to spend them just to see the reaction on peoples' faces. Hopefully they don't call the cops on me, thinking it's counterfeit.

    Post edited by Kitsumo on
  • marblemarble Posts: 7,500

    Kitsumo said:

    @marble, If you want to see how the rest of us live here in the US, there's a decent Youtube channel called Joe & Nic's Road Trip. Ok, maybe it's not all that bad, but some of it is pretty bad.

    I still use a paper check once in a while, but it's rare. I've gone mostly cashless since the pandemic. I did score a bunch of $2 bills last week at my credit union (they're not rare or out of circulation as some people believe, you just don't see them often). But once I had them I realized "when am I going to spend these? I don't really use cash." I love to spend them just to see the reaction on peoples' faces. Hopefully they don't call the cops on me, thinking it's counterfeit.

     

    So sorry to have started this diversion in a valuable and informative thread about GPUs, all of which is very pertinent to my GPU needs at the moment. The USA thing came about because I often feel like - and it is not just this forum - American contributers to forums seem to think that they are only talking to other Americans. I'm not in a heathwave right now because it is mid-winter and I'm trying to keep my old bones warm. My deceased RTX 3090 at least kept my ankles warm. 

    Thank goodness we are not allowed to talk politics!

  • outrider42outrider42 Posts: 3,679

    Most Americans have abandoned checks. I almost have, I only write one check a month for my house payment. I am honestly not sure why I do that, I could pay it online. I pay all my other bills online.

    In retail, check use has been declining for a long time. Even when I did my tour of retail duty 20 years ago checks were already pretty rare. I only see elderly people writing checks at retail, and that is becoming rare, too (the checks...not the elderly). Most people are using credit or debit cards. I use debit for 99% of everything. I can run the debit as a credit to save the debit processing fee. A few places don't like this and charge a credit card fee, so you the customer get hit either way. That is because of how the payment processors charge them. A store with very low margins is hurt by credit card fees. But stores do not pay as much for debit processing, the customer pays. I think credit is a lot more popular than debit, and most places don't charge a credit fee. 

    Credit is so common that even the smallest local vendors have card readers. You can get ones that plug into your smartphone. My barber uses one like that.

    So Americans have to deal with extra fees on nearly everything. Most of it is tax, a potential payment processing fee is fairly tiny compared to tax on a $1000 item, but can be a pain on food. Sometimes you might get a tax break holiday. Some states do this for back to school, they have a weekend where certain items are tax free. Virginia has done a tax free holiday for several years...except this year. That day is a great day to buy certain electronics, as those items are usually eligible. Complete computers were eligible, I do not recall if graphics cards were, or if they had a limit on price. If somebody lives a state that does tax free holidays, it may be worth looking into as many of them are in August.

    The 4080 is roughly the same speed as having both a 4070 and 4060ti combined, perhaps slightly less, but you don't have to worry about going over the 4070's 12gb and losing a large portion of render speed. I'd say that is a win. The most recent 4080 benchmark also shows it pulling ahead of the 3090. There might be some scenes where it loses and some that it wins depending on what is in the scene.

  • kyoto kidkyoto kid Posts: 41,057

    ...one thing that keeps me in Oregon, no sales tax, and being retired on Social Security, no income tax either, state or federal. The downside much lower income compared to what I was making even at my last job that didn't pay very well (certainly not for the responsibilities I had there).

    OK back to GPUs.  

    So you can run a 4060 Ti and 4070 together without the process  defaulting to the lower VRAM card?  So what about running a 3060 12 gb and A 4060 Ti together?

  • marblemarble Posts: 7,500

    kyoto kid said:

    ...one thing that keeps me in Oregon, no sales tax, and being retired on Social Security, no income tax either, state or federal. The downside much lower income compared to what I was making even at my last job that didn't pay very well (certainly not for the responsibilities I had there).

    OK back to GPUs.  

    So you can run a 4060 Ti and 4070 together without the process  defaulting to the lower VRAM card?  So what about running a 3060 12 gb and A 4060 Ti together?

    I think he (?) meant that the 4080 instead of the combination of 4060 and 4070 doesn't have the drawback of defaulting to the 12 GB of the lower VRAM GPU/ 

  • kyoto kidkyoto kid Posts: 41,057

    ...thanks.  so much for that. 

    I should  be able to at least run my Titan-X and 3060 together as both are 12 GB  However it depends though on whether the Iray driver sees the 3060 as the primary card and the Titan-X as the auxiliary one as the latter doesn't have RT cores..

  • ArtiniArtini Posts: 9,462

    I also think, that having 16 GB of VRAM is a clear advantage.

    Now, I use a computer with a regular cooling

    and will stay away from the water cooling of CPU in the future,

    even if I have liked a low temperature on the CPU under load.

    The noise coming from the water pump was unbearable for me.

  • Artini said:

    I also think, that having 16 GB of VRAM is a clear advantage.

    Now, I use a computer with a regular cooling

    and will stay away from the water cooling of CPU in the future,

    even if I have liked a low temperature on the CPU under load.

    The noise coming from the water pump was unbearable for me.

    Water cooling should be quieter than air cooling. I suspect an issue with the specific setup you had. In particular "AIO" systems are not good.

    However it is true that water cooling is very expensive and specialised, and it's much easier and more convenient to use air cooling.

  • marble said:

    kyoto kid said:

    ...one thing that keeps me in Oregon, no sales tax, and being retired on Social Security, no income tax either, state or federal. The downside much lower income compared to what I was making even at my last job that didn't pay very well (certainly not for the responsibilities I had there).

    OK back to GPUs.  

    So you can run a 4060 Ti and 4070 together without the process  defaulting to the lower VRAM card?  So what about running a 3060 12 gb and A 4060 Ti together?

    I think he (?) meant that the 4080 instead of the combination of 4060 and 4070 doesn't have the drawback of defaulting to the 12 GB of the lower VRAM GPU/ 

    That is, if so, not correct - Iray will use (up to) all the RAM available on each card, so if they have differing amounts it is possible for oen to drop out while the other carries on. I don't know where the idea that both cards are limited to the capacity of the smaller came from, but it refuses to go away.

  • ElorElor Posts: 1,494

    Hello,

    Speaking of VRAM, is there a simple way to 'know' how much VRAM someone'll 'need' ?

    I'm currently using a MBP with a M1 Pro and 32 GB of unified RAM (as it is used by both the CPU and the GPU, not that the GPU is really working when IRAY is used of course) but looking at the Activity Monitor, I don't see a way to know what part of the memory used by DAZ would be used by an RTX GPU and what part would be still used by the CPU.

  • PerttiAPerttiA Posts: 10,024

    Elor said:

    Hello,

    Speaking of VRAM, is there a simple way to 'know' how much VRAM someone'll 'need' ?

    I'm currently using a MBP with a M1 Pro and 32 GB of unified RAM (as it is used by both the CPU and the GPU, not that the GPU is really working when IRAY is used of course) but looking at the Activity Monitor, I don't see a way to know what part of the memory used by DAZ would be used by an RTX GPU and what part would be still used by the CPU.

    Is that a Mac? if it is, one gets no GPU assisted Iray rendering as Nvidia and Apple are not friends anymore. 

  • ElorElor Posts: 1,494

     

    Is that a Mac? if it is, one gets no GPU assisted Iray rendering as Nvidia and Apple are not friends anymore. 

    Yes, MBP as in MacBook Pro (sorry, I should have write it in full).

    I know I won't have GPU rendered Iray with that machine, but my question was if it's possible to find how much VRAM I would need if I bought an NVIDIA RTX card (and the rest of the hardware required by a Windows computer) from what I can observe using my current Mac computer (where memory is not really a concern, because I'm always rendering with CPU in the first place).

  • davesodaveso Posts: 7,014

    Richard Haseltine said:

    marble said:

    kyoto kid said:

    ...one thing that keeps me in Oregon, no sales tax, and being retired on Social Security, no income tax either, state or federal. The downside much lower income compared to what I was making even at my last job that didn't pay very well (certainly not for the responsibilities I had there).

    OK back to GPUs.  

    So you can run a 4060 Ti and 4070 together without the process  defaulting to the lower VRAM card?  So what about running a 3060 12 gb and A 4060 Ti together?

    I think he (?) meant that the 4080 instead of the combination of 4060 and 4070 doesn't have the drawback of defaulting to the 12 GB of the lower VRAM GPU/ 

    That is, if so, not correct - Iray will use (up to) all the RAM available on each card, so if they have differing amounts it is possible for oen to drop out while the other carries on. I don't know where the idea that both cards are limited to the capacity of the smaller came from, but it refuses to go away.

    probably fro the way system ram works. Mixed modules it drops back to slowest speed.  

  • PerttiAPerttiA Posts: 10,024

    Elor said:

     

    Is that a Mac? if it is, one gets no GPU assisted Iray rendering as Nvidia and Apple are not friends anymore. 

    Yes, MBP as in MacBook Pro (sorry, I should have write it in full).

    I know I won't have GPU rendered Iray with that machine, but my question was if it's possible to find how much VRAM I would need if I bought an NVIDIA RTX card (and the rest of the hardware required by a Windows computer) from what I can observe using my current Mac computer (where memory is not really a concern, because I'm always rendering with CPU in the first place).

    Pretty much depends on what you put in the scene, I had nine G8 figures with light weight clothing and light weight hair, a Horse 2, a Dog 8, whole western town, Iray Worlds Skydome etc and some props in the scene (without any optimisation tricks) and it wasn't a problem for my RTX 3060 12GB, but others are having problems getting 2-3 G8 figures rendered on a 12GB GPU.

    8GB's is just too little as the baseload by Windows, DS, the scene and the needed Working Space take about 4GB's of VRAM.

    The amount of VRAM the textures are taking is about 50% of how much RAM they take (with default Iray compression settings)
    One 4096x4096x24bit image takes 48MB's of RAM when opened in any program (8k takes 192MB's) and one surface/shader/material can have 5-6 images in total
    In my scenes geometry has taken about 15-20% of what the textures are taking, but if one uses high SubD's or very dense geometry, the amount of VRAM the geometry takes, can quickly skyrocket beyond what ones GPU (or even RAM) can handle.

    VRAM on dedicated Nvidia cards is not shared, nor can one use system RAM to expand VRAM

    The amount of RAM should be around three times more than VRAM, or minimum 32GB's

  • jd641jd641 Posts: 458

    kyoto kid said:

    So what about running a 3060 12 gb and A 4060 Ti together?

    I currently have a 1070 as my primary display card and a 3060 as a dedicated rending card, I was gonna replace my 1070 with my 3060 for games and video editing and have the 4060ti strictly for Studio until I saw that the 4060ti 16gb was $500. Then my dream was crushed and I wept many internet tears. crying

  • kyoto kidkyoto kid Posts: 41,057

    daveso said:

    Richard Haseltine said:

    marble said:

    kyoto kid said:

    ...one thing that keeps me in Oregon, no sales tax, and being retired on Social Security, no income tax either, state or federal. The downside much lower income compared to what I was making even at my last job that didn't pay very well (certainly not for the responsibilities I had there).

    OK back to GPUs.  

    So you can run a 4060 Ti and 4070 together without the process  defaulting to the lower VRAM card?  So what about running a 3060 12 gb and A 4060 Ti together?

    I think he (?) meant that the 4080 instead of the combination of 4060 and 4070 doesn't have the drawback of defaulting to the 12 GB of the lower VRAM GPU/ 

    That is, if so, not correct - Iray will use (up to) all the RAM available on each card, so if they have differing amounts it is possible for oen to drop out while the other carries on. I don't know where the idea that both cards are limited to the capacity of the smaller came from, but it refuses to go away.

    probably fro the way system ram works. Mixed modules it drops back to slowest speed.  

    ...so in the case of a 4060Ti and 3060 12 GB together it would fall back to the 3060 and ignore the 4060 altogether (which means no 16 GB of VRAM)?

    Based on that it sounds like the same would happen with paring the 3060 and Titan-X as even though they  both have the same VRAM and base core count the Titan-X does not have RT cores and the two are 4 generations apart when it comes to GPU processor, cores, and memory.

    Just trying to get the best bang for my meagre bucks.

  • PerttiAPerttiA Posts: 10,024

    kyoto kid said:

    ...so in the case of a 4060Ti and 3060 12 GB together it would fall back to the 3060 and ignore the 4060 altogether (which means no 16 GB of VRAM)?

     

    No, the 3060 12GB would drop out and the 4060TI 16GB would continue. 

  • kyoto kidkyoto kid Posts: 41,057

    ..so the cores from 3060 wouldn't add for improving render speed then.

  • PerttiAPerttiA Posts: 10,024
    edited August 2023

    kyoto kid said:

    ..so the cores from 3060 wouldn't add for improving render speed then.

    They will as long as the scene fits on VRAM of both of the cards separately. ie. total VRAM usage of the scene and all the baseloads stays under 12GB's
    If it doesn't, the 3060 12GB drops out (like it's not there anymore) and only the 4060TI 16GB continues rendering.

    Post edited by PerttiA on
  • marblemarble Posts: 7,500
    edited August 2023

    Now I'm getting confused. I see what Richard is saying and that makes sense but it still negates one GPU completely if (in the case of 12 + 16) it drops to a single GPU after going past the 12 GB liimit of the 4070? Or am I still misunderstanding?

    Not that it makes any difference to me now as I went ahead and bought a 4080 instead.

    Post edited by marble on
  • kyoto kidkyoto kid Posts: 41,057

    PerttiA said:

    kyoto kid said:

    ..so the cores from 3060 wouldn't add for improving render speed then.

    They will as long as the scene fits on VRAM of both of the cards. ie. total VRAM usage of the scene and all the baseloads stays under 12GB's
    If it doesn't, the 3060 12GB drops out (like it's not there anymore) and only the 4060TI 16GB continues rendering.

    ....OK got it. Thank you. (things were so much simpler in the 3DL days)

    So now, in the case of paring the 3060 with the Titan-X ,as both have the same VRAM (12 GB) and as long as the scene does not exceed that, the resources (core count) of both cards will add together though  not be quite as fast as two 3060s as the Titan predates RTX.

  • PerttiAPerttiA Posts: 10,024

    kyoto kid said:

    PerttiA said:

    kyoto kid said:

    ..so the cores from 3060 wouldn't add for improving render speed then.

    They will as long as the scene fits on VRAM of both of the cards. ie. total VRAM usage of the scene and all the baseloads stays under 12GB's
    If it doesn't, the 3060 12GB drops out (like it's not there anymore) and only the 4060TI 16GB continues rendering.

    ....OK got it. Thank you. (things were so much simpler in the 3DL days)

    So now, in the case of paring the 3060 with the Titan-X ,as both have the same VRAM (12 GB) and as long as the scene does not exceed that, the resources (core count) of both cards will add together though  not be quite as fast as two 3060s as the Titan predates RTX.

    Although, there has been some talk about pre-RTX card with it's RTX software emulation not playing nice with an RTX card.

  • kyoto kidkyoto kid Posts: 41,057
    edited August 2023

    ...so better to use the Titan just to drive the displays. 

    Sad as it was once Nvidia's flagship, but then it is now 5 generations old with the release of Ada Lovelace architecture.

    Sort of like when the airlines put some of the most iconic aircraft like Lockheed 's Constellations and Douglas' DC-6s on "milk runs" in the 60s after jet travel became the norm..

    Post edited by kyoto kid on
  • ElorElor Posts: 1,494

    PerttiA said:

    Pretty much depends on what you put in the scene, I had nine G8 figures with light weight clothing and light weight hair, a Horse 2, a Dog 8, whole western town, Iray Worlds Skydome etc and some props in the scene (without any optimisation tricks) and it wasn't a problem for my RTX 3060 12GB, but others are having problems getting 2-3 G8 figures rendered on a 12GB GPU.

    [...]

    The amount of RAM should be around three times more than VRAM, or minimum 32GB's

    Thank you for your answer yes

    I now have a better idea of what I may need if I decide to buy an RTX smiley

  • outrider42outrider42 Posts: 3,679

    Each GPU has its own capacity, if that capacity is exceeded it will not participate in the render at all. That is the rule.

    So if you have a 3060 12gb and a 4060ti 16gb, if the scene is under 12gb then both GPUs will render.

    If the scene exceeds 12gb then the 4060ti will render by itself.

    If the scene exceeds 16gb then no GPU will render, and you will be stuck rendering on CPU, which you absolutely do not want. If your scene is that large, the CPU could literally take days to render that scene depending on what you have.

    So you can see the issue with having two GPUs that have different capacities, especially if the larger capacity GPU is the "slower" one, like in the case of a 4070 and a 4060ti. You have to know what you are getting into with this. I have a 3060 and a 3090, so if the 3060 drops out I can still do ok as it isn't a massive boost. But it is nice when the 3060 can lend a hand. This might make some people want to keep their scenes under 12gb so that both GPUs will render.

    As to mixing RTX and older GTX, when I had my 1080tis and my 3090 together, I was having some issues with Daz Studio crashing. I have no idea what the problem was, it might render fine a few times and then crash without any warning. These scenes were not huge, either, they easily fit the 1080ti capacity so there should not have been an issue. When I swapped the 1080ti for my 3060 the crashing stopped. Now this could be just me. But I had two 1080tis. It didn't matter which one I used with the 3090, I could still crash. I have no information as to why this might have happened, or if the newer DS fixes it. I gave one 1080ti to a friend who has had no issues with it, though he only plays games, it has performed great for him. So I do not think it was a hardware problem. Again, this could have been just me. It could have been a driver or that particular Daz Studio and maybe it will not happen to somebody else. At any rate, given how wide the performance gap is to GTX, it may not be worth using it with RTX anyway. Even just going to a 3060 is a huge boost by itself over even the fastest GTX GPUs ever made. My 3060 was twice as fast as my 1080ti, and that is basically the fastest GTX you can get.

    If there is a big difference in GPU power, I would put the better one on the display. While that might mean it uses a touch more memory to drive the display, the GPU running the display can help the Daz viewport run a little better, too. That to me is pretty worthwhile, a less laggy viewport is just as vital as a faster render. Though you don't need a top notch GPU for the viewport, there is not a big difference between using my 3090 or 3060 for my display. At least it didn't seem to be, both being Ampere is possibly enough. I would experiment and try it both ways just to see. Use the display with one for a while and then swap them to see if DS behaves any different. I use my 3090 to drive the display since it has so much memory that it doesn't become any issue. And it gives my 3060 the best chance to have enough VRAM for more of my scenes.

Sign In or Register to comment.