Quick GPU question

jimmyzpopjimmyzpop Posts: 118
edited June 2021 in The Commons

Hello I have a 2080ti with 11gb vram and have the ablity to purchase a rtx 3090 with 24gb vram or 3060 with 12gb vram. Which setup would show better performace / render time for DAZ3d  for scenes that are in excess of 16gb when paired with my current 2080ti.

I know when using dual cards that daz will use only 11gb of vram for both cards (22gb total) since the 3090 will have to match the vram of the 2080ti.

Will there be a noticable faster render times having a 3090&2080ti vs 3060&2080ti.

Thank you all in advance.  

Post edited by jimmyzpop on

Comments

  • barbultbarbult Posts: 24,839
    edited June 2021

    RTX 3090 features 10,496 CUDA Cores. The RTX 3060, on the other hand, comes with 3,584 CUDA Cores. 3090 will be much faster. If your scene is in excess of 16 GB, it is too big for the 2080 Ti and too big for the 3060. Neither would be used. the CPU would render. You cannot add the memory of the 2080 Ti and the 3060 together - it doesn't work that way. The scene needs to fit completely into BOTH graphics cards, for the combo of 2080 Ti and 3060.

    Post edited by barbult on
  • jimmyzpopjimmyzpop Posts: 118

    Thank you @barbult! Your reply has been very helpful in my decision. I was wrongly interpreting how the memory works with duel cards in Daz. Due to performance of the cuda cores and duel card memory limitations, it makes the most sense to go with the 3090. Thank you again for your help and quick reply! 

  • barbultbarbult Posts: 24,839

    I tried to write more, but the forum kept failing. There are only some combinations of identical cards that have the capability to pool memory. I believe the 3090 has that capability, but only if you combine multiple identical 3090 cards, not if you combine a 3090 with a 20xx series card. I'm not an expert on that at all, so do some more research if you are interested in that concept.

  • kyoto kidkyoto kid Posts: 41,244
    edited June 2021

    ...I'd use the 2080 Ti to drive the displays and dedicate the 3090 for rendering. No point in wasting the money (particularly at the scalper prices) on all that extra VRAM to hamstring it with a lower VRAM card.

    The base 3060 has pretty much the same specs as a Titan X/Xp save for the fact it has faster GDDR6 memory along with RT and Tensor cores as well as an MSRP of 329$ (The Titan X retailed for 999$ and the  Xp 1,199$) however I've seen 3060s marked up as high as 800$.. Though the slightly higher priced (399$ MSRP) 3060 Ti has 4,864 cores along with more RT and Tensor cores, it only offers 8 GB of VRAM which would further limit scene size.  

    ETA:  getting 502 bad gateway errors here as well. They really need to fix this.

    Post edited by kyoto kid on
  • barbultbarbult Posts: 24,839

    kyoto kid said:

    ...I'd use the 2080 Ti to drive the displays and dedicate the 3090 for rendering. No point in wasting the money (particularly at the scalper prices) on all that extra VRAM to hamstring it with a lower VRAM card.

    The base 3060 has pretty much the same specs as a Titan X/Xp save for the fact it has faster GDDR6 memory along with RT and Tensor cores as well as an MSRP of 329$ (The Titan X retailed for 999$ and the  Xp 1,199$) however I've seen 3060s marked up as high as 800$.. Though the slightly higher priced (399$ MSRP) 3080 Ti has 4,864 cores along with more RT and Tensor cores, it only offers 8 GB of VRAM which would further limit scene size.  

    ETA:  getting 502 bad gateway errors here as well. They really need to fix this.

    3080 Ti has 12 GB, not 8 GB. It also has 10240 CUDA cores, not 4864.  

  • kyoto kidkyoto kid Posts: 41,244

    ...apologies [corrected] I meant the 3060 Ti. Having to deal with all the forum crashes has been most distracting.

  • SnowSultanSnowSultan Posts: 3,643

    If you have the ability to buy a 3090, if you can afford it, and you're not being price-gouged, take it - and please tell a few of us where you got it and if there are any more.  ;)  They are impossible to find these days and those that show up are either used (probably mined half to death) and/or twice their normal price.

  • MelissaGTMelissaGT Posts: 2,611
    edited June 2021

    I'd like to know how you managed to get access to a 3090, lol...that is, without sitting in front of your pc refreshing sales pages all day every day or waiting for an email that never comes. The only ones I can find are from scalpers...the price has come down a bit, but it's still way way way more than MSRP. 

    Post edited by MelissaGT on
  • rrwardrrward Posts: 556

    Like someone else said, use the 2080ti for your display and the 3090 for rendering. Not only does the 3090 have double the VRAM of the 2980ti, it is also twicw as powerful in rendering. I wouldn't bother using both for rendering. But that's me.  Oh, and where the [deleted] did you get your hands on a 3090?

  • jimmyzpopjimmyzpop Posts: 118

    Thanks for the help all! the forums have been messing up for me as well.  With your great advice, I will use the 2080ti for display and the 3090 for rendering. As to where I am able to get a new 3090 at retail price,  A friend of mine got a 3090 from waiting in-line at Micro Center overnight last Wednesday for a Thursday morning shipment, and his wife was also lucky enough to be able to order a 3090 that same morning on Antonline's website when they went online. So since now he has two and only needs one for gaming, he is willing to sell me the other 3090 at retail, otherwise he was just going to list it online. Best of luck to you all and my buddy found out about the antonline shipments through the stockdrops discord channel.  

  • kyoto kidkyoto kid Posts: 41,244
    edited June 2021

    ...nice catch, just make sure your case is large enough to accommodate it as depending on the vendor, a 3090 can be as long as 13.2" (a couple of the MSI models).  With some squeezing and grunting I could shoehorn a Founders Edition (12.3") into my big old P-193 case but it would be a very a tight fit. Some of the EGVA ones are between 11.2 and 11.8".

    Yeah I could sort of afford one at MSRP however I don't have the system memory (MB tops out at 24 GB) or expansion slot (PCIe 2.0) to properly support it so I'd have to sink just about as much if not more into a new MB, CPU, and memory.  

    Post edited by kyoto kid on
  • JamesJABJamesJAB Posts: 1,760

    The other 2 potential issues running the RTX 2080ti and 3090 at the same time...  Most RTX 3090 cards take 3 slots, so both cards together are 5 slots (and your motherboard needs to have the 16x slots spaced right for the 2 cards.)  Then the other big one is the Power Supply as those are both very power hungry cards (4x 8 pin connectors between the two cards)

  • fred9803fred9803 Posts: 1,564

    The first question I'd be asking myself is if I really needed a 3090. Am I professional who needs to render heavy scenes and/or as quickly as possible in big formats? If the answer is yes then OK go and get one, you need one. Otherwise I think the expenditure is hard to justify, at least until prices become far less insane.

  • kyoto kidkyoto kid Posts: 41,244

    ...I would be the type who could use one as I plan to create farily epic level scenes with elements that add to the memory load in large format at high quality for gallery level prints 

    Crikey if I could afford an A6000 I're be extremely happy.

  • barbultbarbult Posts: 24,839
    If you can afford it and your computer can handle it, go for it! I can't see how you would regret that decision. You don't have to be a professional to deserve the best!
  • kyoto kidkyoto kid Posts: 41,244
    edited June 2021

    ....that's the rub my system cannot (it still has PCIe 2.0 expansion slots and is at the MB's memory limit of 24 GB) so I'd need to dump a fair  amount into a new MB, CPU, and more memory (minimum 64 GB but preferably 128 for future proofing purposes).

    As I'd choose to stay with pre Kaby Lake tech, it means a slightly higher cost  as I would go with server components (new LGA 2011-V3 boards for 6th generation i7s are difficult to find, most being either off brands I've never heard of, or used/refurbished that all ship from overseas vendors ["YMMV" comes to mind]) the price comes to 1,496$ (about what it cost to build my current system before recent upgrades). This would be for a single socket Supermicro workstation/server ATX MB, 8 core 3.2 GHz Xeon E5 2667, and 128 GB of 2133 ECC DDR4 memory.

    With an RTX 3090 at MSRP, that would pretty much drain my account.

    Post edited by kyoto kid on
  • IamPerttiAIamPerttiA Posts: 25

    rrward said:

     Oh, and where the [deleted] did you get your hands on a 3090?

    Here in Finland they are available at around 3000 eur (+/-200eur) vat included, as are the new 3080Ti's and 3070Ti's (2000eur and 1000eur respectively) 

  • I dunno... there is some debate about performance hit on using/not using your 2080 to drive the monitors....

    What I will say, is that 2080s are still selling well on ebay... you could certainly recoup some of the cost of the 3090 by flogging your old 2080 on ebay.

    Or, for that matter, I am sure there are a lot of folks even here at the forum that would love to upgrade to a 2080....

  • Gator_2236745Gator_2236745 Posts: 1,312
    edited June 2021

    Chumly said:

    I dunno... there is some debate about performance hit on using/not using your 2080 to drive the monitors....

    What I will say, is that 2080s are still selling well on ebay... you could certainly recoup some of the cost of the 3090 by flogging your old 2080 on ebay.

    Or, for that matter, I am sure there are a lot of folks even here at the forum that would love to upgrade to a 2080....

    Yeah, I'm one that had that a separate card to drive the display and ditched that setup.  Earlier versions of Nvidia drivers or Daz it made a difference.  When I didn't have the dedicated display card, the system got real slow while rendering, making other things like Photoshop really slow & painful.  Even simply web browsing painful.  So with a new comp I added in a dedicated card for the display.  With gaming though, I had to switch the cables to a Titan or the game would use my low end card.

    Somewhere along the line the drivers or Studio changed, and when rendering the system didn't get painfully slow.  I just built a new system with a Ryzen 5950x and a 3090, and I just have the 3090 in it.  I can render and the system is still fast for other tasks.  Very happy with the setup.

    Based on my experience, In OP's shoes I'd sell the 2080 while the demand is super hot once I got a working 3090 in my hands.

    Post edited by Gator_2236745 on
  • DripDrip Posts: 1,206

    Within your render settings, you can select what GPUs (and PCU) to use for rendering. So, what you can do with that 2080 Ti - 3090 combo:

    If your scene takes more than ~9GB VRAM, then disable the 2080, so only the 3090 is assigned, and the rendering won't drop to CPU.
    If your scene takes less than ~9GB VRAM, then enable both your 2080 and the 3090 for rendering. It's just a matter of ticking a box, but adding that 2080 will speed up your rendering a lot.

    For good measure, you can also enable your CPU in combination with your GPU's, but the gain is generally really marginal, while your computer will become slower and less responsive, so even doing some non graphically intensive tasks to pass time may become troublesome.

  • prixatprixat Posts: 1,590

    Drip said:

    If your scene takes more than ~9GB VRAM, then disable the 2080, so only the 3090 is assigned, and the rendering won't drop to CPU.
    If your scene takes less than ~9GB VRAM, then enable both your 2080 and the 3090 for rendering. It's just a matter of ticking a box, but adding that 2080 will speed up your rendering a lot.

    Doesn't Iray do this automatically, without the need to tick and untick boxes? 

  • RL_MediaRL_Media Posts: 339

    Yeah, In my experience, no need to untick the card with less VRAM. My 1070 will drop out and just my 2080 super will render if it don't fit in 1070, but will fit in the 2080 super.

  • outrider42outrider42 Posts: 3,679
    edited June 2021

    prixat said:

    Drip said:

    If your scene takes more than ~9GB VRAM, then disable the 2080, so only the 3090 is assigned, and the rendering won't drop to CPU.
    If your scene takes less than ~9GB VRAM, then enable both your 2080 and the 3090 for rendering. It's just a matter of ticking a box, but adding that 2080 will speed up your rendering a lot.

    Doesn't Iray do this automatically, without the need to tick and untick boxes? 

    It is supposed to. It often does work, but sometimes it does not work correctly. I have seen this numerous times over the years with different hardware configs and versions of Daz. Some versions of Daz were much worse about it than others.

    What would happen is that only the top or bottom portion of the render would be ok. The other half would be like transparent. When this issue happened, it would persist even if you changed the scene to use less VRAM, thus requiring a restart of Daz. It might also crash.

    In the more recent versions of Daz I have noticed it seems to handle this better than whatever version I was on when this was a problem.

    Post edited by outrider42 on
Sign In or Register to comment.