How much GPU RAM do you really need?

2

Comments

  • kyoto kidkyoto kid Posts: 41,244

    ...Nvidia dropped the "Quadro" name with Ampere so all the "RTX A" series cards are their pro grade ones. 

  • charlescharles Posts: 849
    edited June 2021

    If your primary use of the card is for rendering, have a MB that supports dual cards, use one as your main display and GPU software usage scratch and the other for rendering. Have the rendering card with as much memory as possible, ignnore all the CUDA's and stuff like that, VRAM, as much as you can get, is what you want working with Daz Iray renders.

    What's waiting a couple of minutes on a render to finish in opposed to never being able to start a render because you haven't enough RAM for your scene.

     

    Post edited by charles on
  • charlescharles Posts: 849

    kyoto kid said:

    ...Nvidia dropped the "Quadro" name with Ampere so all the "RTX A" series cards are their pro grade ones. 

     Good to know.

  • kyoto kidkyoto kid Posts: 41,244

    charles said:

    If your primary use of the card is for rendering, have a MB that supports dual cards, use one as your main display and GPU software usage scratch and the other for rendering. Have the rendering card with as much memory as possible, ignnore all the CUDA's and stuff like that, VRAM, as much as you can get, is what you want working with Daz Iray renders.

    What's waiting a couple of minutes on a render to finish in opposed to never being able to start a render because you haven't enough RAM for your scene.

     

    ...that is why I am very content with my Titan-X Yeah three generations old, 1/3rd the cores of a 3080TI or 3090, no Tensor or RTX cores, but still getsthe job dne admirably and far better than CPU rendering.  12 GB is pretty sufficient for my work (I rarely go over 4 GB of VRAM). The big concern is having enough system memory to have Daz and the scene open.  

    I've always contended that VRAM also affects render speed because once that scene dumps to the CPU, all those GPU cores are useless. 

    The one feature I liked about Reality/Lux was I really liked was after the file was submitted to the render engine, I could close down the scene and Daz programme to save system resources. I could also pause rendering, shut the system down, come back later tunr it back on and resume rendering without skipping a beat..  Wish we had that option with Iray. 

  • nonesuch00nonesuch00 Posts: 18,316

    The way things are going the next generation of VCs or even the one after that will be out before the crypto stuff crashes out but then I should be able to get a 32GB or even 64GB VRAM VC for $1000. laugh

  • y3kmany3kman Posts: 802
    edited June 2021

    scullygirl818_02147fecb6 said:

    is a5000 a Quadro?

    Yes, it's their professional product. They dropped the Quadro name for the RTX branding.

    Post edited by y3kman on
  • The Quadro RTX 5000 is generally less than the 3090 but you only get the 16gb of RAM :( I guess the question really is do I go with just the 3090 video card (assuming I can get it for under $2500) and I can't update the computer. Seems like that may be the better investment and just use the i7 and upgrade the rest along the way. But by the time I can build the rest the prices of the cards will probably be dropped lol.

  • what ones would you suggest that would end up costing less together? Kind of wondering if that's worth it if you are buying a new one as opposed to have an old one to add to.
     

    charles said:

    If your primary use of the card is for rendering, have a MB that supports dual cards, use one as your main display and GPU software usage scratch and the other for rendering. Have the rendering card with as much memory as possible, ignnore all the CUDA's and stuff like that, VRAM, as much as you can get, is what you want working with Daz Iray renders.

    What's waiting a couple of minutes on a render to finish in opposed to never being able to start a render because you haven't enough RAM for your scene.

     

  • Gator_2236745Gator_2236745 Posts: 1,312

    scullygirl818_02147fecb6 said:

    what ones would you suggest that would end up costing less together? Kind of wondering if that's worth it if you are buying a new one as opposed to have an old one to add to.
     

    charles said:

    If your primary use of the card is for rendering, have a MB that supports dual cards, use one as your main display and GPU software usage scratch and the other for rendering. Have the rendering card with as much memory as possible, ignnore all the CUDA's and stuff like that, VRAM, as much as you can get, is what you want working with Daz Iray renders.

    What's waiting a couple of minutes on a render to finish in opposed to never being able to start a render because you haven't enough RAM for your scene.

     

    Might be a waste of money.

    The 3090 isn't considered a professional card, I doubt you'll be able to disable the display ports.  
    My old rig had two Titan X Pascals, and had a little 1060 to run the displays.  I wasn't ever able to disable the display ports on the Titans, so I didn't get the display reservation back.  On top of that, to play any games I had to plug a monitor into one of the Titans to utilize them, otherwise I got the little 1060 performance.  After a while swapping cables around for little benefit got old and I just left the main monitor plugged into one of the Titans.

    The only benefit I saw was that when I was rendering on some older 4.x builds of Studio my desktop got slow with everything - browsing, Photoshop, etc when plugged into the Titans.  I didn't have that with the dedicated 1060.  But at some point with Studio rendering with the card the monitor(s) is plugged into didn't kill desktop performance.

     

    On the bright side, that post reminded me I have that 1060 in a box somewhere, I looked and people are buying them off of Ebay for nearly what I paid for it.  Time to hawk it.  laugh

  • JamesJABJamesJAB Posts: 1,760

    kyoto kid said:

    ...I would imagine because of the RTX cores and GDDR6 memory, the 3060 would be faster.  The P3000 is 2 generations old (Pascal) it has about one third the CUDA cores of the 3060 (no RTX or Tensor compute cores) a core clock of 1.08GHz, same size memory bus (192bit) and only 6 GB GDDR5 VRAM (vs. 12 for the 3060).  It is considered similar to the Pascal GTX 1060 desktop GPU in performance. 

    Interesting fun fact, there was no desktop version of the Quadro P3000.  The mobile Quadro P3000 is the same GPU that's in the Mobile Geforce GTX 1060 excepth that the Quadro version is paired with slower VRAM.  On the flip side, the Mobile workstation laptops generaly have better cooling setups, so the GPU itself can run at max boost 24/7.
    The closest Desktop Quadro to the GTX 1060 is the Quadro P2200, it shares the same GPU but only has 5GB of VRAM.

    If you can manage it, get an RTX card because of the RT and Tensor cores.  Every GTX card has to dedicate part of your avaliable VRAM to emulating the extra features in the RTX cards when rendering in Iray.  When it comes to Iray, a GTX 1060 6GB card will not be able to render a scene that barely fits on a RTX 2060 6GB.

  • KrzysztofaKrzysztofa Posts: 226

    JamesJAB said:

    If you can manage it, get an RTX card because of the RT and Tensor cores.  Every GTX card has to dedicate part of your avaliable VRAM to emulating the extra features in the RTX cards when rendering in Iray

    I never knew this before, but explains some of the VRAM usage ive seen and was confused by...

  • UHFUHF Posts: 515

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

  • cridgitcridgit Posts: 1,757
    edited May 2022

    Redacted

    Post edited by cridgit on
  • GordigGordig Posts: 10,187

    cridgit said:

    UHF said:

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

    Do you mind elaborating how Iray is more expensive than Octane? Iray is bundled with DAZ Studio, while Octane has a monthly fee equal to what I spend on content. Unless you're referring to Iray Server which has to be licenses separately?

    I looked at Octane because I was hoping to try something a bit lighter/faster but its out of my budget.

    Octane is free to use within DS. 

  • UHFUHF Posts: 515

    Gordig said:

    cridgit said:

    UHF said:

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

    Do you mind elaborating how Iray is more expensive than Octane? Iray is bundled with DAZ Studio, while Octane has a monthly fee equal to what I spend on content. Unless you're referring to Iray Server which has to be licenses separately?

    I looked at Octane because I was hoping to try something a bit lighter/faster but its out of my budget.

    Octane is free to use within DS. 

    I have one of the original permanent licenses. 

    Iray can't use my PCs RAM for rendering. With Octane my 1660 GTX 8GB is capable of rendering up to 32GB+ of texture RAM. There's virtually no performance impact.

    To use Iray I clearly need a far far far more expensive GPU to render in a manor I am used to.  That's the only reason I haven't been using Iray.

    The down side to Octane is that you have to do a lot of texture work to get things right.  But its fast.. that attached render was 15 minutes.

    RoboticsLab.jpg
    1920 x 1080 - 2M
  • kyoto kidkyoto kid Posts: 41,244
    edited June 2021

    UHF said:

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

    ...the difference with Octane is that it has Out of Core rendering while Iray doesn't, which allows one to render larger scenes with less VRAM. A great idea, however as with a other software, it has gone to a full subscription model which as I understand requires having to be online while working. It also means having to convert Iray materials.  You also still need ths Daz plugin. 

    Post edited by kyoto kid on
  • takezo_3001takezo_3001 Posts: 1,997

    If you can snag a 3090 at MSRP then go for it, the 3080ti is such a tease, as 12 Gb is a lot, but 24Gb is so much more, and you have the added benefit to doing other things with your PC while waiting for a render, such as gaming, watching a movie, surfing online, etc...

  • UHFUHF Posts: 515

    kyoto kid said:

    UHF said:

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

    ...the difference with Octane is that it has Out of Core rendering while Iray doesn't, which allows one to render larger scenes with less VRAM. A great idea, however as with a other software, it has gone to a full subscription model which as I understand requires having to be online while working. It also means having to convert Iray materials.  You also still need ths Daz plugin. 

    I do Out of Core rendering inside the plug in.  I never use the full Octane even though I could. (It would double my horse power using my older PC.)  The only down side with Octane (for me) has been texture/shader work which for some products can be very extensive.  Anything PBR based is fast unless there's a lot of textures involved as with the attached render.

    I'm wanting to switch to Iray simply because I want to see if there's enough benefit.  I don't know how other folks do it, but I rely on lots and lots and lots of quick renders to adjust textures, posing and lighting.  That's super fast with Octane, but my experience so far with Iray is less than stellar. (The way I see it, Iray will have less texture work, but slower rendering, even with a more expensive GPU.)

    The newer Octane plug in is free and supported, but only allows a single GPU, and excludes the full Octane environment.

    The one feature I really really miss from Reality\Lux was the ability to dump all light layers separately.    That made cleaning up renders in post a real snap.  (At least I'm not running 20 hour CPU renders any more.)

    Night's Watch.jpg
    1920 x 1080 - 683K
  • cridgitcridgit Posts: 1,757
    edited May 2022

    Redacted

    Post edited by cridgit on
  • Gator_2236745Gator_2236745 Posts: 1,312

    UHF said:

    kyoto kid said:

    UHF said:

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

    ...the difference with Octane is that it has Out of Core rendering while Iray doesn't, which allows one to render larger scenes with less VRAM. A great idea, however as with a other software, it has gone to a full subscription model which as I understand requires having to be online while working. It also means having to convert Iray materials.  You also still need ths Daz plugin. 

    I do Out of Core rendering inside the plug in.  I never use the full Octane even though I could. (It would double my horse power using my older PC.)  The only down side with Octane (for me) has been texture/shader work which for some products can be very extensive.  Anything PBR based is fast unless there's a lot of textures involved as with the attached render.

    I'm wanting to switch to Iray simply because I want to see if there's enough benefit.  I don't know how other folks do it, but I rely on lots and lots and lots of quick renders to adjust textures, posing and lighting.  That's super fast with Octane, but my experience so far with Iray is less than stellar. (The way I see it, Iray will have less texture work, but slower rendering, even with a more expensive GPU.)

    The newer Octane plug in is free and supported, but only allows a single GPU, and excludes the full Octane environment.

    The one feature I really really miss from Reality\Lux was the ability to dump all light layers separately.    That made cleaning up renders in post a real snap.  (At least I'm not running 20 hour CPU renders any more.)

    @UHF Shout out to you since I used to use Octane.  I threw more horsepower at it, because I was spending lots of time working with the materials.  It was with V4 and Poser, but I'm sure it's pretty similar - the scene and materials would transfer over with the plugin, some things looked OK most did not.  It was the most I would edit the materials to get to my liking.

    It's a lot faster with Iray not goofing around with textures except my own customization or once in a while if a texture is not very good.  Rendering with Iray is very fast with a 3090. 

  • UHFUHF Posts: 515

    cridgit said:

    How would I do an Octane render in Studio? Under my render engines I don't have Octane so how is it bundled with Studio? I also don't see an Octane forum or much info on getting started. Any helpful links please?

    Figuring out what is what is always kinda fun with Octane.  Its a bit like Windows a few years back... Ultimate Double Plus with a Cherry on top!

    This is the Octane forum.. warts and all.

    https://render.otoy.com/forum/viewforum.php?f=44

     

    This is the announcement for the free plug in that google produced for me, but I have no idea if its the most current;

    https://render.otoy.com/forum/viewtopic.php?f=9&t=73287

  • kyoto kidkyoto kid Posts: 41,244

    UHF said:

    kyoto kid said:

    UHF said:

    I'm currently able to run 20GB renders with 8GB video cards with no impact on performance, so I'm good.  (I use Octane.)

    I'm planning to pick up a 24GB NVidia (3090?) card and give iRay a try.  IRay is so much slower and expensive to use than Octane. 

    ...the difference with Octane is that it has Out of Core rendering while Iray doesn't, which allows one to render larger scenes with less VRAM. A great idea, however as with a other software, it has gone to a full subscription model which as I understand requires having to be online while working. It also means having to convert Iray materials.  You also still need ths Daz plugin. 

    I do Out of Core rendering inside the plug in.  I never use the full Octane even though I could. (It would double my horse power using my older PC.)  The only down side with Octane (for me) has been texture/shader work which for some products can be very extensive.  Anything PBR based is fast unless there's a lot of textures involved as with the attached render.

    I'm wanting to switch to Iray simply because I want to see if there's enough benefit.  I don't know how other folks do it, but I rely on lots and lots and lots of quick renders to adjust textures, posing and lighting.  That's super fast with Octane, but my experience so far with Iray is less than stellar. (The way I see it, Iray will have less texture work, but slower rendering, even with a more expensive GPU.)

    The newer Octane plug in is free and supported, but only allows a single GPU, and excludes the full Octane environment.

    The one feature I really really miss from Reality\Lux was the ability to dump all light layers separately.    That made cleaning up renders in post a real snap.  (At least I'm not running 20 hour CPU renders any more.)

    ...yeah that was another nice feature of Reality/Lux.   Wow, 20 hour render, that was fast.  I had to let them cook for a couple days to get rid of the graininess and at the time my system was quite a "beast".  I tried the "speed boost" that came with the 4.2 upgrade but it was at the price of final render quality (particularly Anti-aliasing along hard edges)

    I guess i could try the Octane free version, but yes, it has some limitations.  Not so much worried about the GPU limit (I only have one), but being restricted from other tools just wouldn't work for me. The fact it still needs an Nvidia GPU to me means switching would be of little advantage as I already have a 12 GB card. 

  • cridgitcridgit Posts: 1,757
    edited May 2022

    Redacted

    Post edited by cridgit on
  • Kevin SandersonKevin Sanderson Posts: 1,643

    https://www.daz3d.com/octane-render-kit For the plug in and the 4 video tutes.

  • SpaciousSpacious Posts: 481

    The answer to the OP title question is MORE.  It's always more.  A new card with more RAM will seem awesome for a while because all the scenes you used to make will fit on it and things will render fast.  Over time you will start adding more and more items to your scenes, using more and more complicated and beautiful shaders, higher SubD, you get the point.  Sooner than you think your imagination will have outpaced the new card and you'll be hunting for ther next hardware improvement.   It's just the nature of the beast.

  • kyoto kidkyoto kid Posts: 41,244
    edited June 2021

    ...yeah that's my issue. Right now 12 GB seems to work pretty well but asI go on, I tend to get more ambitious and start creating bigger scenes with more characters and stuff that required more horsepower. 

    The scene below that I created a couple years ago takes up about 8.9 GB of system memory, and I have more involved scenes in mind, 

    The other factor is newer content is requiring more robust hardware. I heard of some with slightly older hardware having issues with sets like the Cyberpunk Subway StationAirport Island, and the Post Apocalyptic Zone having difficulties with them due to their size, shader resolution, and/or complexity.  I had Realistic Grass Evolution slog my system down badly while Coudscape Creator crashed my display driver (I have 24 GB of system memory - the most the MB supports, and a Titan-X).

    railway station beta.png
    1600 x 1200 - 3M
    Post edited by kyoto kid on
  • outrider42outrider42 Posts: 3,679
    edited June 2021

    For questions about what performance uplift you may expect with almost any GPU (assuming the scene fits VRAM), let me point you to the Daz forum's benchmark thread

    https://www.daz3d.com/forums/discussion/341041/daz-studio-iray-rendering-hardware-benchmarking/p1

    The first post also contains a link to the benchmark mark scene save file. So you can download it and then directly compare your render speed to users. We do not have a P3000 benchmark, so by all means give it a try and post your results to the thread. The P3000 is supposed to be like a 1060 6GB, which by 2021 standards is getting pretty old, and it was only mid range when it was new. Pretty much ANY card that has RTX features will utterly blow your card away.

    CUDA cores scale very predictably, however the newer Ray Tracing cores scale a bit differently. The general idea is that the more complex geometry is, the more of a performance uplift the RT cores will offer over old GTX. So the more complex your scenes get, the better a RTX card will do compared to a GTX card. Thus "real world" performance may vary a bit compared to this bench scene. I would imagine this scene is relatively simple. Of course, more complex scenes will also require more VRAM.

    So buying a GPU becomes a balancing act of VRAM capacity and performance. The 3060 is a great option and will likely be far, far faster than your P3000 while doubling VRAM. The bad news as you have discovered is that right now is the worst time to be buying a GPU. This no exaggeration. There has never been a worse time than now to be in this market. All I can say is good luck. I have wanted to upgrade myself, and have been unable to. At least I have two 1080tis with 11GB though, so things are not all doom and gloom for me.

    Another thing to be aware of is system RAM. To take full advantage of more VRAM, you may need more RAM as well. I don't believe you said how much you have. But you will want at least twice as much RAM as VRAM. So if you do get a 12GB card, odds are you will want at least 32GB of RAM. And that still ight not be enough depending on what you do. Iray has a texture compression system, you can adjust some parameters in the Iray Advanced settings. There are other factors as well. Like if you hide an item in the scene tab, that item may not go to VRAM, but it will still take RAM. Personally, I have maxed my 1080ti 11GB while using 27GB of RAM in one scene, but in another scene I had 50GB of RAM. So it certainly possible to get the most out of a 12GB card with 32GB of RAM, but there may be times you run out of RAM. If you run out of RAM, Daz will crash. It might even cause a system crash.

    To be fair, my 50GB scene had numerous hidden items in it, which no doubt bumped the RAM up.

    Post edited by outrider42 on
  • kyoto kidkyoto kid Posts: 41,244
    edited June 2021

    ...some sets are designed so that you can delete items which are not seen or needed so they still don't add to the memory load. I wish more environment content was set up like that.

    Stonemason does a pretty good job with this in many of his large sets. For example, in one scene I was able to delete all elements of his After the War set that didn't affect the scene in the viewport by first unparenting the items from the ground plane and then simply deleting them.  Some of his sets like Urban Sprawl 2 & 3 also have the buildings, props, and segments of the ground/streets plane as individual loadable files.  

    Yeah, the load some environments put on the system made me consider MBs that can support 128 GB of system memory (DDR4), though unfortunately, the generation of CPU I was looking at is pre-Kaby Lake (preferably Haswell) but both i7s of this generation and MBs are still pretty expensive.

    Post edited by kyoto kid on
  • cridgitcridgit Posts: 1,757
    edited May 2022

    Redacted

    Post edited by cridgit on
  • BandoriFanBandoriFan Posts: 364
    edited June 2021

    Here is a list of parts I'm saving for:

    ASUS TUF Gaming B550-PLUS WiFi AMD AM4 Zen 3 Ryzen 5000 & 3rd Gen Ryzen ATX Gaming Motherboard (PCIe 4.0, WiFi 6, 2.5Gb LAN, BIOS Flashback, USB 3.2 Gen 2, Addressable Gen 2 RGB Header and Aura Sync) - $180

    OR:

    ASUS ProArt B550-Creator AMD (Ryzen 5000/3000) ATX content creator motherboard: $300

    I'll probably go with the $180 one but the $300 has much better sound quality 

    AMD Ryzen 5 5600X 6-core, 12-Thread Unlocked Desktop Processor - $300

    Corsair Vengeance LPX 64GB (2x 32GB) DDR4 3200(PC4-25600) C161.35V Desktop Memory - $310

    Corsair iCUE H115i RGB Pro XT, 280mm Radiator, Dual 140mm PWM Fans, Software Control, Liquid CPU Cooler - $140

    WD_Black 1TB SN850 NVMe Internal Gaming SSD Solid State Drive with Heatsink - Gen4 PCIe, M.2 2280, 3D NAND, Up to 7,000 MB/s - WDS100T1XHE: $250 - This will be my main plus Daz NVMe. On a SATA 3 SSD Genesis 8 female takes forever to load.  Counting the error message she took 26.43 seconds to load 

    Then after that I'll get a 3060 since it has 12GB of VRAM and should be okay until the mid tier 16GB VRAM cards come out. Ages ago 1070's had 8GB of VRAM but 3070's and 3070 TI's do too. They probably don't want people to 8k game for too cheap. But can a 3060 handle Zbrush and other apps needed to make content? 

    Instead of getting one piece at a time I might just get everything at once that way all the new versions will be out and maybe an RTX 4070 will actually have more VRAM than 8GB tho I'll probably do one piece at a time 

     

    Post edited by BandoriFan on
Sign In or Register to comment.