NVIDIA RTX for DAZ Studio

1235»

Comments

  • Matt_CastleMatt_Castle Posts: 2,579

    I don't have that specific environment to test, but across the range of enviroments I've used, it's extremely rare to have to optimise at fewer than six figures, unless there's an obvious resource hog in the scene (full body fur, non-instanced high-poly trees, etc).

    The 3060 might be low down in the Ampere series, but it's a real heavy lifter.

  • PerttiAPerttiA Posts: 10,024

    Matt_Castle said:

    unless there's an obvious resource hog in the scene (full body fur, non-instanced high-poly trees, etc).

    I think the point is also related to knowing how to spot those resource hogs. Many users do not have a clue.

  • GatorGator Posts: 1,302

    Matt_Castle said:

    Cornelius said:

    A little update, if I can ..
    I would like to combine my current GPU that is a M2000 framework to my next 3060.
    This graphics card has 4 GB of 768 CUDA memory. If I could combine it, I would have a multi-gpu available, even if I don't know how to use it in Daz Studio unlike Blender.
    I believe that I would also need a more powerful feeder, maybe from 750w ..

    I'm going to disagree with the reply that called this useless.

    I don't know about that specific GPU, but I kept my old GTX 1650 to use alongside my 3060, because it means I can have the monitor plugged into that, and have Windows/web browsing/etc using the 1650's processor/VRAM and letting the 3060 just get on with the render.

     

    Chumly said:

    A 3060 might get you 4 G8s and A Scene before kicking to CPU.

    Does that sound about right or?

    No.

    These two scenes, each with at least ten figures, needed me to optimise assets in the background:

    https://www.daz3d.com/gallery/user/5700486984368128#image=1195434
    https://www.daz3d.com/gallery/user/5700486984368128#image=1163042

    These, all with at least five figures (and a 3D background, some of them extremely non-trivial) did not:
    https://www.daz3d.com/gallery/user/5700486984368128#image=1215805
    https://www.daz3d.com/gallery/user/5700486984368128#image=1208537
    https://www.daz3d.com/gallery/user/5700486984368128#image=1188789

    Obviously, it depends on the assets, but at least with the set up I'm using (where, as stated above, I've got a 1650 freeing up the VRAM from running the monitor), a 3060 can probably handle at least six figures most of the time, and past that, you can probably afford to optimise figures in the background.

    Wish it worked like that for the consumer cards... it doesn't.

    Even without a monitor plugged in, Windows will reserve VRAM for the desktop just in case you plug a monitor in.  Only with Nvidia's professional card can you disable it.  Even the Titan's weren't good enough in Nvidia's eyes, I had a 1060 for that reason but eventually removed it.

  • Matt_CastleMatt_Castle Posts: 2,579

    Gator said:

    Wish it worked like that for the consumer cards... it doesn't.

    I've got GPU monitoring running, and (unless it is outright lying to me) it's clear that with the 1650 as the active card, any programs/videos/etc I load are loaded into just the 1650's VRAM, not both.

    Yes, Windows still has a small amount of VRAM reserved for WDDM, but that's not the only thing that could potentially be taking VRAM from the render - at least if you're hoping that the computer can be used for other things without disturbing the render.

  • GatorGator Posts: 1,302

    Matt_Castle said:

    Gator said:

    Wish it worked like that for the consumer cards... it doesn't.

    I've got GPU monitoring running, and (unless it is outright lying to me) it's clear that with the 1650 as the active card, any programs/videos/etc I load are loaded into just the 1650's VRAM, not both.

    Yes, Windows still has a small amount of VRAM reserved for WDDM, but that's not the only thing that could potentially be taking VRAM from the render - at least if you're hoping that the computer can be used for other things without disturbing the render.

    The programs aren't loaded into the idle cards VRAM, it just reserves it from use.  Depends on your resolution, I forget but it's a pretty substancial amount.

  • rrwardrrward Posts: 556

    kyoto kid said:

    ...my current Daz system:

    ASUS P6T X58 MB with socket LGA 1366 and two PCIe 2.0 x16 expansion slots.

    Intel Xeon X5660 6 core CPU @ 2.9GHz (upgrade from an i7 930 4 cores.2.8 GHz). with Cooler Master Geminii-S CPU Cooler

    24 GB (6 x 4 GB) DDR3 1333 memory. (upgraded from 12 GB [6 x 2 GB] DDR3 1366)

    Nvidia Maxwell Titan-X 12 GB.(upgraded from Gigabyte Maxwell GTX 750Ti 4 GB).

    Yeah, decade old tech.

    Planned upgrade 

    ASUS Prime H750+ LGA 1200 MB with PCIe 4.0 expansion slots and M2 slot
    Intel i7-11700KF 3.6 GHz 8 core CPU with Noctua NH-D15S CPU cooler
    64 GB DDR4 (2 x 32 GB) DDR4 3200 memory (with 2 open slots for future upgrade).
    Samsung 980 500MB M.2 SSD.

    Cost: 800$

    Already have: EGVA RTX 3060 XC 12GB GPU (wiith backplate)

    Dang, I should just put my old MB in a box and send it to you. It's a LGA2011-3 X99-E MB with a 12-core Xeon. 

  • PerttiAPerttiA Posts: 10,024

    Gator said:

    Matt_Castle said:

    Gator said:

    Wish it worked like that for the consumer cards... it doesn't.

    I've got GPU monitoring running, and (unless it is outright lying to me) it's clear that with the 1650 as the active card, any programs/videos/etc I load are loaded into just the 1650's VRAM, not both.

    Yes, Windows still has a small amount of VRAM reserved for WDDM, but that's not the only thing that could potentially be taking VRAM from the render - at least if you're hoping that the computer can be used for other things without disturbing the render.

    The programs aren't loaded into the idle cards VRAM, it just reserves it from use.  Depends on your resolution, I forget but it's a pretty substancial amount.

    On W10 it's about a gigabyte, on W7 just 300MB's (with 3 monitors connected) 

  • Richard HaseltineRichard Haseltine Posts: 100,960

    PerttiA said:

    Gator said:

    Matt_Castle said:

    Gator said:

    Wish it worked like that for the consumer cards... it doesn't.

    I've got GPU monitoring running, and (unless it is outright lying to me) it's clear that with the 1650 as the active card, any programs/videos/etc I load are loaded into just the 1650's VRAM, not both.

    Yes, Windows still has a small amount of VRAM reserved for WDDM, but that's not the only thing that could potentially be taking VRAM from the render - at least if you're hoping that the computer can be used for other things without disturbing the render.

    The programs aren't loaded into the idle cards VRAM, it just reserves it from use.  Depends on your resolution, I forget but it's a pretty substancial amount.

    On W10 it's about a gigabyte, on W7 just 300MB's (with 3 monitors connected) 

    As I recall, a (relatively) recent update of Windows 10 has reduced the impact.

  • PerttiAPerttiA Posts: 10,024

    Richard Haseltine said:

    PerttiA said:

    Gator said:

    The programs aren't loaded into the idle cards VRAM, it just reserves it from use.  Depends on your resolution, I forget but it's a pretty substancial amount.

    On W10 it's about a gigabyte, on W7 just 300MB's (with 3 monitors connected) 

    As I recall, a (relatively) recent update of Windows 10 has reduced the impact.

    As far as I was able to track it down, it was the update that brought it down to around gigabyte, previously it was a fixed percentage of the VRAM, which combined with bigger VRAM reserved even more than a gigabyte.

  • nicsttnicstt Posts: 11,715
    edited April 2022

    Torquinox said:

    nomad-ads_8ecd56922e said:

    I'm looking at the 12gig variants of the 3060 card.  I was of the understanding that one should have double the system ram compared to what the vram was, so 16gigs (of that now-never-to-be-made card) would have been about right... but now I see lots of talk of needing 3 to 4 times the system ram wrt what you have in vram.

    3 or 4 times is a thing that is sometimes said here by some people who generally have well-founded opinions. I was on board with it because their reasoning makes sense, but as you can read in the thread, the idea has been contested by folks running double and doing fine that way. Looking further, I found many articles about system specs w/respect to video cards and RAM say double is fine. In light of all that, I think a 3060 on 32GB RAM system should be fine. I'd like to tell you, "I have 32GB and that is all you need," but the prices were such that 64GB made sense to me and that's what I bought. So, someone else will have to tell you that. wink

    I've been using a 3090 for over a year, and have 64GB of system RAM.

    I've never had Studio showing as using more than 54GB all the time I've had this build; even when the 3090 has been shown as using over 23GB it hasn't got over the 54GB max i've seen. When it was using said 23GB, it was actually only using about 30GB of system RAM - at its peak.

    The 54GB useage, was actually before I got the 3090

    Post edited by nicstt on
  • kyoto kidkyoto kid Posts: 41,057
    edited April 2022

    PerttiA said:

    Gator said:

    Matt_Castle said:

    Gator said:

    Wish it worked like that for the consumer cards... it doesn't.

    I've got GPU monitoring running, and (unless it is outright lying to me) it's clear that with the 1650 as the active card, any programs/videos/etc I load are loaded into just the 1650's VRAM, not both.

    Yes, Windows still has a small amount of VRAM reserved for WDDM, but that's not the only thing that could potentially be taking VRAM from the render - at least if you're hoping that the computer can be used for other things without disturbing the render.

    The programs aren't loaded into the idle cards VRAM, it just reserves it from use.  Depends on your resolution, I forget but it's a pretty substancial amount.

    On W10 it's about a gigabyte, on W7 just 300MB's (with 3 monitors connected) 

    ..,.I'm running dual 24" displays in W7 and the VRAM load is averaging around 210 MB (with Chrome open).

    The larger WDDM footprint of W10 was one more reason I held out.

    True that he Titan series can be run in TCC mode however I read somewhere here (tried to track the thread down with no success) that non-RTX cards (Pascal and earlier) take a hit to performance and VRAM in 4.12 or later as some standard cores and VRAM are used to make up for the lack of RT and Tensor cores to handle Iray ray tracing..

    Post edited by kyoto kid on
  • Shoot...

    And EVGA has their RTX 3060 12gb Card for $389 now... 

    I checked, and they don't ship to my address.  Otherwise, I'd be doing the happy dance right now.

  • NarkonNarkon Posts: 12

    I wish that were the case where I live. Here the cheapest I find the EVGA GeForce RTX 3060 12GB GDDR6 XC is around €500.

    I have wanted to give rendering a go for a few years now, but the GPU pricing held me back. Now I debate between getting an RTX 3060 12GB or waiting for the 4000 cards. Recently I was talking with a friend about these 2 possibilities and he mentioned that the RTX series is not meant to be used for renders and that the life expectancy of the cards drops when used in that way. Obviously, I don't plan to have the card doing renders 24/7, so mostly a casual/beginner approach, but I've been wondering if this is true and what should my expectations be.

     

  • LeatherGryphonLeatherGryphon Posts: 11,512
    edited August 2022

    Hmmm... "RTX series is not meant to be used for renders and that the life expectancy of the cards drops when used in that way."

      That type of statement needs just more than the word of a friend, and seems highly unlikely, or at least minimally true, not important in the scheme of things.  There is so much untruth in the world today that we need to start investigating facts, and considering the agenda of the source.  

    Can anyone report similar comments but from reliable sources?

    Post edited by LeatherGryphon on
  • Matt_CastleMatt_Castle Posts: 2,579

    Narkon said:

    he mentioned that the RTX series is not meant to be used for renders and that the life expectancy of the cards drops when used in that way.

    If we assume he means "GeForce" rather than "RTX" (There are definitely workstation cards marketed as RTX), that's maybe technically true, but still definitely wrong.

    The GeForce series is indeed not designed with rendering use as a primary concern but instead for gaming. So one can technically say that it's "not meant to be used for renders", but that's different to saying it's "meant not to be used for renders". Nvidia are fully aware the cards get used in this way, and provide driver suites that make it possible to use GeForce cards for Iray. Heck, their "Studio" driver suites, which definitely support GeForce, are actually tested with Daz Studio - amongst other things - which certainly doesn't count as them telling us not to do it.

    Also, any load shortens the life expectancy of the cards (so, again, technically true), but it's not uncommon for Iray renders to use less power and create less heat than it would be to run a demanding video game. The difference will often be that a render is run for longer than you might game, but still, the likely lifetime of a GPU (unless you're running it 24/7 trying to mine crypto on it) is generally going to be quite a long way past the time it's functionaly obsolete.

  • oddboboddbob Posts: 396

    Narkon said:

    he mentioned that the RTX series is not meant to be used for renders and that the life expectancy of the cards drops when used in that way.

     

    The card doesn't know whether it's rendering, gaming or mining and there's plenty of protections built in. If you ran it close to the thermal limit 24/7 there may be some degradation but I'm guessing it would be old enough to be obsolete before it become an issue. Modern PC parts are pretty robust.

    LTT did a video where he tested ex-mining cards and didn't find any performance or stability issues and mining stresses the vram more than gaming or rendering.

    I use my GPUs for gaming and rendering. They draw more power and generate more heat while gaming, especially if ray tracing and upscaling is in use.

  • LeatherGryphonLeatherGryphon Posts: 11,512
    edited August 2022

    If " not suitiable", then why and by how much?  Facts.

    I'm not going to look it up because I don't care.  My RTX 3060s work just fine for me, and yes, I'm 73 and I believe I will die before they do.  But admittedly, I don't run them for day-long renders.  And if it turns out that the factory shipped "OC" (overclocked) cards run hotter than is healthy, one can easily undervolt the card just a tiny bit and achieve significant reduction in power draw and temperature but lose little speed.  There are lots of YT videos showing how.

    Me thinks it is much ado about nothing.

    A modern high-end graphics card is like a Ferrari, wonderful machine as long as it's maintained by competent people.  The biggest threat to graphics cards is improper cooling of the device by negligent owners.  

    We always fall into this trap of "more & faster" is better, and "newest is the best". 

    Current generation has the kinks ironed out.  Next generation is still a cloud of virtual worms.  Let somebody with more money manifest them.

    Post edited by LeatherGryphon on
  • fred9803fred9803 Posts: 1,564

    PerttiA said:

    Matt_Castle said:

    unless there's an obvious resource hog in the scene (full body fur, non-instanced high-poly trees, etc).

    I think the point is also related to knowing how to spot those resource hogs. Many users do not have a clue.

    Totally agree with that PerttiA. Given the peculiarities of DS memory handling, optimising scenes and re-starting DS (shut down & kill the process in task manager after each render) may well give you the benefit of a higher tier card at no cost.

    The "scene does not fit to GPU memory" thing is misleading when renders that previously drop to CPU can be "fixed" by simply shutting down and re-starting DS. There's obviously much more to it than just GPU memory thresholds.

  • fred9803 said:

    PerttiA said:

    Matt_Castle said:

    unless there's an obvious resource hog in the scene (full body fur, non-instanced high-poly trees, etc).

    I think the point is also related to knowing how to spot those resource hogs. Many users do not have a clue.

    Totally agree with that PerttiA. Given the peculiarities of DS memory handling, optimising scenes and re-starting DS (shut down & kill the process in task manager after each render) may well give you the benefit of a higher tier card at no cost.

    The "scene does not fit to GPU memory" thing is misleading when renders that previously drop to CPU can be "fixed" by simply shutting down and re-starting DS. There's obviously much more to it than just GPU memory thresholds.

    How is that misleading? it may well be that restarting frees up resources and so enables a scene that was borderline to work - and of course once memory has run out a restart is needed to free it up, Iray doesn't seem unable to clear the partial upload while running.

  • fred9803fred9803 Posts: 1,564

    Misleading because when rendering drops to CPU some people think OMG I better run out and get a GPU with more VRAM, when as you say all it is is a DS memory handeling issue that can often be fixed with a re-start of the prog. Or as PerttiA says, reducing resource hogs (usually extreme texture sizes) that would also mitigate the need for throwing more money at a better GPU. Perosnally I don't see any real need for anything above a 2xxx RTX for rendering... if you know what you're doing.... that's the clincher.

  • DonDatDonDat Posts: 3
    edited May 2023

    I have a ryzen system with 5900x CPU and had a rtx 3060 12gb GPU, until I upgraded the GPU to an RTX 3070 TI 8gb. Both are great for rendering but the 3070 TI renders a fairly high resolution scene (HD Textures) with 5 or 6 figures (G3, G8 and G9) in nearly 1/2 the time that my 3060 did. HD textures.

     

    Don

    Post edited by DonDat on
Sign In or Register to comment.