DazStudio only works on Nvidia GPUs????

From what I'm reading, this latest DazStudio only works properly on Nvidia GPUs.  Is this true?  Will I need to toss out my latest hi-end AMD Radeon pc and buy an Nvidia-powered pc at twice the price?

«1

Comments

  • DartanbeckDartanbeck Posts: 21,551

    Not true. However, Iray will not use your Radeon, so it will rely only on your GPU.

    But instead of firing your new PC, there's another option for you:

    Free OctaneRender and Octane Render Kit for Daz Studio

  • DartanbeckDartanbeck Posts: 21,551

    Additionally, Filament PBR is not GPU specific either.

  • Dartanbeck said:

    Not true. However, Iray will not use your Radeon, so it will rely only on your GPU.

    "Not true. However, Iray will not use your Radeon, so it will rely only on your CPU."

  • GordigGordig Posts: 10,049

    Dartanbeck said:

    But instead of firing your new PC, there's another option for you:

    Free OctaneRender and Octane Render Kit for Daz Studio

    Unlike Iray, which can be GPU-accelerated with NVidia cards but will run on the CPU, Octane REQUIRES an NVidia GPU, so it's not a viable alternative for the OP.

  • crosswindcrosswind Posts: 6,926
    edited June 2023

    OP: DS 3Delight(RSL) engine works with AMD graphics card while Iray (MDL) engine only works with Nvidia graphics card. So, if you wanna go for an Iray 'Happy Rendering', having an Nividia card is a must.

    But if you have no willingness to buy a new rig with NVidia graphics card and don't wanna go for 3Delight either, you may consider Blender + utilizing Daz assets with Diffeomorphc Daz Importer (DDI). There're quite a lot ppl are playing in such a way, with Happy Rendering... e.g., @Padone one of the authors of DDI.

    Post edited by crosswind on
  • NimosNimos Posts: 39

    Looks like Octane is limited to NVidia GPUs also.  Perhaps that Blender plug-in importer might be the best option as long as 3D assets can be posed within Blender.

  • WendyLuvsCatzWendyLuvsCatz Posts: 38,202

    you don't need to toss out your PC, just add a Nvidia card

  • marblemarble Posts: 7,500

    Question: Is it possible to use Luxrender these days (since the Reality Product was discontinued)? I used Reality/Luxrender for years (on my iMac) before I switch to a PC with NVidia.

  • crosswindcrosswind Posts: 6,926

    Not sure...  its lifecycle was ended...

  • PadonePadone Posts: 3,688

    @crosswind As for diffeomorphic the code and copyright ownership is entirely by Thomas, I just give him some help here and there in my free time.

  • vwranglervwrangler Posts: 4,885

    marble said:

    Question: Is it possible to use Luxrender these days (since the Reality Product was discontinued)? I used Reality/Luxrender for years (on my iMac) before I switch to a PC with NVidia.

    Someone was trying to write a bridge for Luxcorerender (the successor to LuxRender) but it's unclear where the project sits right now.

    https://www.daz3d.com/forums/discussion/comment/7508471/

     

  • crosswindcrosswind Posts: 6,926

    Padone said:

    @crosswind As for diffeomorphic the code and copyright ownership is entirely by Thomas, I just give him some help here and there in my free time.

    Got it~ No problem, that's even great! Thanks for your contribution and help, as always!

  • SnowSnow Posts: 95

    To answer OP question yes DazStudio (iRay) only works properly with Nvidia GPUs!

    DAZ will work with AMD but expect 4hr renders instead of 4 mins because it will use the CPU instead. It does seem to use the card for dforce simulations because I can hear it spinning up, though it's still very slow.

    I use DAZ with an AMD RX 580 on Mac (Ventura) and always let it render for 4 hours at 15000 samples (doing a render as we speak)

    The only good thing about rendering on CPU is that you can still use the computer for other tasks. 

    With AMD we also lack important features such as noise reduction so that has to happen in post process. If you are going for realism then the noise is a pain to remove because it varies in different areas.

    So if you want to have a good experience with DAZ you will have to switch to an NVidia card. DAZ promises support for AMD in DAZ 5 but that release is probably due 2050 so don't keep your hopes up.

    Good luck!

  • wsterdanwsterdan Posts: 2,344

    Snow said:

    To answer OP question yes DazStudio (iRay) only works properly with Nvidia GPUs!

    DAZ will work with AMD but expect 4hr renders instead of 4 mins because it will use the CPU instead. It does seem to use the card for dforce simulations because I can hear it spinning up, though it's still very slow.

    I use DAZ with an AMD RX 580 on Mac (Ventura) and always let it render for 4 hours at 15000 samples (doing a render as we speak)

    The only good thing about rendering on CPU is that you can still use the computer for other tasks. 

    With AMD we also lack important features such as noise reduction so that has to happen in post process. If you are going for realism then the noise is a pain to remove because it varies in different areas.

    So if you want to have a good experience with DAZ you will have to switch to an NVidia card. DAZ promises support for AMD in DAZ 5 but that release is probably due 2050 so don't keep your hopes up.

    Good luck!

    Was it officially stated that DAZ promises support for AMD in DAZ Studio 5? I missed that! Did they mention any other "promises" for D|S 5?

    -- Walt Sterdan 

  • LeanaLeana Posts: 11,690

    crosswind said:

    OP: DS 3Delight(RSL) engine works with AMD graphics card while Iray (MDL) engine only works with Nvidia graphics card. So, if you wanna go for an Iray 'Happy Rendering', having an Nividia card is a must.

    3DL doesn't use GPU at all, it renders only via CPU.

    Iray can render either using your CPU or using an Nvidia GPU. Iray CPU renders take much much longer though, so having a Nvidia GPU is recommended.

    If you have an AMD card you can still render with both engines, using the CPU. The AMD card will be used for OpenGL or Filament preview or renders, and for dForce simulations.

  • crosswindcrosswind Posts: 6,926

    Leana said:

    crosswind said:

    OP: DS 3Delight(RSL) engine works with AMD graphics card while Iray (MDL) engine only works with Nvidia graphics card. So, if you wanna go for an Iray 'Happy Rendering', having an Nividia card is a must.

    3DL doesn't use GPU at all, it renders only via CPU.

    Iray can render either using your CPU or using an Nvidia GPU. Iray CPU renders take much much longer though, so having a Nvidia GPU is recommended.

    If you have an AMD card you can still render with both engines, using the CPU. The AMD card will be used for OpenGL or Filament preview or renders, and for dForce simulations.

    Thanks a lot for clarification! yesenlightened

  • outrider42outrider42 Posts: 3,679
    edited June 2023

    I don't want to start another GPU war, but the fact is that many 3D rendering applications are built around Nvidia and specifically CUDA, which is owned by Nvidia. So this is not just Daz Studio, if you plan on using other software for 3D content, or video editing, or even local Stable Diffusion or other AI, Nvidia is the way to go. It has been this way for over 2 decades, and will likely remain this way for a while still. AMD recently created HIP, which tries to emulate CUDA, but the software still has to make use of this. Not only that, but some of the software that do take advantage of HIP may not use the ray tracing cores on AMD, which results in laughably slow rendering even with a fast AMD GPU. For example, the CUDA based Redshift has AMD HIP now, so you can use AMD to render, but they don't have any RT feature yet. The result is that AMD is over a generation behind in speed (which they are a generation behind in ray tracing anyway even if the RT cores so work, but I digress.)

    The ADA 6000 is like a 4090, the A6000 is like a 3090, and the A5000 is like a 3080. The W7900 is like a AMD 7900 XTX, the W7800 doesn't have a comparable part yet. So here we see the best AMD card getting absolutely destroyed by last generation Nvidia, and not even Nvidia's fastest last gen card at that.

    Blender has an option for AMD as well, but the results are not pretty as AMD is like a full TWO generations behind. Blender 3.5 supposedly adds full HIP-RT support, but there is no way AMD makes up this massive gap.

    If you are serious about doing 3D, yes, you will need a Nvidia GPU. Some people get by without, but they do so with a handicap. They have to find a way to export their scenes into other software (and still render way slower), or suffer through hideous CPU rendering speeds that are slower than watching paint dry. That may be ok for some people, but it is not for me. AMD GPUs are pretty much only good at gaming, everything else is a work in progress. That is just how it is.

    I had a ATI 5870 GPU when I started using Daz Studio back in the day, as I built it for games before finding this wild world of Daz. When I started using Daz Studio, I quickly found that was no good, and went to ebay to buy a used GTX 670, which was a bit of a mistake because I didn't know how much VRAM I would need. (I bought everything used for a long time.) That is my advice, sell your AMD and buy a recent Nvidia off ebay, 3000 series or newer, with a decent amount of VRAM. I wouldn't hesitate to buy a used 3090 at $600, but I already have a 3090 and a 3060. It sucks, but I have been there. At least my 5870 was only $50 and I had it for a while (I wasn't kidding when I said I bought everything used.)

    The reason for all of this is that Nvidia supported the software developers better than AMD did when the industry was taking off. For a long time AMD was on the brink of bankruptcy, This is not gossip or rumor, AMD was a penny stock (boy, anybody who owned AMD stock back when it was pennies is doing well today!) This meant AMD had no resources to support software devs like Nvidia. Nvidia used CUDA to entrench itself into professional software. So today, even though AMD "is back" on the hardware front, they are still miles behind on the software front. It is AMD's biggest weakness, and not something they can easily make up, like they did against Intel. (At the rate Intel is going...Intel might become a penny stock, I would not invest in Intel right now.) CUDA is Nvidia's biggest strength.

    Daz Studio is a great example. Daz added Iray in 2015. Iray was pretty new at the time, and it was one of the earlier GPU accelerated rendering engines that you could use on a desktop. But more than that, Nvidia constantly works on CUDA. You can think of CUDA as being a bit like Daz Studio Genesis, it makes it easier to build programs for, just like Genesis is a shortcut for making a 3D human. AMD just makes the hardware and you have to do a lot of the stuff yourself, like you would need to create a full 3D human from scratch.

    Of course since Daz is not building Iray themselves, then Daz is not in control of Iray. Maybe they can suggest things to Nvidia, but Daz is not developing the render engine. This is a key difference from Redshift. Redshift was able to add AMD HIP to their engine because Redshift builds its own render engine, it is just based on CUDA. Daz Studio would need to get another rendering engine that supports AMD.

    Post edited by outrider42 on
  • NimosNimos Posts: 39

    I think there may be another option --- importing DAZ stuff into UNREAL game engine which is capable of amazingly nice and fast renders on any modern GPU.  I recently played a few games at highest quality settings and I was astonished at the visual beauty of complex renderings pouring out at over 90 frames a second.

  • SnowSnow Posts: 95

    Good advice from Outrider42. In short, 3D = Nvidia. Get the best one you can afford!

  • golem841golem841 Posts: 129

    May I add : the best you can afford, with as much VRAM as possible.

    I'm still on the vanilla 2080 two cards : 8Gb each, rendering fall back to CPU very rapidly with not so complex scenes. Lights and shadows seems to be memory intensive.

  • SnowSnow Posts: 95

    @golem841, does it actually use both cards? 16GB VRAM? Or just 8GB on steroids?

    If that's not enough VRAM then 3090/4080/4090 would be our only option. Define not so complex scenes?

    I mostly render 1 figure (HD subd4) high res (4000x6000) realism so with all the bells and whistles but since I am forced to render on CPU I obviously have no knowledge of VRAM usage. I average 4 hour renders on the i7 4790k. I do wonder how any RTX card would compare, 40min or 4min?

    If I was to switch I would (normally) go for the 4070ti.

     

  • GordigGordig Posts: 10,049

    Snow said:

    @golem841, does it actually use both cards? 16GB VRAM? Or just 8GB on steroids?

    If that's not enough VRAM then 3090/4080/4090 would be our only option. Define not so complex scenes?

    I mostly render 1 figure (HD subd4) high res (4000x6000) realism so with all the bells and whistles but since I am forced to render on CPU I obviously have no knowledge of VRAM usage. I average 4 hour renders on the i7 4790k. I do wonder how any RTX card would compare, 40min or 4min?

    If I was to switch I would (normally) go for the 4070ti.

    Pooling VRAM is only possible with NVLink between two cards of the same type, and aside from 3090 and maybe one or two other cards, only Titan and Quadro series cards (above the A4000) have NVLink.

  • PerttiAPerttiA Posts: 10,024

    Snow said:

    @golem841, does it actually use both cards? 16GB VRAM? Or just 8GB on steroids?

    If that's not enough VRAM then 3090/4080/4090 would be our only option. Define not so complex scenes?

    I mostly render 1 figure (HD subd4) high res (4000x6000) realism so with all the bells and whistles but since I am forced to render on CPU I obviously have no knowledge of VRAM usage. I average 4 hour renders on the i7 4790k. I do wonder how any RTX card would compare, 40min or 4min?

    If I was to switch I would (normally) go for the 4070ti.

    If the cards are not connected with NVLink, one gets only 8GB's, out of which half is taken by the baseloads (Windows, DS, Scene and the necessary Working Space)
    A single 12GB card will give double the VRAM available for Iray rendering.

    When I had GTX 960 (4GB), everything was rendered on the CPU, taking 2-4 hours (overnight), my next card RTX 2070 Super (8GB) allowed me to render the same scenes on GPU in less than half an hour.
    Over time, I have learned what not to do and I can render larger and more complicated scenes in 5-15 minutes on my current RTX 3060 12GB

  • PerttiAPerttiA Posts: 10,024

    Gordig said:

    Snow said:

    @golem841, does it actually use both cards? 16GB VRAM? Or just 8GB on steroids?

    If that's not enough VRAM then 3090/4080/4090 would be our only option. Define not so complex scenes?

    I mostly render 1 figure (HD subd4) high res (4000x6000) realism so with all the bells and whistles but since I am forced to render on CPU I obviously have no knowledge of VRAM usage. I average 4 hour renders on the i7 4790k. I do wonder how any RTX card would compare, 40min or 4min?

    If I was to switch I would (normally) go for the 4070ti.

    Pooling VRAM is only possible with NVLink between two cards of the same type, and aside from 3090 and maybe one or two other cards, only Titan and Quadro series cards (above the A4000) have NVLink.

    RTX 2080/2080Ti/2070 Super and RTX 3090 have NVLink. 

  • golem841golem841 Posts: 129
    edited July 2023

    PerttiA said:

    Gordig said:

    Snow said:

    @golem841, does it actually use both cards? 16GB VRAM? Or just 8GB on steroids?

    If that's not enough VRAM then 3090/4080/4090 would be our only option. Define not so complex scenes?

    I mostly render 1 figure (HD subd4) high res (4000x6000) realism so with all the bells and whistles but since I am forced to render on CPU I obviously have no knowledge of VRAM usage. I average 4 hour renders on the i7 4790k. I do wonder how any RTX card would compare, 40min or 4min?

    If I was to switch I would (normally) go for the 4070ti.

    Pooling VRAM is only possible with NVLink between two cards of the same type, and aside from 3090 and maybe one or two other cards, only Titan and Quadro series cards (above the A4000) have NVLink.

    RTX 2080/2080Ti/2070 Super and RTX 3090 have NVLink. 

    My 2 cards are connected with NV Link, but I use two 27 inches monitors.

    Post edited by Richard Haseltine on
  • PadonePadone Posts: 3,688
    edited July 2023

    May I remind you that other engines as octane and cycles are perfectly able to use the shared vram for textures, so you don't necessarily need a mega powerful card for rendering. Plus the denoiser in cycles is very good and you don't need that many iterations as iray. I mean you're not forced to daz studio and iray, you know there are bridges, right ?

    Post edited by Padone on
  • SnowSnow Posts: 95

    Hi Padone, thanks for the heads-up but I highly doubt it will be as easy as just transferring the scene to Blender, render and be done with it. I am sure there will be a dozen (at least) tweaks necessary to get the same result we have in DAZ. I am sure Cycles is even more capable but to set everything up would probably take more time then to just render in DAZ. I only use DAZ and Blender btw but after trying the DAZ to Blender bridge a few times I gave up. I might try again though because Blender does have proper support for AMD cards (and on Mac)

    If realism is the goal and the scene has been set up perfectly in DAZ it will probably be very difficult to get the same result in Blender, starting with the materials.

  • PadonePadone Posts: 3,688
    edited July 2023

    Try diffeomorphic instead of the daz bridge. You will discover a whole new world for daz to blender conversion. Then yes not everything that makes sense in daz studio makes sense in blender, for example I avoid HD and G9 to get better speed in animation and more optimized figures. But with diffeomorphic it is possible to import HD too if you really need to.

    https://diffeomorphic.blogspot.com/

    Post edited by Padone on
  • GordigGordig Posts: 10,049

    Snow said:

    Hi Padone, thanks for the heads-up but I highly doubt it will be as easy as just transferring the scene to Blender, render and be done with it. I am sure there will be a dozen (at least) tweaks necessary to get the same result we have in DAZ. I am sure Cycles is even more capable but to set everything up would probably take more time then to just render in DAZ. I only use DAZ and Blender btw but after trying the DAZ to Blender bridge a few times I gave up. I might try again though because Blender does have proper support for AMD cards (and on Mac)

    If realism is the goal and the scene has been set up perfectly in DAZ it will probably be very difficult to get the same result in Blender, starting with the materials.

    Once you're used to a node-based shader system, the system in DS will start to feel really limiting. It definitely takes more time than just clicking a shader preset, but the flexibility it gives you is very liberating.

  • PadonePadone Posts: 3,688
    edited July 2023

    @Gordig Well to be fair in daz studio you have the shader bricks that's the same as the blender nodes, the iray uber shader compares more to the blender principled shader. The difference is that the bricks are not so well documented thus much harder to use, and also somewhat more "basic" so you need lots of bricks even for a simple shader. Also the bricks re not pbr in that you follow more a specular workflow, it is possible to do pbr materials with the bricks but you have to know what you're doing. This is also reflected in the iray uber shader wich allows for non-pbr materials, while cycles is pbr only, that's why it is difficult to convert iray to cycles when non-pbr attributes come in the way.

    Post edited by Padone on
Sign In or Register to comment.