NVidia 5090

So, nVidia just announced the 50-series last night at CES.  Huge push on it's AI capabilities.  They glossed over performance specs of it, but here's what I'm guessing:

In Daz, performance-wise, you won't see a significant gain over a 4090 or even a 3090.  Renders may complete quicker, but only by a matter of seconds and not minutes.  One boost I can see to it would be with the iRay viewport probably being much quicker to work with, and denoiser may not be as mushy.  The other fact is that Daz, as far as I know, is sitll using older iRay drivers.  Someone in Dev can correct me if I'm wrong.  So, the drivers wouldn't be able to utilize CUDA to it's full efficiency.

What I do see a huge gain in, from what I gleaned from the speech, is that any AI workflows will be greatly improved.  If you use a local AI in your process at all, there will probably be a huge bump in token generation with the images.

I know, AI can be a touchy subject, but I can also see how it can work *with* an artist to help with image creation.

Thoughts?  Comments?  Peanuts?

«1

Comments

  • Richard HaseltineRichard Haseltine Posts: 101,965

    Iray drivers? What are they? Daz Studio is currently using an older version of Iray, yes - the chnage log shows they did try a newer version but then roilled back before a private build; I would be slightly surprised if even that had compatibility with the new cards, based on previous rollouts.

  • ExpozuresExpozures Posts: 232

    Wonder if it's because older cards 16-series, 20-series and maybe even 30-series may have issues with the newer version of IRay?  And I could see Daz nixing that because only a handful of users would have had 40-series cards at the time...and even today how many people running Daz are running on 40 series, and how many who aren't are going to hop onto 50-series cards?  For an enterprise-level 3D program, like Maya, etc, it's a no-brainer since corporations would be getting high-end hardware for their needs, but since Daz is a hobbist program, the developers need to stick with what works for the average user and not hop on the latest and greatest.  Which is fine.  It's a free program, and it's us users who help maintain it by purchasing from the store.  Not saying, "BAH!  THEY NEED TO UPGRADE IRAY!"  Would be nice to have that option, but I can see why they won't.  And I don't know how much of an issue it would be to toggle between versions, or have a separate installer...though, as a developer, that would be a nightmare to maintain.

    It is what it is, really.  I don't blame Daz for using the older version of IRay that works with most hardware out there.  Maybe by the time the 60 or 70 series comes out, there will be enough of a 40+ marketshare out there to warrant it.

    That's also why I don't really jump at excitement on Jansen Haung's speech about "how much faster the 50-series is over the 40-series!"  For us Daz-heads, it all comes to VRAM capacity.  32GB of VRAM would be nice, but heck, I had a hard enough time fully utilizing 24GB on my 4090.  I imagine render times are going to come in about on par with the previous gen.  4090 wasn't really much faster than a 3090, maybe by a couple of seconds and that was about it.  And that's fine, I'm really not complaining.  Just being a realist for people out there who have good 4090s who may be salivating over the new releases.  Heck, if you've even got a 3090 and you're primarily using Daz, you won't see a significant gain.

    If, though, you're in the market for a new card, probably best to hold off for the 50-series to come out (which should be a couple weeks).

    I'm anxious to see what Gamers Nexus has to say about the card.

  • dirtriderdirtrider Posts: 33

    I've been needing a new pc and have been holding out till the 5090's were released simply because I need the Vram for alot of the scenes I like to make. I have a 4060 with 16 gigs and its not cutting it for my bigger scenes. I really don't keep up with the tech side of stuff so willl the 5090's actually work with Daz?

  • ExpozuresExpozures Posts: 232

    There's no reason it shouldn't work, unless nVidia drops support for IRay, which I really doubt they would.  IRay is very common in particularly the automotive industry, and it's also the backbone of the RTX technology.

    https://blogs.nvidia.com/blog/daz-studio-nvidia-rtx/

    Like I mentioned, I think what we're going to see a difference in is the iray preview in the viewport, and possibly some uplift when denoising is used.  Denoising uses the GPU's AI to approximate pixels next to the rendered pixels, so it only needs to render 1/2 of the pixels in order to extrapolate enough data to generate the other 1/2.  According to Jensen, the new Blackwell will be able to generate the pixels quicker and with more accuracy.  If you turn off denoising, you're forcing the GPU to calculate the raytracing for each pixel you're generating.  My guess is with the 5090, and probably even the 5080, you'll get virtual "real time rendering" in the viewport, and you can probably even turn on denoising after 200 generations instead of after 500 generations.

  • benniewoodellbenniewoodell Posts: 1,979

    I got the 3090 and the 4090 when they came out, I will be skipping this next one for sure, the 4090 works perfectly fine, and the 3090 on my other computer, when I render things simultaneously, I see maybe a few second difference. So I can't see the 5090 being that much of an upgrade. 

  • ExpozuresExpozures Posts: 232

    I'm going to be getting one...my 4090 suffered a tragic accident.  Luckily, I have insurance on my computer.  Only thing is, insurance only paid me what I paid for it, not what it's worth today, and unlike most computer electronics, the value of the 4090 went up instead of down.  Sigh.  So, I have to pocket away a few more dollars to cover the cost, and I figured, "Well, if I gotta put away $500, why not $800 and get the next gen?"

  • RaukoRauko Posts: 37

    I found my 4090 was 60% faster in Iray than my 3090 which equates to saving some 36 minutes on a render that would take 1 hour on the 3090. I'd certainly take that speed up again on a 5090 - but time will tell .. 

  • jbdiminniejbdiminnie Posts: 78

    Expozures said:

    In Daz, performance-wise, you won't see a significant gain over a 4090 or even a 3090.  Renders may complete quicker, but only by a matter of seconds and not minutes. 

    This comment is complete nonsense--the 5090 has much faster memory (GDDR7 vs GDDR6) from a 3090 and a significantly higher number of CUDA cores.  Based on the benchmark thread which shows the 4090 to be faster than the 3090, I would not be surprised (once Daz updates to make sure that it is compatible with the 5090) to see the 5090 is at least 50-60% faster in render time than a 3090.

    Of course, the biggest issue will likely be availability once the cards launch, which will probably make it quite difficult to find one anywhere near the listed MSRP.  Also, 575 watt power draw is a big jump from the 350 W my 3090 uses during rendering, so a new power supply is going to be a must (probably will need to look at something around 1500-1600 watts, given that newer CPUs also have a much higher power draw than older ones).  I am currently in the process of rebuilding my PC (new motherboard, CPU and RAM should be here later this month), and so I am thinking about picking up a 5090 somewhere down the road and running it along with my 3090 for Daz renders--I would expect to see the combination of the two reduce my render times by 70% or so, which would greatly speed up my workflows.

  • Richard HaseltineRichard Haseltine Posts: 101,965

    Expozures said:

    There's no reason it shouldn't work, unless nVidia drops support for IRay, which I really doubt they would.  IRay is very common in particularly the automotive industry, and it's also the backbone of the RTX technology.

    https://blogs.nvidia.com/blog/daz-studio-nvidia-rtx/

    There is every precedent for thinking Iray will not work on 50x0 cards initially - that has been the case for the 20x0, 30x0, and 40x0 cards too - not because of Daz dilatoriness but because nVidia has needed to release a new version of Iray, at least a couple of months after the cards came out. Of course Daz does then need to test the new version, so there will be at least a slight further delay before ity reaches Studio users, but Daz doesn't deliberately hold back on Iray updates unless there is cause.

    Like I mentioned, I think what we're going to see a difference in is the iray preview in the viewport, and possibly some uplift when denoising is used.  Denoising uses the GPU's AI to approximate pixels next to the rendered pixels, so it only needs to render 1/2 of the pixels in order to extrapolate enough data to generate the other 1/2.  According to Jensen, the new Blackwell will be able to generate the pixels quicker and with more accuracy.  If you turn off denoising, you're forcing the GPU to calculate the raytracing for each pixel you're generating.  My guess is with the 5090, and probably even the 5080, you'll get virtual "real time rendering" in the viewport, and you can probably even turn on denoising after 200 generations instead of after 500 generations.

  • nonesuch00nonesuch00 Posts: 18,240
    edited January 8

    For DAZ Studio renders only, a upgrade makes no sense unless you are rolling in dough. 

    Post edited by nonesuch00 on
  • ExpozuresExpozures Posts: 232

    I'm very much in need right now.  My 4090 got some liquid damage, I'm not sure how...I think some dummy (me) put a leaky vape ontop of my computer and some juice dripped down right into the vent on my Geforce card.

    Luckily my insurance did cover the cost of replacement, since I did put a rider on my computer in case of dumb things like that (I'm not going to gamble something dumb happening to my computer, and being out a $2500 video card+components)

  • I was under the impression Cuda Cores are what makes a render faster and looking at the those for the 5090 it is a significant jump, although I doubt the 5070/80 could be faster than 3090/4090 like the owner was saying, but the Cuda Cores are quite low on 5080 9,728 Cuda Cores  10496 (3090RTX) 1600 (4090).

    However the 5090 seems to be stacked with them 21,760 CUDA cores and 32gb too.

  • Richard HaseltineRichard Haseltine Posts: 101,965

    In general CUDA core-count comparisons are useful within a generation but not between generations.

  • crosswindcrosswind Posts: 7,651

    Richard Haseltine said:

    In general CUDA core-count comparisons are useful within a generation but not between generations.

    No doubt ~~ My 2 cards of 30/40 series have 27K cuda cores, but I don't think they can really compete a 5090 ~ 

  • JamesJames Posts: 1,080
    edited January 13

    @crosswind will you buy 5090?

    Anw, for daz renderers, what matter to us is raw performance, right?
    More AI performance doesn't do us much?

    I'm thinking, should I buy 4090 or save more for 5090
     

    Post edited by James on
  • crosswindcrosswind Posts: 7,651
    edited January 13

    James said:

    @crosswind will you buy 5090?

    Anw, for daz renderers, what matter to us is raw performance, right?
    More AI performance doesn't do us much?

    I'm thinking, should I buy 4090 or save more for 5090
     

    It's a safe bet that I won't buy it, at least not in this year. Then I probably have the chance to go for 6090. I'm pretty satisfied with my current cards that can complete rendering most of my scenes (even large ones) within 5 mins.

    Right, if you purely render with DS rather than use those AI-accelerated or AI-assisted apps and games, you'll benefit very less from the new AI features.

    And in your case, I personally suggest you consider 5090 as the 1st priority... but no rush, pls wait for the 1st and 2nd waves of tests with 5090 come out. See if it can really "double" rendering speed comparing with 4090... (I have some doubt ~~)

    Post edited by crosswind on
  • In any case, let others debug NVidia's design for you. I doubt I'll buy before next year. And to be honest, a 4090 (especially with Cycles in Blender) is already so fricking fast and simplifying via material nodes is so effective that I don't feel any particular need for more CUDA cores, nor another 8 gigs of VRAM.

  • ElorElor Posts: 1,807

    James said:

    I'm thinking, should I buy 4090 or save more for 5090
     

    If you can wait, maybe you will be able to buy a second hand 4090 from a reputable source at a cheaper price (to avoid buying from someone who abused the card like what happened when many were mining crypto-currency).

    I remember seeing, from time to time, 3090 at an almost affordable price on a French forum after the 4090 was released, from people who like to always have the best money can buy to play games and who were selling the previous best as a way to reduce the cost of the new best.

  • crosswindcrosswind Posts: 7,651

    Elor said:

    James said:

    I'm thinking, should I buy 4090 or save more for 5090
     

    If you can wait, maybe you will be able to buy a second hand 4090 from a reputable source at a cheaper price (to avoid buying from someone who abused the card like what happened when many were mining crypto-currency).

    I remember seeing, from time to time, 3090 at an almost affordable price on a French forum after the 4090 was released, from people who like to always have the best money can buy to play games and who were selling the previous best as a way to reduce the cost of the new best.

    Agreed, discount for 4090 can be expected.

  • MattymanxMattymanx Posts: 6,929

    I bought a 4090 iin Dec 2023 along with a whole new PC.  I choose to wait to see what Gamers Nexus and other test groups showed for performance over the 3000 series.  Then the "faulty" power connectors issue poped up and I waited and watched.  It wasn't until Gamers Nexus released their findings from their tests that showed it was user error and not a manufacturing defect, that made me feel safe to buy one.

    My previous Nvidia cards were a dual Zotac AMP Extreme 980 TI setup.  From what tests I could perform to compare old and new, the 4090 was performing about 550% better then my dual 980TIs in regards to time to render.

    I also bought my 4090 and new PC as a long term investment.  I dont plan up replacing either and hope they will serve me well over a 5-7 year period.  If I coudl have gotten two 4090s for rendering, I would have.

    And while this was a work related purchase and need, my games really do bennefit from the newer hardware.  Only Skyrim SE, heavily modded, has made my 4090 heat up enough for the fan speed to ramp up to 3200 RPM.  Iray has only managed about 2000-2500 RPM max.

  • bluejauntebluejaunte Posts: 1,907

    I've heard somewhere (maybe DigitalFoundry) that upgrading to every new generation might ultimately make the most financial sense because top end GPU's don't lose much value anymore. Something to consider at least.

  • crosswindcrosswind Posts: 7,651

    bluejaunte said:

    I've heard somewhere (maybe DigitalFoundry) that upgrading to every new generation might ultimately make the most financial sense because top end GPU's don't lose much value anymore. Something to consider at least.

    I also believe that to some extent...only better find a good timing to upgrade ~~

  • bluejauntebluejaunte Posts: 1,907

    Yeah 4090 is still super expensive and experts say they're not gonna drop in price much. So if (hypothetically) you could upgrade to a 5090 for 300 bucks or so, that might make sense. Get a fresh warranty period as well. Then if you could spend 300 bucks every 2 years to get the latest monster, that wouldn't be too bad? It's certainly different to how it used to be. It would've been considered such a waste of money to buy all the lastest stuff some years ago.

  • TheKDTheKD Posts: 2,695
    edited January 15

    I hate the way it is now. I used to wait until say 50x dropped, then get a 4090 for a lot cheaper. That hasn't worked out in a long time lol. I guess my new strategy is going to be upgrading every other or every third cycle drop. Maybe the 6090 will have 48GB VRAM. Yeah, I can dream, 24 ----> 36 VRAM is not a jump I am willing to shell out the money for.

    Post edited by TheKD on
  • DiasporaDiaspora Posts: 444

    Just speaking for myself, CES 2025 proved that my 4090 was a VERY worthwhile investment.

    Don't get me wrong, I'm very eager to see the benchmarks for the 5090 but I need to see a MINIMUM 40% speed increase in iray before I seriously consider purchasing a 5090. 

    Nvidia Iray is the main reason I buy new top of the line video cards with gaming/vr just being an added bonus. 

  • ExpozuresExpozures Posts: 232

    Richard Haseltine said:

    In general CUDA core-count comparisons are useful within a generation but not between generations.

    Exactly.  While CUDA does handle the raytracing and rasterizaton of images, and more CUDA cores=more faster renders.  So, render-wise, a 5090 will outperform a 5080 will outperform a 5070.  Until we get the raw data on how efficient these cores are, we do know that a 5090 will outperform a 4090, but will a 5080 outperform a 4090?  According to nVidia, it will...but their marketing materials will make their video cards crush whatever they want however they want.  It's not until independant reviewers (like Gamers Nexus, JayzTwoCents, etc) get their hands on it and test it in real-world environment.

    From what I have seen, nVidia is really pushing hard on the Tensor side of things.  CUDA did get an uplift, but it's really hard to tell how much of an uplift it got over the previous gen.  I do believe that when it comes to iRay preview and denoising, we will see massive gains because of the enhanced AI onboard.  And who knows?  It may be worth using the denoiser now without losing image quality.

  • DiasporaDiaspora Posts: 444

    Correct me if I'm wrong but doesn't Iray also use raytrace cores?

  • Richard HaseltineRichard Haseltine Posts: 101,965

    Diaspora said:

    Correct me if I'm wrong but doesn't Iray also use raytrace cores?

    Yes, it does - one of the reasons it uses a lot of extra menory on older cards is that it loads code to emulate the RTX features.

  • DiasporaDiaspora Posts: 444
    edited January 16

    well that makes me more optimistic for iray performance

    Post edited by Diaspora on
  • nonesuch00nonesuch00 Posts: 18,240

    Word is the RTX 5090 laptops must use the DLSS to play ray traced games still so if you have a 4090 series and were on the fence maybe you should consider that.

Sign In or Register to comment.