RTX 4000 Officially Revealed

1246

Comments

  • algovincian said:

    oddbob said:

    Rakete said:

    The 4090 is now showing up in blenders benchmark database at rougly twice the performance of a 3090: https://opendata.blender.org/benchmarks/query/?compute_type=OPTIX&compute_type=CUDA&compute_type=HIP&compute_type=METAL&compute_type=ONEAPI&group_by=device_name&blender_version=3.3.0

    Thanks for posting that. I was hoping there would be a bigger gap between the synthetic gaming benchmarks that are floating around and the render performance. Can't personally justify the price at 2x.

    2X the performance seems like a huge jump for a single generation. That will improve as drivers get refined, and at roughly the same MSRP as the 3090 was it seems like a good deal to me!

    - Greg

    2x the performance is also what I consider a worthwhile upgrade for my personal needs, and I think what I got from going from a 1080ti to a 3090?

  • oddboboddbob Posts: 402

    algovincian said:

    2X the performance seems like a huge jump for a single generation. That will improve as drivers get refined, and at roughly the same MSRP as the 3090 was it seems like a good deal to me!

    - Greg

    In the UK it's a 20% price increase between 3090 and 4090. I'd want a waterblock which are looking to be 300ish for the one I want, I'd need a new PSU so that I could move the 3090 into my other system. I have a toy vehicle that's worth less and a daily driver that's worth not much more. For gaming I can turn some settings down, for rendering I can wait a while. Hopefully market conditions and competition will drive prices down for the refresh next year.

  • kyoto kidkyoto kid Posts: 41,202
    edited October 2022

    robertswww said:

    Here is an interesting RTX 4090 that I have not seen mentioned yet...
    NOTE: The RTX 4090 air-cooled cards are all triple-slot cards and use 3-slots.

    On-the-other-hand, the MSI GeForce RTX 4090 SUPRIM LIQUID X 24G is a water-cooled card that only uses 2-slot width for the RTX 4090 (but you do need space in your case to mount the radiator).

    Specs: 
    https://www.techpowerup.com/gpu-specs/msi-rtx-4090-suprim-liquid-x.b9757

    See it to believe it...
    One RTX 4090 is HUGE - Here are 4 of them!
    https://www.youtube.com/watch?v=TjBWpdukzv4&t=795s

    ...indeed they are gargantuan save for the MSI one with the waterblock (although the radiator looks about as thick as a 3 slot card).  While they mention W10/11 I imagine that is due to the fact they ceased driver support for W7/8.1 last year. 

    Yeah I'm pretty content for now with my 3060 and Titan-X. A 3090 would be nice (and would fit in my current case) but would require a slight improvement to the cooling situation.  Already have an 850w PSU so ahead of the curve there. 

    Post edited by kyoto kid on
  • outrider42outrider42 Posts: 3,679

    On the Blender bench, take a look at the previous generation jumps. I also added in some numbers from the Daz bench in the forums for comparison.

    Model       Blender          Daz Bench (iterations per second)

    4090         12322.68                 ?

    3090ti         6250.93                20-22

    3090            5951.6                18.5-20

    2080ti         3217.05                7.5-8

    1080ti         882.82                   4

    980ti           472.05                  2.8

    Blender's numbers do not always align with Daz Iray. The jump from 1080ti to 2080ti on Blender is crazy, it shows what the ray tracing cores can truly do, and it is worth mentioning that Daz Iray could show similar jumps in specific scenes. Every user's experience will be different, and we have always tried to inform users here of this fact. The old "Show Me the Power" thread that featured strand based hair showed the performance gaps could be wildly high, going far higher than 5 or 6 times over non RTX cards.

    The jump from 2080ti to 3090 is quite interesting as well. Ampere split the CUDA cores up, it seems that Iray really liked that as Iray showed a much bigger performance boost. So how will the 4090 fair? I think it will do quite well. I think it will easily hit the 45-50 iteration count I predicted.

    Yeah...the 4090 is poised to be 10 times faster than the 1080ti, the king of 2017 and the GPU many gamers call the GOAT. How about that, possibly 10 times faster in 5 years. If you are not on RTX yet, it might be time to seriously consider it. If the 4090 is too rich, the 4080's might be options. But I personally think both 4080s are over priced. The 4090, as crazy as this sounds, is actually looking like better price to performance over the 4080s. This has been backed up by Nvidia's own numbers. The 4090 has higher frames per dollar than the 4080s do! It isn't supposed to be like that. Regardless, make the best decision that suits you.

    The Founder's 4090 is actually a little shorter than the Founder's 3090. Though it is thicker and fatter, going to full 3 slots.

    So it should fit into most cases as well as the Founder's 3090 does. The only question is how much room you may have for other PCIe parts, or other GPUs. It would certainly be a challenge for most people to run any kind of multiple GPU setup with a 4090. And to honest, now that EVGA has exited the GPU biz, I don't see much reason to buy one of the AIB cards that is so ridiculously massive. Maybe the water cooled one, that might be alright since the card itself is 2 slots. But you still have that big radiator to deal with.

  • kyoto kidkyoto kid Posts: 41,202
    edited October 2022

    ...I'd have the room lengthwise but the thickness of the 4090 would probably conflict with the CPU cooler, and throw a lot of heat its way, unless it was the MSI one (though that would mean sacrificing the top exhaust fans for the radiator). Yes I'd undervolt it to take some stress off the PSU (even though the unit is fairly new and it puts out the recommended 850w) as I am not into games so frame rate is immaterial.. 

    The other issue is would a PCIe 3.0 slot support it or does it need 4.0? 

    Post edited by kyoto kid on
  • PrefoXPrefoX Posts: 252
    edited October 2022

    kyoto kid said:

    ...I'd have the room lengthwise but the thickness of the 4090 would probably conflict with the CPU cooler, and throw a lot of heat its way, unless it was the MSI one (though that would mean sacrificing the top exhaust fans for the radiator). Yes I'd undervolt it to take some stress off the PSU (even though the unit is fairly new and it puts out the recommended 850w) as I am not into games so frame rate is immaterial.. 

    The other issue is would a PCIe 3.0 slot support it or does it need 4.0? 

    CPU Cooler is in the opposit direction, so it shouldn't be a problem. if you get the FE then it might blow heat towards the CPU cooler but not a lot. PCIe3 is fine and shouldn't cost a lot performance, especially for rendering it makes no difference at all.

     

    About the Blender rendertimes, the RT Cores accelerate a lot of BVH stuff and optix had no update for the 4090 cards so those numbers should increase even more. tripple the speed should be possible if they squeeze out everything of the hardware (compared to the 3090)

    @outrider what do you mean that they split the cuda cores with the ampere generation? FP/INT units or that they got additional RT cores?

    you are totaly right that the 4080 are overpriced at the moment. 4090 is fair imho

    Post edited by PrefoX on
  • kyoto kidkyoto kid Posts: 41,202

    ...thanks for the information,

    I do know that PCIe 2.0  does not support the 30xx cards as the slot was tricky to get the 3060 I have to seat properly and the most recent BIOS for such older MBs (like I have) doesn't support them (all I get is the default VGA signal).   The BIOS situation could be a deciding factor as I am looking at a MB with an Intel C612 chipset that supports Intel Broadwell CPUs.

  • GatorGator Posts: 1,312

    kyoto kid said:

    ...thanks for the information,

    I do know that PCIe 2.0  does not support the 30xx cards as the slot was tricky to get the 3060 I have to seat properly and the most recent BIOS for such older MBs (like I have) doesn't support them (all I get is the default VGA signal).   The BIOS situation could be a deciding factor as I am looking at a MB with an Intel C612 chipset that supports Intel Broadwell CPUs.

    Broadwells are getting really long in the tooth now... wouldn't it be better to get a newer MB now?

  • PrefoXPrefoX Posts: 252

    right now is a bad moment to switch to new MBs/CPUs because DDR5 is still expensive, the new boards cost double the money as previously and I would suggest to wait for a ATX3.0 PSU. 

  • outrider42outrider42 Posts: 3,679

    Turing's GPCs could run one FP32 operation and one INT32 operation per cycle. With Ampere they made it so that the INT32 section could perform a FP32 operation instead. So it could do either operation. This gave Ampere cores the ability to perform double the FP32 ops over Turing. What Nvidia did was count this as TWO CUDA cores, rather than saying that one CUDA core was capable of double FP32. Semantics. 

    If you go back to the Ampere announcement, the CUDA core count caught everybody by surprise. AIBs had even made mockup boxes with half the CUDA cores listed.

    So depending on how you define what a CUDA core is, you can claim either number is correct. For Ampere, Nvidia went strictly by FP32 counts. This isn't really wrong, since most computing is FP32. But it isn't exactly correct, either. And this leads to why individual CUDA cores seem a bit weaker than they were in the past, since they are not truly independent.

    AdoredTV did a great video explaining how this was done. It is a long video, as it covers every generation since Fermi. You can skip to the part that he talks about Turing, which is about the 9:30 mark.

    Iray appears to really allow the Ampere design to stretch out and flex its muscle. Lovelace iterates on the Ampere design, but the biggest change is simply the node shrink that allowed Nvidia to pack in so many more transistors over Ampere.

  • kyoto kidkyoto kid Posts: 41,202
    edited October 2022

    Gator said:

    kyoto kid said:

    ...thanks for the information,

    I do know that PCIe 2.0  does not support the 30xx cards as the slot was tricky to get the 3060 I have to seat properly and the most recent BIOS for such older MBs (like I have) doesn't support them (all I get is the default VGA signal).   The BIOS situation could be a deciding factor as I am looking at a MB with an Intel C612 chipset that supports Intel Broadwell CPUs.

    Broadwells are getting really long in the tooth now... wouldn't it be better to get a newer MB now?

    ...staying with W7 Pro as long as I can and that is the last Intel version which fully supported it.  

    Post edited by kyoto kid on
  • PrefoXPrefoX Posts: 252

    outrider42 said:

    Turing's GPCs could run one FP32 operation and one INT32 operation per cycle. With Ampere they made it so that the INT32 section could perform a FP32 operation instead. So it could do either operation. This gave Ampere cores the ability to perform double the FP32 ops over Turing. What Nvidia did was count this as TWO CUDA cores, rather than saying that one CUDA core was capable of double FP32. Semantics. 

    If you go back to the Ampere announcement, the CUDA core count caught everybody by surprise. AIBs had even made mockup boxes with half the CUDA cores listed.

    So depending on how you define what a CUDA core is, you can claim either number is correct. For Ampere, Nvidia went strictly by FP32 counts. This isn't really wrong, since most computing is FP32. But it isn't exactly correct, either. And this leads to why individual CUDA cores seem a bit weaker than they were in the past, since they are not truly independent.

    AdoredTV did a great video explaining how this was done. It is a long video, as it covers every generation since Fermi. You can skip to the part that he talks about Turing, which is about the 9:30 mark.

    Iray appears to really allow the Ampere design to stretch out and flex its muscle. Lovelace iterates on the Ampere design, but the biggest change is simply the node shrink that allowed Nvidia to pack in so many more transistors over Ampere.

    Before Ampere it was INT OR FP, not both. Now it s that every cuda core has INT/FP and FP only shaders. I know that stuff but thanks for the video : )

    The problem for Ampere was more to utilize every shader, that is going to be fixed to a certain point with Adas reorder unit.

  • NylonGirlNylonGirl Posts: 1,916

    Sorel said:

    *3090ti buyers remorse intensifies*

    I think, in the computer world, anything you buy will feel like a bad purchase a year later. 

  • HamEinar said:

    Hmm am I the only one noticing that the 4090 "only" has 6000 more cuda cores? Actual speed of render would be 1.5x at best compared to a 3090 - with the added power consumption, space (heat in he system) and price... If you already have a 3090, getting a second 3090 would be the best bang for your buck..

    yep, got 2x3080 10GBs. thats 17,408 cuda cores. running them at 50% power, lowers heat output by alot and render times increase only in seconds.

    a 4090 with 16,384 cuda cores isnt worth the upgrade. buying 2x3090s on the other hand, thats 20k cuda cores.

    for me rendering isnt the primary goal, so I will stick with the 3080s for now it seems.

  • ExpozuresExpozures Posts: 236

    Reviews for the cards just went live.  The 4090s are freaking beasts!  For the average gamer, they'd really not be much of a benefit since the FPS in them gets into ludicrous speed territory that monitors won't be able to handle for a long time...and also given the fact that 600FPS is just dumb.  But, LTT did do a review of the productivity breakdown and holy crap!  Looking at 2-3X the speed performance over a 3090Ti in Blender.

    Someone was laughing at me for thinking that a 1 hour render on my 3070 would be cut down to probably 10-15min on the 4090.  Yeah...I don't think they'd be laughing now.  3090Ti is about twice as fast as what I currently have, and the 4090 is twice as fast than that in Blender.

    Needless to say, I'm itching to get my grubby hands on one of them.

  • RaketeRakete Posts: 91

    Here is an early review which benchmarks several renderers: https://techgage.com/article/nvidia-geforce-rtx-4090-the-new-rendering-champion/

  • bluejauntebluejaunte Posts: 1,917

    What a beast. DerBauer basically confirmed, limit the power to 60-70% for massivley less power consumption (300W) and barely noticable performance impact. To the point he doesn't understand why they didn't spec it that way to begin with and spare us the power supply issues and connector problems.

  • bluejaunte said:

    What a beast. DerBauer basically confirmed, limit the power to 60-70% for massivley less power consumption (300W) and barely noticable performance impact. To the point he doesn't understand why they didn't spec it that way to begin with and spare us the power supply issues and connector problems.

    He is like the third reviewer I've seen who says the performance is so good they had to double check their results to make sure they hadn't messed up the settings, lol.

  • It seems to be for sale and available (at least from Best Buy)

    Anybody here pick one up?

  • Ghosty12Ghosty12 Posts: 2,065
    edited October 2022

    For those of us in Australia here is what we are likely to have to pay for a RTX4090. One could buy a fairly decent secondhand car for the cost of on of these cards.. lol blush

    https://www.centrecom.com.au/nvidia-geforce-rtx-40-series

    Post edited by Ghosty12 on
  • PerttiAPerttiA Posts: 10,024

    The best price I found here in Finland, was 1999eur (VAT 24% included) / 1612eur (Vat 0%) for Asus TUF Gaming GeForce RTX 4090 24GB

  • GatorGator Posts: 1,312

    Rakete said:

    Here is an early review which benchmarks several renderers: https://techgage.com/article/nvidia-geforce-rtx-4090-the-new-rendering-champion/

    Iray isn't in there, but the 4090's rendering performance looks to be pretty consistent at about 2x that of the 3090.  

    It's impressive for a single generation.  I'll be holding out for a while since I already have 2 3090s for rendering.  Given the size of the thing, I don't think I'll be running a 3090 along with the 4090.  

    Interested to see if there will be a 4090 Ti with 48 GB as rumored (debate on this, who knows).  I'm also eager to see what happens next month as AMD might be shaking up the gaming GPU market.  Unfortunately with the 4090 aimed at content creators really and not gamers, I don't think 4090 pricing will change significantly if at all.

  • nonesuch00nonesuch00 Posts: 18,288

    Chumly said:

    It seems to be for sale and available (at least from Best Buy)

    Anybody here pick one up?

    There are 4 at Best Buy starting at $1599, $1599, $1699, & $1749 and all sold out. I'll be getting one late this winter or early next spring when availability is much more likely.

     

  • ExpozuresExpozures Posts: 236

    Fingers crossed...I'm stopping at my local BestBuy on the way home.  The local farmers wouldn't have much use for one...hoping scalpers didn't get a hold of them...

    I see the vultures are already on eBay selling them for 2-3X the MSRP.  Please don't buy these.  Make these guys suffer for their greed.

  • Someone posted photos of lines outside of Microcenter for these-guessing early demand for early adopters is going to be pretty high. 

  • PerttiAPerttiA Posts: 10,024

    EnoughButter said:

    Someone posted photos of lines outside of Microcenter for these-guessing early demand for early adopters is going to be pretty high. 

    All lining to be beta-testers winkdevil 

  • PerttiA said:

    EnoughButter said:

    Someone posted photos of lines outside of Microcenter for these-guessing early demand for early adopters is going to be pretty high. 

    All lining to be beta-testers winkdevil 

    Especially if they are DAZ users, at least until we get a beta as well, lol.

  • memcneil70memcneil70 Posts: 4,276

     

    asus LogoASUS GeForce RTX 4090 TUF Gaming Triple-Fan 24GB GDDR6X PCIe 4.0 Graphics Card

    SOLD OUTWe are continually getting more stock of this item. Find out how to get yours here

    $1,599.99

    Microcenter sent me the email this morning but I had an appointment at my apartment this morning for a long overdue maintenance. 

    I am also waiting to see how long it takes for the card to be usable with Daz Studio. And what else I will need to buy for it. So my GTX-1080Ti will have do due its thing until then.

  • memcneil70 said:

    I am also waiting to see how long it takes for the card to be usable with Daz Studio. And what else I will need to buy for it. So my GTX-1080Ti will have do due its thing until then.

    Wow that's an upgrade!

    Was thrilled with performance going from 2080ti to Aorus 3090extreme.  Can hardly imagine how thrilled you will be.

  • GatorGator Posts: 1,312

    Yeah @Saxa--SD and @memcneil70 that upgrade is gonna be crazy!

Sign In or Register to comment.