RTX 4000 Officially Revealed

2456

Comments

  • MattymanxMattymanx Posts: 6,949

    Expozures said:

    Mattymanx said:

    Those 4090 specs look nice on paper.  will be interesting to see how it really performs

     

    I'm waiting for Gamers Nexus and Jayztwocents to get their grubby hands on them first before I decide if I'm going to splurge on a 4090 or pick up a cheap-ish 3090Ti.  I'm just rocking a 3070 now, and could use a VRAM boost.  Sucks only being able to get 1 or 2 models and a bit of scenery in a render before it decides to cap-out the memory.

    Yeah, Im going to wait for GN's review before I decide.  While my current system is stable, it is 6 years old and is running with 2 980TIs.  An upgrade is in order. 

  • Mattymanx said:

    Expozures said:

    Mattymanx said:

    Those 4090 specs look nice on paper.  will be interesting to see how it really performs

     

    I'm waiting for Gamers Nexus and Jayztwocents to get their grubby hands on them first before I decide if I'm going to splurge on a 4090 or pick up a cheap-ish 3090Ti.  I'm just rocking a 3070 now, and could use a VRAM boost.  Sucks only being able to get 1 or 2 models and a bit of scenery in a render before it decides to cap-out the memory.

    Yeah, Im going to wait for GN's review before I decide.  While my current system is stable, it is 6 years old and is running with 2 980TIs.  An upgrade is in order. 

    Definitely in order. :)  Yeah, I'd wait to see what the big announcement today from Daz is, and what GN has to say about the 4090s.  If Daz is releasing something that is going to utilize features specifically designed for the 40-series, then that answer is simple: go with a 40-series...but, depending on the reviews which 40-series?  90?  80?

  • Should add...if Daz doesn't have anything special to the 40-series, and Gamer Jesus isn't overly impressed with the price-to-performance, might be worth looking into a 3090 or 3090Ti.

  • outrider42outrider42 Posts: 3,679

    The two 4080s IMO are pretty terribly priced and segmented. The gap between each tier is very large. These are both "4080" class cards but they have like 25% gap???

    Where does that leave the actual 4070 when it launches? Are they really going to have a 4070 that is 50% weaker than a 4080? Nvidia has just set themselves up for disaster IMO. The 4090 is OK. It only costs $100 more than the 3090 and potentially offers a massive jump. But the 4080s are way out of line. The 4080 16GB is priced about where the 3080ti was, but the 3080ti was beefed up 3080, hence the 'ti' part of the name. The 4080 shouldn't cost that much. And the smaller 4080 should be anywhere near $900. I don't think inflation is that crazy.

    Nvidia segmented the cards too far apart. It might get ugly if AMD undercuts them in a couple months. Will AMD do that though, or will they just follow Nvidia's lead? AMD seems to have a habit of just barely dropping in below Nvidia's prices in select tiers, which doesn't serve to help them considering they are the distant #2 in a 2 horse race (I am not counting Intel).

    One thing to remember guys is that the gaming benchmarks are not at all reflective of where Iray performance will fall. Remember the 2000 series was pretty underwhelming for gamers but the new ray tracing cores completely changed the game for Daz Studio Iray. Jenson Huang made multiple statements that ray tracing has made big strides with Lovelace.

    I think we got a small hint. They didn't give many numbers, but the one that stands out is that the remote car demo thing ran 4 times faster on the 4090. This demo makes far more use of ray tracing than most video games, and so that performance could be a clue to how well a fully path traced software like Iray may run. The caveat is they used DLSS 3.0, so a chunk of that performance is from DLSS. So I don't think we will see 4x performance in Iray. But I do think it will fall between 2 to 3x. Ray tracing can be weird though, so we might have scenes that don't hit 2x, but then other scenes with a lot more mesh data run much faster.

    Speaking of DLSS, I wonder if Iray will improve its denoiser with the updated Tensor cores? If they could improve the denoiser that could could be a big game changer right there.

    The last thing I want to urge people to remember is that it may take time for Daz Studio to even support the new 4000 series! If you rush out and buy a 4090 and don't keep a backup card you might not be able to render for a while until Daz puts out the update! I could be wrong, and maybe Lovelace works on launch day, but we have a long history of that not being the case. So I don't want people to jump on Lovelace expecting it to just work without a backup plan. In fact I am putting this into the opening post, just in case somebody hasn't considered this.

  • With the prices the way they are now, I will probably only buy like every 4 generations now lol. Unless I hit the lottery or some rich distant cousin I never heard of crokes and leaves me a fortune.

  • RayDAntRayDAnt Posts: 1,144

    EnoughButter said:

    HamEinar said:

    Hmm am I the only one noticing that the 4090 "only" has 6000 more cuda cores? Actual speed of render would be 1.5x at best compared to a 3090 - with the added power consumption, space (heat in he system) and price... If you already have a 3090, getting a second 3090 would be the best bang for your buck..

    Cuda cores are the only measurement for render speed?

    Essentially yes, but with a huge caveat: GPU memory capacity capacity. In order for a scene to be renderable it has to be able to fit in the memory buffer of whatever GPU you are attempting to render it on.

  • rrwardrrward Posts: 556

    RayDAnt said:

    @outrider42,

    Don't forget the RTX 6000 (not to be confused with the Quadro RTX 6000 of yore) coming in December...

    With a max TDP of only 300 watts.

     

    ETA: It's also a PICE gen 4 card, meaning that the rest of the 40 series line will also be gen 4 (not 5) based. Food for thought.

     

    The new RTX X000 cards are what I'm looking at. I currently have an A5000 for my render box and I'm very happy with it.

  • Pass, Im good with my 3080RTX.
  • GatorGator Posts: 1,312

    Expozures said:

    Mattymanx said:

    Those 4090 specs look nice on paper.  will be interesting to see how it really performs

     

    I'm waiting for Gamers Nexus and Jayztwocents to get their grubby hands on them first before I decide if I'm going to splurge on a 4090 or pick up a cheap-ish 3090Ti.  I'm just rocking a 3070 now, and could use a VRAM boost.  Sucks only being able to get 1 or 2 models and a bit of scenery in a render before it decides to cap-out the memory.

    Same, I'll wait for Gamers Nexus and Jayztwocents to get their hands on them.  Although I already have 3090s.

    I take the manufacturers performance claims with a big grain of salt, but it does look promising.  There's a lot more cores clocked higher.  

    Another thing is if we'll actually see Nvidia lowering prices - this has been a first that I can recall with very steep discounts on new 3090s and big discounts on 3080s.  We have a much different environment over past generations.  A recession, inflation, a glut of inventory, AMD catching up, and very impressive performance from the latest gen consoles.  Given the current environment, it's hard to imagine the demand on the high end to be high enough.  I think gamers will be in a wait and see mode with AMD coming out in November to see what they deliver.

  • GatorGator Posts: 1,312

    outrider42 said:

    oddbob said:

    outrider42 said:

    There is always a bottleneck. A CPU could be faster. The drives could be faster. The RAM could be faster. All of these have some impact on gaming performance. I imagine we will see lots of people doing tests with PCIe 3.0 and Lovelace given that 3.0 is still popular. But I don't think it will be much of an issue. If it actually is any kind of bottleneck...I am certain it is a very small one.

    Worst case for a 3090 between PCIe 3 and 4 is about 2%. For my uses, rendering, VR and 4k gaming the 3090 is the limiting factor. That's with a 10700K with MCE enabled, 3200 ram, SN750, +100/500 on the GPU and a custom loop with 420 & 280 rads. A 4090 may change the dynamics. Have to wait and see. I find the 2x performance claim for rasterised games believable. I'm looking forward to rendering benchmarks, if it follows the pattern from 20 series to 30 series it should be impressive.

    Sounds like you already have your answer then. The 4090 is still PCIe 4.0, and its VRAM is not much faster than the 3090. All the PCIe comparisons I have seen have required crazy high frame rates to even begin to see performance drops for 4.0 versus 3.0. These rates are higher than what most 4K monitors can even handle, since most max out at 144 hz. Samsung only just released the first 240 hz 4K display this year. But I would rather have an OLED myself, and most of those currently cap at 120 hz, except for QD-OLED that caps at 165 I think. The 4090 should have no problem hitting that cap in most games for the next couple years.

    The one that sticks out to me is the Hardware Unboxed video that showed cheap DDR5 running Spiderman with ray tracing 15% faster than DDR4 at 1080p. I wonder if that is a sign of what may be coming. 

    I think you want to look into OLED more.  Personally, I'd avoid it on a computer as OLED is prone to burn in.  Permanent taskbar and icon shadows.  laugh

  • outrider42outrider42 Posts: 3,679

    Gator said:

    outrider42 said:

    oddbob said:

    outrider42 said:

    There is always a bottleneck. A CPU could be faster. The drives could be faster. The RAM could be faster. All of these have some impact on gaming performance. I imagine we will see lots of people doing tests with PCIe 3.0 and Lovelace given that 3.0 is still popular. But I don't think it will be much of an issue. If it actually is any kind of bottleneck...I am certain it is a very small one.

    Worst case for a 3090 between PCIe 3 and 4 is about 2%. For my uses, rendering, VR and 4k gaming the 3090 is the limiting factor. That's with a 10700K with MCE enabled, 3200 ram, SN750, +100/500 on the GPU and a custom loop with 420 & 280 rads. A 4090 may change the dynamics. Have to wait and see. I find the 2x performance claim for rasterised games believable. I'm looking forward to rendering benchmarks, if it follows the pattern from 20 series to 30 series it should be impressive.

    Sounds like you already have your answer then. The 4090 is still PCIe 4.0, and its VRAM is not much faster than the 3090. All the PCIe comparisons I have seen have required crazy high frame rates to even begin to see performance drops for 4.0 versus 3.0. These rates are higher than what most 4K monitors can even handle, since most max out at 144 hz. Samsung only just released the first 240 hz 4K display this year. But I would rather have an OLED myself, and most of those currently cap at 120 hz, except for QD-OLED that caps at 165 I think. The 4090 should have no problem hitting that cap in most games for the next couple years.

    The one that sticks out to me is the Hardware Unboxed video that showed cheap DDR5 running Spiderman with ray tracing 15% faster than DDR4 at 1080p. I wonder if that is a sign of what may be coming. 

    I think you want to look into OLED more.  Personally, I'd avoid it on a computer as OLED is prone to burn in.  Permanent taskbar and icon shadows.  laugh

    OLEDs don't get burn in as often as people claim they do. LED TVs actually fail a LOT more often. And I mean a LOT more often than OLEDs do. There is a risk of burn in, yes, but there is also a signicantly larger risk that your LEDs will fail than any OLED will get burn in.

  • nonesuch00nonesuch00 Posts: 18,288

    In the past I was always on the fence, wait and see, wait for pricing, wait for availability, wait for the end of price gouging, and now that I have the luck to see how the GeForce RTX 3060 12GB I have COULD do with more speed and more memory I am 100% buying the GeForce RTX 4090. Not a second's hesitation there. It's a must. I actually have enough money saved now and am struggling to decide which will come first: an Apple MacBook Air M1 or an GeForce RTX 4090. I have to have both by end of 2023. Both are musts though, no vacillitating there.

  • nonesuch00 said:

    In the past I was always on the fence, wait and see, wait for pricing, wait for availability, wait for the end of price gouging, and now that I have the luck to see how the GeForce RTX 3060 12GB I have COULD do with more speed and more memory I am 100% buying the GeForce RTX 4090. Not a second's hesitation there. It's a must. I actually have enough money saved now and am struggling to decide which will come first: an Apple MacBook Air M1 or an GeForce RTX 4090. I have to have both by end of 2023. Both are musts though, no vacillitating there.

    That's my thought exactly.  I've got a 3070, and I really could use more VRAM and more CUDA cores to help boost up my render game. My fence is: do I hop on the 4090 bandwagon, or do I take advantage of the drastically cut MSRP of the 3090Ti?  How much of a gain would I get with a 4090 over a 3090Ti

  • GatorGator Posts: 1,312

    Expozures said:

    nonesuch00 said:

    In the past I was always on the fence, wait and see, wait for pricing, wait for availability, wait for the end of price gouging, and now that I have the luck to see how the GeForce RTX 3060 12GB I have COULD do with more speed and more memory I am 100% buying the GeForce RTX 4090. Not a second's hesitation there. It's a must. I actually have enough money saved now and am struggling to decide which will come first: an Apple MacBook Air M1 or an GeForce RTX 4090. I have to have both by end of 2023. Both are musts though, no vacillitating there.

    That's my thought exactly.  I've got a 3070, and I really could use more VRAM and more CUDA cores to help boost up my render game. My fence is: do I hop on the 4090 bandwagon, or do I take advantage of the drastically cut MSRP of the 3090Ti?  How much of a gain would I get with a 4090 over a 3090Ti

    Yeah, how much faster a 4090 will be over a 3090 or 3090 Ti is a question many of us want to know.  smiley

    Nvidia quotes some very impressive increases in performance, just looking at specs it does appear to be based in reality.

    3090 10496 cores, 328 TMUs, 112 ROPS, 1395 MHz base clock 1695 MHz boost clock

    3090 Ti 10752 cores, 336 TMUs, 112 ROPs, 1560 MHz base clock 1860 MHz boost clock

    4090 16384 cores, 512 TMUs, 192 ROPs, 2235 MHz base clock 2520 MHz boost clock
     

    Almost 60% more CUDA cores with about a 50% increase in boost clock over the 3090.  Throw in architecture improvements and double performance looks plausible.

  • nonesuch00nonesuch00 Posts: 18,288

    Expozures said:

    nonesuch00 said:

    In the past I was always on the fence, wait and see, wait for pricing, wait for availability, wait for the end of price gouging, and now that I have the luck to see how the GeForce RTX 3060 12GB I have COULD do with more speed and more memory I am 100% buying the GeForce RTX 4090. Not a second's hesitation there. It's a must. I actually have enough money saved now and am struggling to decide which will come first: an Apple MacBook Air M1 or an GeForce RTX 4090. I have to have both by end of 2023. Both are musts though, no vacillitating there.

    That's my thought exactly.  I've got a 3070, and I really could use more VRAM and more CUDA cores to help boost up my render game. My fence is: do I hop on the 4090 bandwagon, or do I take advantage of the drastically cut MSRP of the 3090Ti?  How much of a gain would I get with a 4090 over a 3090Ti

    Well what piqued my interest & made it a must, was the stating by the head honcho that the cores (I forget what those type cores are called, tensors cores???, does that sound right?) that do the DLSS / AI stuff have been made around twice as fast.

  • nonesuch00nonesuch00 Posts: 18,288
    edited September 2022

    dup

    Post edited by nonesuch00 on
  • Ghosty12Ghosty12 Posts: 2,065
    edited September 2022

    There looks to be an issue coming that not many folks knew about until now, the video below explains what is going on.

    Post edited by Ghosty12 on
  • BejaymacBejaymac Posts: 1,897

    Ghosty12 said:

    There looks to be an issue coming that not many folks knew about until now, the video below explains what is going on.

    Yep, you need a new PSU and adapter cable if you want to get a 40 series, otherwise you'll end up with a desk fire rather than a PC 

  • oddboboddbob Posts: 402

    outrider42 said:

    OLEDs don't get burn in as often as people claim they do. LED TVs actually fail a LOT more often. And I mean a LOT more often than OLEDs do. There is a risk of burn in, yes, but there is also a
    signicantly larger risk that your LEDs will fail than any OLED will get burn in.

    I've had an LG C1 on my desk for about 13 months and I'm yet to see any image retention despite many hours using the same few games and apps.

    There are ways to lessen the risk and there are better options if you just want a large monitor for productivity.

    If you want to play games or watch media though they're very good, and considering they're on a similar sale cycle to Daz stuff they're decent value compared to other screens.

  • I wouldn't have a complete panic attack about the PSU just yet.  From what I gleaned from Jayz' video is that the big risk is in using pigtail connectors.  A 4090 will need 3 PCIE connectors to power the sucker.

    According to nVidia you need either:

    3x PCIe 8-pin cables from power supply to included RTX 4090 Power Connector Adapter. Graphics card supports 3x or 4x PCIe 8-pin cables.

    or

    1x 450W or greater PCIe Gen 5 power cable from power supply

    Again, waiting for Gamer Jesus to do his testing, but I think if you have enough headroom, and you have a good PSU that has lots of VGA connectors, you should be okay.

    Not going to say to do it...but, I picked up a EVGA 1200W SuperNova P3 PSU.  It has 6 8-pin PCIe connectors.  I'm hoping this will get Jesus's blessing.  I've been doing some searching, and from what I can see, the only PCIe 5.0 PSU out on the market right now is from Gigabyte...which...eh...I'd prefer not to burn my house down, thanks.

    Screenshot 2022-09-24 062640.png
    1621 x 636 - 174K
  • oddboboddbob Posts: 402

    nonesuch00 said:

     I am 100% buying the GeForce RTX 4090. Not a second's hesitation there. It's a must.

    I'd wait for reviews first to see if there are any unexpected problems, either model specific or in general. 

    I do think it's a good thing to buy early in the cycle if you intend on buying.

    I've bought GPUs when they go end of life and there are savings to be had but you're also missing out on using the faster thing for a couple of years.

    I'm sticking with my 3090 for another couple of years I think.

  • oddboboddbob Posts: 402

    Expozures said:

    Again, waiting for Gamer Jesus to do his testing, but I think if you have enough headroom, and you have a good PSU that has lots of VGA connectors, you should be okay.

    The only thing that's new here are the four sense pins. The other issues were present before but weren't a problem with a well designed PSU and GPU. My 3090 is doing 375W on a custom 2 x 8 pin to 12 pin cable from moddiy and has been doing for a  long time. It's been in and out several times, fitted with a block and stuck on a riser. Distinct lack of melting.

  • Ghosty12Ghosty12 Posts: 2,065

    oddbob said:

    Expozures said:

    Again, waiting for Gamer Jesus to do his testing, but I think if you have enough headroom, and you have a good PSU that has lots of VGA connectors, you should be okay.

    The only thing that's new here are the four sense pins. The other issues were present before but weren't a problem with a well designed PSU and GPU. My 3090 is doing 375W on a custom 2 x 8 pin to 12 pin cable from moddiy and has been doing for a  long time. It's been in and out several times, fitted with a block and stuck on a riser. Distinct lack of melting.

    There was that but there was also the concern of the flimsiness of the connector being used on the cards.

  • oddboboddbob Posts: 402

    Ghosty12 said:

    oddbob said:

    Expozures said:

    Again, waiting for Gamer Jesus to do his testing, but I think if you have enough headroom, and you have a good PSU that has lots of VGA connectors, you should be okay.

    The only thing that's new here are the four sense pins. The other issues were present before but weren't a problem with a well designed PSU and GPU. My 3090 is doing 375W on a custom 2 x 8 pin to 12 pin cable from moddiy and has been doing for a  long time. It's been in and out several times, fitted with a block and stuck on a riser. Distinct lack of melting.

    There was that but there was also the concern of the flimsiness of the connector being used on the cards.

    The cable connectors should be fine unless you're buying cheap and nasty cables off Amazon or Ebay or plug and unplug them more than normal. The quality of the connectors on the card are down to the design and manufacture of the board partner. Sometimes they do daft things in a stupid fashion. The 8 pin connectors on the 3080/3090 Gigabyte Eagle cards come to mind. Because it was a short board and they wanted the plugs in the traditional position at the end of the card the power plugs were fitted with to the cooler. Because the back of the plug had inadequate support cards were arriving with out of place or loose pins. People were struggling to get the power leads in or finding that the pins moved or came loose if the cards were unplugged. Cards were revised and unusable ones replaced but it's always better to wait for decent reviews and let other people do the beta testing. Another thing that I'm waiting to see is how hot and noisy the new cards get after being run hard in a typical PC case for an hour.

  • nonesuch00nonesuch00 Posts: 18,288

    Ghosty12 said:

    There looks to be an issue coming that not many folks knew about until now, the video below explains what is going on.

    Frankenstein says: I'm scared of fire and it's almost Halloween month. Here is an example of a PCIE5 PSU needed for the new PCIE5 graphics cards.

    MEG Ai1300P PCIE5 | Power Supply | Overflow With Power (msi.com)

    It's not available to buy yet unfortunately as PSUs get much less hype than graphics cards..

  • outrider42outrider42 Posts: 3,679

    I honestly feel like some of that is a bit of fear mongering. It is good to be skeptical, of course. But we already have 450 Watt 3090tis, the 4090 is using the same amount of power. It is possible that the 3090ti was something of a test for the 4090. So they are not going into this blind.

    Again I reiterate that undervolting is a great option.

    Nvidia showed a slide that stated "current generation games" saw a 60-70% uplift. Not quite the 2 times they claim in other games which use RTX features. But this still falls in line with my expectations that Iray performance will double (or more). Gaming doesn't benefit like Iray does on generational uplifts. It has been that way every generation, even before "RTX" existed. The 3090 was around 45% faster than the 2080ti at games, and look how that turned out for Iray. (Very well.)

    So I am not one bit concerned about what performance Lovelace will bring. Prices are more of an issue, mainly with the two 4080s costing what they do. The 4080 12GB has no business costing what it does. That is a 294mm die, which is SMALLER than the 1070 from 2016, which had a 314mm die. That is just wow. In fact this chip is the smallest "x80" class card released. The smallest. And they can't get it under $900? Huh? The die sizes of these chips really tell a story.

    There are certainly some strange things going on. The door for AMD has never been wider in its history against Nvidia. It will be very interesting to see what AMD reveals when their time comes.

    In other news Intel finally announced their ARC A770 for $329. This GPU will not work with Iray, sorry. But it packs 16GB of VRAM and could rival a 3060. If the drivers work...that hasn't been smooth sailing. The die size is 406mm, BTW, which is a lot larger than the 4080 12GB. It is on a 6nm node rather than the 4nm that Lovelace is on, but still that is kind of funny. There are reports that Intel may be selling these for little profit, or possibly even a loss, since they want to get a foot in the door in the GPU market. Still, this card packs more VRAM and uses a bigger chip than the $900 4080 12GB.

  • fred9803fred9803 Posts: 1,564

    I'm probably not alone in saying I can't afford a 4000 series nor let along afford to run one at the power requirements with today's power bills, and the PC upgrade I'd have to do to accommodate . They've priced themselves out of my market for me and I suspect many others. 

     

  • GordigGordig Posts: 10,171

    outrider42 said:

    Nvidia showed a slide that stated "current generation games" saw a 60-70% uplift. Not quite the 2 times they claim in other games which use RTX features. But this still falls in line with my expectations that Iray performance will double (or more). Gaming doesn't benefit like Iray does on generational uplifts. It has been that way every generation, even before "RTX" existed. The 3090 was around 45% faster than the 2080ti at games, and look how that turned out for Iray. (Very well.)

    It's logical that that would be the case. Nvidia isn't making the games, but they do make Iray.

  • outrider42outrider42 Posts: 3,679

    So Nvidia released CUDA 11.8 today. This is important to us as Iray users, because Iray is based on CUDA. The biggest detail to know is that this update adds Lovelace support to the CUDA platform.

    11.8

    • This release introduces support for both the Hopper and Ada Lovelace GPU families.

    https://developer.nvidia.com/blog/cuda-toolkit-11-8-new-features-revealed/

    This is huge news, because it is an indication that Iray will indeed require an update in order to use Lovelace at all. Which means that we have to wait for Iray to update their SDK, and then Daz Studio to release a new version that uses this new Iray. It always takes Daz at least a month to get a beta out after the SDK drops. So where is the SDK?

    The Iray Dev Blog is still silent with no updates since August. That really is getting on my nerves, and has me a bit worried that it may be a while before Daz Studio gets 4000 series support. How long? We simply do not know.

    Of course if anybody from Daz-Tafi has any information regarding the subject, NOW WOULD BE A SWELL TIME TO TELL US, the customers, what is going on. These cards release in just SEVEN days!

    If you are thinking of buying a 4090 on day 1, you better have a plan in case it doesn't work in Daz Studio! I am trying to warn you. Do not ditch your current GPU. Maybe we get lucky and the 4090 works on launch day, but what *if* it doesn't? Are you prepared for that possibility?

  • EnoughButterEnoughButter Posts: 103
    edited October 2022

    outrider42 said:

    I honestly feel like some of that is a bit of fear mongering. It is good to be skeptical, of course. But we already have 450 Watt 3090tis, the 4090 is using the same amount of power. It is possible that the 3090ti was something of a test for the 4090. So they are not going into this blind.

    I think you are right. Both NVidia and Corsair (and, more importantly, Jonny Guru of Corsair) have said it won't be an issue (and mentioned the fear mongering). And Steve at GN also mentioned today it wasn't a big deal. I put their combined authority a bit higher than that other dude, but people can believe who they want, lol. 

     

    outrider have you seen the size of the Aorus card, lol. 

    It completely dwarfs the 4090FE.

     

    Post edited by EnoughButter on
Sign In or Register to comment.