Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

191012141545

Comments

  • On the comparison page, only the 3-slot RTX-3090 is shown to have the NVIDIA NVLink™ (SLI-Ready) option. Not the 2-slot 3080 or 3070.
    Source: https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/compare/

  • On the comparison page, only the 3-slot RTX-3090 is shown to have the NVIDIA NVLink™ (SLI-Ready) option. Not the 2-slot 3080 or 3070.
    Source: https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/compare/

    I'm trying to figure out if I can even fit the 3090 in my case with my 1080Ti-I don't think I can fit 2 3090s, lol.

  • I'm trying to figure out if I can even fit the 3090 in my case with my 1080Ti-I don't think I can fit 2 3090s, lol.

     

    Why would you need to?  The 3090 is so far ahead of the 1080 Ti... Here's the thing with dual setups.  You have to be very careful with thermals.  Your temperatures go up, your clocks go down.  Putting two cards in one machine (I'm talking air cooling here of course) will seriously hurt the airflow around both cards and that will give you worse performance overall, including the CPU which is sitting in that hotter environment too.  It may be better overall but it's never 2 x, so you have to ask the question if the value proposition of that second card is worth it.

     

     

  • GordigGordig Posts: 10,191

    Has there been any indication of whether the 3090 can be put into TCC mode?

  • marblemarble Posts: 7,500

    I am disappointed but not surprised that the 3070 is stuck with the same 8GB of VRAM that I have in my GTX 1070. VRAM is *the* limiting factor of IRay and NVidia continue to ignore this or push us towards the expensive end which is beyond the spending capacity of many hobbyists such as myself. I'm guessing that Optix and some of the other fancy features in the new cards will require even more VRAM so my scenes will soon be down to a couple of characters with texture sizes reduced beyond the point where the seams start to show. High hopes dashed for me but I can see the thrill of those with cash to burn.

  • AalaAala Posts: 140
    edited September 2020
    marble said:

    VRAM is *the* limiting factor of IRay and NVidia continue to ignore this or push us towards the expensive end which is beyond the spending capacity of many hobbyists such as myself.

    I don't think Nvidia has Iray or equivalent renderers in mind when designing the 3070. The 3070 is a perfectly balanced card for gaming, able to run 4k at 60 fps easily and will end up being the best selling card from the lineup. 8GB is good enough for 4k right now, though certainly could be better (and probably will after AMD launches their cards in turn).

     

    Post edited by Aala on
  • marble said:

    I am disappointed but not surprised that the 3070 is stuck with the same 8GB of VRAM that I have in my GTX 1070. VRAM is *the* limiting factor of IRay and NVidia continue to ignore this or push us towards the expensive end which is beyond the spending capacity of many hobbyists such as myself. I'm guessing that Optix and some of the other fancy features in the new cards will require even more VRAM so my scenes will soon be down to a couple of characters with texture sizes reduced beyond the point where the seams start to show. High hopes dashed for me but I can see the thrill of those with cash to burn.

    I would still wait.  I'm pretty sure they'll release another set of cards with more RAM.  They'll probably wait until AMD release RDNA2 to announce them officially.  So I'm going to hold off on buying for a while I think.

  • marblemarble Posts: 7,500
    Robinson said:
    marble said:

    I am disappointed but not surprised that the 3070 is stuck with the same 8GB of VRAM that I have in my GTX 1070. VRAM is *the* limiting factor of IRay and NVidia continue to ignore this or push us towards the expensive end which is beyond the spending capacity of many hobbyists such as myself. I'm guessing that Optix and some of the other fancy features in the new cards will require even more VRAM so my scenes will soon be down to a couple of characters with texture sizes reduced beyond the point where the seams start to show. High hopes dashed for me but I can see the thrill of those with cash to burn.

    I would still wait.  I'm pretty sure they'll release another set of cards with more RAM.  They'll probably wait until AMD release RDNA2 to announce them officially.  So I'm going to hold off on buying for a while I think.

    Some of the rumours mentioned that they expect the double-sized VRAM to be released in Spring next year (another 6 months to wait) and then only if they are pushed into producing those variants by whatever ATI release in the meantime. Again, my hopes have a habit of biting the dust so I'm not raising them yet. My son was looking for a GPU upgrade and offered to buy my 1070 if I upgrade to a 3070 but that looks unlikely now too.

  • marble said:

    Some of the rumours mentioned that they expect the double-sized VRAM to be released in Spring next year (another 6 months to wait) and then only if they are pushed into producing those variants by whatever ATI release in the meantime. Again, my hopes have a habit of biting the dust so I'm not raising them yet. My son was looking for a GPU upgrade and offered to buy my 1070 if I upgrade to a 3070 but that looks unlikely now too.

    Six months isn't long to wait though, in the grand scheme of things.  I'm not buying two cards this release cycle that's for sure, so waiting it is.

  • billyben_0077a25354billyben_0077a25354 Posts: 771
    edited September 2020
    Aala said:

    Man, I'm in a dilemma now. I have two 2080 Ti's, but don't think it's possible for me to just switch to two 3090's. There's no room on my PC, and I'm not sure if my 1000w PSU will be able to handle it. I really hope that for Iray, the 3090 is at least 2x faster, because a straight swap would probably make sense, but on the other hand I might have to sell by two 2080 Ti's for just 500 each, or less!

    And then there's the fact that it's on Samsung 8nm, which I'm pretty sure means the series is going to have a refresh next year with either TSMC 7nm, or another smaller node from Samsung, bringing down power consumption by up to 50% for the same performace. What do I do?!?!? Dilemmas... aghhh...

    With a pair of 2080 Ti's, y\u are sitting pretty and have some time and options.  I am stuck with a single GTX 1070 and have to do somehting because my 1070 is now two generations old and need more memory and cuda cores to do the scenes I want to do.

    Post edited by billyben_0077a25354 on
  • DripDrip Posts: 1,206

    I first want to see some iray perfomance comparisons. For now, I'm actually pretty satisfied with my 2070 anyway, though some additional memory would be nice. So I'll just see what the iray benchmarks are, and hope for a 3070 variant with somewhere between 10 and 16 gigs in six months. A 3070Ti or 3070 Super could be an interesting improvement, both in performance and memory.

  • kenshaw011267kenshaw011267 Posts: 3,805
    edited September 2020
    Gordig said:

    Now Nvidia could be pulling NVLink but that would be a pretty major screw job.

    I could be misremembering, but didn't you say something to the effect that VRAM could be shared over PCIE 4.0?

    It isn't fast enouigh. PCIE gen 5 is. But that is a couple of years off. The motherboards that had NVLink built in were custom ones for the DG computers that Nvidia sold.

    A pair of 2080's or 2070 Super's, the slowest NVLink commonly available, can transfer 25Gb/s each way. A 16x PCIE gen 4 connection would max out at 31.5 Gb/2 one way. So it's over half the speed of the slowest connection.

    And yes there is a slide that does say the 3080 and 3070 does not have NVLink.

    Post edited by kenshaw011267 on
  • I'm rather curious, why the rush to get these newer cards? Unless I missed something, these cards probably won't work with Daz iray rendering until they update the software, which could take a few months or more, no?

  • DiasporaDiaspora Posts: 459
    edited September 2020

    Speaking for myself, my bar for buying an RTX 3090 was "will it cut my render times in iRay to under 33% of where I am now with my EVGA RTX 2080 black (non-oc)?"

    Glancing at the specs, it looks like it clears the threshold...

    RTX 2080 has 2944 cuda cores
    RTX 3090 has 10496 cuda cores

    I'm going to wait to purchase until I see some actual real world benchmarks, but right now this is looking VERY promising and the productivity and quality of life improvements this will give me is more than worth the price of admission. 

    Post edited by Diaspora on
  •  

    Chumly said:

    So with the 3090 at 24GB Vram....

    Is that like... 7 G8Fs + an environment in one go?

    I'll use my current TITAN RTX and let ya know laugh

  • Diaspora said:

    Glancing at the specs, it looks like it clears the threshold...

    RTX 2080 has 2944 cuda cores
    RTX 3090 has 10496 cuda cores

    I'm going to wait to purchase until I see some actual real world benchmarks, but right now this is looking VERY promising and the productivity and quality of life improvements this will give me is more than worth the price of admission. 

    There's some confusion over number of "cores" here.  I think it's actually half that on the 3090 but each one does two instructions per clock.  That was one part of the presentation that wasn't clear.  Anyway Digital Foundry actually got a hands-on look (one of the rare few).  In games at least, RT looks about twice as fast.  Rasterising around that too.  I don't know how it scales with resolution.  We'll have to wait for actual numbers from anyone who can get their hands on one once they hit shelves.

    Another point here, I think AMD's new generation of cards is going to drop around November time.  We can probably expect another round of NVIDIA cards first quarter next year then, or maybe earlier, with more memory.  Or maybe that's wishful thinking on my part.  The competition is good though.

  • nicsttnicstt Posts: 11,715
    Gordig said:

    Guess there's a 3090 in my future. I'd been saving up for a Titan RTX, but not no more.

    I'm still saving for one, but might grab a 3090 for monitors and help rendering as well; wonder what AMD will do.

    Whatever happens, my 970 I use for the monitors has a faulty port and does seem to be less capable than it once was.

  • nicsttnicstt Posts: 11,715
    Aala said:

    Man, I'm in a dilemma now. I have two 2080 Ti's, but don't think it's possible for me to just switch to two 3090's. There's no room on my PC, and I'm not sure if my 1000w PSU will be able to handle it. I really hope that for Iray, the 3090 is at least 2x faster, because a straight swap would probably make sense, but on the other hand I might have to sell by two 2080 Ti's for just 500 each, or less!

    And then there's the fact that it's on Samsung 8nm, which I'm pretty sure means the series is going to have a refresh next year with either TSMC 7nm, or another smaller node from Samsung, bringing down power consumption by up to 50% for the same performace. What do I do?!?!? Dilemmas... aghhh...

    Wait... Save cash.

    Not wait... Spend cash.

    ... The choice is yours.

  • nicsttnicstt Posts: 11,715
    Robinson said:

    I'm trying to figure out if I can even fit the 3090 in my case with my 1080Ti-I don't think I can fit 2 3090s, lol.

     

    Why would you need to?  The 3090 is so far ahead of the 1080 Ti... Here's the thing with dual setups.  You have to be very careful with thermals.  Your temperatures go up, your clocks go down.  Putting two cards in one machine (I'm talking air cooling here of course) will seriously hurt the airflow around both cards and that will give you worse performance overall, including the CPU which is sitting in that hotter environment too.  It may be better overall but it's never 2 x, so you have to ask the question if the value proposition of that second card is worth it.

     

     

    Not if one is only for monitors, or only very limited. I've been using a 970/980ti for about five years now and have used the 970 with the 980 about five times - if that.

  • nicsttnicstt Posts: 11,715
    Aala said:

    Man, I'm in a dilemma now. I have two 2080 Ti's, but don't think it's possible for me to just switch to two 3090's. There's no room on my PC, and I'm not sure if my 1000w PSU will be able to handle it. I really hope that for Iray, the 3090 is at least 2x faster, because a straight swap would probably make sense, but on the other hand I might have to sell by two 2080 Ti's for just 500 each, or less!

    And then there's the fact that it's on Samsung 8nm, which I'm pretty sure means the series is going to have a refresh next year with either TSMC 7nm, or another smaller node from Samsung, bringing down power consumption by up to 50% for the same performace. What do I do?!?!? Dilemmas... aghhh...

    With a pair of 2080 Ti's, y\u are sitting pretty and have some time and options.  I am stuck with a single GTX 1070 and have to do somehting because my 1070 is now two generations old and need more memory and cuda cores to do the scenes I want to do.

    Need?

    Nah, lots of us manage with less. Sure it's nice to have - or do you use it for work?

  • EnoughButterEnoughButter Posts: 103
    edited September 2020

    EVGA just sent me an email-looks like they will have *five* 3090 models coming out, three 3080 models, and two 3070 models, at least to start. Scared to find out what the Kingpin version will cost.

    https://www.evga.com/articles/01434/evga-geforce-rtx-30-series/

    Looks like at least a couple of the models will still use 8 pin power connectors.

    MSI will have four 3090 models

    https://videocardz.com/press-release/msi-announces-geforce-rtx-3090-rtx-3080-and-rtx-3070-graphics-cards

    ASUS has two 3090 models so far

    https://www.tweaktown.com/news/74882/asus-intros-geforce-rtx-30-series-rog-strix-tuf-gaming-graphics-cards/index.html

     

    Post edited by EnoughButter on
  • Robinson said:
    Diaspora said:

    Glancing at the specs, it looks like it clears the threshold...

    RTX 2080 has 2944 cuda cores
    RTX 3090 has 10496 cuda cores

    I'm going to wait to purchase until I see some actual real world benchmarks, but right now this is looking VERY promising and the productivity and quality of life improvements this will give me is more than worth the price of admission. 

    There's some confusion over number of "cores" here.  I think it's actually half that on the 3090 but each one does two instructions per clock.  That was one part of the presentation that wasn't clear.  Anyway Digital Foundry actually got a hands-on look (one of the rare few).  In games at least, RT looks about twice as fast.  Rasterising around that too.  I don't know how it scales with resolution.  We'll have to wait for actual numbers from anyone who can get their hands on one once they hit shelves.

    Another point here, I think AMD's new generation of cards is going to drop around November time.  We can probably expect another round of NVIDIA cards first quarter next year then, or maybe earlier, with more memory.  Or maybe that's wishful thinking on my part.  The competition is good though.

    Yeah. There is some real weirdness about this CUDA stuff. I sent an email to my Nvidia rep and got back nothing but the timestamp to the presentation and more will come next week. So not just wait for game data but wait for rendering performance data. While we won't likely see iRay data till people around here get the cards the early benchmarks on Blender and Vray should give us some idea how this new CUDA performs on rendering tasks.

    Although I'd be really shocked if it did badly. Nvidia makes a lot of money off the enterprise and CUDA.

  • Aala said:

    Man, I'm in a dilemma now. I have two 2080 Ti's, but don't think it's possible for me to just switch to two 3090's. There's no room on my PC, and I'm not sure if my 1000w PSU will be able to handle it. I really hope that for Iray, the 3090 is at least 2x faster, because a straight swap would probably make sense, but on the other hand I might have to sell by two 2080 Ti's for just 500 each, or less!

    And then there's the fact that it's on Samsung 8nm, which I'm pretty sure means the series is going to have a refresh next year with either TSMC 7nm, or another smaller node from Samsung, bringing down power consumption by up to 50% for the same performace. What do I do?!?!? Dilemmas... aghhh...

    With a pair of 2080 Ti's, y\u are sitting pretty and have some time and options.  I am stuck with a single GTX 1070 and have to do somehting because my 1070 is now two generations old and need more memory and cuda cores to do the scenes I want to do.

    Looking atthe options out there, Your best bang for the buck might be to see if you can snag a matched pair of 2070 Super's and a NVLink bridge. At anything less than full price that is going to be pretty cheap compared to the 3090. Even at full price you're saving $400+ but done 8Gb.

  • wishful thinking maybe, but is it possible that the RTX 3090 has 10,496 cuda cores AND does two calculations per clock? 

    I would think that if they're describing 10,496 cuda cores without an asterisk, they would be describing a tangible feature of the PCB. 

  • DiasporaDiaspora Posts: 459
    edited September 2020
    Diaspora said:
    Looking atthe options out there, Your best bang for the buck might be to see if you can snag a matched pair of 2070 Super's and a NVLink bridge. At anything less than full price that is going to be pretty cheap compared to the 3090. Even at full price you're saving $400+ but done 8Gb.

    Problem with that plan is that it looks like only the 3090 is going to have NVlink. I guess Nvidia is making the multi-card setup a premium feature from now on?

    Post edited by Diaspora on
  • Diaspora said:

    wishful thinking maybe, but is it possible that the RTX 3090 has 10,496 cuda cores AND does two calculations per clock? 

    I would think that if they're describing 10,496 cuda cores without an asterisk, they would be describing a tangible feature of the PCB. 

    I find it strange that it has more than twice the CUDA cores of a Titan RTX, the same amount of VRAM, and yet it is around half the price. Any plausible explanation for this?

  • DiasporaDiaspora Posts: 459
    edited September 2020

    It's my understanding that the Titan cards were always case studies of immense diminishing returns.

    Also, Titan RTX is Turing 12nm, while Ampere is 8nm, so maybe it's just cheaper for them to pack on the cores because they're physically smaller now?

    Post edited by Diaspora on
  • GordigGordig Posts: 10,191
    Diaspora said:
    Diaspora said:
    Looking atthe options out there, Your best bang for the buck might be to see if you can snag a matched pair of 2070 Super's and a NVLink bridge. At anything less than full price that is going to be pretty cheap compared to the 3090. Even at full price you're saving $400+ but done 8Gb.

    Problem with that plan is that it looks like only the 3090 is going to have NVlink. I guess Nvidia is making the multi-card setup a premium feature from now on?

    That has no impact on the 2070 Super.

  • i53570ki53570k Posts: 212

    Big Navi can't launch soon enoguh.  Nvidia is holding back on the 700-1500 gap to counter AMD.  It's either going to be a 12GB 3090 (3080 Ti?) or 20GB 3080.  Both can launch right away if Nvidia chooses.  If Big Navi is compeitive with 3080 but comes with 16GB then it's probably 20GB 3080 displacing the 10GB version at a reasonable premium.  In that case I am happy to get that.  If Big Navi is between 3080 and 3070 then we likely will see the $1000 12GB 3080Ti instead.  Then it's the eyewatering 3090 or bust.  There is also hope that Big Navi can challenge 3090....  

  • Diaspora said:

    wishful thinking maybe, but is it possible that the RTX 3090 has 10,496 cuda cores AND does two calculations per clock? 

    I would think that if they're describing 10,496 cuda cores without an asterisk, they would be describing a tangible feature of the PCB. 

    No. They've had to own up and admit the physical number of CUDA is half the number on the slides. Otherwise those wouldn't just be generational increases but leaps. Plus they have stated hom many SM are on each and how many CUDA per SM and it works out to half.

    A lot of this presentation was if not outright misleading it was shaded that way. They left a lot out and glossed over a lot else. They spent a lot of time hammering that the 1080ti owners should upgrade without really making a compelling case, as a 1080ti owner.

Sign In or Register to comment.