AMPERE RENDERING BENCHMARKS... OMG

«1

Comments

  • stephenschoonstephenschoon Posts: 360
    edited September 2020

    If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    Post edited by stephenschoon on
  • If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    true true, I think that it will do very well with iray ALTHOUGH i still dont like the small amount of Vram. I had 2 2070s and to get DOULBE the cuda cores in one GPU as opposed to what i had with those 2 is amazing.

  • Hurdy3DHurdy3D Posts: 1,058

    If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    no, you don't have to wait https://blog.irayrender.com/post/628125542083854336/after-yesterdays-announcement-of-the-new-geforce

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited September 2020

    https://www.guru3d.com/articles_pages/geforce_rtx_3080_founder_review,30.html

    This is showing the 3080 performing about twice as fast as the 2080 Ti in Blender.

    Post edited by tj_1ca9500b on
  • towdow3 said:

    If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    true true, I think that it will do very well with iray ALTHOUGH i still dont like the small amount of Vram. I had 2 2070s and to get DOULBE the cuda cores in one GPU as opposed to what i had with those 2 is amazing.

    What you have to remember is that this card is a replacement for the 2080 not the 2080 Ti. So an extra 2gb of vram. If/when a ti replacement is brought out it should have more vram and even better performance.
  • im watching reviews.....im gonna be broke AF....OHHHH BOY

  • nicsttnicstt Posts: 11,715
    edited September 2020

    Please don't use all caps in your title.

     

    droidy001 said:
    towdow3 said:

    If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    true true, I think that it will do very well with iray ALTHOUGH i still dont like the small amount of Vram. I had 2 2070s and to get DOULBE the cuda cores in one GPU as opposed to what i had with those 2 is amazing.

     

    What you have to remember is that this card is a replacement for the 2080 not the 2080 Ti. So an extra 2gb of vram. If/when a ti replacement is brought out it should have more vram and even better performance.

    According to Nvidia, this was a replacement to the 2080ti - presuming I'm remembering their presentation correctly.

    Post edited by nicstt on
  • droidy001 said:
    towdow3 said:

    If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    true true, I think that it will do very well with iray ALTHOUGH i still dont like the small amount of Vram. I had 2 2070s and to get DOULBE the cuda cores in one GPU as opposed to what i had with those 2 is amazing.

     

    What you have to remember is that this card is a replacement for the 2080 not the 2080 Ti. So an extra 2gb of vram. If/when a ti replacement is brought out it should have more vram and even better performance.

    riiiiiight i hear that. a 3080ti IF they call it that will be a good move probably have 16gb Vram and such..

  • nonesuch00nonesuch00 Posts: 18,320

    Wow! Wow! Wow! I know what I'm saving for this winter's passtime.

  • nicstt said:

    Please don't use all caps in your title.

     

    droidy001 said:
    towdow3 said:

    If it's true, the 3080 does look amazing, we need to wait for some Iray benchmarks...
    Steve.

    true true, I think that it will do very well with iray ALTHOUGH i still dont like the small amount of Vram. I had 2 2070s and to get DOULBE the cuda cores in one GPU as opposed to what i had with those 2 is amazing.

     

    What you have to remember is that this card is a replacement for the 2080 not the 2080 Ti. So an extra 2gb of vram. If/when a ti replacement is brought out it should have more vram and even better performance.

    According to Nvidia, this was a replacement to the 2080ti - presuming I'm remembering their presentation correctly.

    They were definitely comparing it to a ti. It's all a blur with the speculation we've had. But definitely room to fit in another card between the 3080 and 3090.
  • nicsttnicstt Posts: 11,715
    edited September 2020

    I agree there is room for more cards, and there will certainly be some.

    I think that review is the most balanced and objective.

    If you're int he market for a new card for and your card is say a 980ti or older/lower spec, then I would say well worth the upgrade.

    If you game at 1080p, not worth an upgrade if you have a 2080ti.

    Looking at the Blender comparrison in the Linus Tech Tips review, it looks like being an amazing upgrade.

    But from what one of the videos showed, it may not fit in your case because of depth, not just length. Maybe due to cooler, and the motherboard you have.

    It was also interesting to note that AMD's Ryzen 9(?) was mostly not particularly affected in comparison to Intel's best, although there was a difference. And it is also worth noting that it did vary by title. EDIT: At 4k there was no difference in at least one of the reviewers, with any differences being mainly/mostly(?) due to 1440p.

    If you're going to use it for 1080p gaming (and are a gamer), then I'm inclined to call you an idiot (sorry).

     

    Post edited by nicstt on
  • Wait till tomorrow for the AIB cards. The FE card runs at the limits of the power delivery for the card. That is not a good sign, basically for any 2x8 pin card as well. When Nvidia said this thing was 320W TDP that was actually what it ran at under benchmark loads. 

    CUDA rendering has traditionally been lower load but with CUDA now firing twice per clock, which is effectively OC, people should take care to keep an eye on power consumption on these cards till we get solid power and thermal benchmarks during long CUDA runs.

  • nicsttnicstt Posts: 11,715

    I agree.

    I don't like how much power it needs. I pay the electric bill.

  • RobinsonRobinson Posts: 751
    edited September 2020

    Thanks for that.  First time I've seen actual RT (Cycles though - but at least something that uses Optix).  Looks like a 3080 would be around 3 times faster than my vanilla 2070!   The one I really want to see is the 3070 though.  It's more in my price range.

     

    Anyway, very encouraging.

     

    By the way, a lot of the reviewers are making extremely dumb arguments.   What you're getting is something better than a 2080 Ti for $400 less than a 2080 Ti.  As Jay's 2c pointed out in his review, it's been a very long time since that kind of price/performance bump happened in consumer hardware.

    Post edited by Robinson on
  • bluejauntebluejaunte Posts: 1,923
    Robinson said:

    Thanks for that.  First time I've seen actual RT (Cycles though - but at least something that uses Optix).  Looks like a 3080 would be around 3 times faster than my vanilla 2070!   The one I really want to see is the 3070 though.  It's more in my price range.

     

    Anyway, very encouraging.

     

    By the way, a lot of the reviewers are making extremely dumb arguments.   What you're getting is something better than a 2080 Ti for $400 less than a 2080 Ti.  As Jay's 2c pointed out in his review, it's been a very long time since that kind of price/performance bump happened in consumer hardware.

    That was also preceded by a massive bump up with the 2080 TI though, let's not forget. I mean really we're now back to slightly more reasonable prices? Still not exactly cheap though, just that they have somehow managed to make us think we're getting a bargain this time around. Kudos to their marketing I guess.

  • evacynevacyn Posts: 975

    How much RAM is everyone waiting for? Can we expect a 3080 TI with 16GB in the new year? I have twin 1080 TIs and I'd love to upgrade, but 10GB isn't enough.

  • Robinson said:

    Thanks for that.  First time I've seen actual RT (Cycles though - but at least something that uses Optix).  Looks like a 3080 would be around 3 times faster than my vanilla 2070!   The one I really want to see is the 3070 though.  It's more in my price range.

     

    Anyway, very encouraging.

     

    By the way, a lot of the reviewers are making extremely dumb arguments.   What you're getting is something better than a 2080 Ti for $400 less than a 2080 Ti.  As Jay's 2c pointed out in his review, it's been a very long time since that kind of price/performance bump happened in consumer hardware.

    The 20 series was new tech, this is now building on that. I think the reason everyone is so fixated on the 2080 Ti price is because it didn't drop as it got older. There was no reason to drop the price because it had no competitors. I think there will still be a $1000-1200 card in the range at some point. It seems nvidia have given us a huge leap to blow amd out of the water before they even launch. I know moores law seems to have slowed down in recent years, but it's hanging on in there.
  • nicsttnicstt Posts: 11,715

    I wouldn't consider less than 24GB, which is what I was going for in an RTX Titan, but Nvidia have kindly made that (in the form of a 3090) cheaper.

    I m tempted to wait for AMD offerings though (I use Blender for my renders), but that would preclude me using Iray on occasions. Interesting times.

  • you need to upgrade all your PC to PCIe 4.0 sad so, not only buy an expensive card but your motherboard too.

  • nicsttnicstt Posts: 11,715

    They are backward compatible. Nnce the render information is on the card, the PCIe version is of limited use I understand?

  • you need to upgrade all your PC to PCIe 4.0 sad so, not only buy an expensive card but your motherboard too.

    No, they're compatible with PCIe 3.0.  You'll be fine with that.  The two things you need to check are power supply and case size, especially in cases that have drive bays.

  • That was also preceded by a massive bump up with the 2080 TI though, let's not forget. I mean really we're now back to slightly more reasonable prices? Still not exactly cheap though, just that they have somehow managed to make us think we're getting a bargain this time around. Kudos to their marketing I guess.

    I'm not sure the arbitrage really matters here.  Yes the 2080 Ti was ridiculously expensive, but then again NVIDIA has beancounters and knows more or less how many they'll sell at that price.  They bin chips for them as well.  The only thing that matters for most consumers is what you get for the amount you usually pay.  I'm usually in the xx60 to xx70 range and have been pretty consistent in that for many years now, so that's how I look at it.

  • Robinson said:

    That was also preceded by a massive bump up with the 2080 TI though, let's not forget. I mean really we're now back to slightly more reasonable prices? Still not exactly cheap though, just that they have somehow managed to make us think we're getting a bargain this time around. Kudos to their marketing I guess.

    I'm not sure the arbitrage really matters here.  Yes the 2080 Ti was ridiculously expensive, but then again NVIDIA has beancounters and knows more or less how many they'll sell at that price.  They bin chips for them as well.  The only thing that matters for most consumers is what you get for the amount you usually pay.  I'm usually in the xx60 to xx70 range and have been pretty consistent in that for many years now, so that's how I look at it.

    Exactly how I'm looking at. How much better for the same money. It doesn't matter what they call them. Pound for pound the difference is huge over the 20's. They are not giving us anything, they either know or believe Amd is sitting on something big in either value or performance.
  • Robinson said:

    you need to upgrade all your PC to PCIe 4.0 sad so, not only buy an expensive card but your motherboard too.

    No, they're compatible with PCIe 3.0.  You'll be fine with that.  The two things you need to check are power supply and case size, especially in cases that have drive bays.

    compatible Yes but won't have the full potential of that cards could offer.

  • Leonides02Leonides02 Posts: 1,379

    I'm incredibly excited. I have four 1080i's... I actually have no idea if I'm going to keep some and upgrade to a 3090, or what. 

    I probably won't be able to fit more than two 3090's in my case.

  • nicstt said:
    If you're going to use it for 1080p gaming (and are a gamer), then I'm inclined to call you an idiot (sorry).

    Why, though?  Wouldn't it work really well for 1080p gaming?  Would you only buy hardware that works with what you have right now, or wouldn't it make sense to pick somethig up now that can drive 4K later in case you want to go that route? 

  • joseftjoseft Posts: 310
    duckbomb said:
    nicstt said:
    If you're going to use it for 1080p gaming (and are a gamer), then I'm inclined to call you an idiot (sorry).

    Why, though?  Wouldn't it work really well for 1080p gaming?  Would you only buy hardware that works with what you have right now, or wouldn't it make sense to pick somethig up now that can drive 4K later in case you want to go that route? 

    looking into future potential resolution upgrades for your system is a good point, but i think its more aimed at people who currently have a 2080ti or a 2080, and are looking at upgrading to a 3080 just for 1080p gaming without considering going up to 1440 or 4k. Thats where it would be a bit silly to do at the present time. The difference in performance in that space is minimal, and both those cards can run every current game easily at 1080. With the one possible exception being flight sim 2020, which you may need to turn some settings below ultra to maintain 30+fps. 

    if you you were one of those people with a 2080 or 2080ti and the only use-case you have for it is 1080p gaming, a much smarter decision would be to, at the very least, pass on the itinital rush, and see what else comes out of the woodwork in the next 6 months or so. i.e, wait to see if we see Ti versions etc.

  • rrwardrrward Posts: 556
    Robinson said:

    you need to upgrade all your PC to PCIe 4.0 sad so, not only buy an expensive card but your motherboard too.

    No, they're compatible with PCIe 3.0.  You'll be fine with that.  The two things you need to check are power supply and case size, especially in cases that have drive bays.

    compatible Yes but won't have the full potential of that cards could offer.

    Unless you game at 4K it doesn't matter. The difference in data throughput for rendering is miniscule compaired to the time the card is performing the render. It's not like gaming where new data is cinstantly being fed to the card.

  • Yeeeep. Already have my 3090 $ set aside.

  • rrwardrrward Posts: 556
    edited September 2020

    I'm incredibly excited. I have four 1080i's... I actually have no idea if I'm going to keep some and upgrade to a 3090, or what. 

    I probably won't be able to fit more than two 3090's in my case.

    I upgraded from three 1080tis to two 2080tis and still saw a marked improvement in render performance. I would not be surprised if the 3090 out performed my two 2080tis. I think I could fit two 3090's in my case, I know my PSU could handle it...

    [Edited for spelling]

    Post edited by rrward on
Sign In or Register to comment.