Is a GTX 2060 Super a good card to get or...

2»

Comments

  • StratDragonStratDragon Posts: 3,253

    I'm trying to decide if I should get a gtx 2060 Super card.  Is it a decent card, at 6GB of VRAM, for higher resolution scenes or should I wait for something better?

    About to start creating graphics for my first OGN and just looking for some incite.

    My 1660ti 6GB taps out with larger scenes but for previews its indispensable. If you can go with a 20xx they appear to be excellent. My concern with the 3000's is with C19 that release date is going to push back, and the cost of parts due to scarcity might change that price point dramatically. My 16 has actually gone up in price from last august.

     

  • outrider42outrider42 Posts: 3,679
    The 2070 Super is a very interesting card. With Ampere on the way, I wouldn't recommended buying any Turing right now. But the 2070 Super using Nvlink may be an exception. The 2070 was never intended to have Nvlink, that was exclusive to the 2080 and up. But Nvidia was forced to respond to the upcoming AMD 5700XT launch. The entire Super refresh is a direct response to AMD. Unlike the original 2070, the 2070 Super is a cut down 2080, and uses the same board. That board has Nvlink. Honestly I am surprised Nvidia didn't disable it in the 2070 Super, but since it works, the 2070 Super is the cheapest way to access Nvlink and increase your VRAM capacity.

    Nvlink does work with Iray now. It has been tested with the 2080ti.

    Since the 2070 was never intended to get this, you can bet that next gen Ampere will restrict Nvlink from the 3070. They might even restrict higher cards from it for all we know. So that makes the 2070 Super the most interesting Turing card, and Nvlink in general really changes the usual upgrade path when Ampere launches.

    But make no mistake. Ampere is releasing this year. The 3060 and 3070 might release later, but the top cards are releasing this year. With AMD set to launch this year, Nvidia will respond. Remember how the Super lineup is a response to AMD? Nvidia even teased their Super line within days of AMD's scheduled announcement event. Do any of you think this is mere coincidence?

    Nvidia has always been aggressive when AMD presses on them. Today AMD is in a different place. For over several years AMD was in a downward spiral, and consumer confidence soured. But Ryzen has turned AMD around. Consumers are now much more interested in what AMD is doing. Their next launch is going to be big, and they have confirmed their products will launch this year. In fact they have stated that their GPUs will launch before the new consoles do.

    Do you think that Nvidia will just sit still during AMD's big launch? They have this down to a science. Each of these companies has eyes on the other. They have their own sources, they are not developing in a vacuum.

    There are obviously fake leaks out there. But just because a leak doesn't come true is no indication the leak was actually wrong or fake. Specs are constantly in flux and many specs can be changed right up to the last minute. So sometimes a spec is changed for whatever reason. Sometimes they have multiple designs going at once, and choose one.

    The leaked cooler is great example. So it appears this cooler is real, since Nvidia is investigating how it happened. That pretty much confirms it is legit. But it is still possible Ampere doesn't use these! They can have several different cooler designs going, and these could just be tests. So even though the leak is real, it still may not be what we get.

    So it is important to consider things like this when you hear about various leaks and rumors.

    But there are things we do know. We know AMD is gunning for the top now, something they haven't done for nearly 10 years. We know consumers are hyped for AMD GPUs, which also hasn't been the case for years. We also know the exact specs of both consoles, and that they will be quite powerful devices themselves. The consoles present a whole new level of competition to Nvidia. So Nvidia cannot screw around. They have to hit back hard, or they risk losing market share. Every time there has been such a threat, Nvidia historically responds.

    Nvidia also historically has followed up blunders with more solid releases. Turing might not be technically a blunder, but it has garnered a lot of criticism for its pricing. While I don't think we will see Pascal prices, I don't think prices will increase over Turing.

    Also, I expect the 3060 to at least ray trace faster than a 2080ti. I think that is a realistic expectation given the focus that ray tracing is getting. As for normal performance, maybe not, but do remember that the 2060 was par with the 1080, while also offering ray tracing. So Nvidia has made some big leaps before. It is totally within reason to expect 2080 level performance from a 3060 along with 2080ti level ray tracing. And again, just look at the competition. The new Xbox is going to match up with a 2080, so Nvidia had to respond to this.

    But you will not be able to Nvlink any 3060 or 3070.
  • DMaxDMax Posts: 637
    DustRider said:

    If your looking for a top quality budget laptop, Clevo/Sager laptops are really good. I have one that was custom assembled by Prostar that I used heavily for 3+ years, and it's still runs really well (had to move up to a machine with 128Gb RAM, so I don't use it much now). You can order a custom build from their website, or get a prebuilt from Amazon. This would be a great machine for the price:

    https://www.amazon.com/Prostar-PB51DF2-i7-10875H-3200Mhz-Gaming/dp/B087Z9M1Q2/ref=sr_1_1?dchild=1&keywords=prostar+clevo&qid=1593320086&s=pc&sr=1-1

    I had my new laptop (MSI GT76 Titan) custom built at Xotic PC, and I'm really happy with it, and with the quality of service and build from Xotic. I would have gone with Prostar again, but they didn't offer a machine with 128Gb RAM. My next laptop (3 years or so) will probably be from either Prostar or Xotic PC (note that Prostar has been in business for about 20 years, which is impressive for a small custom shop).

    Thank you very much @DustRider for those insights! I took a look and Prostar does seem like a good option although it appears to be limited to 16GB RAM. I will check out Xotic PC! Thank you kind sir!

  • j cadej cade Posts: 2,310
    edited June 2020
    I fee like every time someone asks "is this a good card" the response is *always* "no now is the worst time to buy thighamajig x is coming out within the year and it will completely change everything, we don't know all that much about it but definite game changer."

    At some point, the potential time you save rendering on a faster card seems like it's counterbalanced by the time spent waiting to buy the perfect card

    Post edited by j cade on
  • nonesuch00nonesuch00 Posts: 18,333
    edited June 2020

    Some actual facts that won't stop the hype train but might give some people actual information.

    Ampere has been announced but no commercial or datcenter cards have been announced. NONE.

    There have been no specs released or even release dates announced.

    All this stuff from the click bait end of the IT journalism world are worthless. Some might be legit but if you go back and look at these same source's reporting before every other major launch you will find them full of completely untrue stuff. Most recently when Navi was approaching release there were claims that it would include HW ray tracing, not true. That the top end card would rival, or flat outperform, the 2080ti, not true. And so on.

    What we do know about Ampere entering the consumer market is that Nvidia is still testing cooler designs for their reference cards. There was a recent, likely, legit leak of images of a very strange cooler. Cooler Master, which is who Nvidia uses for their coolers, has not stopped or slowed production of other products indicating they have lines making these coolers. Samsung, who will be the fab for the Ampere chips, has not slowed delivery of other 7nm wafers so they do not seem to be in mass production either. The Nvidia partners also have not reduced production of Turing cards or started trimming product lines, another indication that the Ampere cards are not yet in production.

    Taken as a whole, along with the utter lack of Nvidia hyping an announcement, it is unlikely Ampere will be released in 2020. It is possible they will try and get something to market by Black Friday but that window is closing rapidly.

    The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.

    So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.

    Post edited by nonesuch00 on
  • nicsttnicstt Posts: 11,715
    edited June 2020
    Visuimag said:

     

     

    nicstt said:
    Noah LGP said:

    RTX 3060 (8GB GDDR6) should be 5% slower than 2080 Ti, it will be released in Q1 2021.

    Facts should be backed up with evidence.

    https://www.tweaktown.com/news/72225/nvidias-mid-range-geforce-rtx-3060-could-beat-flagship-2080-ti/index.html

    In any event, to the OP, yes a 2060 is a decent card for rendering. Spend your money how you want, though (I just built a mostly new rig last week, for example). You can get it now (even though I'd advise the next step up on a 2070), or grab one of the 3000 series cards in a few months. That's the great thing about options. :)

    That's not evidence, but speculation at best. The article states them to be leaked specs. Leaks can be accurate, and can not. Then we have. " up to" appear in Cude core count.

    ... And then again, we have, at the start " The last few days have been pretty crazy in the GPU world, with purported specs on NVIDIA's upcoming Ampere-based GeForce RTX 3080 Ti blowing us away". You'll notice the word "purported", or at least should have because that is the most important word in all of that sentence.

    I like reading the rumours, but that is just what they are.

    Some will turn out to be accurate, and as has happened previously, some will not.

    @everyone

    Please don't make factual claims when someone is looking for help with a purchase.

    Post edited by nicstt on
  • takezo_3001takezo_3001 Posts: 1,998
    edited June 2020

     

    That makes no sense. I render 3 or 4 G8's with environments on an 8Gb card.

    Tell that to my GPU... wink

    But yeah, it does make sense as I'm trying for high realism and my characters are pretty heavy as far as mats/shaders go, plus, I use at least 4-8k HDRIs, as I've said I'm going for high realism, but yeah, I could go for a heavily optimized scene but that would compromise the look that I'm trying to achieve!

    Rather than get into a big discussion over it, let's just keep it as a subjective thing between artists and leave it at that. wink 

    But no matter, once I get my 3k GPU, the issue will take care of itself!smiley

    Post edited by takezo_3001 on
  • kenshaw011267kenshaw011267 Posts: 3,805
    edited June 2020

     

    Some actual facts that won't stop the hype train but might give some people actual information.

    Ampere has been announced but no commercial or datcenter cards have been announced. NONE.

    There have been no specs released or even release dates announced.

    All this stuff from the click bait end of the IT journalism world are worthless. Some might be legit but if you go back and look at these same source's reporting before every other major launch you will find them full of completely untrue stuff. Most recently when Navi was approaching release there were claims that it would include HW ray tracing, not true. That the top end card would rival, or flat outperform, the 2080ti, not true. And so on.

    What we do know about Ampere entering the consumer market is that Nvidia is still testing cooler designs for their reference cards. There was a recent, likely, legit leak of images of a very strange cooler. Cooler Master, which is who Nvidia uses for their coolers, has not stopped or slowed production of other products indicating they have lines making these coolers. Samsung, who will be the fab for the Ampere chips, has not slowed delivery of other 7nm wafers so they do not seem to be in mass production either. The Nvidia partners also have not reduced production of Turing cards or started trimming product lines, another indication that the Ampere cards are not yet in production.

    Taken as a whole, along with the utter lack of Nvidia hyping an announcement, it is unlikely Ampere will be released in 2020. It is possible they will try and get something to market by Black Friday but that window is closing rapidly.

    The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.

    So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.

    Big Navi only became a thing once AMD only released mid range, in comparison to Nvidia, cards. They aren't even, strictly speaking, Navi cards. Navi is RDNA 1. The new cards will be RDNA 2.

    Anyone who doubts that such claims were definitely made about Navi before 7/7/2019 can easily enough use Google.

    The sites that spent almost all of their time pushing these rumors are wrong more often than not. They make money not by being reliable but by the screaming headlines getting gullible people to click through. Just examine the absurd, and contradictory, "leaks" about Ampere. There was a a made up set of specs circulating, that got posted here, claiming the cards were just a generational improvement in performance but also consumed 10 to 25% more power while being on a smaller process node. Anyone, and I do mean anyone, who knows computer hardware would have known that was garbage. But there it was.

     

     

    Post edited by kenshaw011267 on
  • kenshaw011267kenshaw011267 Posts: 3,805

     

    That makes no sense. I render 3 or 4 G8's with environments on an 8Gb card.

    Tell that to my GPU... wink

    But yeah, it does make sense as I'm trying for high realism and my characters are pretty heavy as far as mats/shaders go, plus, I use at least 4-8k HDRIs, as I've said I'm going for high realism, but yeah, I could go for a heavily optimized scene but that would compromise the look that I'm trying to achieve!

    Rather than get into a big discussion over it, let's just keep it as a subjective thing between artists and leave it at that. wink 

    But no matter, once I get my 3k GPU, the issue will take care of itself!smiley

    ? I only optimize when scenes exceed my 11Gb card, which is pretty rare as I know what it can handle after better than 3 years with it.

    If by high realism you mean excessive subD, what else could you possibly mean, then you're hamstringing yourself for no gain. I've checked. A subd3 and subd4 figure is indistinguishable to the eye and next to that using image comparison software on an extreme closeup. But heavy mats and shaders? If you're using 4k maps for lots of stuff you are again pointlessly hamstringing yourself. Again unless you are doing closeups so tighht you should see the character's pores you won't produce a difference.

    But please test it for yourself. There are plenty of good image comparison programs out there.

    I never use a 4k HDRI unless i just need light. If it appears in the render at all I use 8k, as anything else looks awful.

  • takezo_3001takezo_3001 Posts: 1,998
    edited June 2020

     

    That makes no sense. I render 3 or 4 G8's with environments on an 8Gb card.

    Tell that to my GPU... wink

    But yeah, it does make sense as I'm trying for high realism and my characters are pretty heavy as far as mats/shaders go, plus, I use at least 4-8k HDRIs, as I've said I'm going for high realism, but yeah, I could go for a heavily optimized scene but that would compromise the look that I'm trying to achieve!

    Rather than get into a big discussion over it, let's just keep it as a subjective thing between artists and leave it at that. wink 

    But no matter, once I get my 3k GPU, the issue will take care of itself!smiley

    ? I only optimize when scenes exceed my 11Gb card, which is pretty rare as I know what it can handle after better than 3 years with it.

    If by high realism you mean excessive subD, what else could you possibly mean, then you're hamstringing yourself for no gain. I've checked. A subd3 and subd4 figure is indistinguishable to the eye and next to that using image comparison software on an extreme closeup. But heavy mats and shaders? If you're using 4k maps for lots of stuff you are again pointlessly hamstringing yourself. Again unless you are doing closeups so tighht you should see the character's pores you won't produce a difference.

    But please test it for yourself. There are plenty of good image comparison programs out there.

    I never use a 4k HDRI unless i just need light. If it appears in the render at all I use 8k, as anything else looks awful.

    Yes, I do use 2-3 subD levels as the low poly artifacts are pretty ugly, I also use the HDRI pics as a background as well, hence the 4-8k resolution, not to mention my 4k multi-texture maps; I do multiple renders that have close-ups and far shots (About 2-3 feet for the far shots) as I post in a "photo-shoot styled" series of pics, however, as for my action/fantasy compositions/animations I can get away with lighter text/SubD requirements.

    Thanks though, for the suggestions!smiley

    Again, this will be moot once I can get an alleged 12gb ti as I won't know for sure about the true specs until August/September assuming that the announcements are around that time, who knows at this point? I hate the wait!

    SUB-D 1.png
    1404 x 810 - 728K
    SUB-D 2.png
    1415 x 778 - 1014K
    Post edited by takezo_3001 on
  • nonesuch00nonesuch00 Posts: 18,333

     

    Some actual facts that won't stop the hype train but might give some people actual information.

    Ampere has been announced but no commercial or datcenter cards have been announced. NONE.

    There have been no specs released or even release dates announced.

    All this stuff from the click bait end of the IT journalism world are worthless. Some might be legit but if you go back and look at these same source's reporting before every other major launch you will find them full of completely untrue stuff. Most recently when Navi was approaching release there were claims that it would include HW ray tracing, not true. That the top end card would rival, or flat outperform, the 2080ti, not true. And so on.

    What we do know about Ampere entering the consumer market is that Nvidia is still testing cooler designs for their reference cards. There was a recent, likely, legit leak of images of a very strange cooler. Cooler Master, which is who Nvidia uses for their coolers, has not stopped or slowed production of other products indicating they have lines making these coolers. Samsung, who will be the fab for the Ampere chips, has not slowed delivery of other 7nm wafers so they do not seem to be in mass production either. The Nvidia partners also have not reduced production of Turing cards or started trimming product lines, another indication that the Ampere cards are not yet in production.

    Taken as a whole, along with the utter lack of Nvidia hyping an announcement, it is unlikely Ampere will be released in 2020. It is possible they will try and get something to market by Black Friday but that window is closing rapidly.

    The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.

    So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.

    Big Navi only became a thing once AMD only released mid range, in comparison to Nvidia, cards. They aren't even, strictly speaking, Navi cards. Navi is RDNA 1. The new cards will be RDNA 2.

    Anyone who doubts that such claims were definitely made about Navi before 7/7/2019 can easily enough use Google.

    The sites that spent almost all of their time pushing these rumors are wrong more often than not. They make money not by being reliable but by the screaming headlines getting gullible people to click through. Just examine the absurd, and contradictory, "leaks" about Ampere. There was a a made up set of specs circulating, that got posted here, claiming the cards were just a generational improvement in performance but also consumed 10 to 25% more power while being on a smaller process node. Anyone, and I do mean anyone, who knows computer hardware would have known that was garbage. But there it was.

     

     

    I'm not going to google that Everything I read was November 2019 and later as I was researching to build a desktop for myself. Anyway, I've learned long ago that the news media implies a lot to escape culpability for their innuendo and I treat them that way. I had never read that tweaktown guy before though but he was very concise and clear. 

  • SevrinSevrin Posts: 6,310

     

    Some actual facts that won't stop the hype train but might give some people actual information.

    Ampere has been announced but no commercial or datcenter cards have been announced. NONE.

    There have been no specs released or even release dates announced.

    All this stuff from the click bait end of the IT journalism world are worthless. Some might be legit but if you go back and look at these same source's reporting before every other major launch you will find them full of completely untrue stuff. Most recently when Navi was approaching release there were claims that it would include HW ray tracing, not true. That the top end card would rival, or flat outperform, the 2080ti, not true. And so on.

    What we do know about Ampere entering the consumer market is that Nvidia is still testing cooler designs for their reference cards. There was a recent, likely, legit leak of images of a very strange cooler. Cooler Master, which is who Nvidia uses for their coolers, has not stopped or slowed production of other products indicating they have lines making these coolers. Samsung, who will be the fab for the Ampere chips, has not slowed delivery of other 7nm wafers so they do not seem to be in mass production either. The Nvidia partners also have not reduced production of Turing cards or started trimming product lines, another indication that the Ampere cards are not yet in production.

    Taken as a whole, along with the utter lack of Nvidia hyping an announcement, it is unlikely Ampere will be released in 2020. It is possible they will try and get something to market by Black Friday but that window is closing rapidly.

    The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.

    So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.

    Big Navi only became a thing once AMD only released mid range, in comparison to Nvidia, cards. They aren't even, strictly speaking, Navi cards. Navi is RDNA 1. The new cards will be RDNA 2.

    Anyone who doubts that such claims were definitely made about Navi before 7/7/2019 can easily enough use Google.

    The sites that spent almost all of their time pushing these rumors are wrong more often than not. They make money not by being reliable but by the screaming headlines getting gullible people to click through. Just examine the absurd, and contradictory, "leaks" about Ampere. There was a a made up set of specs circulating, that got posted here, claiming the cards were just a generational improvement in performance but also consumed 10 to 25% more power while being on a smaller process node. Anyone, and I do mean anyone, who knows computer hardware would have known that was garbage. But there it was.

     

     

    I'm not going to google that Everything I read was November 2019 and later as I was researching to build a desktop for myself. Anyway, I've learned long ago that the news media implies a lot to escape culpability for their innuendo and I treat them that way. I had never read that tweaktown guy before though but he was very concise and clear. 

    Being concise and clear is not a substitute for being accurate, though.  Right now, all these people are doing is selling clicks.

  • nicsttnicstt Posts: 11,715
    Sevrin said:

     

    Some actual facts that won't stop the hype train but might give some people actual information.

    Ampere has been announced but no commercial or datcenter cards have been announced. NONE.

    There have been no specs released or even release dates announced.

    All this stuff from the click bait end of the IT journalism world are worthless. Some might be legit but if you go back and look at these same source's reporting before every other major launch you will find them full of completely untrue stuff. Most recently when Navi was approaching release there were claims that it would include HW ray tracing, not true. That the top end card would rival, or flat outperform, the 2080ti, not true. And so on.

    What we do know about Ampere entering the consumer market is that Nvidia is still testing cooler designs for their reference cards. There was a recent, likely, legit leak of images of a very strange cooler. Cooler Master, which is who Nvidia uses for their coolers, has not stopped or slowed production of other products indicating they have lines making these coolers. Samsung, who will be the fab for the Ampere chips, has not slowed delivery of other 7nm wafers so they do not seem to be in mass production either. The Nvidia partners also have not reduced production of Turing cards or started trimming product lines, another indication that the Ampere cards are not yet in production.

    Taken as a whole, along with the utter lack of Nvidia hyping an announcement, it is unlikely Ampere will be released in 2020. It is possible they will try and get something to market by Black Friday but that window is closing rapidly.

    The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.

    So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.

    Big Navi only became a thing once AMD only released mid range, in comparison to Nvidia, cards. They aren't even, strictly speaking, Navi cards. Navi is RDNA 1. The new cards will be RDNA 2.

    Anyone who doubts that such claims were definitely made about Navi before 7/7/2019 can easily enough use Google.

    The sites that spent almost all of their time pushing these rumors are wrong more often than not. They make money not by being reliable but by the screaming headlines getting gullible people to click through. Just examine the absurd, and contradictory, "leaks" about Ampere. There was a a made up set of specs circulating, that got posted here, claiming the cards were just a generational improvement in performance but also consumed 10 to 25% more power while being on a smaller process node. Anyone, and I do mean anyone, who knows computer hardware would have known that was garbage. But there it was.

     

     

    I'm not going to google that Everything I read was November 2019 and later as I was researching to build a desktop for myself. Anyway, I've learned long ago that the news media implies a lot to escape culpability for their innuendo and I treat them that way. I had never read that tweaktown guy before though but he was very concise and clear. 

    Being concise and clear is not a substitute for being accurate, though.  Right now, all these people are doing is selling clicks.

    Indeed it can be very clear rubbish; we just don't know, which is what those stating as gospel what is going to happen, don't get.

    Some of it will turn out to be correct, some less so. Some not at all.

  • nonesuch00nonesuch00 Posts: 18,333
    Sevrin said:

     

    Some actual facts that won't stop the hype train but might give some people actual information.

    Ampere has been announced but no commercial or datcenter cards have been announced. NONE.

    There have been no specs released or even release dates announced.

    All this stuff from the click bait end of the IT journalism world are worthless. Some might be legit but if you go back and look at these same source's reporting before every other major launch you will find them full of completely untrue stuff. Most recently when Navi was approaching release there were claims that it would include HW ray tracing, not true. That the top end card would rival, or flat outperform, the 2080ti, not true. And so on.

    What we do know about Ampere entering the consumer market is that Nvidia is still testing cooler designs for their reference cards. There was a recent, likely, legit leak of images of a very strange cooler. Cooler Master, which is who Nvidia uses for their coolers, has not stopped or slowed production of other products indicating they have lines making these coolers. Samsung, who will be the fab for the Ampere chips, has not slowed delivery of other 7nm wafers so they do not seem to be in mass production either. The Nvidia partners also have not reduced production of Turing cards or started trimming product lines, another indication that the Ampere cards are not yet in production.

    Taken as a whole, along with the utter lack of Nvidia hyping an announcement, it is unlikely Ampere will be released in 2020. It is possible they will try and get something to market by Black Friday but that window is closing rapidly.

    The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.

    So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.

    Big Navi only became a thing once AMD only released mid range, in comparison to Nvidia, cards. They aren't even, strictly speaking, Navi cards. Navi is RDNA 1. The new cards will be RDNA 2.

    Anyone who doubts that such claims were definitely made about Navi before 7/7/2019 can easily enough use Google.

    The sites that spent almost all of their time pushing these rumors are wrong more often than not. They make money not by being reliable but by the screaming headlines getting gullible people to click through. Just examine the absurd, and contradictory, "leaks" about Ampere. There was a a made up set of specs circulating, that got posted here, claiming the cards were just a generational improvement in performance but also consumed 10 to 25% more power while being on a smaller process node. Anyone, and I do mean anyone, who knows computer hardware would have known that was garbage. But there it was.

     

     

    I'm not going to google that Everything I read was November 2019 and later as I was researching to build a desktop for myself. Anyway, I've learned long ago that the news media implies a lot to escape culpability for their innuendo and I treat them that way. I had never read that tweaktown guy before though but he was very concise and clear. 

    Being concise and clear is not a substitute for being accurate, though.  Right now, all these people are doing is selling clicks.

    When he states a rumour he's points it out and when he doesn't he says it's "confirmed". And his articles are short too. I like him as a tech journalist. Since it's clearly tech fluff journalism I don't care so much had it supposed to been 'real news' then I wouldn't read it as I haven't been reading or watching any 'real news' anymore by anyone.

    I do wonder how all these tech fluff sites stay in business though? I know 'real news' is loss laden business subsidized by billionaires but I don't think these tech fluff sites are. ?indecision¿

  • ScarletX1969ScarletX1969 Posts: 587
    edited July 2020

    So, I've gotten the 2070 Super with 8GB of VRAM.  So, first takeaways...

    Firstly, my rig has 3 video cards, a GTX 1650, and my new addition.  A recent image that I created recently (that's in my gallery smiley) was used for agressive testing along side a 1650.  The resolution is 3000 x 1525.

    1. With just the 2070 Super - rendering time was 4 hours
    2. With just the 1650 and 1050TI (originally I had a GTX 960 as my main card) - rendering time was 6 hours
    3. With the 2070 and the 1650 - rendering time was 3 hours and 18 minutes.

    The 2070 Super is definitely faster than my 1650 but they do seem to also render quite nicely together as well.  There was another graphic on the Internet that was also used to do benchmark testing with various cards, so I used it in testing mine.  The 2070 rendered it in 7 minutes, the 1650 in 18 minutes, but together it took 4 1/2 minutes.

    So I will be getting another 2070 Super to replace the 1650 later this year and I will be good for producing renders for my graphic novel.

    Post edited by ScarletX1969 on
  • kenshaw011267kenshaw011267 Posts: 3,805

    So, I've gotten the 2070 Super with 8GB of VRAM.  So, first takeaways...

    Firstly, my rig has 3 video cards, a GTX 1650, and my new addition.  A recent image that I created recently (that's in my gallery smiley) was used for agressive testing along side a 1650.  The resolution is 3000 x 1525.

    1. With just the 2070 Super - rendering time was 4 hours
    2. With just the 1650 and 1050TI (originally I had a GTX 960 as my main card) - rendering time was 6 hours
    3. With the 2070 and the 1650 - rendering time was 3 hours and 18 minutes.

    The 2070 Super is definitely faster than my 1650 but they do seem to also render quite nicely together as well.  There was another graphic on the Internet that was also used to do benchmark testing with various cards, so I used it in testing mine.  The 2070 rendered it in 7 minutes, the 1650 in 18 minutes, but together it took 4 1/2 minutes.

    So I will be getting another 2070 Super to replace the 1650 later this year and I will be good for producing renders for my graphic novel.

    The 2070 super has a NVLink connector. If you have two you should consider getting the bridge to let you pool VRAM for bigger images.

  • ScarletX1969ScarletX1969 Posts: 587
     

    The 2070 super has a NVLink connector. If you have two you should consider getting the bridge to let you pool VRAM for bigger images.

    Is that sold separately?  I will definitely get it.

  • kenshaw011267kenshaw011267 Posts: 3,805
     

    The 2070 super has a NVLink connector. If you have two you should consider getting the bridge to let you pool VRAM for bigger images.

    Is that sold separately?  I will definitely get it.

    Yes, it's also not cheap, $100 US, about. I would do your best to get the same brand of 2070 Super if you intend to try it. SLI only works on exactly matched cards and there is no information on if NVLink does as well.

  • ScarletX1969ScarletX1969 Posts: 587
     

    The 2070 super has a NVLink connector. If you have two you should consider getting the bridge to let you pool VRAM for bigger images.

    Is that sold separately?  I will definitely get it.

    Yes, it's also not cheap, $100 US, about. I would do your best to get the same brand of 2070 Super if you intend to try it. SLI only works on exactly matched cards and there is no information on if NVLink does as well.

    I just spent another arm and leg on another 2070 Super and an EVGA NVlink bridge.  So I will post some results on how NVlink works in Daz Studio...if it's supported at all.

  • kenshaw011267kenshaw011267 Posts: 3,805

    NVLink is supported. People have gotten 2080ti's working and Daz says they tested it during the beta.

  • ScarletX1969ScarletX1969 Posts: 587

    NVLink is supported. People have gotten 2080ti's working and Daz says they tested it during the beta.

    2080TIs are expensive...oh lord!  I would love to see the benchmarks.

  • Ghosty12Ghosty12 Posts: 2,068

    NVLink is supported. People have gotten 2080ti's working and Daz says they tested it during the beta.

    2080TIs are expensive...oh lord!  I would love to see the benchmarks.

    Yes 2080Ti's are expensive but will never match the ooh boy that's expensive cost of these https://www.centrecom.com.au/leadtek-quadro-rtx5000-pcie-16gb-gddr6-work-station-graphics-card .. And they are not the most expensive of the Quadro line the RTX 8000 is the cost of a fairly decent car.. :)

  • ScarletX1969ScarletX1969 Posts: 587

    Yeah, I think I'll pass on that for now...lol

     

Sign In or Register to comment.