(Rumormill) The New Nvidia 3000 Series Cards..

Ghosty12Ghosty12 Posts: 2,068
edited June 2020 in The Commons

Just came across this video from Gamer Meld showing the supposed information for the new Nvidia Ampere Cards..  Also it seems that the leak of the Founders Card design is supposedly true..

But to the leak information of the first three cards:

Ampere Titan (Maybe but not 100% on that..)

3090 supposedly will be instead of a 3080Ti..

3080 As per usual..

So they do look good what will be interesting is the cost of the cards, as it is stated in the video that the cost of the heatsink/cooler is supposed to be $150..

Okay some changes to the leaks/rumors, seems that Nvidia could be doing away with the Ti models..

The video does ramble on a bit but the "leaked" information is in there..

Post edited by Ghosty12 on

Comments

  • kenshaw011267kenshaw011267 Posts: 3,805

    Those are absurd and clearly wrong.

    Ampere is a die shrink which should mean less power required. The Titan equivalent is supposedly 70W higher with all the other reported stats the same as the RTX Titan. That's just absurd.

     

  • Ghosty12Ghosty12 Posts: 2,068
    edited June 2020

    That is why I put rumor in the title and supposed in the OP, until we get more reliable information.. Also if a mod is around could they either delete this thread or merge it with my other thread please, as I forgot I had made a thread about this last month, thank you.. :)

    Post edited by Ghosty12 on
  • kyoto kidkyoto kid Posts: 41,260

    ...end of the Titan line? We'll see.  If so, the Turing RTX  will remain on my list as it offers the TCC mode option which the gaming enthusiast cards do not.  Not really buying the hype and speculation.

  • Ghosty12Ghosty12 Posts: 2,068
    kyoto kid said:

    ...end of the Titan line? We'll see.  If so, the Turing RTX  will remain on my list as it offers the TCC mode option which the gaming enthusiast cards do not.  Not really buying the hype and speculation.

    Yeah it will be a case of wait and see what the actual specs are when Nvidia decide to release them, my main interest is the supposed amount of vram on the 3080 which if true is a fairly good step up from the 2080..  What will be interesting is to know how much will go on the 3080Ti as what is "stated" there is not much of an update over the 2080Ti..

  • Those are absurd and clearly wrong.

    Ampere is a die shrink which should mean less power required. The Titan equivalent is supposedly 70W higher with all the other reported stats the same as the RTX Titan. That's just absurd.

    A die shrink only means less power at the same frequency. "performance-per-watt" is everything and they could have used the headroom afforded by the 7nm process to increase the clock multiplier for more performance. There's nothing absurd or clearly wrong, here, they didn't violate any law of physics because they didn't quote the clockspeed. We can only wait and see.

  • This is terrible news, if even remotely true. Now you probably can't run 4 GPUs and a CPU off of a normal 1600W wall socket.

  • Ghosty12Ghosty12 Posts: 2,068

    Have posted some new leaked information and video in the OP..

  • LeatherGryphonLeatherGryphon Posts: 11,682
    edited June 2020

    This is terrible news, if even remotely true. Now you probably can't run 4 GPUs and a CPU off of a normal 1600W wall socket.

    3-phase anybody?  Move your computer to your kitchen and plug it into the weird socket for your electric stove/range/cooker.surprise

    For those people with an old IBM-370 in the basement, you're all set.yes

    While working as an HP consultant for government & industry back in the early '90s.  We advised on high level h/w and s/w big-iron HP systems in racks.  One job in St. Louis for the power company had us installing and configuring a dual redundant system.  I did the failover software configuration and my partner was uncrating and assembling the racks, cables, disks, printers, etc.  Lots of work.  I was busily planning my configuration while the other guy was crawling around inside the racks connecting wires.  All of a sudden I hear this "POW" and smell blue smoke.  It seems that the racks came wired for 3-phase 220v, but the CPU cage and its various controller boards came configured for single-phase 110v.  In a normal situation the shape of the plugs would prevent disaster, but inside the rack, all the equipment had the same shape plugs that plugged into the power distributor box in the rack, regardless of whether they used 110v, or 220v.  But you had to make sure the right switches were switched!  That day I was so glad I didn't have anything to do with the h/w end of things.  The installation was set back a couple of weeks while the managers and lawyers and insurance companies figured things out.  The very embarassed technician never lived down the event and was thereafter known as "Sparky".  But the silver lining? to that black cloud is that a couple weeks later I got to spend another 3 days in St. Louis.frown  The ride up the arch is fun the first time, but not worth repeating.indecision

    Post edited by LeatherGryphon on
  • richardandtracyrichardandtracy Posts: 5,951

    Flip. I hadn't realised single phase sockets in the US were limited to 1600W. Here in Europe it's very unusual to have 3 phase to a house, but the voltage is 220V +/- 10%. Sockets on the continent can deliver 2.4 kW and in the UK it's 3kW.

    I am surprised.

  • Gusf1Gusf1 Posts: 257

    This is terrible news, if even remotely true. Now you probably can't run 4 GPUs and a CPU off of a normal 1600W wall socket.

    Just get an electrician to upgrade the wiring AND Breaker to 30 amps.

                                  Gus

  • outrider42outrider42 Posts: 3,679

    Everything is still in a state of flux. What people need to understand is that the specs constantly change throughout development. They may have performance targets, but how they get there can change.

    I believe Nvidia is toying with the idea of a 3090 or 3080ti. If they plan on releasing upgraded "Super" versions down the road, a "3080ti Super" would be a stupid name. By calling the top card a 3090, that eaves the door open for several things. Changing the name can give them a reason to change the price, see. And they can still offer a Titan, too.

    I also believe that both companies are playing a game of chicken with each other (similar to Sony and MS with their consoles.) I would even wager that some leaks are being released by AMD and Nvidia on purpose to gauge reaction, and some may be real, and some may be fake.

    I do think the top card, whatever it is called, will sport 24GB of VRAM, like the Titan RTX. They've already set that precedent, so it would be silly to do less. Then you have a 3080ti/3090 that may have 12GB, and a 3080 that may have 10GB. These may be true, but I think that is a bad idea. Both consoles will sport more VRAM than these last two. Some people think I am crazy by not mentioning consoles have no DDR RAM, and that would mean they have less RAM/VRAM overall. But ah, they don't pay attention. The PS5 is actually using its SSD as a replacement for DDR RAM. Yes, it that fast. So the 13.5GB of VRAM that the PS5 has (the OS takes up 2.5 of the 16) is entirely used as VRAM, it is not splitting VRAM and RAM. It doesn't need to. The VRAM in the PS5 will also only hold about 1 second worth of data. The PS4 by contrast stored 30 seconds worth of data in its VRAM. This is fascinating, and it is all due to the SSD. The PS4 had to duplicate data in order to keep up, which led to both inflated install sizes and VRAM usage. The PS5 will keep it very trim, meaning its 13.5GB of VRAM can go a very long way. PC cannot compete with this currently. The only way PC can compete is by offering a boatload of VRAM, and yet if these rumor are true, the top cards from Nvidia will not do that.

    A die shrink is no guarantee that Wattages will be lower at the top. If Nvidia is afraid of AMD, and rumors suggest that maybe they should be, then they have a reason to push as hard as they can if they want to keep the performance crown.

    There is all sorts of speculation that AMD could beat Nvidia this time around, with Nvidia's saving grace being faster at ray tracing. Just look at the new Xbox and Playstation, these machines are hitting 10 and 12 TFlops and they are consoles that might cost $500 or $600. The consoles are just the tip of the spear for AMD this year. Nvidia has to compete not only with AMD GPUs, but also AMD consoles. A good many people might just buy a console instead of a GPU upgrade. To entice people to by GPUs, Nvidia is going to have to smash console performance AND do it at not so ridiculous prices. If Nvidia releases Ampere at higher prices than Turing, they will get destroyed by AMD, no matter how fast they are.

  • Gusf1 said:

    This is terrible news, if even remotely true. Now you probably can't run 4 GPUs and a CPU off of a normal 1600W wall socket.

    Just get an electrician to upgrade the wiring AND Breaker to 30 amps.

                                  Gus

    Sure, thanks for the suggestion. But that's additional cost on top of everything, and then there's the cost of those types of rigs; they're not regular PCs anymore but specialized rigs with a specialized cost. Not to mention how nervous I would be willing the switch for the first time, nor the further damage to my electric bill :) I'm going to have to seriously look at EEVEE.
  • outrider42outrider42 Posts: 3,679

    I would think two of these will suffice for most things. I expect traditional performance will be around 40-50% higher than a 2080ti range, if not more in some situations. But the ray tracing will be much faster, like 2 or 3 times faster, and Iray benefits more from ray tracing than most. So actual performance for Iray should be much higher than just 40-50%. Just look at the jump from a 1080ti to a 2080ti, in gaming it was a typical 30% or so, but for Iray the 2080ti is nearly TWICE as fast as a 1080ti. I have two 1080tis, and my current benchmark tests are slower than a single 2080ti. That is a pretty serious jump.

    Now I don't know if the top Ampere will double the 2080ti at Iray, but I think it might get close. Especially at more geometrically complex scenes where ray tracing really benefits.

    Regardless, I think Ampere is going to be a big leap. Will it be as big of a leap as Turing? Maybe. Turing's big problem was the price. We shall see what Ampere will cost. I personally don't think it will go up, AMD is going to be competing this time around and might even release its new lineup before Nvidia does. If AMD beats the 2080ti easily, and beats the 2080ti price, then Nvidia will be forced to compete on price.

    So ultimately, all eyes are on AMD. If you want good GPUs at good prices, you should root for AMD to be extremely competitive.

  • outrider...

    But will the new AMD cards be a player for use in Daz?  Even if they have some sort of Ray Tracing?  I thought/heard that they wouldn't be.  
    If so, does that mean we are rooting for AMD because it will hopefully make the new Nvidia cards cheaper (but still our only daz/iray option?)

  • mclaughmclaugh Posts: 221
    PJWhoopie said:

    But will the new AMD cards be a player for use in Daz? 

    Doesn't matter. A rising tide lifts all boats. NVIDIA has to compete with AMD on price in its primary market (gaming). NVIDIA's price and performance have to be at least on par with AMD's. If iRay performance (both speed and render quality) lags noticeably behind ProRender, NVIDIA's going to lose market share to AMD, regardless of whether or not AMD cards are a meaningful option in DS.

  • nicsttnicstt Posts: 11,715
    edited June 2020

    Specs as in speculation or specificaion.

    The OP should be clear what Specs means.

     

    PJWhoopie said:

    outrider...

    But will the new AMD cards be a player for use in Daz?  Even if they have some sort of Ray Tracing?  I thought/heard that they wouldn't be.  
    If so, does that mean we are rooting for AMD because it will hopefully make the new Nvidia cards cheaper (but still our only daz/iray option?)

    Render elsewhere, I do.

    It is what IMO is Iray's consistently poor performance that has forced me to move - as opposed to what Nvidia hoped and buy a card.

    Post edited by nicstt on
  • ArtiniArtini Posts: 9,724

    I think, I will start saving for PS5 and hopefully they release the Dreams for it,

    when PS5 will be available for the purchase.

     

Sign In or Register to comment.