Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

145791045

Comments

  • That was of course a guess based on no knowledge of antitrust.

    Ok! Progress of a sort! But, unfortunately, your error based on your admitted "no knowledge" of the most important aspect of your own argument completely contradicts what you've been doubling and quadrupling down on for a while now.

    The makers cannot make any such price fixing deals any more. That's what they keep getting busted for doing.

    I think I am only one of three people to point out that a purchaser negotiating with a single vendor independent of others is not price fixing. It is the kind of negotiating that must exist for the free markets the DoJ needs to enforce to in turn exist.

     

  • marblemarble Posts: 7,500

    That was of course a guess based on no knowledge of antitrust.

    Ok! Progress of a sort! But, unfortunately, your error based on your admitted "no knowledge" of the most important aspect of your own argument completely contradicts what you've been doubling and quadrupling down on for a while now.

    The makers cannot make any such price fixing deals any more. That's what they keep getting busted for doing.

    I think I am only one of three people to point out that a purchaser negotiating with a single vendor independent of others is not price fixing. It is the kind of negotiating that must exist for the free markets the DoJ needs to enforce to in turn exist.

     

    Exactly. I'm not one of the three but I did mention negotiation and it seems to me to be self-evident that such negotiation happens as a matter of course in all business transactions. It is the nature of business from street traders to huge corporate deals. Negotiaion is not price fixing - in some ways it is the opposite. At least that's how I see it. If anyone has ever bought from sites like Alibaba.com they will see different prices for different quantities (and these prices are still negotiable). I really don't see the point in arguing against this as it is so obvious.

  • kenshaw011267kenshaw011267 Posts: 3,805
    Sevrin said:

    Great. I give, believe any thing you want. This is just my field.

  • mrposermrposer Posts: 1,131

    Is there a dumbed-down thread for a DAZ user with an old computer that only renders in CPU mode who is looking to buy a new gamer? computer that will work with DAZ and use gpu rendering? What is a decent Nvidia card to look for being included that will work well for DAZ  with deforce clothes and hair.

     

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

  • 10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited August 2020

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

    There's also a rumor(s) out there that these GPUs are being built at the Samsung fabs. 

    Above video is by Jim at AdoredTV, for those that might be curious.

    Post edited by tj_1ca9500b on
  • TheMysteryIsThePointTheMysteryIsThePoint Posts: 3,027
    edited August 2020

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

    There's also a rumor(s) out there that these GPUs are being built at the Samsung fabs. 

    Above video is by Jim at AdoredTV, for those that might be curious.

    Wow, that was a good analysis that I was actually able to follow. It supports the 400W rumor, and that means no 15A circuit could support even the GPUs alone. And the guy that wrote E-Cycles told me that he sees even an RTX2080ti intermittently pull 400W. I think 4 GPU setups are going to move into the professional-only category...

    Post edited by TheMysteryIsThePoint on
  • billyben_0077a25354billyben_0077a25354 Posts: 771
    edited August 2020

    I know everyone is picking at each other about the memory and list price vs volume and Kenshaw talks about math.  If Nvidia puts 24GB on the 3090 and decide to build 100,000 units that is a lot of memory.  Based on a leaked PCB photo the card will have memory on the back of the board so IF it is 24GB as some rumors have stated, will have 24 1gb modules.  For 100,000 cards that is 2.4 million memory modules at $13.00 a chip.  That is 31.2 milliopn dollars of GDDR6X memory.  I would think that if Nvidia is buying that much memory, they would get a better price than $13.00 a gig.

    Post edited by billyben_0077a25354 on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited August 2020

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

    There's also a rumor(s) out there that these GPUs are being built at the Samsung fabs. 

    Above video is by Jim at AdoredTV, for those that might be curious.

    Wow, that was a good analysis that I was actually able to follow. It supports the 400W rumor, and that means no 15A circuit could support even the GPUs alone. And the guy that wrote E-Cycles told me that he sees even an RTX2080ti intermittently pull 400W. I think 4 GPU setups are going to move into the professional-only category...

    I'll be very curious as to how well these new cards will react to undervolting.  Even so, I could live with say 2 double slot cards with my remaining PCIe slots (except the 2nd & 4th sitting under the GPUs) being available for other things like capture cards, adaptor cards, etc...

    It's not like I'm going to be able to afford 4 GPUs anytime soon anyways, unless they are really cheap...

    Part of me wants to rebel and embrace 64 core Threadripper with a token GPU for desktop duties.  Sure, the renders will be a bit slower as they would be CPU only, but running out of memory shouldn't be an issue at that point, since I'd have at least 64 GB of memory installed in such a setup.

    Dual 64 core EPYC Romes could also be interesting, but that's out of my price range.

    Anyways, I'm very curious as to how Nvidia's new product stack will shake out.

    Post edited by tj_1ca9500b on
  • outrider42outrider42 Posts: 3,679
    I'll just mention this, one poster of this information is saying that it comes from a source who always been accurate in the past. That source says 3090 24gb for $1400. This matters because his own credibility is on the line if they are wrong. That same source also says the 2080 will have both 10 and 20gb versions, with one being $800. He actually made it sound like $800 3080 was 20gb, but I have difficulty believing that. I would assume the $800 is 10, and 20 may be $900. But hey, who knows, maybe the 10gb is actually $700, which would be rad. That would place it near the classic 1080ti. That would be certainly better received by gamers.

    I will also point out that not one person discussing the tech is saying that $1400 is too low for a card with 24gb. In fact, it is nearly the opposite, many are complaining how that is too high, LOL. Many of these people also know how this industry works, and how much stuff costs. So I find it quite odd how none are talking about VRAM prices making these price rumors impossible.

    We also have more pictures now, which again point to 24gb and a 3 slot design. Why is it 3 slots??? Because it has components on the other side of the board just like I described earlier. Basically having components on the flip side adds another slot to the width of the card. It is all adding up, and many of the age old rumors are starting to look to be correct. The rumors talked 24gb months ago. We have also had core counts for a while, too. The rumors about stuff on the other side are at least a month old. Many of these rumors are not new. But now they are coming together. Frankly, the only thing we don't know is when they are launching. The price can still be up in the air. Nvidia is probably monitoring social media for reactions right now. But so far, most of rumored lineup seems to have prices close to their Turing predecessor. The 3090 is an exception, and as I already said earlier, that is because they had no 2090 before this. It is like how Intel created the i9 series, which is more expensive than i7 used to be, and AMD did the exact same with their R9 series. A change of name usually means a new price tier.

    Nvidia's design is pretty wild. I would imagine that board partners will have more normal designs. So yeah...maybe wait for those. The AIBs are usually better anyway.

    I personally am looking at EVGA, because I want to get their extended warranty, which is one of the best. My big fear with these GPUs is that they if they are hot and use an untested Samsung Fab, they may have some reliabilty issues. The 2080ti had some issues at launch. A German retail site published its sales and RMA data, and the 2080ti had the highest RMA rate among the models sold. I believe it was 5%. That's 1 out of every 20. EVGA offers up to 10 year warranties. 10 years! I'm not even going to keep it that long, LOL.
  • Ghosty12Ghosty12 Posts: 2,068
    edited August 2020

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

    There's also a rumor(s) out there that these GPUs are being built at the Samsung fabs. 

    Above video is by Jim at AdoredTV, for those that might be curious.

    Wow, that was a good analysis that I was actually able to follow. It supports the 400W rumor, and that means no 15A circuit could support even the GPUs alone. And the guy that wrote E-Cycles told me that he sees even an RTX2080ti intermittently pull 400W. I think 4 GPU setups are going to move into the professional-only category...

    And the latest rumor with image is that the 3090 is supposed to be a triple slot card, and that the special 12 pin power connector will supposedly allow up to 600 watts of power as well.. And on top of that it seems the rumor of the type of ram being used, GDDR6X or whatever it is called supposedly has heat issues.. Which means if true then this generation of Nvidia cards are well yikes in a lot of departments..sad

    Post edited by Ghosty12 on
  • kyoto kidkyoto kid Posts: 41,256
    edited August 2020

    ...well if that's true, I definitely will keep saving up for that Turing RTX Titan.  I'd have to build a whole new system, as well as have the flat rewired, for a monster card like that.

    Post edited by kyoto kid on
  • I personally am looking at EVGA, because I want to get their extended warranty, which is one of the best. My big fear with these GPUs is that they if they are hot and use an untested Samsung Fab, they may have some reliabilty issues. The 2080ti had some issues at launch. A German retail site published its sales and RMA data, and the 2080ti had the highest RMA rate among the models sold. I believe it was 5%. That's 1 out of every 20. EVGA offers up to 10 year warranties. 10 years! I'm not even going to keep it that long, LOL.

    I have used EVGA stuff for years.  My last three video cards were all EVGA including my curent GTX 1070.  I also now have a SuperNova 1200 PSU so I am ready for RTX 3000.  I just wish that EVGA made AMD motherboards. 

    Ghosty12 said:

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

    There's also a rumor(s) out there that these GPUs are being built at the Samsung fabs. 

    Above video is by Jim at AdoredTV, for those that might be curious.

    Wow, that was a good analysis that I was actually able to follow. It supports the 400W rumor, and that means no 15A circuit could support even the GPUs alone. And the guy that wrote E-Cycles told me that he sees even an RTX2080ti intermittently pull 400W. I think 4 GPU setups are going to move into the professional-only category...

    And the latest rumor with image is that the 3090 is supposed to be a triple slot card, and that the special 12 pin power connector will supposedly allow up to 600 watts of power as well.. And on top of that it seems the rumor of the type of ram being used, GDDR6X or whatever it is called supposedly has heat issues.. Which means if true then this generation of Nvidia cards are well yikes in a lot of departments..sad

    I hope not.  If the tripple width is all on the normal fan side then okay but if they sandwich the PCB with a full width on each side, a lot of boards will not fit the card because the memory slots will interfere with the back side cooler.  At least, that is how it looks on my Gigabyte board.

  • outrider42outrider42 Posts: 3,679
    edited August 2020

    I have a feeling the dual sided thing is just Nvidia's Founder's Edition. There was a supposed pic of a 3rd party board with a more traditional 3 fan design. If there are no fans on the back, then the back should be clean. 

    The curious thing is certainly why would Nvidia do this on their card. It sounds a bit crazy. Why didn't they use 2gb stacks of VRAM instead of placing 1gb chips on both sides like that? To do that just for cooling seems odd to me. Unless that VRAM really does get crazy hot, that would be the only explanation to me. A 2gb stack of the chips may produce more heat, so they used 1gb chips and split them on each side of the board. But this raises yet more questions. If the AIBs are not doing this dual sided design, then surely the AIBs are using 2gb chips. So how well does that work?

    I also cannot imagine this using quite that much power. Oh I am sure it will eat up some electricity, don't get me wrong. But 400 Watts seems too extreme, they would be pretty desperate to ship with that. They would have to be pretty concerned about AMD's competition to do that. But hey, I did tell you guys that Nvidia is hell bent on staying on top of AMD. They do not want to lose that performance crown. That is why they are launching now, and why they might just launch a crazy bonkers hot GPU. But the numbers they are talking for this 12 pin connector are its max. I think if it were to use that much, it might melt the connectors, LOL.

    The hottest GPU in recent memory is in fact the Radeon 7. It hit right at 300 Watts in spite of being on 7nm. Nvidia has generally been under the 250 Watt level with their top GPUs for years. You have to go all the way back to Fermi, like Tom mentioned in his video, to find a hotter Nvidia GPU.

    So I suppose the 3090 could break 300 Watts, but I don't think it will break 400. Maybe one of the AIBs do something absurd, or people overclocking, but at stock I don't see it. There is also a part of me that wonders if the 12 pin is bit of a smoke screen to throw people off.

     

    I personally am looking at EVGA, because I want to get their extended warranty, which is one of the best. My big fear with these GPUs is that they if they are hot and use an untested Samsung Fab, they may have some reliabilty issues. The 2080ti had some issues at launch. A German retail site published its sales and RMA data, and the 2080ti had the highest RMA rate among the models sold. I believe it was 5%. That's 1 out of every 20. EVGA offers up to 10 year warranties. 10 years! I'm not even going to keep it that long, LOL.

    I have used EVGA stuff for years.  My last three video cards were all EVGA including my curent GTX 1070.  I also now have a SuperNova 1200 PSU so I am ready for RTX 3000.  I just wish that EVGA made AMD motherboards. 

    Ghosty12 said:

    10 days 17+ hours to go!  I'd embed the timer, but 1) not sure how to do that and 2) might piss the moderators off...

    Anyways, mountainous grain of salt but...

    https://wccftech.com/nvidia-geforce-rtx-3090-flagship-ampere-gaming-graphics-card-pictured-1400-us-price/

    Price subject to change at a whim of course!  24 GB for around that price would be really cool, but if that founder's card is indeed a triple slot card, consider my enthusiasm muted!  I'd probably hold out for a 2 slot version... a liquid cooled single slot would be impressive, but the odds of that happening are probably less than 1%...

    Why would I want a single slot? 'Cuz I want to build to that quad GPU setup, with free space left over for other non-GPU cards of course!  Sure, there's PCIe ribbon cables, but those get janky...

    WCCFTech also has pricing for other card ranges in another 'huge mountain of salt' rumor, but since that can change at Jensen's whim on a dime, no point getting people's hopes up.  Plus it's WCCFTech, so some people won't care in any case.

    So yeah, 10 Days, 17 hours+ (as of this post)...

    If the power rumors are true, that'll make running 4 GPUs more of a challenge than it already is, and I'm not even sure Blender will grok two sets of two NVLinked GPUs, even if the Linux Nvidia driver does. Pretty sure I'm going to uses two systems of two GPUs each, on separate 20A circuits.

    There's also a rumor(s) out there that these GPUs are being built at the Samsung fabs. 

    Above video is by Jim at AdoredTV, for those that might be curious.

    Wow, that was a good analysis that I was actually able to follow. It supports the 400W rumor, and that means no 15A circuit could support even the GPUs alone. And the guy that wrote E-Cycles told me that he sees even an RTX2080ti intermittently pull 400W. I think 4 GPU setups are going to move into the professional-only category...

    And the latest rumor with image is that the 3090 is supposed to be a triple slot card, and that the special 12 pin power connector will supposedly allow up to 600 watts of power as well.. And on top of that it seems the rumor of the type of ram being used, GDDR6X or whatever it is called supposedly has heat issues.. Which means if true then this generation of Nvidia cards are well yikes in a lot of departments..sad

    I hope not.  If the tripple width is all on the normal fan side then okay but if they sandwich the PCB with a full width on each side, a lot of boards will not fit the card because the memory slots will interfere with the back side cooler.  At least, that is how it looks on my Gigabyte board.

    I wrote about my EVGA experience somewhere here before, but my EVGA 1080ti died last year by lightning. They replaced it super quick considering we are on opposite sides of the country. I bought the 1080ti used off ebay, so I am not the original owner, but they treated me great. The tech I spoke to answered every question I had clearly. They will replace the product as many times as it breaks over the warranty. This stands out to me, because I do repair work, and in my field the extended warranty ends if a unit is replaced. So that is quite comforting. I actually bought that 1080ti knowing it had some warranty left, but I did not expect to use it. So I was already high on them, but that experience got me on board. And yeah, it would be great if they did some AMD stuff. At least motherboards, that shouldn't anger Nvidia.

    Post edited by outrider42 on
  • Ghosty12Ghosty12 Posts: 2,068
    edited August 2020

    I do not know how genuine / real the images in these articles are, but if real the 3090 is a massive card.. They compare it to a 2080 and well yes, again if real one is going to need a huge case to put it in.. Then there is what sort of strain there would be on the PCI-E slot and motherboard..

    https://linustechtips.com/main/topic/1237143-the-rtx-3090-is-a-colossal-triple-slot-graphics-card/

    https://videocardz.com/newz/nvidia-geforce-rtx-3090-graphics-card-pictured

     

    Post edited by Ghosty12 on
  • nicsttnicstt Posts: 11,715

    One of the reasons I use a Case that can be converted into a test fig; the motherboard lies flat.

  • nonesuch00nonesuch00 Posts: 18,320

    I now I have a dual slot MSI Radeon RX 570 8GB and a dual slot PNY GeFore GTX 1650 Super 4GB and only a mini-ATX motherboard in a mini-ATX case and it's very disappointing that I can't use both GPU cards in the case at the same time even though one of them would be on a PCIe x16 2.0 slot and not a PCIe x16 3.0 slot. There is not enough space. I have only 3 PCI slots and one is one of those is the very short PCIe x4 slot.  

    So the 3 slot design doesn't bother me. Once a expansion card is over 1 slot for my computer; 2, 3, or even 4 slots and it's all the same result: I can use only one expansion card. I could buy a single slot GPU though using a 2 slot card in the top main PCIe x16 slot and the single slot expansion card in the bottom PCIe x16 2.0 slot.

    As far as power usage being 300W I guess it's not a problem after all they moved the release date from mid September or later to end of August so that's not something you do if your design is overheating and burning up computers. I don't believe the rumours of 400W as that's likely the math of overzealous GPU overclockers and not actually empirical data directly measured from the new Ampere GPUs. 

  • outrider42outrider42 Posts: 3,679
    Don't worry about pcie versions. Iray does not make much use of it. You can run Iray at full speed at pcie 2.0, or use only x8. This has been proven in testing. So I don't think anybody who focuses on Iray will have any concerns. Pcie only effects software like gaming, so this is only a concern for gamers. Right now few games can really saturate pcie 3.0, however that may change in the next year or so. Horizon Zero Dawn's PC port has shown numbers in benchmarks that seem to suggest it is a game that is more sensitive to pcie speed. I would imaagine that more console ports in the future will be as well.

    The bigger concern is the shear size and weight of the card. We may need a support bracket to hold this thing in place so it doesn't bend the bus too much. But even this has been tested. GamersNexus actually did a test that tried to simulate GPU "sag" where the GPU was sagging in the case. It didn't effect anything. Maybe it might slightly effect airflow if it sags so much that it is too close to an obstruction, like the bottom of the case or another pcie card. Otherwise its not a big deal, other than appealing to your vanity.
  • nicsttnicstt Posts: 11,715

    It's the long-term affect on the motherboard; of course, many might upgrade before that happens, but I prefer not to have it as even an unlikely problem.

  • outrider42outrider42 Posts: 3,679

    So that 12 pin connector everybody talking about is indeed real. But it is not quite what most expected. Take a look at the size of the connector compared to a standard 8 pin:

    Its so cute! The 12 pins don't even take as much space as a single 8 pin does. According to the leak, this 12 pin is only on the Founder's Edition. They do claim that 3rd party boards may have 3 8 pin plugs.

  • outrider42outrider42 Posts: 3,679

    Not me, I got a 1000 Watts baby! I got that a couple years ago just to have extra juice for Daz multiple GPU setups.

    I believe the GPUs below the 3090 are going to be more in line with their Turing counterparts in terms of power use. So the 3080 should be in range of the 2080 Wattage, and still be faster than a 2080ti. People looking at the 3070 will probably be quite happy, too. Much of this talk is about the top card, the 3090. With the 3090, Nvidia has gone all out and cranked everything to the max. This is not something they have done for many years. IMO this shows just how much Nvidia wants to keep the performance crown from AMD, like I have said several times now. I'm sure they could have shipped a GPU with less power with less performance as their top card. But they are concerned about AMD taking that crown away from them. So they increased performance as much as they could, but this performance uses more power. The 3090 is the only one that has this 12 pin connector.

    The article above does say that there may be 3rd party boards with 2 8 pin connectors, so if true that would be closer to past top cards.

  • nicsttnicstt Posts: 11,715

    But a twelve pin connector is less than 2 x 8 pin, no?

    If this is the case, it sounds more like a marketting ploy.

  • outrider42outrider42 Posts: 3,679
    nicstt said:

    But a twelve pin connector is less than 2 x 8 pin, no?

    If this is the case, it sounds more like a marketting ploy.

    I believe they are using 1 8 pin connector and this 12 pin connector, so it is still drawing from 3 x 8 connectors off the power supply, at least that is my understanding. This could be wrong. But I did say this could be a smoke screen. Plus power supply makers would support this new format since Nvidia is the market leader. That would mean new power supplies will probably have some kind of big sticker on them promoting that they use the new Nvidia 12 pin, which is basically extra marketing.

    It is going to be quite interesting how this goes. I have a hard time believing that Nvidia would really put a 400 Watt GPU to the consumer market. People are going to laugh at that, as some already are. I do think it will use more power than we are used to, but not so drastically.

    I actually bought a Kill-A-Watt device recently and I have been playing around with it. My PC has two 1080tis and several hard drives, and a older i5. The 1080ti uses 250 Watts according to its testing by other sites, it could push beyond that if overclocked really hard. With Iray my system will hit 450 Watts, it never breaks past 500, and this is the entire system with two 1080tis installed. So I feel confident that a 3090 at 50 Watts more is not going to be a real problem. That would push my total system power beyond 500 Watts. Even in the more worse case situation where a 3090 hits 400 Watts, that would still only raise my system to around 600 Watts or so. And that is assuming I keep one 1080ti installed. I haven't decided if I am going to jump on a 3090, and if I do, if I'd keep a 1080ti installed. If it really is $1400 and performs like rumors say, it would be a huge upgrade.

  • nicsttnicstt Posts: 11,715
    edited August 2020

    I have a good 1200w PSU, that is under-utilised - even with the threadripper and two GPU.

    If it is a genuine card, I believe it's more about marketting (and giving themselves headroom in case they need a 3090ti) than anything else - apart from the said 3090ti. laugh

    I have zero interest in a triple slot card; it's going to have to be cheap and powerful to get me to consider it.

    I really like how my first gen Threadripper handles renders in Blender.

    Post edited by nicstt on
  • outrider42outrider42 Posts: 3,679

    According to a new rumor the exact size is 310 mm in length and 2.75 slots for the Founders. 3rd parties could be almost anything. There very well could be a 2 slot 3rd party card with 2 x 8 pins that is not clocked as high. I wouldn't mind that myself if its close.

    I do not think there will be a 3090ti, because there is no way they can go higher. Kind of like there was no Super version of the 2080ti. It is only missing like a 100 or so cores of the full die, and the 3090 may already be pushing past 300 Watts. So there is no where to go to get more performance. The only room above the 3090 is the Titan itself, which they are saying may have 48gb of VRAM. But the Titan would probably cost $3000+.

    The only possible refresh would be switching nodes. If the 3090 is indeed on Samsung, a switch to TSMC's better node should provide improved performance. But these nodes are not completely compatible. To switch nodes like that would require a new design. It is possible that Nvidia has already designed these cards, but TSMC did not have enough capacity for them, that has been a big rumor for a while as to why they could be using Samsung. TSMC is supposedly going to have more capacity in 2021.

    TSMC's role in this cannot be understated. They are the most important chip maker in the world now, and everybody wants chips made by them. So companies are battling it out for capacity, since TSMC can only produce so much. Nvidia took a gamble with Samsung, but may have been burned by that. AMD bought a lot of capacity from TSMC, which left Nvidia out. All this corporate musical chairs has led to the GPUs we will be seeing released soon, for better and worse.

  • outrider42outrider42 Posts: 3,679

    Well, now this site is saying that the Founder's card is just using the 12 pin connector...there are no other connectors on the board. That changes things drastically for power. The 12 pin can handle up to 300 Watts, and the pcei can deliver another 75, so if the card indeed just has one 12 pin, then the absolute max power is 375 Watts. But nobody uses the max, so this card is probably just around 300 W if this is true. So that would totally be a big smoke screen, LOL. After all, why not use the old 2 x 8? I suppose that the answer might be space, as this 12 pin is so much smaller it only takes the space of a one 8 pin. That would leave more room on the board for other things. But really, it seems silly, especially if the AIBs stick to the old 2 x 8 pin designs, which is happening.

    https://wccftech.com/nvidia-geforce-rtx-30-series-ampere-gaming-graphics-cards-12-pin-power-connector/

     

  • nicsttnicstt Posts: 11,715

    Well, now this site is saying that the Founder's card is just using the 12 pin connector...there are no other connectors on the board. That changes things drastically for power. The 12 pin can handle up to 300 Watts, and the pcei can deliver another 75, so if the card indeed just has one 12 pin, then the absolute max power is 375 Watts. But nobody uses the max, so this card is probably just around 300 W if this is true. So that would totally be a big smoke screen, LOL. After all, why not use the old 2 x 8? I suppose that the answer might be space, as this 12 pin is so much smaller it only takes the space of a one 8 pin. That would leave more room on the board for other things. But really, it seems silly, especially if the AIBs stick to the old 2 x 8 pin designs, which is happening.

    https://wccftech.com/nvidia-geforce-rtx-30-series-ampere-gaming-graphics-cards-12-pin-power-connector/

     

    Marketting hype; folks are talking about it and not about AMD.

Sign In or Register to comment.