Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

1353638404145

Comments

  • RayDAntRayDAnt Posts: 1,147
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yes, you would think that, but since rendering is the bottleneck, rendering quicker has meant to me that I just render twice as much :) This hobby has made a $100 difference in my electric bill from a single 960 to 4 2080tis.

    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely know way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    I've run 4 gigabyte 2080ti blowers for the better part of two years and have had zero problems. I plan to buy 3090 blowers from gigabyte again. The only question is how many.

    It's their motherboards that I am never going to touch again... My Gaming 7 was a nightmare and I got no help at all from them. Certain problems were never resolved and I just worked around them.

    Im surprised you have had isues with their motherboards. In every machine i have built, i have only used 2 motherboard manufacturers - Gigabyte and Asus. Probably equally, 4-5 each. I believe Asus are pretty much regarded as making the best motherboards, and out of all motherboards i have used in my personal machines, i have had issues with 2 of them, both of them Asus boards. Maybe they were not even issues as such, maybe a better definition would be a strange gimmick that two of them had. Sometimes, seemingly randomly, they would not post on the first boot attempt. it would ALWAYS post on the second attempt though. 

    So nothing dealbreaking, but something i never understood. But never had any gimmicks or issues with any of the gigabyte boards i have used. My current one is a gigabyte trx40 master and 0 issues so far

    Gigabyte and Asus are the only two GPU manufacturers i buy too, and funnily enough, never had an issue with a gigabyte gpu, but i have an Asus 2060 blower at the moment that sometimes generates a ticking noise in the fan that i can then stop by getting in there and wiggling the fan a bit (while off of course)

    I guess I should add that I didn't have any problems until I crammed 4 GPUs into my Gaming 7. My hypothesis is that those types of configurations are just not tested as well because they're kind of fringe. Wierd things started to happen, BIOS updates would solve one problem and cause another, and there was no one with a similar configuration to provide the wisdom of his prior experience.

    If I build a homebuilt, I'm going to go with the precise configuration of the reviewer.

    I also use the Gaming 7 in my Titan RTX system. Apart from iissues with inconsistent temperature monitoring and fan control at high system utilization %s (remedied by over-the-top watercooling and external fan controllers...) it's suited my purposes fine. I would never even consider putting more than two (maybe three) high-performance GPUs in it though because it doesn't have additional board power beyond what the standard 24-pin provides. Remember that GPUs pull a sizable portion of their power budget (iirc up to 75w each) direct from the PCI-E slot. With 4 2080Tis in a Gaming 7, I'm not surprised at all to hear that you've had weird issues. There's a very good reason why virutally all 3+ PCI-E x16 slot boards have additional power hookups on them.

  • nicsttnicstt Posts: 11,715
    fred9803 said:

    Point taken. GPU fan noise isn't an issue that can be ingored.. as seen in the video. My MSI 2080 is barely audible even under stress and that's the way I like it. I keep my case open which means I need to clean the components of dust regularly. Listening to loud fans spinning insanely would stress me out no-end.

    Agreed, I wont buy noisy cards; I have two 900 strix and completely silent normally, but under load, they do make a noise, although not a lot - but enough for me to notice.

  • RayDAnt said:
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yes, you would think that, but since rendering is the bottleneck, rendering quicker has meant to me that I just render twice as much :) This hobby has made a $100 difference in my electric bill from a single 960 to 4 2080tis.

    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely know way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    I've run 4 gigabyte 2080ti blowers for the better part of two years and have had zero problems. I plan to buy 3090 blowers from gigabyte again. The only question is how many.

    It's their motherboards that I am never going to touch again... My Gaming 7 was a nightmare and I got no help at all from them. Certain problems were never resolved and I just worked around them.

    Im surprised you have had isues with their motherboards. In every machine i have built, i have only used 2 motherboard manufacturers - Gigabyte and Asus. Probably equally, 4-5 each. I believe Asus are pretty much regarded as making the best motherboards, and out of all motherboards i have used in my personal machines, i have had issues with 2 of them, both of them Asus boards. Maybe they were not even issues as such, maybe a better definition would be a strange gimmick that two of them had. Sometimes, seemingly randomly, they would not post on the first boot attempt. it would ALWAYS post on the second attempt though. 

    So nothing dealbreaking, but something i never understood. But never had any gimmicks or issues with any of the gigabyte boards i have used. My current one is a gigabyte trx40 master and 0 issues so far

    Gigabyte and Asus are the only two GPU manufacturers i buy too, and funnily enough, never had an issue with a gigabyte gpu, but i have an Asus 2060 blower at the moment that sometimes generates a ticking noise in the fan that i can then stop by getting in there and wiggling the fan a bit (while off of course)

    I guess I should add that I didn't have any problems until I crammed 4 GPUs into my Gaming 7. My hypothesis is that those types of configurations are just not tested as well because they're kind of fringe. Wierd things started to happen, BIOS updates would solve one problem and cause another, and there was no one with a similar configuration to provide the wisdom of his prior experience.

    If I build a homebuilt, I'm going to go with the precise configuration of the reviewer.

    I also use the Gaming 7 in my Titan RTX system. Apart from iissues with inconsistent temperature monitoring and fan control at high system utilization %s (remedied by over-the-top watercooling and external fan controllers...) it's suited my purposes fine. I would never even consider putting more than two (maybe three) high-performance GPUs in it though because it doesn't have additional board power beyond what the standard 24-pin provides. Remember that GPUs pull a sizable portion of their power budget (iirc up to 75w each) direct from the PCI-E slot. With 4 2080Tis in a Gaming 7, I'm not surprised at all to hear that you've had weird issues. There's a very good reason why virutally all 3+ PCI-E x16 slot boards have additional power hookups on them.

    Any PCIE slot can draw 75W even a 1x. Every board should be designed to deliver 75W to each slot at the same time. I'm not saying they're all tested for it but the ATX standard says they should be.

    The whole point of going from the 20 pin to 24 pin power connector for the motherboard was to provide that power.

  • RayDAnt said:
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yes, you would think that, but since rendering is the bottleneck, rendering quicker has meant to me that I just render twice as much :) This hobby has made a $100 difference in my electric bill from a single 960 to 4 2080tis.

    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely know way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    I've run 4 gigabyte 2080ti blowers for the better part of two years and have had zero problems. I plan to buy 3090 blowers from gigabyte again. The only question is how many.

    It's their motherboards that I am never going to touch again... My Gaming 7 was a nightmare and I got no help at all from them. Certain problems were never resolved and I just worked around them.

    Im surprised you have had isues with their motherboards. In every machine i have built, i have only used 2 motherboard manufacturers - Gigabyte and Asus. Probably equally, 4-5 each. I believe Asus are pretty much regarded as making the best motherboards, and out of all motherboards i have used in my personal machines, i have had issues with 2 of them, both of them Asus boards. Maybe they were not even issues as such, maybe a better definition would be a strange gimmick that two of them had. Sometimes, seemingly randomly, they would not post on the first boot attempt. it would ALWAYS post on the second attempt though. 

    So nothing dealbreaking, but something i never understood. But never had any gimmicks or issues with any of the gigabyte boards i have used. My current one is a gigabyte trx40 master and 0 issues so far

    Gigabyte and Asus are the only two GPU manufacturers i buy too, and funnily enough, never had an issue with a gigabyte gpu, but i have an Asus 2060 blower at the moment that sometimes generates a ticking noise in the fan that i can then stop by getting in there and wiggling the fan a bit (while off of course)

    I guess I should add that I didn't have any problems until I crammed 4 GPUs into my Gaming 7. My hypothesis is that those types of configurations are just not tested as well because they're kind of fringe. Wierd things started to happen, BIOS updates would solve one problem and cause another, and there was no one with a similar configuration to provide the wisdom of his prior experience.

    If I build a homebuilt, I'm going to go with the precise configuration of the reviewer.

    I also use the Gaming 7 in my Titan RTX system. Apart from iissues with inconsistent temperature monitoring and fan control at high system utilization %s (remedied by over-the-top watercooling and external fan controllers...) it's suited my purposes fine. I would never even consider putting more than two (maybe three) high-performance GPUs in it though because it doesn't have additional board power beyond what the standard 24-pin provides. Remember that GPUs pull a sizable portion of their power budget (iirc up to 75w each) direct from the PCI-E slot. With 4 2080Tis in a Gaming 7, I'm not surprised at all to hear that you've had weird issues. There's a very good reason why virutally all 3+ PCI-E x16 slot boards have additional power hookups on them.

    I admit that my attitude was way too cavalier, but I even had issues with the two Titan RTXs I had before the 2080s. They worked fine and were stupid fast, but they never generated any video. If I ssh'd into the system and ran Blender over it, everything worked fine, the monitors just never synced, even before POST. There were no clues for me to even begin to debug what was going on. Having to return two Titans is not a feeling I would wish on my worst enemy :)

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    Well the latest rumors have Big Navi pumping out performance on par with the 3080, depending on the benchmark.  Of course, we should wait for the actual independent reviews, but for you Blender folks that can utilize AMD GPUs, it's worth mentioning at least.

    Of course, since Daz Studio is pretty much stuck in CUDA land ATM, the best we can hope for is that the flood of new AMD GPUs helps offset the demand for NVidia GPUs a bit, and maybe the prices normalize a bit sooner... I'm still thinking Q1 2021 before I even try to grab a 3090 at this point.

    The leaks on the 5000 series Ryzens look very promising, but I'm more interested in the Ryzen 4750G ATM.  That or Threadripper... at this point I probably won't even try to build a system before 2021, but it's still nice to keep up on the latest developments on the hardware front.  I have seen the 'unboxed' 4750Gs over at Newegg, but I'm not in a buying mood ATM.

  • Well the latest rumors have Big Navi pumping out performance on par with the 3080, depending on the benchmark.  Of course, we should wait for the actual independent reviews, but for you Blender folks that can utilize AMD GPUs, it's worth mentioning at least.

    Of course, since Daz Studio is pretty much stuck in CUDA land ATM, the best we can hope for is that the flood of new AMD GPUs helps offset the demand for NVidia GPUs a bit, and maybe the prices normalize a bit sooner... I'm still thinking Q1 2021 before I even try to grab a 3090 at this point.

    The leaks on the 5000 series Ryzens look very promising, but I'm more interested in the Ryzen 4750G ATM.  That or Threadripper... at this point I probably won't even try to build a system before 2021, but it's still nice to keep up on the latest developments on the hardware front.  I have seen the 'unboxed' 4750Gs over at Newegg, but I'm not in a buying mood ATM.

    No way would I pay $500 for 3700x with an iGPU. If you're just going to add a GPU anyway why even think about that. You can get a 3700X for $325 right now.

    Prices on 3000 series will drop even further after the 3000 series release, not that the 3700x was ever a chip anyone cared about.

  • outrider42outrider42 Posts: 3,679
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely no way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    A couple things, of course a blower is capable of cooling such a card, but to do so requires an insane amount fan speed, hence the turbo jet sound from running more than one of these things. So blower cards fell out of fashion with most gamers as they are just too freakin' loud. This lead to the card makers dialing back on blower style cards, the best performing chips they got would be paired with different coolers instead. That meant that the less performing chips often got relegated to use in blower coolers. These coolers were often cheap, which lead to hotter GPUs compared to the other models in their lineup. In turn that lead to blower cards being seen this way. But everybody knows that blower cards are better for multiGPU setups, no question. For a single GPU setup, a dual or triple fan can work better and without sounding a siren in the process.

    The other thing, and this one is the most important, is that Puget only tested rendering software with these cards. Guys...rendering software does not run the GPU as hot as most games. How many times does this need to be stated? My own GPU will be at least 10 full degrees hotter running a video game than with Iray. Sometimes it can be 15C hotter. I typically render right at 70C, and game at up to 84C. That is a big difference, and there is no doubt that this fact helps the Puget system stay under control. Now if Puget was gaming (and I know that is impossible with 4 GPUs,) then things would be quite different for the temps. I can tell you this, they would not be able to claim they run at under 80C, LOL.

    Also, Puget is selling these systems, they aren't going to talk about how hot they might get under specific loads. They know that people are not buying these to play video games, so that is not a concern.

  • joseftjoseft Posts: 310
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely no way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    A couple things, of course a blower is capable of cooling such a card, but to do so requires an insane amount fan speed, hence the turbo jet sound from running more than one of these things. So blower cards fell out of fashion with most gamers as they are just too freakin' loud. This lead to the card makers dialing back on blower style cards, the best performing chips they got would be paired with different coolers instead. That meant that the less performing chips often got relegated to use in blower coolers. These coolers were often cheap, which lead to hotter GPUs compared to the other models in their lineup. In turn that lead to blower cards being seen this way. But everybody knows that blower cards are better for multiGPU setups, no question. For a single GPU setup, a dual or triple fan can work better and without sounding a siren in the process.

    The other thing, and this one is the most important, is that Puget only tested rendering software with these cards. Guys...rendering software does not run the GPU as hot as most games. How many times does this need to be stated? My own GPU will be at least 10 full degrees hotter running a video game than with Iray. Sometimes it can be 15C hotter. I typically render right at 70C, and game at up to 84C. That is a big difference, and there is no doubt that this fact helps the Puget system stay under control. Now if Puget was gaming (and I know that is impossible with 4 GPUs,) then things would be quite different for the temps. I can tell you this, they would not be able to claim they run at under 80C, LOL.

    Also, Puget is selling these systems, they aren't going to talk about how hot they might get under specific loads. They know that people are not buying these to play video games, so that is not a concern.

    I am aware that most games make GPUs run hotter than renders do. 

    As im sure you agree, Pugets customers are not gamers, and im sure the kind of people in the market for a rig like that know enough about this kind of thing to know this without being told also.

    with that said though, running just one of these blower 3090s i think would still be ok for gaming. In Puget's tests, the results with just one of them in the system showed that the temp under load was better than some of the traditional cooler variants. So running a game with just one of them would have similar results as a traditional cooler i think. Hotter, but still under control.

  • joseft said:
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely no way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    A couple things, of course a blower is capable of cooling such a card, but to do so requires an insane amount fan speed, hence the turbo jet sound from running more than one of these things. So blower cards fell out of fashion with most gamers as they are just too freakin' loud. This lead to the card makers dialing back on blower style cards, the best performing chips they got would be paired with different coolers instead. That meant that the less performing chips often got relegated to use in blower coolers. These coolers were often cheap, which lead to hotter GPUs compared to the other models in their lineup. In turn that lead to blower cards being seen this way. But everybody knows that blower cards are better for multiGPU setups, no question. For a single GPU setup, a dual or triple fan can work better and without sounding a siren in the process.

    The other thing, and this one is the most important, is that Puget only tested rendering software with these cards. Guys...rendering software does not run the GPU as hot as most games. How many times does this need to be stated? My own GPU will be at least 10 full degrees hotter running a video game than with Iray. Sometimes it can be 15C hotter. I typically render right at 70C, and game at up to 84C. That is a big difference, and there is no doubt that this fact helps the Puget system stay under control. Now if Puget was gaming (and I know that is impossible with 4 GPUs,) then things would be quite different for the temps. I can tell you this, they would not be able to claim they run at under 80C, LOL.

    Also, Puget is selling these systems, they aren't going to talk about how hot they might get under specific loads. They know that people are not buying these to play video games, so that is not a concern.

    I am aware that most games make GPUs run hotter than renders do. 

    As im sure you agree, Pugets customers are not gamers, and im sure the kind of people in the market for a rig like that know enough about this kind of thing to know this without being told also.

    with that said though, running just one of these blower 3090s i think would still be ok for gaming. In Puget's tests, the results with just one of them in the system showed that the temp under load was better than some of the traditional cooler variants. So running a game with just one of them would have similar results as a traditional cooler i think. Hotter, but still under control.

    The only temp they reported was 67 for Octane on the single blower. add 10C to that, which is not an unreasonable add to that, and you're bumping up against the thermal throttle for the card of 80C during gaming and that means no boost behavior at all. That is definitely not what standard coolers were getting. I'm not saying standard cooling was great for the 3090's but they were better than that.

  • billyben_0077a25354billyben_0077a25354 Posts: 771
    edited October 2020

    With all of this talk about blower style coolers, anyone remember the imfamous Geforce FX "Dust Buster" coolers?  You could hear those things 30 feet away at a lan party.

    Post edited by billyben_0077a25354 on
  • outrider42outrider42 Posts: 3,679
    joseft said:
    joseft said:
    fred9803 said:
    joseft said:

    Have been keeping an eye out for reviews on the gigabyte 3090 blower version and this popped up

    4x 3090 blowers in a single machine

    https://www.pugetsystems.com/labs/articles/Quad-GeForce-RTX-3090-in-a-desktop—-Does-it-work-1935/

     

    Those temps look good but I wouldn't want to pay the power bill. Then again, if you're rendering so much quicker you would't be running them for so long.

    Yeah i think the kind of people in the market for a rig like that, the power bill is the least of their worries

    i wanted to know what the thermals would be like, given these are 350w (and more) cards. I read a LOT of 'expert' opinions on the internet saying that this blower card would be thermal throttling e-waste, many of which had some unkind things to say about Gigabyte for even manufacturing it, because there was absolutely no way in these peoples infinite experience that a blower could possibly cool these cards.

    Myself, i was sure that Gigabyte would not release a product that simply didnt work due to thermals. As it turns out, one of these blower cards by itself actually runs cooler than many of the traditional cooler versions out there right now. My bigger concerns were how well it would hold up in multi-gpu configs, and to a lesser extent how noisy it is. It holds up well beyond expectations in a multi-gpu config. And i think the video demo they did about the noise made it sound worse than it is due to the amount of ambient noise in the environment they were testing in. But even so, i would prefer it being loud with thermals under control than be quiet and have major thermal issues due to the amount of heat being blown into the case that a traditional cooler setup on these cards would cause in a multi-gpu config. My machine is already pretty loud when i have all my case fans turned up.

    A couple things, of course a blower is capable of cooling such a card, but to do so requires an insane amount fan speed, hence the turbo jet sound from running more than one of these things. So blower cards fell out of fashion with most gamers as they are just too freakin' loud. This lead to the card makers dialing back on blower style cards, the best performing chips they got would be paired with different coolers instead. That meant that the less performing chips often got relegated to use in blower coolers. These coolers were often cheap, which lead to hotter GPUs compared to the other models in their lineup. In turn that lead to blower cards being seen this way. But everybody knows that blower cards are better for multiGPU setups, no question. For a single GPU setup, a dual or triple fan can work better and without sounding a siren in the process.

    The other thing, and this one is the most important, is that Puget only tested rendering software with these cards. Guys...rendering software does not run the GPU as hot as most games. How many times does this need to be stated? My own GPU will be at least 10 full degrees hotter running a video game than with Iray. Sometimes it can be 15C hotter. I typically render right at 70C, and game at up to 84C. That is a big difference, and there is no doubt that this fact helps the Puget system stay under control. Now if Puget was gaming (and I know that is impossible with 4 GPUs,) then things would be quite different for the temps. I can tell you this, they would not be able to claim they run at under 80C, LOL.

    Also, Puget is selling these systems, they aren't going to talk about how hot they might get under specific loads. They know that people are not buying these to play video games, so that is not a concern.

    I am aware that most games make GPUs run hotter than renders do. 

    As im sure you agree, Pugets customers are not gamers, and im sure the kind of people in the market for a rig like that know enough about this kind of thing to know this without being told also.

    with that said though, running just one of these blower 3090s i think would still be ok for gaming. In Puget's tests, the results with just one of them in the system showed that the temp under load was better than some of the traditional cooler variants. So running a game with just one of them would have similar results as a traditional cooler i think. Hotter, but still under control.

    Around here...many people assume Daz Iray is the worst thing you can do on a GPU, just look around the forums a bit and you will see comments like this. So no, I never assume that people looking to buy GPUs for this purpose to fully understand this, because the fact is many of them do not. We have hardware thread after hardware thread here asking for information. We have lots of people who just buy prebuilds. We got people using freakin' laptops. The user base runs the entire spectrum, and many of them are not that tech savvy. That is a big part of Daz's draw in the first place, you do not need to be an expert at 3D to be able to use Daz Studio, so it is only natural that you will find many users who just get into the tech side of it. Some people just want to make art and not worry about those details. And being tech savvy does not mean they know specific things like this. While building a computer is easier than ever, it is still something that is surprisingly niche.

    Blowers can certainly be fine for gaming, if you can stand the noise they make. Gaming blower cards are still sold today, like these 3090s are, given that 3090s are NOT Titans or Quadro. Nvidia's own Founder's cards were all blowers up until Turing came along, but you could still find blower models from AIBs for every class of card, including the 2080ti. The exception would be Titan RTX, which only Nvidia sold, and only had dual fans.

  • JamesJABJamesJAB Posts: 1,760

    And all of the Quadro cards are still using blower style coolers.

    Also, my Dell Precision tower is not very dual fan GPU friendly.  The tower is designed to bring air in the front, go directly through, and out the back.  There is also no room inside for all of these non-standard size cards.  Even a dual fan card that is the correct width has hardly any room at the side of the card for exhaust air.  On the flip side when you install a blower style GPU, the front case fans push cool air into the GPU intake and then gets blown out the back of the tower.

    Personally, I would take a blower style card any day. (Blower cards are also better for multi GPU systems because all of the heat is blown out the back of the case instead of being re-inserted back into the intake air pool.)

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited October 2020

    This is a bit off topic, but AMD beat their earnings estimates apparently, and has closed the Xilinx deal...

    https://www.cnbc.com/2020/10/27/amd-to-buy-chip-peer-xilinx-for-35-billion-in-data-center-push.html?__source=google|editorspicks|&par=google

    The AMD earnings call is scheduled for later today, but the earnings beat mention is in the article.  What I'm trying to wrap my head around is how a relatively tiny company like AMD is able to close such a large deal.  For comparison, AMD's annual revenue for 2019 was $6.7ish billion, while Xilinx was $3.6 Billion, but the articles I've seen keep throwing around $35 billion for the cost of the deal, or over 7x AMD's annual revenue... So I'm not quite understanding where the $35 billion comes from.  Stock market maybe?

    Mind you, I'm all about AMD getting on equal footing with Intel eventually (Intel is still huge by comparison), and the Xilinx deal should help them in the Datacenter market, but this reminds me of the ATI deal.  The ATI deal eventually panned out handomely IMHO, but it took AMD quite a while to dig themselves out of that hole, not to mention the CPU division falling behind Intel until Ryzen showed up, that was quite painful  too...

    The continued benchmark leaks show the new AMD chips due next month trouncing the 10xxx series Intel chips by 10% or more, but of course Intel should have a counter-punch at some point.  AMD had managed to pay their debt down significantly in recent years, but this new Xilinx deal, yeah I'm just not wrapping my head around it.  That $35 billion has to come from somewhere...

    Anyways, back on topic, looking forward to the independent 6000 series Radeon reviews this week, and the 5000 series Ryzen reviews in early November...

     

     

    Post edited by tj_1ca9500b on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited October 2020

    Here's Anandtech's take on the Xilinx deal:

    https://www.anandtech.com/show/16196/amd-in-35-billion-allstock-acquisition-of-xilinx

    As noted in the article above, AMD's earnings conference call is scheduled for 8am EST today, or in just a few minutes.  So we should know by noon how Q3 went for AMD, which apparently was pretty good.

    Post edited by tj_1ca9500b on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited October 2020

    Looks like AMD paid down their debt by nearly half from the Q2 amount:

    https://www.globenewswire.com/news-release/2020/10/27/2114937/0/en/AMD-Reports-Third-Quarter-2020-Financial-Results.html

    $373 million is almost chump change as compared to their total yearly revenues, but it's still nice to see that number continue to go down.  For reference, in 2016 it was over $2 billion, and it looks like the all time high for AMD was $5.3 Billion in 2007.

    No doubt the Enterprise and Semi-custom divison was boosted heavily by game console chip sales for the new Microsoft and Sony consoles, but it's nice to see an 'across the board' increase, except for maybe discrete graphics card GPU sales.  The linked sheet above doesn't break out graphics cards revenue, but that's the 'prevailing opinion'.  Zen chips have been selling like hotcakes of course!

    It also looks like they increased their R&D budget a bit, which is expected but nonetheless nice to see.

    Looking over the Xilinx deal, as mentioned in the articles, it'll be an all stock deal, with AMD capitalizing on it's share surge this year from $30 to over $80... still sounds like funny money to me, but then the stock market is weird that way...

    Anyways, nice to see AMD still resurging, and hopefully they get their discrete GPU game together in the coming year or two.  I'm getting a bit tired of having to pay the Nvidia premiums for Nvidia cards with high VRAM amounts... I'll buy a 3090 eventually, but yeah I'd rather have had a Radeon VII in my system for half the cost.  But of course Radeon and CUDA don't mix, so here we are...

     

    Post edited by tj_1ca9500b on
  • tj_1ca9500btj_1ca9500b Posts: 2,057

    Also, RTX 3070 reviews today!

    Here's some Octane and Blender benches:

    https://techgage.com/article/nvidia-geforce-rtx-3070-rendering-performance/

  • Ghosty12Ghosty12 Posts: 2,068

    Yup had here is LTT's review/preview of the 3070, while he found it good in the tests they did.. He was not so kind on Nvidia's practices of late..

  • Nvidia had it coming.

  • Here's Anandtech's take on the Xilinx deal:

    https://www.anandtech.com/show/16196/amd-in-35-billion-allstock-acquisition-of-xilinx

    As noted in the article above, AMD's earnings conference call is scheduled for 8am EST today, or in just a few minutes.  So we should know by noon how Q3 went for AMD, which apparently was pretty good.

    So the answer to where the money came from is that it didn't - it's exchanging shares in AMD for those in Xilinx so that existing AMD shareholders end up with a smaller (three-quarters) share of, hopefully, a larger company and Xilinx share holders haev a much smaller share (25%) of a much larger company. Most deals are a mix of cash and shares, all-cash and, as here, all-share deals are more unusual from what I've seen but not unknown.

  • tj_1ca9500btj_1ca9500b Posts: 2,057

    Here's Anandtech's take on the Xilinx deal:

    https://www.anandtech.com/show/16196/amd-in-35-billion-allstock-acquisition-of-xilinx

    As noted in the article above, AMD's earnings conference call is scheduled for 8am EST today, or in just a few minutes.  So we should know by noon how Q3 went for AMD, which apparently was pretty good.

    So the answer to where the money came from is that it didn't - it's exchanging shares in AMD for those in Xilinx so that existing AMD shareholders end up with a smaller (three-quarters) share of, hopefully, a larger company and Xilinx share holders haev a much smaller share (25%) of a much larger company. Most deals are a mix of cash and shares, all-cash and, as here, all-share deals are more unusual from what I've seen but not unknown.

    That's what I gleaned as well from a few articles.   AMD's share price jump from just over $30 at this time last year to $80 or so as I type this no doubt put them in a stronger position for the stock deal.Zilinx was at just over $90 a year ago, and not counting today's sudden jump was at around $115. 

    Still, I still think of the stock market as funny money, due to how volatile it is.  Plus, I still have memories of the last few stock market crashes, and of course have read about 1929...

    Your perspective is appreciated!

     

  • outrider42outrider42 Posts: 3,679
    Ghosty12 said:

    Yup had here is LTT's review/preview of the 3070, while he found it good in the tests they did.. He was not so kind on Nvidia's practices of late..

    Linus seriously needs to do something with his hair. And I don't think I can ever get used to that beard.

    Anyway...the 3070 numbers are pretty great. Matching the previous flagship 2080ti in games is cool and all, but we are interested in the rendering. And boy does the 3070 shine at rendering, beating the 2080ti by a wide margin. It is too bad it only has 8gb, but if 8gb is enough, this thing is a little beast. This is what I have been expecting out of Ampere. And really, Ampere is great for Iray users, it just needs more VRAM.

    Maybe if AMD can lay the smack down, Nvidia will change its mind on those extra VRAM cards. It is very impressive just how AMD has dug itself out of its own grave. This is no joke. AMD was right on the brink of going under just before Ryzen. But I do think Intel has greatly assisted AMD with how it has been stuck on 14nm. I truly believe that if Intel had made progress to 10nm on time history would be very different. But hey, here we are, competition is good. And do not forget people, Intel is looking to make a GPU as well. Maybe. It seems like Intel has again run into issues, so their GPUs may end up getting pushed back over and over, too. If Intel does not get their act together, they may be forced to move more towards TSMC as well.

    I don't know what is going with Nvidia. Their seemingly erratic behavior I think proves that Jenson really does call all the shots, because I don't think they would do all of this otherwise. Some of their decisions just do not make any sense at all. Some people are claiming that the stock shortage is on purpose...if this is actually true (and I am not saying it is), it would be one of the dumbest business moves ever. Maybe not as bad as advertising "Lose weight with the aid of AYDS" bad, but really, really close. Forcing a shortage when your big adversary is right about to release their most hyped GPU in a decade is not a smart idea. Many of the agitated people who couldn't get a 3080 will just buy AMD instead! All AMD has to do is...well...deliver. Creating a stock shortage on purpose only works if you have total control of the market in question, or something so unique that nobody else offers anything like it. Then, and only then does this idea have the slightest merit. But if you do this in a market that has strong competition, such a move will likely drive a large portion of your once loyal customers over to your competitor who has their product in stock. This is not some space logic here. Nvidia knew many months in advance that AMD would be launching around this time, and that they would be probably competitive. That is why I find this forced scarcity difficult to believe. If anything, Nvidia should have been building as many cards as possible to outsell AMD. In fact, in years past, some Nvidia victories were sometimes due to them simply having more stock to sell than AMD as AMD struggled to get cards built, so they should be well aware of this.

  • Ghosty12Ghosty12 Posts: 2,068
    Ghosty12 said:

    Yup had here is LTT's review/preview of the 3070, while he found it good in the tests they did.. He was not so kind on Nvidia's practices of late..

    Linus seriously needs to do something with his hair. And I don't think I can ever get used to that beard.

    Anyway...the 3070 numbers are pretty great. Matching the previous flagship 2080ti in games is cool and all, but we are interested in the rendering. And boy does the 3070 shine at rendering, beating the 2080ti by a wide margin. It is too bad it only has 8gb, but if 8gb is enough, this thing is a little beast. This is what I have been expecting out of Ampere. And really, Ampere is great for Iray users, it just needs more VRAM.

    Maybe if AMD can lay the smack down, Nvidia will change its mind on those extra VRAM cards. It is very impressive just how AMD has dug itself out of its own grave. This is no joke. AMD was right on the brink of going under just before Ryzen. But I do think Intel has greatly assisted AMD with how it has been stuck on 14nm. I truly believe that if Intel had made progress to 10nm on time history would be very different. But hey, here we are, competition is good. And do not forget people, Intel is looking to make a GPU as well. Maybe. It seems like Intel has again run into issues, so their GPUs may end up getting pushed back over and over, too. If Intel does not get their act together, they may be forced to move more towards TSMC as well.

    I don't know what is going with Nvidia. Their seemingly erratic behavior I think proves that Jenson really does call all the shots, because I don't think they would do all of this otherwise. Some of their decisions just do not make any sense at all. Some people are claiming that the stock shortage is on purpose...if this is actually true (and I am not saying it is), it would be one of the dumbest business moves ever. Maybe not as bad as advertising "Lose weight with the aid of AYDS" bad, but really, really close. Forcing a shortage when your big adversary is right about to release their most hyped GPU in a decade is not a smart idea. Many of the agitated people who couldn't get a 3080 will just buy AMD instead! All AMD has to do is...well...deliver. Creating a stock shortage on purpose only works if you have total control of the market in question, or something so unique that nobody else offers anything like it. Then, and only then does this idea have the slightest merit. But if you do this in a market that has strong competition, such a move will likely drive a large portion of your once loyal customers over to your competitor who has their product in stock. This is not some space logic here. Nvidia knew many months in advance that AMD would be launching around this time, and that they would be probably competitive. That is why I find this forced scarcity difficult to believe. If anything, Nvidia should have been building as many cards as possible to outsell AMD. In fact, in years past, some Nvidia victories were sometimes due to them simply having more stock to sell than AMD as AMD struggled to get cards built, so they should be well aware of this.

    LTT is whacky at times but he dose a beard well..lol  I am hoping that AMD their next series of cards, do scare Nvidia enough to the point of browning their pants.. And true have no idea what Nvidia are doing in how they are acting, and why they seemed to think that the amount of vram they put on the 3080 and 3070 was enough.. To me for the 3080 and 3070 having 16GB for the 3080 and would not encroach as much on the 3090, and on the 3070 10 to 12GB, would of made sense but that is just me, unless of course Nvidia have something planned next year.. Will be interesting to see how good these new AMD cards are, and will be interesting to see what Nvidia do..

    Was looking at a computer hardware site here in Australia and they were selling a Gigabyte 3090 card for over $3000 AUD, saw the price and nearly fell out of my chair.. laugh

  • Ghosty12 said:
    Ghosty12 said:

    Yup had here is LTT's review/preview of the 3070, while he found it good in the tests they did.. He was not so kind on Nvidia's practices of late..

    Linus seriously needs to do something with his hair. And I don't think I can ever get used to that beard.

    Anyway...the 3070 numbers are pretty great. Matching the previous flagship 2080ti in games is cool and all, but we are interested in the rendering. And boy does the 3070 shine at rendering, beating the 2080ti by a wide margin. It is too bad it only has 8gb, but if 8gb is enough, this thing is a little beast. This is what I have been expecting out of Ampere. And really, Ampere is great for Iray users, it just needs more VRAM.

    Maybe if AMD can lay the smack down, Nvidia will change its mind on those extra VRAM cards. It is very impressive just how AMD has dug itself out of its own grave. This is no joke. AMD was right on the brink of going under just before Ryzen. But I do think Intel has greatly assisted AMD with how it has been stuck on 14nm. I truly believe that if Intel had made progress to 10nm on time history would be very different. But hey, here we are, competition is good. And do not forget people, Intel is looking to make a GPU as well. Maybe. It seems like Intel has again run into issues, so their GPUs may end up getting pushed back over and over, too. If Intel does not get their act together, they may be forced to move more towards TSMC as well.

    I don't know what is going with Nvidia. Their seemingly erratic behavior I think proves that Jenson really does call all the shots, because I don't think they would do all of this otherwise. Some of their decisions just do not make any sense at all. Some people are claiming that the stock shortage is on purpose...if this is actually true (and I am not saying it is), it would be one of the dumbest business moves ever. Maybe not as bad as advertising "Lose weight with the aid of AYDS" bad, but really, really close. Forcing a shortage when your big adversary is right about to release their most hyped GPU in a decade is not a smart idea. Many of the agitated people who couldn't get a 3080 will just buy AMD instead! All AMD has to do is...well...deliver. Creating a stock shortage on purpose only works if you have total control of the market in question, or something so unique that nobody else offers anything like it. Then, and only then does this idea have the slightest merit. But if you do this in a market that has strong competition, such a move will likely drive a large portion of your once loyal customers over to your competitor who has their product in stock. This is not some space logic here. Nvidia knew many months in advance that AMD would be launching around this time, and that they would be probably competitive. That is why I find this forced scarcity difficult to believe. If anything, Nvidia should have been building as many cards as possible to outsell AMD. In fact, in years past, some Nvidia victories were sometimes due to them simply having more stock to sell than AMD as AMD struggled to get cards built, so they should be well aware of this.

    LTT is whacky at times but he dose a beard well..lol  I am hoping that AMD their next series of cards, do scare Nvidia enough to the point of browning their pants.. And true have no idea what Nvidia are doing in how they are acting, and why they seemed to think that the amount of vram they put on the 3080 and 3070 was enough.. To me for the 3080 and 3070 having 16GB for the 3080 and would not encroach as much on the 3090, and on the 3070 10 to 12GB, would of made sense but that is just me, unless of course Nvidia have something planned next year.. Will be interesting to see how good these new AMD cards are, and will be interesting to see what Nvidia do..

    Was looking at a computer hardware site here in Australia and they were selling a Gigabyte 3090 card for over $3000 AUD, saw the price and nearly fell out of my chair.. laugh

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 

  • algovincianalgovincian Posts: 2,636

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

     

    Who do you expect is going to make games that require more than 8GB if there isn't hardware to run them on? 

     

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 

     

    The software/hardware both drive/support each other. The only reason to make games that require more than 8GB is if there's hardware to run them.

    - Greg 

  • OMG, News Flash, AMD RX 6900 XT beats the RTX 3090 and is only $999.00.  Nvidia spanking to commence soon.  BOOM, mic drop.

  • Ghosty12Ghosty12 Posts: 2,068
    edited October 2020

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    Post edited by Ghosty12 on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited October 2020

    So, I just watched the AMD Big Navi/RDNA 2 presentation.

    Short form, 6900XT trades blows with 3090, 6800XT trades blows with the 3080, slightly lower power consumption, at least in the AMD benches.  We won't see the cards for almost another month though, and December for the 6900XT.

    There's also a 6800 which is a bit cheaper.  If I'm remembering correctly, 6900XT is $999, $649 for the $6800XT, and I missed the price for the 6800.  All three cards had 16GB of VRAM as I remember, and some other features.

    So short form, if AMD's benchmarks are honest, AMD is finally able to trade blows with the top end Nvidia cards.

    In other news, I've seen a couple of rumors in the last week about a 3080 Ti and 3070 Ti in the works, but no hard specs as of yet.

    Nvidia's 'out of the gate' pricing makes a bit more sense if the AMD numbers are accurate.  Of course, we get to wait on the independent reviewers to see how that shakes out in the various upcoming reviews.

    Post edited by tj_1ca9500b on
  • bluejauntebluejaunte Posts: 1,923
    edited October 2020

    Great news, competition is always good even if we can't use AMD cards ourselves for Iray.

    Post edited by bluejaunte on
  • Ghosty12Ghosty12 Posts: 2,068

    So, I just watched the AMD Big Navi/RDNA 2 presentation.

    Short form, 6900XT trades blows with 3090, 6800XT trades blows with the 3080, slightly lower power consumption, at least in the AMD benches.  We won't see the cards for almost another month though, and December for the 6900XT.

    There's also a 6800 which is a bit cheaper.  If I'm remembering correctly, 6900XT is $999, $649 for the $6800XT, and I missed the price for the 6800.  All three cards had 16GB of VRAM as I remember, and some other features.

    So short form, if AMD's benchmarks are honest, AMD is finally able to trade blows with the top end Nvidia cards.

    In other news, I've seen a couple of rumors in the last week about a 3080 Ti and 3070 Ti in the works, but no hard specs as of yet.

    Nvidia's 'out of the gate' pricing makes a bit more sense if the AMD numbers are accurate.  Of course, we get to wait on the independent reviewers to see how that shakes out in the various upcoming reviews.

    Rumor was that Nvidia canned those rumored cards, though seeing what AMD have just revealed Nvidia may want to reconsider..

  • Ghosty12 said:

    Well AMD just launched their new cards, and well it was interesting and all of their cards come with 16GB of vram.. And it looks like AMD's top of the range card is taking on the 3090, at $500 less.. I think that Nvidia may want to be concerned with what AMD have shown.. Will have to wait and see how Nvidia respond to this, only time will tell..

    It's enough VRAM because they're gaming cards and no game calls for even 8 Gb. Even for most casual creator uses 10Gb is plenty.  You guys have got to understand that DS is a very very niche use case. For 99.999999999999999999999999999+% of users the 3080 will never run out of VRAM.

    The only reason to increase the amount of VRAM is for advertising and epeen purposes and considering the cost and performance issues Nvidia clearly has decided against that. Maybe they'll eventually release the prosumer oriented Ampere cards once they figure out the GDDR6X production issues, but that isn't likely to happen for a while. 


    By that thinking then 4 to 8 GB of system ram is more than enough for most people.. But we all know most people will go for 16 to 32 GB of system ram, as it adds in an amount of future proofing to peoples computers..

    No. There are very good reasons for more system RAM. There is not any current reason for more VRAM. and the argument about not pushing VRAM because no cards exist that don't have more? WTF?

    Even in the Nvidia lineup there have been 11Gb cards for a long time. Nvidia made a very valid point though even with 2 generations of 11Gb flagships there is not a single game that pushes past 8Gb even with every texture setting maxed out at 4k. So why make consumers keep paying for VRAM they aren't using?

Sign In or Register to comment.