Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
I'm currently running a 980 ti and a 1080 ti in a well-ventilated case with custom fan curves and not seeing any issues - the CPU and the GPUs all max out around 65 C. I'm thinking a 3080 looks pretty good (about 0% of my scenes fit the 980's 6GB so 10 is OK). My concern is if I'll need to bite the bullet and upgrade to Win 10 pro to get drivers for the 3080.
I was going to just type a paragraph of HAHAHA but I figured I should not do that. But just know that I have been laughing very hard today and am quite amused.
Focus on the performance, not the CUDA cores. When Digital Foundry did its testing, they were getting anywhere from 60 to 90% more performance with the 3080 over the 2080 in various games. If Ampere truly doubled CUDA cores then I would expect a much larger gains than that. What we are seeing is something in between, so this "doubling" of CUDA cores appears to be a bit misleading. My guess is that these new CUDA cores are more like hyperthreads in a CPU. They boost performance but should not be confused as a true doubling of cores. But they could impact Iray quite nicely because of how Iray works.
We don't have Iray data, and nobody is going to know until we test it ourselves, because Iray is so freakin niche in the world. Hell, Nvidia has mentioned some renderers in their recent marketing but even they didn't bother to speak of Iray. You wont like to hear this but you are way down the totem pole of things they care about, and its always been that way...hence the excruciating long waits for Daz Iray to even support a new GPU AT ALL. Lest we forget the months it took for Pascal to get supported. That was absolutely embarrassing, there is no other word for it. There was no excuse, it demonstrated vividly just how low of a priority Iray was. Because if Iray had been a priority...well they would have had the drivers for it sooner. Simple as that.
But as I've said, I believe Iray will support Ampere on day 1 thanks to OptiX 6.0, not Nvidia's love or support. OptiX 6 does not need to be recompiled, and thus we should not have to wait. And I also said that amung all software, Iray usually benefits more from RT than most. Thus I expect a really big uplift in Iray performance from Ampere, for both the 3080 and 3090. Again, just look at the benchmark thread in my sig. Its right there for you to see, and then compare GPU performance in top games.
You will find that Iray has a different tier from gaming. Most gamers will tell you that the 2080 is just slightly faster than the 1080ti in most games, and in fact there are some games where the 1080ti beats the 2080! But this is not true in Iray. The 2080 destroys the 1080ti at Iray, and for that matter, EVERY SINGLE RTX GPU does as well. Even the freakin' 2060 beats the 1080ti. That is much the ray tracing cores effect our friendly Iray. I don't know how much more I can spell that out. So if the 3080 has ray tracing cores that are twice as fast as the 2080, and the CUDA cores are also quite fast, then it is very easy to take a wild guess that the 3080 will be even better than the 2X increase seen in the best case scenarios from gaming.
The fab names are just marketing. Once they started using fin-fet everything stopped having a real meaning. What does 8nm even mean??? The way Samsung "measures" their fabs is different from TSMC, and Intel in turn is different from them both. But Ampere is able to place 28 BILLION transistors onto its package. That is a number with some actual value. That is a lot. And it is likely very true that Samsung is giving Nvidia a really good deal on their chips. I am sure that makes Nvidia happy. But I have heard that Samsung's chips are not really that great, and it is probably a big reason why the 3080 and 3090 have TDP's above 300 Watts.
Because they can, and they have many times in the past. The Titan has often been the halo product for Nvidia, and a frequent tactic they have used is to compare a new hardware to their old hardware and tell you what an amazing deal it is. They did exactly that with the 2080ti today, stating directly how the 3080 is such a great deal because it is faster than the $1200 2080ti and it only costs $700. And hey, the 3070 is even a little faster than the $1200 2080ti and it costs only $500! What a wild and crazy deal that is! The Titan is always used as a way to sell their hardware, as they can compare X to the Titan and tell you what an awesome deal it is compared to the Titan.
So this is nothing new. Also the 3090 is a lot like a Titan, but it is not actually a Titan. The Titan offers a few additional features that only show up in the Quadro line, like TCC mode. People sometimes call the Titan a "Prosumer" product as it straddles the line between professional and consumer markets. So far the 3090 does not appear to have these features. But at any rate, this is how Nvidia sells things. You can find instances of this throughout their history. They launched Pascal and basically said the same exact thing. They talked about how the new 1070 matched the previous generation Titan, and here they are 4 years later doing it all over again. This is classic Nvidia.
Actually, Jenson Huang is a brilliant salesman. He knows exactly what he is doing and how to say just the right things.
One possible reason is because computing power for 3D animation is still a scarce resource; we haven't yet gotten to the point where more is no longer better. If I can render two shots over night instead of one, that's still a massive difference and it will continue to be until I can render scenes equally as fast as I can have them ready to be rendered.
I appreciate what you're saying about thermals, that's wise advice any way you look at it. An engineer at System76 told me that 2 should be manageable, but for 3 or 4, you'll need blowers. I hope that's still true with the higher wattage, and it's the last thing I want to know about the new cards. I love my blowers and want them again... the way they burn my shin gives me a satisfying feeling that heat isefficiently leaving the system. :)
Thank you for your insightful comment Outrider, I guess before I make any purchase, it'll be interesting to note how the RTX 3080 compares to the RTX 3090 in real terms.
(in Iray, I mean, I know Iray is low in the pecking order overall but I would never consider buying a 1,500 dollar card just for games, iray performance is what seals the deal for me one way or another)
If i was NVIDIA , I would hold the 20GB 3080 and 16GB 3070 until after AMD released big navi. If big navi is really competitive, release the 20GB/16GB version at cost or cheaper and if big navi can't compete then +$300 on the 3080.
Doubling CUDA core is not that unbelievable. Number of cuda per smchanges generation to generation. All the previous generation was based on the assumption that NVIDIA wil use 64 cuda per sm. They have simply doubled it. It not clear at this stage if they just replaced GA100's FP64 core with FP32 or they made INT32 core able to do FP32 math.
Also 2x core doesnt mean 2x performance when their could be other bottleneck like memory bandwidth.
BTW folks, just because the Nvidia 3090 is 3 slots that is no indication that they all will be. I was looking at EVGA's site and one of their 3090's will be 2.2 slots, which is not bad at all, my MSI 1080ti is 2.5 slots so that card would actually be a bit skinnier. Oh, and that EVGA was not water cooled, either. So it just comes down to design as to how big the card will be. It does look like many of these will be very long cards, though. Nearly all are 3 fan configs and in the 12 inch range.
Oh yeah.
But it does use 3x8 pin connectors, so it is using some serious juice. Interestingly, EVGA claims the FTW3 3090 still runs cooler than their FTW3 2080ti, so there is that.
Well Iray performance certainly will not be worse than the games. I haven't seen any game that offers performance differences quite like Iray. Though Quake RTX might be close. With Quake RTX, the 1080ti was beat by all the RTX GPUs in frame rates. So it worth noting that Digital Foundry did show Quake RTX, and it ran around 90 to 98% faster than the 2080. This one game is probably the most comparable to Iray because it is entirely path traced, thus this game shows off the ray tracing performance very well.
So keep an eye on that number. Of course, this is comparing the 3080 to the 2080, not the 3090 to the 2080ti. But Quake RTX is still not the perfect indicator, because the 1660ti is also able to match the 1080ti in performance, and that is just not right. While all RTX cards beat the 1080ti at Iray, the 1660ti is not RTX, and it is not even close to the 1080ti for Iray. Maybe things have changed since then, I think its a driver issue for Pascal with this game.
I am wanting to do some larger scenes and am planning on finally getting my comics started which means I need a much better card to render multuiple images a day and do other productions things. A 3090 is flat out of my price range as I am still a hobbyist but I could swiung for a 3080 with 20 GB of memory or the perfered card which would be a 3070 wuth 16 GB of memory.
Hoping they releast 20 GB 3080 and/or 16 GB 3070 soon. I am a firm believer in the quality ofr EVGA Products. Have a EVGA GTX 1070 and a SuperNova 1200 PSU. Recently bought the PSU to get ready for Amphere as my old Corsair 1000 watt PSU was at least 12 years old amd mayne more. Had it so long that I forgot when I bought it.
Based on the language used to describe the 3090 in the product stack, I am fairly certain that it is the Titan card of the generation. Meaning that it will support things like togglable WDDM/TCC driver mdoe support. The only thing that gives me pause is the lack of "Titan" in the name. I can't see Nvidia foregoing that moniker this generation. At the same time, I can't see Nvidia revealing a card as their flagship model at the same time as revealing another, significantly more powerful model (what happened today/yesterday.) And having a XX90 card in the stack is already a blast to the past. So who knows at this point. Nvidia really seems to be stirring the pot of its own product stack structuring with this generation.
Until Nvidia says these high VRAM cards can exist the AIB's cannot make them. So they won't be coming out until Nvidia announces them. I wouldn't expect them, if they ever do come out, before 2021. I think the 3090, 3080 and 3070 are all they intend to put on shelves for Xmas.
All the AIB's that sell in the US have said they will not be producing any cards with the 12 pin connector. I would be surprised to see any AIB anywhere that does. To say that the HW industry is not happy with Nvidia over this is an understatement. They tried some song and dance about needing a different connector but the adapter they bought is just going to be from a pair of 8 pin PCIE cables. The only other one that appears to even being made is one for 2 molex to it.
Damn, I just looked at my PSU, an EVGA 650 G5, and it looks like I'm going to need a new PSU if I go with an RTX 3090 cause it probably doesn't have enough wattage for both my CPU and hypothetical GPU but it even only has 2 8 pins for GPUs. Tsk.
What I have heard is that Nvidia wants to keep the naming schemes of the cards simpler this time. Even they are embarrassed by the whole "Super" refresh naming. They had numerous cards with "ti" in the name, it was goofy and confusing.
So I think that might have played a role in the absence of a 3080ti. They may still make one later on, they certainly have room for one.
As for the 3090 being a Titan in all but the name, Huang did pretty much say that. But that raises a question...why not just call this a Titan then? Why change the name at all? Having a regular number still strongly implies this is a gaming card. Practically every gaming site is talking about the 3090, not the 3080, and that is all because of the name. Few gaming sites talked about the Titan RTX as much as the 3090 has been. I don't think it would make any sense for a 3000 card to have TCC features. Some of what I have heard is that developers wanted a larger VRAM card to work with, but it did not need to be a Titan because they did not need those specific features. Another thing I heard is that the 3090 is going to be extremely rare. One big problem is that Samsung 8nm has terrible yields, so getting chips that have enough working cores to pass as a 3090 is hard. That is why the 3080 has about 20% fewer cores as the "flagship" GPU. It is kind of strange that a flagship would have be such a cut down chip, rather than the 3090.
The way he casually announced the 3090 is also a bit odd. It was almost like he didn't want to sell that particular card too much. Most of the video was about the 3080, and indeed, the GPU that Digital Foundry has is a 3080. It is clear the 3080 is what they are pushing. But they still need to beat whatever AMD has coming, too. So the 3090 has to exist in order to keep that lead, even if it is rare and hard to get. So again...that competition thing.
So keep that in mind, too. The 3090 may be really tough to get, and that might result in prices going up on thing. However, to me this seems kind of strange too. Because here is another thing, Nvidia made the Titans themselves. No 3rd party ever sold Titans. However the 3090 is going to be sold by a lot of 3rd parties. If everybody is selling 3090s, then it would make sense to assume that the supply must not be that bad. After all, if supply was that bad, then it would make sense for Nvidia to just keep the 3090 to themselves like they do all Titans. And consider this, EVGA is offering FIVE different models of the 3090, but only 3 models of the 3080. That sure seems strange if the 3090 is so hard to get...why would they have so many different models to offer?
There are a lot of weird things going on right now, that is for sure. We still have some questions.
I think the 12 pin is brilliant. The AIBs need to get over themselves, they just don't want to change. Just look at the 12 pin on the 3080, it takes up so little space on the board. The wire in the cable is also supposed to be a higher gauge.
I mean, we've had 8 pins for so long now. Maybe it is time for a new standard. Just think if the power supply came with a 12 pin. One 12 pin uses a lot less cabling, so if the 12 pin started from the power supply, it would result in a little less clutter in the build. And two of these would certainly be better than 3x8 pins, which is madness. I am not looking forward to running so many cables.
Having said that, I will be looking at EVGA. I want that extended warranty, and the customer service is top notch if it is ever needed. These GPUs are on Samsung 8nm, which is an unproven fab. Word is that yields are bad, which I spoke about above, and that they leak power, which is why Ampere needs so much power. I would bet money that if Ampere was on TSMC that it would not need 3x8 pin. There is a reason that the A100 is built on TSMC, those are serious parts that need to be reliable and be the best. But gamers are getting the 2nd string backup here. So I do have a bad feeling that the failure rate of Ampere is going to be a little high. Thus I want to have peace of mind, and EVGA has those awesome 5 and 10 year warranties. For the 3090, the 5 year would be $30, and the 10 year would be $60. On a $1500 GPU, that's nothing. Other places offer extended warranties as well, so anybody in this thread, I highly recommend you look at buying an extended warranty if you plan on keeping your cards longer than 3 years, the standard for manufacturer warranties. I am not even sure if I will keep mine or not. I have two 1080tis and they are just 3.5 years old now, just barely past their original warranties, and here I am looking to upgrade, LOL.
A German website released information that showed the 2080ti as having a 5% RMA rate. Now this may not mean they failed at 5%, they could have been RMA'd for any number of reasons. But that 5% was still higher than any other Turing GPU. I personally expect the 3090 to have a higher rate than that, possibly as high as 10%.
And one more thing, for those bummed about the 3070 having 8GB, just know that Turing card prices are going to drop like stones. Why keep that 2080ti when a freakin 3070 beats it so easily? The used market is going to be flooded, and there are already reports of 2080tis being sold for as little as $500. I don't know if that is true, but I certainly find it hilarious. I have been telling people to sell their freakin cards before Ampere hit. They just lost a big opportunity. At any rate, their loss could be your gain. If Turing prices do drop like stones, you have the option of snatching them up and using Nvlink on the ones that support it. Of course, then Nvidia might release that 16GB 3070 after you buy two 2070 Supers, LOL.
RTX 3000 series will have SLI support only on 3090 expensive model...
:(
Well here's the thing. The larger VRAM cards were "leaked" some time ago, which means quite a few people including me will be waiting for them. That's not good for NVIDIA if they wait too long as it'll overlap with the AMD launch in November. So I think perhaps they'll be Q4 or Q1 2021. I literally want them to take my money and I'd be getting one when they hit shelves if they hadn't teased a 16Gb 3070 a month ago (not that I'll buy an AMD card as it won't do iRay, obviously, I mean the point in general about competition).
"Having said that, I will be looking at EVGA. I want that extended warranty, and the customer service is top notch if it is ever needed."
Speaking for myself, I plan on only considering EVGA 3090's or 3080's, is there any reason I should consider any other brand?
As far as temperatures and noise levels go, the MSI trio-fan cards have been very good. I'll probably looking at them as a first choise.
..SLI? I thought they moved on to NVLInk. So much for pooling memory.
I think only the 3090s will be NVLink compatable. Personally I'd never need 48GB of VRAM for any of the stuff I do.
3090 woops 3080 performance
https://blog.irayrender.com/post/628125542083854336/after-yesterdays-announcement-of-the-new-geforce
That's what AMD's StoreMI is I'm pretty certain. And Intel has similar but I forget what they call it. Important thing is they finally had to go with starting to add enough VRAM on consumer level GPUs although more is still needed it's a great start.
So I will mostly likely buy the 3070 16GB when & if it comes out or a AMD Big Navi with 16GB or more RAM if the Big Navi has close or better number of compute units and all that specialized mumbo jumbo as the 3070 has. It looks very good now to quickly use all those models I've accumulated from DAZ 3D. Days of rendering for one 4K image (in DAZ Studio or Blender) are gone!
Looks like that the 30xx series has about the double iray speed than the 20xx series:
https://blog.irayrender.com/post/628125542083854336/after-yesterdays-announcement-of-the-new-geforce
The last presentation (20 series) was equally shaded; my opinion is misleading.
Surprisingly cheap, but... If there are no affordable 12-16GB models coming, I'd rather get an other 2070 Super + NVLink (=16GB VRAM).
Speed is no longer an issue but the the bloated models, which lead to lack of VRAM.
Here's an article about the listings of 'Founders' and Custom models that went up over at Uverclockers..co.uk
https://wccftech.com/nvidia-geforce-rtx-3090-rtx-3080-rtx-3070-custom-model-prices-unveiled/
They are listing AORUS separately from Gigabyte, but of course AORUS is just a sub-brand name for Gigabyte. Asus and MSI are also featured, as is Inno3D. No word on how long we'll have to wait on the custom models, at least in the UK per the article...
If I wasn't struggling a bit with 11GB of VRAM, the 3080 would almost be a no brainer at those prices, but yeah I hit the CPU only wall enough in the renders that I do where it becomes a significant factor.
This all looks promising, but at the end of the day it's all pretty meaningless (outside of the increased vram) until we see numbers in Daz. The majority of these features always end up being used in video games or high end animation and it never makes its way into our toys.
I could be horribly mistaken, but I have seen any of the goodies we got with the 20X0 series made available for us in Daz, and a bit doubtful we'll see it with the 30X0 series.I hope to be proven horribly wrong for my own benefit
I just hope people learn the mistake from the 20X0 launch and don't go running out to buy a card day 1 and then spent the next few months bitching about not being able to use it in Daz
Is just the listings are using stock photos or that AIB cards are very similar in size for 3080 and 3090? The Nvidia 3090 card is monstrous in size compared to 3080.
Also, all the 3080 are 10GB. We might have to wait for Micron to launch 16Gbit GDDR6x before we can buy the 20GB/16GB cards.
Bloated: in what way?
Do you mean they are providing high resolution textures you have no need for, but others (myself included) want?
Or geometry?
Nvidia officially stated that geometry doesn't have much affect - and personal experience is it doesn't take up much room untill one starts to add shedloads of subD or equivalent.
Apparently they needed to move a lot of air across the GPU and VRAM on the 3090s, hence their extra bulk.
The theory is that we won't see the rumored 20GB 2080 models until next year, after AMD launches 'Big Navi'. I'd expect that the 'Ti' branding will go along with the 20 GB cards, and also the rumored 16GB 3070 models, but of course until they actually show up it's just wishful thinking/rumors.
To be honest, I'll be seriously looking at the 3090's as more VRAM is important to me, but I won't pull the trigger on those until we get non-beta Daz Studio support for those cards. That's why I went with the 1080 Ti in my system, at the time of purchase RTX support for the 20xx series cards was still in the beta stages in DS.
The Titans have always been very big cards. The 3090 is not much larger than the RTX Titan.
There are no 20Gb/16Gb cards until Nvidia says there are. They have to announce cards like that and make reference cards available to the AIB's before they can make them. None are announced so none will be made.
I think those listings are just using the same stock 3090 photos for both 3080 and 3090. The 3080 cards have NVlink which it does not.