Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
No affiliation with Daz whatsoever - just a freelance audio-visual artist with a deep-seeded interest in the engineering side of things (and a background in data collection/analysis.)
TCC is short for "Tesla Compute Cluster." One of the major value adds of both the Quadro and Titan lineups is that - unlike either GeForce or Tesla cards which have fixed driver mode support - they allow the user to select the driver mode (TCC for dedicated compute workloads with no OS overhead, WDDM for generic system and gaming use) on the fly to fit their current needs. Buying an Nvida GPU with switchable WDDM/TCC driver mode support is the same thing as buying both a GeForce and a Tesla GPU (since the only thing that differentiates them is a lack of physical display outputs on Tesla models - all PCI-E based GPU models from a given Nvidia generation are built around the same GPU die hardware family regardless of Tesla/Quadro/Titan/GeForce packaging.) Hence why spending money on a "Titan class" card without key "Titan class" card software features is a fraught concept.
Having a 20GB+ framebuffer doesn't make a GPU a "Titan class" card. If that were the case, then the Titan X/Xp/etc would no longer qualify as "Titan class" cards. What makes a GPU "Titan class" is that it can do Titan things - ie. it has Titan class driver support. Sans Titan series driver features, the RTX 3090 is effectively nothing more than a 3080 Ti with an almost uselessly large amount of additional VRAM. Not cost effective for gaming, and not cost effective for creative workloads either. Effectively a useless luxury expense (relatively speaking.)
Although I think you are a little harsh, I'm inclined to agree. My understanding was the 3090 was a Titan replacement; I got that from the presentation. The fact it has game ready drivers only makes it wrong to compare it with the Titan, which was done. As you say, it's a game card with loads of RAM; useful for us folks, but when one is loosing some RAM because windows reserves it, then it's annoying.
It is huge, and a disgusting power hog. I was lined up to get one, but now, I'm going to wait and see. There is lots to like about it; there are also aspects to dislike.
Fwiw the Game Ready drivers thing is a non-issue. Much like how all Tesla/Quadro/Titan/GeForce are really just a single set of GPU dies with different support hardware configurations built aronud them, all Nvidia driver releases are really just a single driver codebase with selected software optimizations (depending on the release channel) added into the overall payload. And since driver modes (WDDM/TCC) are a part of the core functionality of that driver codebase, it makes no difference which driver release channel you are on when it comes to accessing things llike TCC functionality. As long as your card's model ID matches the current list of TCC-allowed devices stored in the driver itself, it will work. And work equally well too regardless of whether it be the Game Ready/Studio/etc driver package you have installed.
TCC isn't the only thing turned off on the 3090, if it is turned off. The benchmarks had the 3090 performing worse than the RTX Titan in a number of professional level tests. Those may not matter to iRay but if you do anything else compute oriented with the card it likely will.
(Relatively) major NEWSFLASH for any current/prospective non-Founders Edition RTX 3080 owners out there. It seems that many of Nvidia's AIB partners have taken it upon themselves to skimp on the power filtration circuitry on the back of the PCB behind the GPU die in such a way that standard boost behavior can lead to crashes under particularly intensive workloads.
Prior experience indicates that Iray rendering will almost certainly be one of these workloads. So be advised.
Makes me glad I wasn't considering any but the FE edition, and have decided to wait now anyway.
iRay is a pretty light workload. It uses a relatively small part of the GPU.
Yeah so 3080 20GB really seems to be happening at some point.
https://www.techspot.com/news/86869-another-nvidia-aib-partner-reveals-rtx-3080-20gb.html
More test results
I think this will also be the case for AIB 3090's as well, more so those AIB's that cheap out on that circuitry.. But watching this now I am also on the glad that I am waiting bandwagon, and also more than likely will get an FE edition card too because at least the likelyhood of Nvidia skimping out is low..
This isn't as big a deal as you guys are making it out to be. If you're planning on overclocking the card this would be a concern, Jayz2cents channel is an overclocking channel. If you're terribly concerned wait for the 2nd wave of AIB cards. The FE cards have their own serious design flaws and really shouldn't be bought, Those 12 pins have horrific power delivery particularly on the 3090 where they completely fail under boost.
Guys, when it comes to streaming video games, things are very different than streaming youtube. Youtube only needs decent enough download speed to be fine. But with gaming, the latency of the connection is absolutely vital to a having a decent experience.
When you play a game, you need to not only receive the data, but also SEND your controller commands to the server, and that server then sends that data back to you. This has to happen as instantly as possible, or the entire thing falls apart. You can have a 100 mbs connection, or a 300, or even gigabit speed...it doesn't matter if the latency is high or inconsistent. And even if you do have super low latency and ping, you could still be screwed if your are physically far from the server where your game is located, because the laws of physics still apply here. Data can only travel so fast over distance, so any games that rely on lightning fast reflexes will suffer.
Consistency is the other factor. If your internet has brief spikes of latency, even if it less than a second, this is enough to break the experience. If youtube buffers for a second, no big deal, but if your game buffers or lags for a second...you could die (in the game peeps, not a cheap horror movie or anime). Your internet has to be fast and consistently fast all the time. And you have to be close enough to the server on top of all this.
We are not done. You also need good performing networking in your home and devices to support this, too. If you have an older router, you need to upgrade.
So you have to have all of these elements into place before streaming games can work. Many people do, but a lot of people do not as well. It really depends on where you live. You might be in a big city but still have issues.
However, there is no doubt that streaming games is going to be a big business. This is all still pretty new, in spite of Nvidia doing it for years. Geforce Now has been in beta all that time. It only officially launched this past year. Stadia flopped because Google really screwed up. There are many reasons why it flopped, there really is not a singular one.
Stadia's pricing was awful. Buying a game that you can only stream was an instant turn off. If you can buy a game, then it should be available to a local device, period, or a least a lot people believe so. Imagine if you had to buy movies to watch them on Netflix! They would never have survived. Other people balked at Stadia being a Google product, siting how Google has killed so many services over the years (there is a site dedicated to the graves of all these services, and there are hundreds of dead services). Thus these questions linger over Stadia: "How long will Google support it?" "What if they stop Stadia?" "Do you lose all the games you bought if they stop Stadia?" Google brought this on themselves, no matter how much they reassure people that they will support Stadia.
So then you already have a small group of people who even wanted to try Stadia. Well you better convince them how awesome it is, right? Except Google failed spectacularly here. First they botched the launch. Some Stadia controllers (oh, did I mention that Stadia required its own controller, you could not use your own) did not come with the code to use Stadia. Um...what? Then some people had performance issues, like the lag I mentioned above. Then the game selection was tiny, it still is. Some games did not perform at expected levels, like Destiny 2 was not playing at 4k in spite of Stadia promising everything being 4k. I could go on and on about Stadia's issues, but it should be clear that Stadia has not done well for many different reasons.
Stadia certainly soured people.
The business plan behind Geforce Now is much better. You can stream the games you already own from Steam and other stores. So while you do buy the games, unlike Stadia, you actually own the games to play on your local PC. You pay for the ability to stream them anywhere. Sadly GN has its own problem, not all publishers want to play ball. Some publishers do not like the idea that a player can have their game installed on a server besides their own. They want a cut of that action. So some publishers have not allowed GN to run their games, thus you cannot play all your games. And the selection has changed from time to time. This uncertainty is GN's biggest problem.
Microsoft's cloud service and Game Pass is somewhere in the middle of all of these, and is very interesting. Game Pass itself is NOT streaming. You can download the game and play locally. So Game Pass is more of a rental service like Netflix, and you do not have to worry about your internet latency ruining performance, just that you download the game. It does offer frequent discounts and a huge library, which just got a whole lot bigger. Xcloud is a different service, but can be included with Game Pass, but these should not be confused with each other. Xcloud as you might guess is the streaming service. Right now many games are on it, though not all Game Pass games just yet. But the idea here is that between Game Pass and Xcloud, you have all options covered for a pretty low sub fee.
The big advantage to Game Pass is that all MS first party titles are included with it the day they release. So you don't have to buy Gears of War 11 or Halo Super Infinite, those be there ready for you with your sub. Those will also be there for Xcloud on other devices.
At any rate, that covers streaming games in a nut shell. But there is room for both streaming games and playing them locally with powerful hardware. After all, Microsoft sells physical Xboxes at the same time they are offering a streaming service. You do not need both, but you can get both. That Microsoft says this may be the last hardware generation...well we shall see about that.
I do not think PC gaming or gaming on local devices will go away so quickly. Particularly the PC, as many people enjoy tweaking games to their liking. This is not just modding. You might have people who create and host their own servers for multiplayer games which they can set their own rules for, or perhaps their own mods in the server. You can't do that with a streaming service. So I think that there will be far more gamers who want to keep their games locally than that of people who watch movies and TV shows. The demographic itself is different. There will certainly be those who are fine with it, as I said, there is room for both. Gaming is a massive industry that dwarfs even Hollywood.
So while MS talks about this being the last console generation, Nvidia already has plans for next generation Hopper, and AMD has their own thing, too. Obviously this is super early, but I believe that Hopper will be truly stunning, as Nvidia should be moving away from this rather rough Samsung 8nm node back to the comfort of TSMC 5nm. That alone will offer incredible performance gains, plus according to rumors Hopper is planned to be a chiplet design like Ryzen. If they get that working it could lead to wild performance gains as they will no longer be constrained by the difficulty of building huge monolithic chips. So just wait for Hopper!
(That last sentence was a joke.)
Ahh k that is interesting about the 12 pin power plug.. I did see that one person in one video about this, who has a Asus TUF 3080 card that is supposed to have all the better quality components, for the power filtering on the back of the card have issues as well.. But on that the thing is they do not say what they were doing to their system or the card..
Well that's disappointing but I guess that means now I will be getting an FE edition
Or specifically choose a card with at least one of the more expensive components in it. I would.
Because the work around is going to be (probably) a software update to throttle down slightly the cards.
There are others that utilise the more expensive component, but my own opinion is that the FE cards are reasonable value with what appears to be excellent build quality. Opinion is that Nvidia isn't making much from them.
FYI I've had zero issues with my 3080 (MSI Ventus OC). It has a 5 *SP-CAP/1 MLCC array configuration. Also boost clock has never exceeded 1995MHz when I was monitoring.
* SP-CAP (aluminum polymer), not POSCAP (which is a Panasonic trademark for their tantalum polymer chip capacitors). These cards do not use POSCAP's.
1 MLCC is why you have no issues so far. They should be present, but it seems that those trying to cut costs skimped on that more expensive set of components.
I've been reading lots of websites recently, and of course most they talk about is usually just clickbaits and rumours, so I try to take everything with a grain of salt, I still have to admit that this recent launch seems to be really weird. Not a peep from Nvidia themselves of course, but numerous websites still talk about 20Gb versions of 3080 to be launched after Big Navi, and apparently at least couple card manufacturer leaks ( GALAX and Gibabyte ) are preparing for those too. For customers like us, there's a huge difference if card has 10Gb or 20Gb, so it is a big blow if we buy 10Gb card now, and then after few months second card with double the VRAM arrives ( for example https://videocardz.com/newz/confirmed-nvidia-geforce-rtx-3060-ti-launches-after-rtx-3070 ).
Well, and that is not all. Numerous sites also talk about how Nvidia has already prebooked TSMC's 5nm production capasity for next year for their Hopper architecture cards. If Hopper launches 2021 ( https://wccftech.com/nvidias-hopper-architecture-will-be-made-on-tsmcs-5nm-process-launching-2021/ ), what that leaves Ampere? 1 year life span before next-gen arrives does not sound good. Of course nobody really knows anything about Hopper, and it could be that it's not even meant for normal customer use, and it's more like a new Quadro line or something, but if Nvidia really prebooked TSMC's capacity for next year, it must not be cheap so surely they have something cooking. It's just rumours and a big blur of course, but still makes me think about buying a new card even more.
With all these rumours I'm really starting to think that Ampere is just a emergy solution from nVidia, so that AMD won't get the performance crown even for few months, while Hopper is the true next generation card ( maybe with MCM philosophy ). Of course all these new 30xx cards are out of stock, so can't even buy one even if I wanted, but this rumour mill sure makes me think about it twice. Of course if Nvidia releases 20Gb 3080 with reasonable price, I'm very interested, but if Hopper is already around the corner, then....huh, too many rumours this time
Was watching a JayzTwoCents video on thermals testing of a 3080 FE and a AIB 3080.. With all the hubub about the way Nvidia had the fans and so on, the FE card and CPU did run cooler than when he had the AIB card in there..
Weird?
It's crap; folks are claiming its a paper launch, but we've no idea really due to bots (reportedly) being used to purchase 3000 cards.
1 year lifespan? why not good? The card you buy now, won't be suddenly slower when the next new 'thing' comes along. There will always be the next new thing, be it this generation or next.
Take the 2080/ti it actually performed faster during the current 3000 tests than it did on release, so it's always possible that what you buy will get faster too. Some of the reviews specifically hightlighted that increase.
TSMC's 5nm is booked for 2021 by Apple that's confirmed by TSMC so Nvidia cannot be booking it.
Also there are always people talking about the next cards as soon as the new cards launch. Just ignore them. Nvidia is not that stupid.
I thought Apple booked 5nm capacity for 2020, since they are launching tons of new stuff this year ( like at least iPhone and iPad. ). I think next year is going to be a wild one for TSMC, at least if the rumours are true: https://hexus.net/tech/news/industry/142480-list-tsmc-5nm-customers-orders-published/
No they booked it for all of 2021.
https://www.extremetech.com/computing/315186-apple-books-tsmcs-entire-5nm-production-capability
They've booked it for next year at 5nm sure because they have to test run the chips & cards in significant enough quantities to test, re-engineer, and see yields and other problems that might come up. They try to avoid being too wasteful but still they have to test. The next arch GPU I wouldn't expect until 2022 unless they know AMD is forcing their hand in 2021.
What I found interesting was that nVidia target ARM just announced 5nm 192 core ARM CPUs that some are speculating will be used to challenge intel & AMD CPUs. Newflash: Not without the intel/AMD instruction set it won't. The SW won't be there to enable the competition. I'd definately love nVidia offering 192 core CPUs that run Windows 10 and it's programs & apps natively! AMD & intel, both to follow!
Just got a 3080, any idea when support will be added, or are there any test builds out there that support it?
None. No.
The most recent build supports it, but it's not public.
My original plan was that I was going to sell my RTX 2080 and just run an RTX 3090.
I think a better value for me now is that I KEEP my RTX 2080, add an RTX 3080 and a high wattage power supply that can run both those cards and everything else, and I will end up coming out far ahead of just a solitary RTX 3090.
I say this as someone who avoids making scenes that require a lot of memory, so the 24 gigabytes isn't enough reason to pay the best of the best premium.
Gonna wait though for independently conducted iRAY benchmarks, maybe the 3090 will really excel in that.
3080 compute performance:
https://www.thefpsreview.com/2020/09/21/geforce-rtx-3080-fe-gpgpu-compute-workstation-performance/
Thinking about it some more, even if someone had 1,500 to burn and 10gb was enough vram, considering the lackluster improvement in performance between the 3080 and 3090 in context of the doubling of price, is there any reason someone shouldn't just get two rtx 3080's?
It would cost the same amount as a 3090 but ensure way more iray iterations total and a more iterations per dollar.
I'm suspecting that Nvidia got rid of the 3080's nvlink feature because then they'd probably really struggle to move 3090's.