Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
You cannot play much of anything at 8K right now, the GPU itself is the bottleneck there. The 3090 can only play a few games at 8K and that is mostly with DLSS upscaling enabled in them. We are still quite a ways from real 8K, you don't even see many 8K screens for that matter, either.
The real reason they don't want to put more VRAM into their gaming cards is to maintain market segmentation from their (formerly known as) Quadro series. The A5000 only has 24GB of VRAM and the A4000 only has 16GB, so you can be quite sure that they wouldn't want to put 16 in a mid range gaming card. That the 3060 wound up with 12 is a small miracle. We will only see more VRAM from Nvidia if the market changes. Right now Nvidia can sell every single card they produce within seconds. They have absolutely no reason to do more than necessary. If the market was different, then I bet yes, they would likely have increased VRAM. After all, there were plans for 20GB 3080s and 16GB 3070s. In fact these do exist in laptops, and we saw photographs of Nvidia presentations showing the higher capacity models. AMD has 3 tiers of GPUs that all offer 16GB, the competition is there, so these were in the plans...until Nvidia realized they did not need them.
I don't see this changing anytime soon. Now that the 3080ti is out and only has 12GB, there really is no going back and adjusting the product line. There is no chance that Nvidia releases lower tier cards with 16 and 20GB when their new 3080ti only has 12. Unless the market just suddenly bombs, this is what we are stuck with. The 4000 series is a long way off, and it is hard to say how things will even be by then. Will this scalping situation still be in effect in late 2022? I sure hope not, but I just don't see what can change the situation. While Nvidia has nerfed mining on new releases for now, they are STILL profitable to mine on. And there is always the chance that the big mining firms find a way to break the limiter. Crypto has dropped recently, but it is still overall way up compared to just last year, and there are predictions that Bitcoin will break 100K in value. Nothing has really changed to alter the fact that GPUs can still make miners money, so they will still horde them and deprive us small time consumers from inventory.
As long as GPUs are highly profitble for mining, this market will not change. GPU mining has to crash for anything to change.
What comes after terabytes?
Petabytes.
Me, twenty-five years ago: "What comes after gigabyte? Pfft! Like I'd ever need to know."
3080ti with 12GB. Huh. Huge improvement over the 1080ti which had a measly 11GB. (Tongue cramps from jamming into cheek.)
As someone no doubt pointed out, putting foreground and background in two separate renders works if you can get the lighting right in the fg image with all the walls missing, and If you aren't afraid of postwork. For scenes with multiple charcters, I also experimented with creating billboards for background characters/object - It's a little complicateed and is time consumming enough to make a cpu-only render tempting if you have enough cpu cores and ram. Regardless, you just need to make sure you get the perspective and lighting right in the billboard renders so they don't look like life-sized cardboard cut-outs.
Tera rabies?
You need more than you can afford, so just get what you can afford.
I wouldn't consider less than 10-12GB personally, and I use a 3090 for rendering in studio and blender.
You can manage with less, but...
Redacted
A related question: does subD require VRAM or system RAM? I never render HD characters at high subD because of the usual bottlenecks, and I'm wondering whether more VRAM is the answer.
As you predicted, VRAM.
One of the major reasons why I got a 3090 is for the Vram, and yes, SubD uses Vram, my measly 8 Gb couldn't cut it, now I can have my characters at 3-4 with no crappy low poly artifacts!