2022 Nvidia 3070ti vs 2080ti

The 2022 Nvidia 3070ti is rumored to have 16 GB VRAM. I had a 2080ti with 11 GB VRAM until it died recently. Can anyone explain in Daz terms how much more that 5 GB translates into, like three more Genesis 8.1 characters? I wish I could imagine 5 GB VRAM in some understandable measure like that.
Here is the rumor:
https://thinkcomputers.org/nvidia-to-launch-rtx-3090-super-rtx-3070-ti-16gb-and-rtx-2060-12gb-soon/
Post edited by hwgs1971 on
Comments
A 16GB 3070 has been rumoured to be arriving with every new wave, and the Twitter account leaking it this time really does not strike me as reliable. They've got almost no tweets, basically no followers, and if you filter a Google Search for their name to exclude the last couple of months, they're basically unknown.
As great as a 16GB 3070 would be for Iray, Nvidia are still selling 8GB 3070s almost as fast as they can make them, so there's really not much incentive for them to do it. I want it to happen, I really do, but I'd be waiting for a much more credible source before you start getting excited.
~~~~~
As far as how much further that will go? It's very hard to say - it will depend on how the vendor responsible for the character has set up the materials, what clothes and hair are being used (which can easily be more resource load than the character themselves) and indeed how much the end user makes use of optimisation to reduce the load of objects further from the camera.
Functionally, it's about 50% more, and while things like the render canvas are somewhat fixed overheads, "an extra 50%" that gives a rough idea of how much more you could fit into a scene. (Assuming the card actually happens.)
Even if released, good luck getting one. With chip shortages and the crypto mining stupidity, there is little hope at all for normal people to get these high end cards.
I'd always take the card with more RAM; it gives more options for useage.
Simple things: like leaving a render open and running others. Not having to jump through hoops to get a scene to fit is another.
... And then the 3000 series has benefits over 2000, or at least it has features - you decide if they are a benefit.
I suppose if you're determined enough you could camp out on one of these restock alert websites 24/7 just for the chance that the card could become sold-out while in your cart! :P
I really despise miners more than scalpers and hope cryptocurrency crashes, and they lose their shirts!
Also, NVidia/retailers are to blame for selling damn-near pallets filled with GPUs to a single person! <=Not literally of course, but more than 4 at-a-time is too much during an extreme shortage!
This is the very reason why I got the 3090, sure I can load up heavy scenes, but also have the ability to actually use my PC while waiting for a render such as having another instance of DS open!
You guys need to move to this side of the pond
The 3060 12GB's (among the other models) have been ready for immediate delivery since early June from a legitimate dealer. The models vary from one week to another, but there is no need to go hunting for one or camping over night in front of the store.
Speaking of over night... Past midnight already, so...
I hope that Nvidia includes the "low hash rate" feature to make the cards less attractive to crypto miners. Then maybe we'll just have worry about the supply chain issues. ::cross-fingers::
Based on what's been said here, I may just get a 3090 now. I'm no expert, but I've been reading a lot about RAM and power supply and that initially scared me off the 3090. If what I read is true, most prebuilts are underpowered to the point that they cannot use the 3090 to its fullest extent. Like I read you need twice as much RAM as VRAM as a minimum, yet most of the 3090 prebuilts I see have 32 or 16 GB RAM. Same with wattage...I was close to buying one the other day that had 750 watts before I knew that was barely enough. So an uneducated sucker like me could drop $4500 on a prebuilt with the best GPU and end up with disappointing performance. Good points about the 3070ti rumors, and I detest these crypto miners too. What a waste of resources.
Well depending on where you live you might get a open box 3090 at Best Buy for $1759.99
GIGABYTE NVIDIA GeForce RTX 3090 GAMING OC 24GB GDDR6X PCI Express 4.0 Graphics Card GV-N3090GAMING OC-24GD - Best Buy
I agree with you completely... I don't need a faster card, I need a bigger one.
I have seen a lot of people say that the 3090s get their VRAM chips overly hot (I believe they're on both sides of the board in order to fit that much on, so cooling is a challenge) to the point that a lot of people undervolt/clock, and I know at least one person who's elected to create a custom watercooling circuit for the job. It's the one reason I'd actually give some thought to the 3080 Ti, which otherwise seems almost redundant at its price point.
I think it varies with specific board manufacturer, but I'd look it up first.
(As is, I really would love that 16GB 3070 Ti to come out, because it'd get more VRAM than the 3060 or 3080 Ti, but hopefully without the cooling issues of the 3090 and hopefully pretty respectable speed... but as I say, it's not a particularly credible rumour at this point).
IMO, it's often better to build yourself, unfortunately now that might be difficult although if you can secure the GPU you should be able to. Or with the prebuilt, same with anything really you should do your homework. Stick with a quality PSU like EVGA, Corsair, or Seasonic among others and you're GTG. I've been maxing out the VRAM on the 3090 and haven't used up all of my 64 GB of RAM, so IMO 64 GB is enough for a 3090.
It's easy to point fingers, but many months after Nvidia has all the cards LHR it still hasn't made a noticable difference to me in supply.
Also it's not really a waste of resources - that's the propaganda. It's a smaller chunk of resources vs. government backed fiat currency - look at the resources there making it possible. Governments spend trillions on military equipment and an enormous amount of blood shed enforcing it. The banking systems consume even more electricity than bitcoin. That's the high level TLDR without getting political.
Or if you don't want to deal with trying to get a consumer card..... (Scalpers / wait in line all night in front of Best Buy)
You can get a Nvidia RTX A4000 16GB for $1390 (Retail Box) right now. This card gets you RTX 3070 level performance in a single slot form factor with a single 6 pin power connector.
The only potential drawback would be that all of the video outputs are DisplayPort. This means that you will need one of the more expensive DisplayPort to HDMI adapters if you want 4K HDR over HDMI for your TV
Because I'm a nice guy, here are a few links. This is the retailer that I purchased my A4000 from.
https://www.dihuni.com/product/nvidia-rtx-a4000-pny-vcnrtxa4000-pb-16gb-gddr6-pci-e-4-0-gpu/ They have 46 in stock as of posting this.
ps... You can also get the RTX A5000 24GB card from the same retailer for $2675 (RTX 3080 performance, 24GB VRAM, Dual Slot, Single 8 pin power connector and NVLINK)
https://www.dihuni.com/product/nvidia-rtx-a5000-pny-vcnrtxa5000-pb-24gb-gddr6-pci-e-4-0-gpu/ They have 11 in stock as of posting this.
And here is the newegg link to the DisplayPort to HDMI adapter that I am using on my RTX A5000 to connect to my Sony OLED TV @ 4K (Full range RGB) with HDR
https://www.newegg.com/club3d-cac-1085-displayport-to-hdmi/p/N82E16812443059
That's because the LHR version of the RTX 3080 still mines at the LHR speed faster and uses less power than the GTX 1080ti. And then to top it off the LHR feature has been effectively defeated via a partial unlock combined with mining two cryptos at the same time.
Same.
RAM is king; I hate expensive paper-weights sitting in my PC.
I should probably add an addendum to the above, when consider cards specs (RAM or anything else) I'm considering a comparrison within the same generation.
I always thought the 2000 series were over-priced; 3090 is (IMO) the only almost reasonable priced card Nvidia has produced in 10 or more years.
So I searched it online - reported that 25% cards sold went to crypto. It's a significant number, but not enough where you can't get one. Also, it's hard to get PS 5 and Xbox X and you can't mine with those. It's largely due to a massive increase in demand, and the supply chain/chip shortages.
On the other hand, I've damn near maxed out 64 GB of RAM when trying to use the 12 GBs of VRAM on my 3060, with DS alone peaking at 58GB and leaving the rest of the system having to make do with leftovers - so I'm really glad I didn't opt to build this as a 32 GB system.
I've also definitely heard reports of people crashing 64GB systems before they maxed out a 3090. It's going to depend a lot on the exact make-up of what's in your scene, but personally I don't think it's that ridiculous to spec the system RAM at four times your VRAM, rather than the common suggestion of twice, particularly if you want the system to be able to do anything else at the same time.
Most of the testing I've seen shows that it's underclocked enough to be closer to the 3060 Ti than the 3070.
Note also that resale value may not be as dependable as the more widely in demand gaming cards, should you be one of the types who likes to offset the cost of your next upgrade.
Agreed, I think the 2000 series was not a good deal, price/performance was terrible compared to 1000 series. And no more VRAM the 1000 series. 3090 finally VRAM!!!
I have a 2070 Super (8GB) with 64GB's of RAM and it isn't that uncommon to go over 32GB's of RAM while the IRAY rendering is still done by the GPU, that's more than 4 times more RAM than VRAM as only about 5.5GB's of VRAM can be used for rendering due to the VRAM baseload by the OS, DS and the loaded scene.
How much RAM is needed could depend on the ratio geometry vs textures, my scenes usually require some 5-10 times more VRAM for textures than geometry.
Ironically, since the near-entirety of the GPU buying public and tech reviewers are harping on and on about the 3080 is the best deal going, I also think the 3090 is a great upgrade/price from the 24Gb titan as it's $1k less than that card, I was actually considering the 3080, until I found out it wasn't going to be 12 Gb, thank F*** I got the 3090 instaed!
...the A4000 is the only 16 GB card available from Nvidia. It's single slot and has a TDP of ony 160W (less than the old 1 GB GTX 460 I had). Being single slot it is not NVLink compatible. Given the current prices I've seen on the10 GB 3080, 12 GB 3080Ti, and even 12 GB 3060, the A4000 appears to be a pretty good buy.
That is actually a great price on the A5000.which has an MSRP of 2,250$
The Pro cards do not seem to experience the insane markups of the consumer ones likely in part because of their already high initial cost and more specialised use.
Have you experienced burn-in with your OLED? I've had my eye on one for a year but am scared by all the burn-in rumors.
I would definately buy the RTX 3090 24GB were is not for the coming RTX 4090 (mystery size)GB that will be almost certainly priced similarly, and as I'm betting I can show up at Best Buy at 5:00 or 6:00 in the morning on release day, be given a ticket, and get it MSRP. That means I'll buy one of the 3060 TI, 3070, or 3080 this fall or winter until that card gets released. Barring some really stunning technological advancement after that, it may even be the last video card I even need, as opposed to want, to buy.
I've had mine for close to a year, and I have left the computer desktop up for extended periods of time. No burn in yet. (And the picture on it is amazing!)
He needs acting lessons, but the blue blew suit suits him.