Thoughts on the 3090
![hwgs1971](https://secure.gravatar.com/avatar/903a122baec16f9f028d7658def50ea8?&r=pg&s=100&d=https%3A%2F%2Fvanillicon.com%2F903a122baec16f9f028d7658def50ea8_100.png)
I've been on the fence about upgrading my 2080ti to a 3090 for the additional VRAM. Then my Corsair One died. Amazingly, Best Buy replaced the full initial purchase price including tax, because I'd purchased the extended warranty. Financially, it was essentially like I leased that computer for 20 months for $300 since all I lost was the price of the extended warranty. So I was able to use the proceeds toward a new machine which arrived this week. CLX RTX 3090 + Intel i9 10900 + 64 GB. If anyone else has been agonizing like me about whether to upgrade from a 2080ti to a 3090, here are my thoughts after a few days. The VRAM is great. I loaded up scenes with what I thought would push the 3090 to its limit and it wasn't even half used. The tower is still cool to the touch. That is impressive. The speed was a bit of a letdown. I'm not going to give objective speed test metrics. Plenty of others have done that. My subjective observation is it seems slightly faster than the old 2080ti + i9900 + 32 GB rig, but in the real world sitting at my desk it is barely noticeable. So the bottom line is if you're on the fence, get the 3090 if you want VRAM, but if you want speed wait for the 40XX series.
Comments
I wouldn't think that rendering you'd see much speed increase. Gaming maybe, but rendering? Hmmm...I really don't know.
My husband and I have been going around for a week because he doesn't understand vram, Cuda or any of it. I told him I'm not much after a speed increase as much as I am being able to fit an entire scene on my gpu. LOL THAT would great if I never had to worry about falling back to cpu on a render, more than a speed increase while the gpu is rendering as I find it renders fast enough for me ;).
Laurie
I upgraded to a 3090 when I built my new rig and don't regret it one bit (previously had a 1080TI). Being able to fit complex scenes onto the card, and not have my card choke at trying to render dForce hair, is totally a boon.
Can someone clarify the two rams for me. I know vram is the ram you get on a graphics card. I know cpu is the processor?, so the ram you have on your computer, is what cpu is using instead of the vram. I have an nvidia 650 that only has 1 gig of ram, however, I do have 32 gigs of computer ram. I'm not that interested in iray,except in certain cirucmstances, so I should be fine if I'm understanding this right.
just checked the current prices for a 24GB 3090 at the local computer emporium - $2899 Cdn.. But for $3124. Cdn you can get this intersting bit of kit.. AORUS RTX 3090 GAMING BOX external 3090 via Thunderbolt 3. Sadly, not in my price range... ever..
https://www.gigabyte.com/Graphics-Card/GV-N3090IXEB-24GD#kf
...would love that 24 GB but I'm just happy I was able to get my hands on a 3060 XC at a very reasonable price direct from EVGA on the other day. Couldn't really afford a 3090 even at MSRP anyway (and would probably need a heftier PSU). 12 GB is still a pretty decent amount of VRAM and it has about 50% better performance than my Titan X.
I also don't regret upgrading my Titan X Pascal to a 3090 FE. The increase in VRAM allows me to run more complex scenes with no issues (already having 64 GB of RAM meant I rarely had a problem with scene size, just whether it would stay on the GPU or not). The render speed increase has been the biggest benefit for me--GTX cards are much slower since they have to rely on special IRAY code to emulate the ray tracing features of a RTX card, and that code also took up 2-3 GB of my Titan X VRAM, which further limited the scene sizes I could run and stay on the GPU. I have seen an average reduction in render times on my scenes of 4-5x with the new card, and the card runs around 60 C during rendering, which is no problem for my system to handle. Only downside is the price of the card, but fortunately I had been saving some cash towards the upgrade for over a year and had the money when a card became available to buy.
I thought the rendering benchmark threads indicated the 3090 was around 2x faster than a 2080 Ti, so even upgrading from last gen RTX to current gen RTX should see speed improvements, although not as dramatic as it is going from earlier generation NVidia cards.
My experience in going from a 2080ti to a 3090 was that it generally halved the render times. I agree that not having to be concerned with running out of vram is great.
...exactly why the 3060 XC interested me, as it pretty much has the same specs as the Titan-X, but with the addition of RT/Tensor compute cores faster VRAM and processor. This way that VRAM drain is no longer an issue (I was rendering my scenes in Daz 4.10 as it predated the release of the RTX series so I didn't have to deal with the ray tracing emulation taking up extra VRAM).
Still looking at a system upgrade (new CPU, MB, and system memory to make it "W11 ready").
The new Intel 12th generation CPUs (Alderlake) are going launch tomorrow 11/4/2021, along with some newer generation hardware.: PCIe 4 compliant PSUs to handle newer gens GPUs, DDR5 Memory @ 5200 Mhz. and higher. All those combination is suppose to give you a overall performance boost of 20% or more.
If you're in the market to build a new system, I would suggest to wait till all the data / availablity, and pricing are confirmed or stabilized. . You could future proof your rig with the newest hardware. or you could taked advantage of lowered sale prices of older hardware.
I'm not sure what kind of a speed increase you were expecting, but the 3090 has more than twice the number of CUDA cores(10,496) and faster memory(GDDR6X) compared to the 2080 Ti, so I would expect there to be a very noticeable difference in rendering times. If you're expecting blazing speed where everything renders in a few minutes, its pretty much all about CUDA cores, in which case you would need a workstation/server class system that supports beyond 256 Gb RAM & has a sufficient number of PCIe slots for several RTX graphics cards. If I were going this route, I would be filling the system with RTX A5000 cards instead due to the 3000-series cards requiring higher wattages & generating a lot more heat. Of course, you could use 3000-series cards, but expect to spend a lot more for custom water-cooling & extra maintenance. I should also point out that with 64 Gb RAM & RTX 3090, there will be times when you will run out of system RAM before you use up all of the VRAM. You should be fine, but if you ever intend to push VRAM usage, you should be aware of that.
As for the 4000-series cards, you're going to be waiting for quite some time, at least another year, and then waiting on compatibility updates for iray/Daz. Given the pattern, I don't want to think about the power requirements of these cards & the possible need to start running these systems plugged into a range or dryer outlet...
Those are still for 'Consumer' platform, nothing to get exited about. HEDT versions maybe coming in a year or so.
...as I understand DDR5 will be a tad bit on the expensive side. I'm still on DDR3 so moving to 4 is already an upgrade. From what was published in PC Magazine 32 GB will cost about as much as 64 GB of DDR4. Not sure if the boost in memory speed really necessary to support GPU rendering as the card does all the "heavy lifting"..
Those Alder lake CPUs are configured rather differently. Instead of the usual "n" cores = "n" x 2 threads, they involve something called "Performance" and "Efficient" cores and the total thread count isn't exactly twice the core count anymore. For example the i7-12700KF has 12 total cores but only 20 instead of 24 threads and only supports two memory channels (even my old DDR3 system supports 3). They also seem as if they'll run on the hot side (the i7 above has a base TDP of 125w and 190w in turbo mode) so more expensive liquid cooling would be required.
Also wondering about that "backwards" move in memory channels to 2 with pretty much all the latest Intel CPUs (including some Xeons) as that would limit rather than enhance bandwidth. The forthcoming Zen 4 Threadrippers will be increasing bandwidth from 4 to 8 memory channels while Intel is going the other way.
Just wonder, where you see the lowered prices of older hardware?
I am also exited about new Intel CPU, but a bit worry about its power requirements.
DDR 5 prices are almost twice as much than DDR 4, so I will probably skip it and use DDR 4 instead.
RAM optimized for video adapters is called VRAM. These chips have two ports so that video data can be written to chips at the same time the video adapter regularly reads the memory to refresh the monitor’s current display.
RAM is utilized in the computer as a scratchpad, buffer, and main memory. It offers a fast operating speed. It is also popular for its compatibility It offers low power dissipation
Totally agree, I recently got a 3060 and retired my 1070ti, and not having to worry about running out of vram is awesome.. And I can finally use the dforce hair products I have, although having two in scene can get interesting as they gobble up the vram.. lol And lastly having scenes render super fast is great..![smiley smiley](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/regular_smile.png)
And here I am still plodding along with my dual 2080ti rig. I'd love a 3090 as it would increase my VRAM and reduce my power consumption just a tad. But I've only once run out of VRAM and that was a scene without any humans in it.
its funny, but when the term 3090 is mentioned, I always think first - Hey I know those.. I
BM 3090 complex...
The funny bit is that the 3090 needed a 3089 Power Unit, which provided 3 phase AC power at 400Hz (just as hungry as the Nvidia 3090?) :) ) and the IBM 3097 power and coolant distribution unit (water cooling!)
Ah the good old days!
...hmm I'd need a bigger flat for that one.
But, still you would probably not have any problems with the place being too cold in the winter![wink wink](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/wink_smile.png)
That was the 1st computer I was made to program on. Or maybe it was the IBM 3060 I think instead. Punch cards, assembly language, JCL, paper print outs with the results, and then come back in a day or two after it was scheduled and ran to see where we messed up. Good thing the problems were tedium problems rather than anything actually difficult.
According to the rumours, Nvidia should be announcing the RTX 30 Super lineup in December and they should hit the stores in January... The 16GB 3070 sounds tempting.
Yeah, I'm strongly tempted to get one of those. I'd be upgrading from a 1060 card, which has 6gigs VRAM, so the improvement would be dramatic. I am wondering what the speed increase would be from my 1060 to that 3070 16gig model. I'm surmising that a render that takes 2 hours now, and still comes out the other end a mite grainy would possibly be completed in.... what... 15 minutes? 25 minutes? Even if it reduced the render time down to half (60 minutes), it would be worth the money, but I'm expecting a more dramatic change than that.
I don't think the 16GB 3070 actually exists - A modder made one and it works, but NVidia is not going to make one. They apparently planned to make one, but they killed it off along with a 20GB 3080. The actual 3080Ti has only 12GB VRAM. I suspect the only way to get more VRAM on the card is to buy the A-series (Quadro replacement) cards.
The 3090 has 24GB, doesn't it?
Correct.
That's true, Richard, but that's part of my point. The 3090 is already out. It has 24GB. The 12GB 3060 is sort of a weirdo card. It has as much or more VRAM than all existing 30xx cards except the 3090, but it also has a reduced 192-bit memory bus compared to 256 bits for other 30xx series cards. I seriously doubt we're getting 16GB 3070 or 20GB 3080 UNLESS an updated/upgraded 3090 emerges that exceeds 24GB. That would be great, but the 30XX Ti versions we already received have far less than 24GB RAM. Why would the Super versions get more VRAM?
From what I can see, less than half of games can address or use more then 8GB VRAM. So, there isn't really much incentive to put big VRAM on the 3070 or 3080. And that brings us to the A-series. I think those have been discussed many times before. They're not really for gaming, they have 16-48Gb of VRAM, and they're more expensive (MSRP, if not street price) than their 30xx counterparts. If you really want more VRAM for extensive IRAY and PBR rendering, those may be the cards to get.
Just a note about the physical size of the 3090 - I don't know if anyone else has had my experience.
I have a 3090 from PNY and it is a beast with triple fans. I have had problems when I move my destop slightly or even just disturb the cables at the back ... these movements cause the screen to go black. I have opened the case and there is a lot of weight pulling on the edge connectors for that card. I have now found a little plastic tube which I have placed under one edge of the card with the bottom of the tube resting on a solid part of the case interior. This gives a little more support to the GPU and, so far, no black screens.
Amazon just sent me a email suggestion to buy a 3090 due to my search history. A steal of a deal at 15 grand........
Do you mean GPU sag? There is this free fix from JayzTwoCents, which seem spretty cool -
https://www.youtube.com/watch?v=liChy76nQJ4
My 3090, the EVGA FTW3 Ultra, is actually a bit smaller in terms of width and depth than my old 1080TI FTW3...it's just taller (3-slot card). (In the realm of things, it's really not a lot smaller, we're talking millimeters.) I have it mounted vertically, however, which alleviates any issues with size. It also takes away a lot of the tension you'd normally see with the cables. I have a Lian Li o11D case and am using the LINKUP vertical mount. I also replaced the rigid cables that come with the card for the softer and more flexible AsiaHorse extenders.
I jumped from the notorious GTX 970 (you know the 3.5 GB VRAM advertised as 4GB) to the RTX 3090... I had to sell one kidney and one lung but I'm happy with my purchase. Also, I got a Gigabyte so I had to remove the door from my apartment so it could fit in.![cheeky cheeky](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/tongue_smile.png)