Video card (and other) recommendations needed, new desktop coming up
Since my old computer (2012! When I started using DS and joined this place!) is finally reaching the point where it is not any fun anymore, since software requirements have gone up and I don't see a point in trying to squish more performance out of it, I'm getting a new computer.
Since my primary use will be for 3D working (Daz Studio, Marvelous Designer, 3D Coat, Hexagon; possibly Blender too), followed by Photoshop, streaming in browser and playing couple of browser games, I need tiny bit of help.
Current specs planned:
- AMD, some higher end Ryzen + AMD MB (these details I will squeeze out of my regular computer crack dealer, but if there are any known issues with these things, let me know.)
- Couple of RAID1'd HDs (possibly like... 4 beefy bois); not SSDs, tho.
- 64 GB RAM (will this be enough these days? I've usually managed OK with 32 GB, but recently DS has started to slurp me dry, down to 200 MB or so while rendering.)
- Various usual stuff like Bluray and transferring the old DVD-RW from the older computer, still works great.
- Since I'm not some kind of hifi sound maniac, sound card embedded on MB is enough.
- The estimated usage out of the system is 6 years, but seeing how well mine has performed last 11, 10 years is not out of question.
- Windows 10 or 11, of the most "user is allowed to change everything as much as they want" variety.
- New monitor, single.
And finally, some sort of NVidia 30* or 40* RTX card.
- Start with this assumption when you recommend stuff to me:
- I intend to use this card for the next five years or so before I swap to a new one.
- I also tend to render scenes that have a lot of non-instanced stuff (i.e. loads of props packed to a point of view), with intense texturing and/or fur or similar effects.
- My preferred light is from HDRis.
- My budget for the card is roughly ~1500-2000 €/$. (The rest of the computer has its own budget.)
- Obviously it needs VRAM a lot. But how much? More than 16 GB?
- Recommendations for CUDA cores?
- Recommendations for *handwave* anything?
Note that I don't do much gaming on my desktop and whatever gaming I do, is not heavy (DOSBox games at this point, LOL), I do not need to worry about game compatibilities or such. Daz Studio + Firefox + other listed things are my primary uses.
My most awesome thanks, my fellow nerds! Sorry that I haven't been around much! (Edited for further details.)
Comments
I think your budget is a bit low for what you want, I'm assuming that you are amongst us on the european side of the pond.
If I were to start to collect components for a new rig, 1500-2000 eur would be gone after getting just the motherboard, processor and RAM
Oh no no no, 1500-2000 is for the GPU alone. The rest has its own budget. And yes, this would be in Finland, mooching up my fave computer crack dealer who gives me a bit of discount every time I show up, since I've referred so many clients their way.
Ok, for what you describe the RTX 3090 would be perfect, but they have become quite hard to find... Verkkokauppa (dot) com still lists one in outlet at 1395eur, but they don't tell the location and it can't be ordered online. The RTX 4090 is priced at 2360 - 2550 eur. The only manufacturer I would accept is Asus at these prices.
Yep, the problem just is that I know I'd like RTX4090 but the price is ridic and accessibility is bad; my contact for hardware orders from big importers, so I have some room to ask around. I'm mostly interested about user experiences re: VRAM and need for CUDA cores, since more you need them, the price goes up. I'm trying to find some sort of balance in order to have non-frustrating experience in rendering. (I got ASUS 980TI right now, 6 GB VRAM; for a whole lot of time it tends to run out of it these days.)
I upgraded my RTX 2070 Super 8GB to RTX 3060 12GB last summer because the 8GB VRAM started getting too little, haven't ran into problems so far with 12GB's. The attached pic was rendered with Iray in 3 minutes 20 seconds and it took about 4GB's of VRAM, which would be pushing the limits of 8GB card on W10.
The next stage above RTX 3060 12GB would be RTX 3080, which would give faster speed, but when selecting the GPU, the amount of VRAM is by far the most important thing - If the scene doesn't fit the VRAM, the GPU is just as useful as any generic display adapter that you can buy with pocket lint.
There is just one benchmark for Iray rendering and it's here;
https://www.daz3d.com/forums/discussion/341041/daz-studio-iray-rendering-hardware-benchmarking/p1
The newer GPU's can be found towards the end.
Hey, AWESOME! Thanks!
Actually I myself am looking at building a new machine this year. I want it to primarily be for gaming but also use it for rendering.
Given I am currently using Intel and have no issues, are there any hiccups to expect when swicthing to AMD processors with DAZ?
Any noticable DAZ difference between X3D and standard X Processors (5800 X3D vs 5800X is the only avalible now but I am waiting for new X3D release for Ryzen 7000 series)?
Well I can tell you having 24 GB of VRAM is awesome. If you want something to last you 6+ years given your budget, I'd say go for that.
I recently picked up the MSI Suprim 4090 water cooled. In Daz from a few scenes I tested with it's almost as fast as two 3090s. If you can swing a 4090, go for it. Otherwise look around more for a 3090, they seem to still be pretty common here in the US. Maybe someone will ship to you?
Too bad you're in Europe, I still haven't sold the 3090 I pulled from my system.
All right. Been to couple of places in my city, asked for some build drafts and price checks; one (my regular) place might have RTX 3090 in their claws.
Definitely prioritize VRAM. I went from a 2070 super to a 3060 and the extra 4gb makes a huge difference
In practice going from 8GB's to 12GB's, doubles the available VRAM for Iray rendering, because of the baseloads by W10, DS, the scene and the needed 'Working Space' by the process
...from what I understand Windows 11 Professional does give the user more ability to dump features that are not wanted or needed But only the Professional Edition allows this..
W10 is also just had it's last full update before it goes EOL two years from now.
I currently have an upgrade planned to move to W11 as I'm still on 7 which has pretty much been abandoned by most software developers,(including Nvidia, I cannot even use the last two beta releases of 4.21 as they require a driver that doesn't support W7) as well as Google.
I Am looking at a 5900X 64 GB of memory ( currently only have 24 - yeah that's how old my rig is) and a new MB to support it. I plan to use the same drives PSU, and case. I already have a 3060 nut discovered my MB's old BIOS isn't compatible with it so it's just sitting in the box.
The fun part will be trying to come up with the 939$ (for all the parts including a beefier CPU cooler and a W11 Pro OEM) as I'm on a meagre fixed pension.
Yeah, it isn't cheap in any way, but like I said, I'm planning this as next 6 to 10 year investment, with added caution of "Microsoft's OS shenanigans" (followed by "OK, I'll use Linux and WINE then") and other potential developments. In my case, there was a really lucky happening in online casino (clearly my grandmother gave her uncanny luck to my Significant Otter as her heritage to him), and so I'm allowed to splurge a bit, seeing how the old system is kiiiinda coming apart.
...nice.
I occasionally play our state's Megabucks and actually would be happy with just 5 out of 6 as that would be sufficient (around 2,500$). as it doesn't exceed the 5,000$ limit which would require paying federal taxes (only 8% an state withholding would be taken which is about 200$)..
How restricted is your VRAM now? How large do you want to go? It is a difficult question to answer because everybody uses this stuff differently. While it is true assets are using more data, the main thing comes down to how many characters you want in any scene, and how large your render are. Rendering the same scene in 4K uses a lot more VRAM than say 1080p.
Obviously, having more VRAM is almost always better, as it gives you options. You can be lazy with scene optimizations, which I admit I have become less inclined to optimize stuff now that I have a 3090. I have never ran out, but I am not trying to recreate battles from Lord of the Rings, either.
The problem is that Nvidia has been real stingy with VRAM. There are only a few choices for consumers that have over 12GB. The 3090, 4080, and 4090. That's it. The 3060 has 12GB. There is a 3080 12GB and a 3080ti with 12GB, and the new 4070ti has 12GB.
That is a short list. I would not go for any 8GB card at this time. The 3060 is easily the cheapest option to get 12GB. It is the slowest. But it is still MUCH faster than your 980ti. The 3060 is going to be roughly 3 to 4 times faster than that. So still a huge upgrade if money is tight. A 3080ti would be a fair bit faster. The 3090 is faster still, but not drastically faster than the 3080ti. The main reason to stretch for a 3090 over a 3080 is obvious, it has 24GB of VRAM. If you don't need that VRAM, then you don't need the 3090.
The 4070ti seems to be about as fast as the 3080ti, so the price of each could be a factor. The 4080 is again, faster and this time has 16GB of VRAM. While the 3090 is not that much faster than a 3080/ti, the 4090 does have a larger gap between it and the 4080, plus it has 24GB.
Another thing to consider is physical size. The 4080 and 4090 are some of the largest GPUs ever built. You WILL need to verify they will fit in your case! The 3090 is also pretty darn big, too, but not quite as big as these 2. This is not a joke. The 4080 and 4090 can be 4 whole slots wide. They are ridiculous. Since you are talking about having bluray drives and stuff, this could be a real issue.
If your PC allows it (power and space), you can also render with multiple cards. As long as they each have enough VRAM. Right now I have a 3060 and a 3090, so as long as the scene is less than 12GB then both cards will run. This is an option, too. You could buy a 3060 on the cheap (if we can consider them cheap) and buy something else later to add to it. Like I said, it is still huge upgrade compared to what you have now. There have been times when I ran out of VRAM on the 3060 and the 3090 ran alone. But not a lot, and I might have been able to optimize it better. But since the 3090 was still going, I didn't bother.
The GPU market is kind of crashing right now. So prices might improve over the spring and summer. But that also could mean the 3000 series is totally sold out. I don't know how the European market is.
...the only 16 GB GPU available from Nvidia is the RTX A4000. which is about 950 to about 1,200$ depending on where it's purchased from.
European market is really squished with 3090s, slim pickings. But the cases I've been offered by those two stores are measured specifically to house biiiig GPUs of these days. As I once quipped...
...love it.
No, the RTX 4080 does as well.
...that's news. Didn't think they'd ever do a 16 GB card outside of their Pro (formerly Quadro) line (The Turing generation Quadro RTX 5000 had 16 GB).
Still a triple slot card with a high TDP (450w). If I had the funds I'd probably opt for an RTX A4000 (single width card) as I don't bother with games.(TDP is 140 W which is 20 W lower than my old 1 GB GTX 460).
Example configuration:
something like the below or similar / better (uk pricing from a few weeks ago):
Total: ~ £ 1450 (I have taken middle prices from idealo not the cheepest ones - a few weeks ago)
CPU fan: ~ £ 50
(a high end model like the Noctua NH-D15 would be ~ £ 100)
plus case, case fans etc.
Notes:
CPU:
The i7-13700KF is extremely fast, has 18 cores and offers max. 24 threads. I use it in my new build and can say it is a lot faster in daily operation than the AMD Threadripper 3960x I used before. If you don't need the build-in GPU you can go with the i7-13700K (not ...KF) which comes without gpu and is a bit cheeper. Alternatively an Intel i7-13600K or KF is a bit slower, has a few cores less but is cheeper.
Regarding the build-in GPU. If you attach the monitor to that one, and use a current Windows 10 build you save a few 100 MB of vram. The latest Windows 10 interations do not reserve as much vram in general anymore but on top they reserve even less, if no display is attached to the gpu.
Mainboard:
It is a DDR4 not DDR5 board and has one lan port 2.5 GB speed. If you need two lan ports or 10GB speed, expect higher board prices (as I wanted a 10GB connection to my NAS, I simply bought a 10 GB lan card for 87 € and put it in a free PCI slot - this was way cheeper than to go for a board that comes with 10 GB build-in).
DDR4 ram is expected to depricate in the next years. This might become relevant for your 6 years time frame. If you go with DDR4 (Ryzen 5 is also DDR4) buy the needed ram timely to not get hit by limited supply in the future. Alternatively go with Ryzen 7000 series or buy an intel board with DDR5 support (e.g. MSI PRO Z790-A WIFI DDR5). But board and ram will be more expensive.
Regarding the Intel LGA-1700 socket: Rumors say, the 13th gen of Intel CPUs is the last one to use the LGA-1700. It is expected that the next gen of Intel CPUs will require a new mainboard with a new socket.
AMD: Same situation with AMD. Ryzen 5000 series uses socket AM4. The new Ryzen 7000 series uses socket AM5. So, if you buy a Ryzen 5000 CPU now you can't just replace the CPU with a Ryzen 7000 in the future. You need to replace the mainboard as well.
If you prefer to have a simple upgrade path (just replace CPU) it might be better to go with Ryzen 7000 and not Intel 13000 or AMD 5000. Ryzen 7000 uses AM5 socket and AMD wrote they will support AM5 at least until 2025 with new CPUs. Disadvantage is, that Ryzen 7000 needs the more expensive DDR5 ram and the AM5 mainboards are also more expensive than the older DDR4 boards with socket AM4.
RAM:
min. 32 GB and see mainboard (better 64 GB).
Storage:
If you want to use it for many years you might want to read about experience with long term use before buying something. I personally have good experience with Samsung and Crucial SSDs and M.2s but that was just luck maybe.
GPU:
The RTX 3060 12 GB is considered the sweet spot of processing speed, vram and price theese days. Alternatively there is the RTX 3060 Ti. It is faster but has less vram and costs more. In case of a decision I suggest to always go for more vram instead of more speed. Or better: try to get a used 3090 with 24 GB of vram (I use this one and it is hell of fast and has plenty of vram - good for DS and stable-diffusion).
Or wait for the lower end RTX 4xxx GPUs. The 4090 would be nice but in my opinion the price is just to high. For that money one can get two used 3090 and combine them via a NVLink. This way you get a similar speed but with combined 48 GB for textures. (for two 3090 take a good PSU).
PSU:
Don't go for the cheepest models. The most simple models tend to fail more early than higher end products and they could struggle with power demand peeks form the PC (CPU and GPU can request way higher watteges from the PSU than their official TDP values suggest - those peaks are only lasting a few milliseconds but very simple PSUs might cause a PC shutdown or blue screen because of it - it is rare, but it happens). Some vendors offer extended warranty for their higher end models. E.g. I got a 10 years warranty for a Corsair AX1600i PSU (it offers 1600 watt which is fine for multi GPU configurations - but it is expensive - I got it during a sale).
Power cosumption and cooling:
The 13700KF at default settings in my board and at a high load (Cinebench R25 benchmark) consumed nearly 250 watt. That creates is a lot heat to be handled by the CPU cooler. But, those CPUs (Intel and AMD) and mainboards are set to very high voltage values at stock. In my case I just went into the MSI bios and lowerd the "CPU lite load" setting. As a result the CPU is now at 194 watt under Cinebench R25 with no reduction in speed and stability. Temps went down from around 90 degree C to around 75 degree C (watercooled system).
The RTX 3090 is similar to this regard. Default was 330 watt in Iray rendering at around 1950 MHz clock speed. I lowered the clock curve to a max. of 1800 MHz and reduced the allowed voltage quite a bit (can simply be done with the MSI afterburner tool). Result: before around 330 watt, after 245 watt. Speed wise it does not make a relevant difference in DS.
I went with watercooling to not run into cooling / throttling issues. The system can render for endless hours without any heat issue on GPU, CPU or other components. Temps don't reach critical values which hopefully helps for components to last a few years.
If you go the intel route with LGA-1700 socket and build the system yourself, google for "LGA 1700 bending issue" before doing so. A lot of overheating, throtteling, bluescreen issues have to do with it (it's about a bend of the CPU when attached to the board so that the cooling plate does not have good contact the CPU die - this causes bad heat transfer to the cooler and results in overheating of those CPU areas that are not well covered).
Awesome! Thank you for the tips; I will compare this to the setup offered by one of the comp stores (it does have DDR5, on account of higher end Ryzen needing it). The other store hasn't sent me the offered setup yet, so good to know! I new I could trust fellow nerds in this place!
There has been some talk about stability problems with DDR5 when getting into amounts of RAM which DS would be comfortable with (64+ GB's)
Well that's a yikes.
There's always a chance of something going wrong. That is just how it is. My first attempt to build Ryzen actually failed, the thing would not boot, it was dead. I had to exchange the 5800X for another, and that one worked. But most people would say the 5800X is fantastic. Sometimes you just get unlucky. Of course you want to minimize the risks that you can control, and some things might break more than others, but even the most rock solid products still have a failure rate.
You are doing your research, and that's a good thing.
I'm Ye Olde Tyme Nerd.
Been putting PCs together since 1994; using computers since 1985. Married to someone who has done it even longer. Building things to last and recycling stuff for other uses is a big thing here; my old desktop is most likely to see some six years+ of use as a Linux server at home.
Same here, but without the PC enthusiastic companion, never found one, so didn't bother
With that knowledge, did you check Intel X299 and what it can offer? It's an older chipset, but in my opinion, it's still a serious contender.
My PC enthusiastc companian is my son...we started together when he was nine and now he's 40 :) We spent this afternoon installing ssd's into my case after the dock they lived in threw a wobbly.
...the one caveat with the new Intel i7 and i9 CPUs is they have a "split core" setup. The i7-13700KF has 8 "Performance" and 8 what are called "Efficiency" cores the latter of which are single instead of hyper threaded (these cores also have a lower base clock speed). So instead of 32 threads for rendering. you have 24, (the same as the Ryzen 5900X has) both at slower speeds (base clock for the "E" cores is 2.5GHz and the Pcores 3.4GHz - the 5900x has a base clock of 3.7GHz on all 12 cores/2 threads) This architecture seems more suitable for portable devices like notebooks and tablets where battery charge is an issue.
Granted, for now Daz is only single threaded when loading and setting up scenes but the thinking is that 5.x may use all CPU cores which should translate to faster load times and better non rendering performance (like dForce sims). Xeon CPUs are straight hyperthreaded processors, but those are above the budgets of most of us here (the "Ice Lake" Xeon W-3375 Workstation Processor which was launched in Q3 of 2021 has 38 cores and 76 threads but costs a whopping 5,000$). The highest core/thread count in the most recent Xeon series (Rocket Lake) trops out at 8c/16t.
I have to skip Intel for various reasons, biggest one being "if one of the Linux servers at home goes kaboom, I can pull out the HD and plug it into a desktop on temporary basis without too much screaming", since all systems are AMD here. Plus, I can carry a grudge. Like 25+ years, easy.