Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
dup
Personally, I have a Ryzen 9 3900/ 32gb Ram/Asus Tuff Gaming 570 MB/ 750w Platinum PS.. I am still using a GTX 1060 6GB card, as I had hoped to upgrade to a 30x0, but prices haven't become reasonable until now...
But egad, the 40x0 is so close, and is expected to be plentiful and reasonable in cost... sigh.
I don't game much (and when I do, its probably world of tanks for 1/2 hour or so)...
My plan was to go all in on a Ryzen 7000... but I am on the fence about it now. Inflation is real, and since I am living in Europe at the behest of the USG, I am hesitant to pull the trigger on a new system this fall as Europe is bracing for a very expensive winter as far as heating goes....
I am seeing 12gb 3080 cards in the $700 range, and while that might be a good deal, I am betting that the 4070 cards will be priced less, with more performance.... sigh, its just that the 4070s probably won't come out until Jan/Feb....
The prudent thing might be to stick with my current computer, upgrade to 64gb of ram, and wait for the 30x0 cards to drop a bit more, which should be as soon as the 40x0 cards "announce" and just get the best 30x0 card, with the most VRam for the buck.
Is the forum acting up once more, such that attempts to post a message acts like it failed, so you click Post on that message again? 0o
Anyway, I'm also likely waiting for the 4080, or maybe the 4070. Originally I was going for the 4070, but since it sounds like that might not show up until January or so, and the 4080 will get here a month or two earlier than that, and it sounds like the 4080 will only be a bit over a couple hundred bucks above the 4070, and will render faster than the 3090... and I'm tired of 2 hour renders on my 1060... so... faster is better...
My only concern is... my current motherboard tops out at 32gigs of ram, and that's what I have in it. Is that fine with 16gigs of vram? I'd originally been of the understanding that you needed twice the system ram to what you have for vram, but more recent posts I've seen have recommended three times the system ram to your vram.
I would say, one needs at least 3 times the VRAM
Attached is a test I made some time ago to see how much RAM and VRAM was used while rendering in IRAY
Case a) just one lightweight G8 figure with lightweight clothing and hair
Case b) four similar G8 characters with architecture
Case c and d) started increasing SubD on the characters to see at which point the rendering would drop to CPU
"RAM/GB" and "VRAM/MB" taken from GPU-Z, "DS Log/GiB" taken from DS Log, no other programs were running but DS and GPU-Z
The "DS Log/GiB" is the sum of Geometry usage, Texture usage and Working Space - After Geometry and Textures, there should still be at least a Gigabyte of VRAM available for the Working space => In my case, Geometry + Textures should not exceed 4.7GiB
Note; Case c) was already using 38GB's of RAM, even though the rendering was done on the GPU, Case d) when rendering on CPU the RAM usage went almost over my 64GB's
Tests made using RTX 2070 Super (8GB), i7-5820K, 64GB's of RAM on W7 Ultimate and DS 4.15
Very much agree here, that and larger sized SSD's are still expensive compared to an equivalent sized HDD.
Hi All,
I am finally getting around to getting a new computer. I really appreciated all the input you gave to me previously. Life threw me some curve balls that set me back but I am hoping to order this week. Here are the specs and thank you in advance.
Get the Pro version of the operating system
...indeed Pro gives you more options than the Home Edition. for a small increase in price, particularly if it is an OEM.
The downside with a prebuild, particularly from a "big name" company like HP is that you may find your Windows 11 set to "-S mode" which limits you to purchasing software through the Windows store. The other issue is many prebuilds come bundled with bloatware. I'd definitely avoid activating the McAffee Demo and get better AV software as well as Anti Virus protection. I had McAfee years ago and it was junk.
My setup has a small boot drive (240 GB) and a 2 TB content library drive (both SSDs) as well as a secondary 2 TB HDD for storage and a 4 TB HDD for backup and archive purposes.
It looks like a good system but it could be tweaked.
I'd go 4090 instead of 3090, faster and quieter. Unless you're getting a really good deal on the 3090 as they're a couple of years old now. Where I am the few 3090s that you can find retail cost as much as a 4090.
A 13900k is a lot of CPU, you could drop down to a 13700k or even a 13600k and put the rest towards a 4090. Depends what other programs or games that you're going to run, look at some benchmarks.
2 x 32 ram would be better than 4 x 16 as it leaves room to expand, that said I do fine with 64gb. 5200mhz is on the slower end now, faster ram doesn't cost much more and can make a difference, probably not for Studio though. About 6000mhz is reasonable for price/speed but ddr5 prices are still dropping and you can always swap it at some point in the future.
The power supply is probably a typo, you need a decent 850w for a high end system as a minimum.
I've got two systems, a 12400, 64gb 3200 DDR4 with a 3090 and a 13700k, 64gb 6600 DDR5 with a 4090. Both have good quality 850w psus, are boringly stable and are great to use for more or less anything.
I used to have a spare computer for emergencies. in the '70s I kept it on top of the "big-iron" machines where I worked. If one gets desperate and can live without graphic output, it might be possible to find one of these https://en.wikipedia.org/wiki/Digi-Comp_I
for unexpected pesky 3-bit problems.
...for my upgrade I'm astarting with 2 x 32 DDR4 3200 to eventually esxpand to 128GB for when I can get my hands on 24GB GPU.
Why do you like the Pro version? Please and thank you.
I don't when it comes to W10/W11, but with the Home version you are at the mercy of MicroSoft.
If you have the Pro version, at least you have some means to say "This is MY computer"
Why do you like the Pro version and what do you use from it that makes it worthwhile? I have been buying HP's for many years and had no problems really. There isn't much bloatware anymore, not like in the old days and definitely not like on most cellphones. I always uninstall McAffee right away, I don't like it either. I do buy the additional warranty just in case and nowadays you get it refunded back if you don't use it but I am not exactly sure how that works. Thank you for your comments I really appreciate the input.
I really wanted a 10 or 12T secondary and am still trying to get that accomplished but If 2T is all they will give me then I already use many externals and will stick with those.
Sounds like a great plan!
For Iray the GPU has to be able to run CUDA, which no AMD GPU has. The iGPUs are still AMD graphics. It can run renderers that don't use CUDA. Tensor is also a Nvidia only thing. AMD has its own AI core in their latest RDNA3 chips, however it doesn't do anything yet, lol. Seriously, it is there, but they have no software for it.
If Nvidia ever wanted to, the 4060ti has a 187 mm² die size, while the PS5 has a 308 mm² die. There is easily enough room for to stash a 4060ti into an APU they wanted. There are some pretty sweet APUs coming in 2024, but none will have CUDA, as they are AMD based. Intel might even have some nice APUs coming, but again, no CUDA.
This, as too many buy PSUs that are on the borderline of the CPU's/GPU's power draw without considering the accumulated power draw of the entire system, such as with HDD/USB/Motheboard, etc, not just the CPU/GPU, not to mention power spikes and the like, it is much better to get a 1kW or even a 1.2kW for future upgrades as most reputable companies have 10-12 year warranties, so they're made to last several upgrades!
I always put an ample power supply in my machines. So I can always have power for upgrades. I got a 1kW. With my components, I needed 800 - 899 Watts just for my system hardware; I'll have no choice but to upgrade my PSU for the newer cards from Nvidia
Right now I render with CPU's so I am used to using a lot of CPU's. I usually buy a computer and go with it for as long as I think I can so I try to buy up and not do much to them later. Have you been having problems with the 4090? It's $250 to go from the 3090-24G to the 4090-24G. I was told that the 4090's have been problematic and not ready for prime time so your input would be appreciated.
As far as the RAM goes you don't get to pick anything but the amount. And if I upgrade to the 4090 I do believe they put in a 1250w PS. Options are very limited.
Thank you.
You're welcome.
If you get a system with a 3090 or 4090 you won't need to render on the cpu unless you run out of vram. I've only managed to do this once and it was a large scene with lots of glass and I tried enabling caustics. The gpu is many times faster than even the best cpu for iray renders, If I render on the gpu only then the render speed is about the same but there are enough cpu resources left to browse the web, watch videos or even run other programs. From my brief testing the 4090 renders between 30 and 40 times quicker than my 13700k.
I've not had any significant issues with the 4090 in daz studio, or in other programs or games. Using the ds beta which has better 40 series support it renders about 70% quicker than the 3090 while using a similar amount of power. 4090s have larger coolers and all the vram is on the cooler side of the board rather than split between the front and back like the 3090 so they run cooler and quieter.
If you're not going to render on the cpu then dropping from a 13900k to a 13700k will save about 150 which could go towards the 250 extra for a 4090. For studio, single threaded performance is important and the 13700k is pretty close in the real world to the 13900k. It's also an easier chip to cool and therefore less likely to throttle down due to temps than the 13900k. If you run programs which will max out all the 24 cores and 32 threads then the 13900k will be better but the 13700k is still a modern 16 core / 24 thread cpu and by the time it begins to struggle with anything the 13900k won't be far behind.
Whichever way you configure the new machine I think you'll be impressed with it.
Once you render with GPU you will never, ever, ever want to go back to CPU rendering.
One of the fastest CPUs tested in our benchmark scene is the AMD 3970X, which is a 32 core 64 thread beast that costs $2000. That CPU ran the benchmark in 11.5 minutes, doing 2.6 iterations per second. It may be a generation older than the 5950x, but it easily beats the 5950x in other rendering engines like Vray. The 3970x scored 40000 points versus the 5950x scoring only 25000. That is pretty wide gap. So I don't know how a 5950x will fair in Daz Iray, but I doubt it will be close to 2.6 iterations per second. I don't think any desktop CPUs break 2 iterations. Maybe a 13900k could. Maybe.
Meanwhile, in the same benchmark test the 4090 is finishing in around 60-70 SECONDS. The iteration rates are around 26-28 per second. The 4090 is more than 10 times faster than a $2000 CPU. The difference is just crazy. You will be able to render 10 times as many works compared a $2000 CPU.
With 24gb of VRAM, you have to be making some pretty big scenes to exceed the VRAM buffer. Considering that your original plan was a 64gb RAM model, I doubt most of your scenes will exceed the 4090 VRAM capacity. I could be wrong, it depends on compression, but I think the 4090 will do you well.
Another perk of GPU rendering is that you can leave your CPU free. So you can easily use your PC for other things. I can be rendering a scene while building a scene in another instance of DS no problem. This does use memory, so you can hit the VRAM cap perhaps this way if the scene is heavy. But in general there is enough VRAM to run other software. With such fast rendering, your render may be finished before you get your scene ready.
...possibly the 96 core Threadripper 7000 "Storm Peak" series could do it., but then you're talking what will likely be a very pricey CPU for which you could get a a 48 GB RTX A6000 (Ampere) and pretty much rarely ever have to worry about running out of VRAM as well as have some change in your pocket.
The 64 core 5995wx scored 60000 in Vray. So roughly 2.4x the 3970x score. So that would get us to 6.24 iterations per second if the performance gap is comparable in Iray. That might be around RTX 2080 level. But most of the Ampere cards are faster than that, even the low end ones, and no Lovelace is close to that slow. To get that speed you need $5700, and that is just for the chip. Threadripper motherboards also cost a lot more, as does the memory.
There is a 96 core Epyc, and since they are bought in pairs you would have 192 cores. But they clock under 3ghz. So Threadripper with its higher clocks can outperform Epyc at some tasks. In Vray the 64 core part is slower, but the gap in not that large considering all the hardware (I believe it was 80000). BTW the 96 core Epycs use about 400 Watts each. You might need to also rewire the home to power one of these servers. You need cooling, too.
I hate to imagine what AMD will ask for the Threadripper 7000 series. They have raised prices multiple times on new generations, and a new 96 core part would absolutely be something they want to charge an even bigger premium for. At 96 cores I don't see how that many can possibly clock like previous Threadripper.
It might sound nuts, but Jensen Huang's catch phrase of "the more you buy the more you save" is actually true in this field. GPU acceleration has done wonders for rendering. As much flack as the 4060ti has received, it should still be faster than a 64 core CPU that costs $5700. Its tough, but I think I would take the $500 4060ti 16gb instead.
But that's just me. Even though I've got these kidneys just lying around that I don't even need. Don't ask.
Just for fun I ran the benchmark scene on the 13700k. Latest DS beta, all stock CPU settings. 261w max power draw. 1.46 iterations per second.
The 13900k benches 20-25% better in multi core rendering workloads which would put it at about 1.8 if it scales the same in DS.
My 4090 does 26.3 stock at about 285w. That's about 18x the performance, thought it would be higher.
The 4090 has more headroom, I've hit 29.5 with a pretty silly overclock, the 13700k was hitting 97c stock with a custom loop, not much room there.