could I use amd cpu to instead NVIDIA card
kyoreona
Posts: 176
I had a dream to buy a computer with an Nvidia graphics card and enough memory to happily use IRAY.
But as new models get bigger and bigger, just a few characters in fancy clothes can stand next to each other and get more than 6GB or even 12GB of video memory.
On the other hand, the growth rate of video memory, which makes video cards affordable to me, has almost stalled, and I can't afford a 20GB video card now.
If I end up switching to the CPU for rendering because I'm running out of video memory, then instead of waiting for a sweet spot graphics card with high memory that I can afford, could I just put my money into a high-performance AMD CPU, which will eventually carry the rendering task anyway?
As for the graphics card, is it possible to also choose AMD, or randomly choose a 6GB NVIDIA graphics card?
Comments
I'm not entirely certain I understand your question, but you can use an AMD CPU with an Nvidia card. A powerful CPU will render Iray faster than a weaker CPU, but neither will render nearly as fast as an adequate GPU.
The 12GB 3060 can be bought at 400eur (including 24% VAT), it's by far the most cost effective way to upgrage.
Time or money - That is the question. An RTX 3060 12G will save time and would probably be more efficient energy wise. CPU rendering will save $$, but will take significantly more time and will tie your system up, making it harder to multi-task. You can get $$ back, but not time. Take the RTX 3060.
OP, no, an AMD GPU will NOT render IRAY. It's not a solution.
Best course is the aformentioned 3060 - 12Gb, save up for it if you can.
CPU is not a viable alternative for rendering, the speed decrease is a factor of 100x at least.
The only alternative would be to look into Blender, which I think just got AMD GPU rendering. You would need to export DAZ products into Blender and get comfortable with that software.
Also you can render 5 characters and a background IF you optimize the scene (texture memory mostly) with scene optimizer. You don't need a 20Gb card for that.
So, which GPU do you have now with 12GB VRAM?
Before getting my 3090, I rendered all the time in Blender as my Threadripper 1950x (first gen) was faster than my 980ti in Blender, although it was slower than the 980ti in Studio.
Blender was also faster at getting a render I was happy with.
Now I have the 3090, I render less often in Blender, but still do so. I dislike the restrictive rendering solution that is nvidia, but it is what it is.
Perhaps you too could render in Blender, using whatever card you have?
This could be a solution when you hit the 8,12,24GB VRAM limit - Render in layers: https://www.daz3d.com/forums/discussion/70269/rendering-layers?utm_source=google&utm_medium=cpc&utm_campaign=&utm_content=&utm_term=&utm_matchtype=&utm_creative=&gclid=CjwKCAjwqauVBhBGEiwAXOepkfKYmVwPHCEN2KJZpcG_DZTz4lnos0wpC7jFA-ZcoChG40ETU9QauxoC98kQAvD_BwE
Or use the scene optimizer.
https://www.daz3d.com/scene-optimizer
Maybe DAZ should have a broad hint, saying:
"If you consider DAZ Studio as your main application, we recommend using Nvidia GPU devices for being able to make full use of the IRAY render engine."
So, you have about 3GB's of VRAM available for IRAY rendering. A 12GB GPU would have three times as much.
Windows 10 steals about 1GB to start with, DS needs a few hundred megabytes and the scene takes some (depends on the size of the scene) and about 1GB is needed for "Working space", the rest is available for textures, geometry, lights etc.
Running other applications while rendering, may reduce the amount of available VRAM even more.
This is why you OPTIMIZE =>>> https://www.daz3d.com/scene-optimizer
Simplified:
Say you have a scene with 4 characters + clothing is 1GB each to render = 4GB
And you have a background = 2GB
6GB , so it doesn't fit.
You run Scene Optimizer and you get rid of normal + bumps and you reduce texture size.
Now the scene and the characters only take 3Gb and it fits your card. You are now rendering with Iray + GPU (Which will be A LOT faster than CPU)
I often render big scenes, so this is an issue that I also have.
To get around it I have used a number of solutions:
1. I have an RTX3060 12GB VRAM. This does great for small and medium scenes.
2. I use scene optimiser frequently. This helps medium/large scenes stay on the GPU.
3. Just prior to a large render I restart Daz and check that nothing is hogging memory unnecessarily.
4. For some scenes I have characters put into groups. I then render te scene several times with various groups turned off before stiching the final scene together in photoshop. You have to be very carefull with shadows, and this technique takes a lot of work. Probably hardly worth it.
5. On the really big scenes I simply let my AMD Ryzen 9 take the load and do something else while the render runs.
There is a learning curve for Blender, but the Diffeomorphic plugin makes it easier. Check out the Blender thread.
good grief prices have dropped and I think I just missed my last finance offer from paypal... $450 for a 3060? shopping going are we
Well never mind... the 3060 has the same number of cuda cores and on bench mark the titanx is 13k and the 3060 is 17k ...
so have to get to a 3060 ti to gain cores.
---
argh
Oh well, the 3060 has no more cores but 12
then you have to jump to a 3080 12 g or higher to match my 12g ... so back to the waiting list.
This is under the system requirements:
**Additional Details:
NVIDIA Iray Render Engine: 64-bit only. NVIDIA video card with 4GB+ VRAM recommended. CUDA Compute Capability 2.0 or greater required.
Wait for the GTX4090
That is a joke
Minimum 6GB VRAM and preferably an RTX card.