When using Iray, is VRAM important? Is it worth buying a 3090 instead of a 3080 for this?

in The Commons
3090 has 24G VRAM but the actual performance is not much better than the 3080,I'm hesitant to buy it for more VRAM
Comments
The way I look at it, the more vram I have, the less time I have to spend fiddling with optimizing the scene until it will render. Oh, and the more vram I have, the bigger dimnsions I am able to render.
The difference in VRAM and price of the cards is quite significant... Considering that DS doesn't have support for RTX30 series yet, I would just wait and see what the future brings.
To me, vram is the whole point. I can fill scenes without thinking about it. So, 24gig is a must.
why is it that the video card cannot use system ram? bang that baby up to 64gig or something and let it fly.
because nVidia. If your business is making graphics cards, making it so the consumer doesn't need expensive vram is bad for business, so you make the software dependant on hardware you sell.
And ordinary ram is much much slower when located elsewhere on the machine via a comparatively slow bus.
Already in the 90's the SoundBlaster32 had sockets for adding memory by yourself, but that wouldn't force people to buy more expensive cards...
One of the reasons I am working on switching to cycles and eevee render for my "pipeline". One of the motivations of those render engines is not selling high end nvidia GPU's :P
VRAM is important in as much as if your scene data don't fit in the VRAM then the GPU won't be used at all. Now whether or not you need tons of VRAM depends on what kind of scenes you create, and how much you optimize them.
If a scene wont fit on your card, then the card was a waste of money.
You can usually make scenes fit, it might take less than a minute, it can also be complicated. It is also nice not to have to worry about it.
vram determinate your scene size and other factor on the card such as # of CUDA core in charge of rendering speed, from all iray & vray test i gathered, 3090 is merely 15% faster than 3080 on average, and both 30xx cards are miles faster than their 20xx peers, but the key difference is 3090 may able to fit a very complicate scene with 10 character in a scene without any optimization effort, that is just impossible on 3080. So it's really a time vs money issue, thing, which is more important to you at this moment.
Because iray in Daz works like azz, debugging tool won't tell you valuable information such as “you are about approaching your vram limit”, “your scene is too large to render”, how much vram you are over budget etc, most of time it just fail to render when you do things you cards can't handle. So speaking for myself, I'd grab a 3090. well it's not like it will be available till 2021 so just wait. Iray aside, too bad Nvidia monology most rendering tech in nearly all industry 3D tools which only supply CUDA cores, it's not like AMD cards will be an option.
Since you're going to gt a lot of people telling you "MOAR VRAM" Before you decide, what types of scenes do you generally render and are you planning to render?
For instance if you typically render scenes with 1 character the odds you are going to need the extra vram is incredibly marginal, on the other hand if you are planning to consistently render 5 characters in a shot and don't want to optimize - yes you are going to need the 24 gb
(for the record I have a 1060 with 6gb but since I rarely render more than 2 characters + a background in a scene and and comfortable optimizing I almost never hit the limit, and again thats just at 6gb vram)
Well here's my personal experience with iRay renders.
I render iRay scenes that often use only one or two DAZ Store characters with DAZ Store clothing and often DAZ Store fibre mesh hair and one or more DAZ Store environmerntal sets. I CPU render. When I had a laptop that had 8GB system RAM and an intel integrated HD Graphics 4000 GPU I frequently, say about 10% of the time ran out of system RAM. So I upgraded to 16GB system RAM and rendered with that for 2 1/2 year and only ran out of system RAM twice. Now I have build a new PC with 32GB system RAM but still no nVidia GPU and have yet to run out of system RAM iRay rendering. I don't optimize textures but I did start rendering a smaller canvas consistently at 1920x1080 or 1920x1920 most of the time.
So I think the 'sweet' spot if you do scenes with similar resources to my renders will be a nVidia GeFore RTX 3070 16GB whenever they release supposedly in December. If you do more than 3 - 5 characters (a lot will depend on how much water, glass, metal, fiber hair, dforce or SBH and other things in the environment) then you can composit the characters batch by batch with their shadows on the clean rendered badground. Makes problems with reflections and such but oh well.
Most of us aren't doing military invasion scenes, sports stadium scenes, or Time Square types scenes though. We do like dawn scenes before most folk are out of bed and the dawn lighting looks better than noon lighting too. Sort of the opposite of vampires, DAZ folk only come out at dawn. LOL.
I current have a 2080TI, which has 11GB, and I run out of VRAM only because i have really complex scenes (many poorly optimised, high poly crap in them).
If you are only planning to render scenes with out-of-the-box Daz environments and one or two figures, then there is no value in having a 3090. A 3080 would be fine.
You can still work with limitations of low VRAM by splitting up your scene into chunks. I would love to have 24GB, but the cost of doing that is high. Cost benefit analysis tells me it's better to just keep rendering in chunks.
That assumes it is possible/practical to add RAM - if nothing else, the memory controllers are part of the GPU chip so making the lower-end chips able to adderss more RAM would increase their complexity, and so cost, for a feature that might have a limited market (and I would imagine that moving the memory controller off the GPU would have an impact on performance).
Please Daz give us a time frame for the Ampere update.
GPU compute directly access data from main memory (those DDR-whatever Modules), so usually all data for GPU-based raytracing has to be in GPU memory (on the graphics board).
Don't forget that ideally you really need at least 2GB of RAM for every 1GB of VRAM in your box, so if you do want a 3090 then you should be looking at putting it in a PC with at least 48GB of RAM.
...⬆ this ⬆
It gets down to a point of diminishing returns when more time is spent on manually optimising scenes (particularly very involved ones) compared to just rendering them.
VRAM is very important, BUT I'd recommend to wait a bit longer. There are rumours of custom designs with more ram, that means there will be probably a 3080 with 16 GB available in a few months.
+1
Even though my scenes have up to 4 characters with whatever architecture and such, I have not yet ran out of VRAM with the RTX 2070 Super 8GB
The biggest I have monitored in GPU-Z took around 6GB:s of VRAM
I don't optimize, but I don't go crazy with the lighting either.
I can't afford a 3090 and I am not going to swap one 8GB card for another when my main problem is not being able to fit my scenes into 8GB. So I am really hoping that the 16GB version of the 3070 is coming or perhaps the 20GB 3080 if I can beg my family for some Christmas contributions.
I'd hold off purchasing a 3090 for now. 3080 20gb and 2070 16gb are rumoured to appear in December - I'd at least wait for confirmation one way or the other.
VRAM, regarding iRay at least, is like income. Spending will always expand to fill available resources.
Learn to optimise your scenes - removing unecessary maps, reducing texture size on distant objects, mesh resolution, hiding out of shot objects that don't impact the rest of the scene (eg through reflections or shadows or light sources). Use the same bump/spec/normal maps on characters that share the same UV layout (if it isn't noticable - look out for moles, freckles, scars etc), hell even use the same diffuse maps and adjust diffuse strength and/or translucency weight to adjust skin tone and move character specific details, i.e. makeup or skin blemishes, to geometry shells. You'd be surprised how much can fit in 8gb (I have 8gb in my main render rig at the moment) given a good thrashing with the optimisation stick. Remember to save assets as scene subsets, and all materials as presets so you can quickly re-use them.
When you've optimised all you can get away with, then start throwing VRAM at it.
First why don't GPU's use system RAM?
speed. GPU's are designed to do lots and lots of calculations very quickly. GDDR is much faster than DDR, at least for the sort of operations GPU's do. Periodically Nvidia and AMD do put out low end cards with DDR on the card and the performance is awful, an order of magnitude worse than the same card with GDDR. So going to DDR over the PCIE bus and CPU is even slower. I'm sure that if Nvidia cared they could design some way to shuffle data in and out of VRAM but they don't. iRay is not a major profit source for them.
Second optimization and 8gb cards
I make VN's. I do still fiddle around with DS as a hobby but 95% or more of my renders are for my VN's which make me money. I have an 11Gb and an 8Gb card. I track my render logs for when I exceed 8Gb, I don't want to get into bad habits nor do I want fewer renders getting done overnight. Maybe 1 render in 50 exceeds 8Gb. When that happens and I catch it I run scene optimizer and that fixes the problem, which is almost always a bunch of 4k maps. I know people get ambitious but 8 Gb remains plenty. I could easily enough have bought either a pair of 2070 Supers or a RTX Titan or a 3090 but I do not see the need. If you're really exceeding 11Gb routinely then your ambitions hopefully match your budget.
Daz almost never gives ETA on new features.
That said, a new beta just got released today, and according to the change log it included a new version of Iray.
...on the other hand, I tend to approach 3D as I did when I used to paint in oils and watercolours, so I tend to create scenes that can even make a Titan-X break a sweat. Optimising a very busy scene even with the Scene Optimiser (which does have some shortcomings) can be a very intensive process which has an impact on workflow. As I mentioned in a post above there can be a point of diminishing returns when dealing with a large scene. Being able to have enough overhead to avoid the process dumping to the CPU becomes a speed factor in and of itself.
Yeah maybe 16 GB might be work for my purpose, but I'll wait and see how Nvidia approaches and prices it. For now, the only existing Nvidia 16 GB GPUs available are the Quadro P5000 and RTX 5000 which still cost more than a new 3090 (ranging from,1,700$ - 2,300$ even used). Having the same VRAM and and more cores for less than 1/3 the price of a Turing RTX6000 is still a good deal.
Looking at another thread, it appears the new beta does support Ampere cards. The first few benches were posted with a 3080 and 3090, and holy crap they blow everything else out of the water in terms of speed (around 12-14 iterations/second on the benchmark scene compared to 6-7 iterations/second on a 2080Ti if I am remembering correctly).
It's 20gb.
And a rumoured 40gb Titan model coming soon (ish).
https://www.techpowerup.com/272508/galax-confirms-geforce-rtx-3080-20gb-and-rtx-3060-rtx-3060-matches-rtx-2080
I just put my PNY GeForce GTX 1650 Super 4GB in my computer to test all the scenes I've made in the last 2 weeks to see if they'd fit in a 4GB video RAM and ever though I used only presets from products bought in the DAZ Store none of the scenes will GPU render.
Given that I am going to buy an 3070 with 16GB or ideally a 3070 with 20GB I think the 3080 with 20GB will be out of my price range though, maybe even the 3070 with 16GB and I'll just have to save more as there is no sense buying a new expensive video card that isn't going to be doing any ray tracing! Maybe I'll even buy a 3090 with 24GB once I start saving that much for those others.
They might release a 3060 Super with 16GB for $500 or less which is something I just completely made up but that's more in tune with what I can reasonably afford to save every 2 to 3 years waiting on a video card that does the job.
++++++
Take part of that back - I could render one scene I had because it used Favreal's French Village Bundle which is an old 3DL product and converted to iRay. The textures are much lower resolution but work at sufficient distance in any case.
Interesting thread, I was curious about this topic.....