Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
I did have luminance down to under 10 but still got the noise. The spot lights were much better but even they were set way below the 100 default.
I have tried to find out how to use the HDRI. I've played with turn on/off "Draw" and other options but can't get a sphere to show in my scene. I have lots of HDRI images downloaded that I used with Reality but can't figure out how to use them with IRay.
I don't understand the GPU/CPU relationship at all. I can't see how to determine which (or both) is being used. I am told that CPU kicks in when there GPU VRAM is used up but how do I know when a scene has too much for the VRAM? And even then, will the GPU and CPU work together or does the CPU over-ride the GPU and switch it off?
The scene MUST fit your video card's memory...that means if you have a 1 GB video card, the scene has to be smaller than that...including all textures, and subdivisions. In other words, probably NOT a clothed human figure, with hair...and maybe not even a naked one, with no hair. If it doesn't, then the CPU will be used.
Now, if the scene fits...you have two choices...CPU + GPU OR GPU alone. That's a before render choice.
It should fall back to CPU only, if you run out of memory on the video card.
Mine is a 2GB GTX-680M, but 2GB is still minimal for IRay, I think.
Yes 2 GB is a minimum. But a naked or simply clad character with hair should fit in. Don't forget that other programs you have running may take some VRAM too.
As for the relation between CPU and GPU just know :
CPU = Only the CPU will be used
GPU = Only the GPU will be used. If the scene doesn't fit in the available VRAM it shouldn't render
CPU + GPU = GPU will be used if scene fits in the available VRAM otherwise the CPU is used and the GPU is ignored
I am not aware of any means that will tell beforehand what size of VRAM will be needed. So it is run and guess.
Nope...I've rendered plenty of simple scenes in GPU mode on a 1 GB GT440...
It's a mobile chipset...does it have actual dedicate memory or is it shared memory?
Removed by me. Calculation was false like mjc1610 shows in a post in this thread.
Anyway DAZ_Vince confirmed in a video that 2 GB VRAM is sufficient for a scene with one character.
See DAZ_Vince post here:
http://www.daz3d.com/forums/viewreply/819082/
Thanks for explaining that. Long story short: my measly 2GB is not enough so GPU is a non-starter. I know that there are a lot of seriously well equipped enthusiasts here but that kind of hardware investment puts it out of the"hobby" category IMHO.
I'll carry on with a few test renders in CPU mode because I have read in several places that it should still be faster than Luxrender for a similar image quality.
IRay is indeed faster then Luxrender and it is mostly (not always) faster than 3Delight even run on CPU only.
Nope...I've rendered plenty of simple scenes in GPU mode on a 1 GB GT440...
It's a mobile chipset...does it have actual dedicate memory or is it shared memory?
Again, this seems to be at odds with what Erdehel was calculating. And I'm still in the dark about how to know, during the render, whether GPU or CPU is being used. I have an app that tells me the temperatures close to the CPU and GPU and all I can say is that the CPU goes up but the GPU doesn't which suggests that, even on my simple scene, it is working on CPU only.
This is my card: http://www.geforce.co.uk/hardware/notebook-gpus/geforce-gtx-680m/specifications
That GC is rather old but it is CUDA compliant. So it should work with IRay if the available VRAM will allow it. Set your settings on CPU + GPU and don't worry which is used. My system has 4 GB VRAM and often it switches to CPU only.
Which is precisely how I have set it. :)
I did a test with a few simple props and it was easy to tell by the render times whether GPU was being used (it was much faster than when I switched off GPU). So I guess I could do the same kind of test with G2F characters, clothes and hair. I just though there might be some indication somewhere which might say "rendering using GPU" but clearly not.
According to this (found on the Autodesk Iray FAQ)...
"For estimating memory usage, budget about 1 GB per 8 million triangles, to which you must also add 3 bytes/pixel for any referenced bitmaps. "
That should mean about 4 million quads...with no textures.
That's about 50 MB per 4K x 4K texture...and that's EACH map.
4096 pixels x 4096 pixels = 16,777,216 pixels x 3 bytes = 50,331,648 bytes (or 49,152 KB (50,331,648/1024) or 48 MB (49,152/1024) )
There's also between a few dozen and some 100s of MB of other 'overhead'.
And this can show the load and memory usage on the video card...while in use.
http://www.techpowerup.com/gpuz/
You are going the wrong way...you don't multiply bytes by 1024 to get to KB, MB or GB...you divide.
The first division gets you KB...then next MB.
The texture memory size is roughly the same as it would be if it were NOT a compressed jpeg/png file. An uncompressed 4K image file is in the 30 to 50 MB range....and that's what it will take up in system memory or video memory (but probably more on a hard drive due to block size....but that's a different story).
Another way of looking at it....you can fit between 150 and 200 10 megapixel (3664 x 2748) images in 1 GB....that's what my camera can hold on a 1 GB SD card...(I think 187 is the 'exact' number)
Here's an online bit calculator...
http://www.matisse.net/bitcalc/?input_amount=50331648&input_units=bytes¬ation=legacy
Unless your a gamer.. Then most of the systems around here are starter level.
Unless your a gamer.. Then most of the systems around here are starter level.
Yeah...I'm happy with my son's hand-me-downs....if he doesn't burn them out, first.
In Render Settings, go to Environment (1).
Select either "Dome and Scene" or "Dome only" (2).
If the HDR is supposed to be visible in your render, click Draw Dome" to on (3).
You can exchange the HDR image in "Environment Map" (4), and increase the light output by Environment Intensity in the slider above "Environment Map".
Not sure what I'm doing wrong then because I had all 4 of those settings as you show them but I still can't see the HDRI image in either the viewport or the render.
Perhaps I have the light intensity wrong or something similar.
You have to change the setting in (3) to ON, if the hdri is to show in the image.
It will not show in the viewport, though.
See this post and the links inside. Many questions answered.
See the video to the end some neat trick there on how to use the viewport with interactive rendering enabled..
http://www.daz3d.com/forums/viewreply/819082/
Thanks - I was actually looking through that thread this morning.
However, the problem I was having with those artefacts all over my screen (see above) has returned with a vengeance. I have been trawling through Apple support forums but can't find any answers other than "probably a faulty graphics card". I don't get them with anything other than IRay, however.
It remains to be seen whether i will get them if I switch off GPU and just rely on CPU. When they start, they appear all over my screen, even outside of DAZ Studio. A reboot usually gets things back to normal and yesterday it was fine for a few hours before it happened again. But today I've started another project and will be sticking with 3Delight until I have more time to do more testing with 4.8.
Sounds like yes, it is a card problem...and quite probably a heat related issue. When was the last time the machine was given a good clean out?
Sounds like yes, it is a card problem...and quite probably a heat related issue. When was the last time the machine was given a good clean out?
I got it back from the Apple service centre about a month ago. That was for a failed hard drive (which they replaced for free even though it was more than a year out of warranty) but they had it all opened up and I would assume they cleaned it. Apple make pretty damn sure that customers don't attempt to open up these wafer-thin iMacs.
However, as I said earlier, my app that displays temperatures shows the GPU hardly moving from its ambient temperature during the render. The CPU temp does go up but not to a level I would need to worry about.