Some Clarification About Iray

2»

Comments

  • marblemarble Posts: 7,500
    edited December 1969

    lee_lhs said:
    Try rendering with just the environment HDRI that comes as default setting, no extra light, camera headlight turned off. :-)

    As for the distant light, that one doesn't decay with distance, and you have to turn it waaaaay down to not fry the scene. You have to set the luminence for it to 10 or lower (default is 1500).

    Depending on your graphic card, you might get faster renders if you just tick CPU in OptimizeX. I actually got slower renders with GPU ticked there as well. :-)

    I did have luminance down to under 10 but still got the noise. The spot lights were much better but even they were set way below the 100 default.

    I have tried to find out how to use the HDRI. I've played with turn on/off "Draw" and other options but can't get a sphere to show in my scene. I have lots of HDRI images downloaded that I used with Reality but can't figure out how to use them with IRay.

    I don't understand the GPU/CPU relationship at all. I can't see how to determine which (or both) is being used. I am told that CPU kicks in when there GPU VRAM is used up but how do I know when a scene has too much for the VRAM? And even then, will the GPU and CPU work together or does the CPU over-ride the GPU and switch it off?

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    marble said:

    I don't understand the GPU/CPU relationship at all. I can't see how to determine which (or both) is being used. I am told that CPU kicks in when there GPU VRAM is used up but how do I know when a scene has too much for the VRAM? And even then, will the GPU and CPU work together or does the CPU over-ride the GPU and switch it off?

    The scene MUST fit your video card's memory...that means if you have a 1 GB video card, the scene has to be smaller than that...including all textures, and subdivisions. In other words, probably NOT a clothed human figure, with hair...and maybe not even a naked one, with no hair. If it doesn't, then the CPU will be used.

    Now, if the scene fits...you have two choices...CPU + GPU OR GPU alone. That's a before render choice.

    It should fall back to CPU only, if you run out of memory on the video card.

  • marblemarble Posts: 7,500
    edited December 1969

    Mine is a 2GB GTX-680M, but 2GB is still minimal for IRay, I think.

  • ErdehelErdehel Posts: 386
    edited May 2015

    Yes 2 GB is a minimum. But a naked or simply clad character with hair should fit in. Don't forget that other programs you have running may take some VRAM too.

    As for the relation between CPU and GPU just know :

    CPU = Only the CPU will be used
    GPU = Only the GPU will be used. If the scene doesn't fit in the available VRAM it shouldn't render
    CPU + GPU = GPU will be used if scene fits in the available VRAM otherwise the CPU is used and the GPU is ignored

    I am not aware of any means that will tell beforehand what size of VRAM will be needed. So it is run and guess.

    Post edited by Erdehel on
  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    marble said:

    Mine is a 2GB GTX-680M, but 2GB is still minimal for IRay, I think.

    Nope...I've rendered plenty of simple scenes in GPU mode on a 1 GB GT440...

    It's a mobile chipset...does it have actual dedicate memory or is it shared memory?

  • ErdehelErdehel Posts: 386
    edited May 2015

    Removed by me. Calculation was false like mjc1610 shows in a post in this thread.

    Anyway DAZ_Vince confirmed in a video that 2 GB VRAM is sufficient for a scene with one character.

    See DAZ_Vince post here:

    http://www.daz3d.com/forums/viewreply/819082/

    Post edited by Erdehel on
  • marblemarble Posts: 7,500
    edited December 1969

    Erdehel said:
    It is very simple: Count 2 Bytes per pixel. You can't fit many textures of 4000 x 4000 in 1 GB lol. 4000 x 4000 = 16 000 000. To have that in GB you should divide that number twice by 1024: 16 000 000 / 1024 = is the result in MB and that result divided by 1024 is the theoretical size in GB (15.26 GB). Of course there is no GC with that amount of RAM. From what I know the renderer will reduce the picture's resolution. I think it does that while 'preparing the scene'. I doubt IRay reduces the resolution under 800 x 800. Do the math. That's 0.6 GB. So far I know a character has at least 6 main textures : Head - Body -Limbs and for each Diffuse and Bump. to fit that in 2 GB and less there is another optimization going on. Didn't delve in IRay Internals yet to know how this exactly works.

    Anyway these numbers show how rapidly the VRAM gets used. Simple scenes maybe but then with poor textures.

    Thanks for explaining that. Long story short: my measly 2GB is not enough so GPU is a non-starter. I know that there are a lot of seriously well equipped enthusiasts here but that kind of hardware investment puts it out of the"hobby" category IMHO.

    I'll carry on with a few test renders in CPU mode because I have read in several places that it should still be faster than Luxrender for a similar image quality.

  • ErdehelErdehel Posts: 386
    edited December 1969

    IRay is indeed faster then Luxrender and it is mostly (not always) faster than 3Delight even run on CPU only.

  • marblemarble Posts: 7,500
    edited May 2015

    mjc1016 said:
    marble said:

    Mine is a 2GB GTX-680M, but 2GB is still minimal for IRay, I think.

    Nope...I've rendered plenty of simple scenes in GPU mode on a 1 GB GT440...

    It's a mobile chipset...does it have actual dedicate memory or is it shared memory?

    Again, this seems to be at odds with what Erdehel was calculating. And I'm still in the dark about how to know, during the render, whether GPU or CPU is being used. I have an app that tells me the temperatures close to the CPU and GPU and all I can say is that the CPU goes up but the GPU doesn't which suggests that, even on my simple scene, it is working on CPU only.

    This is my card: http://www.geforce.co.uk/hardware/notebook-gpus/geforce-gtx-680m/specifications

    Post edited by marble on
  • ErdehelErdehel Posts: 386
    edited May 2015

    That GC is rather old but it is CUDA compliant. So it should work with IRay if the available VRAM will allow it. Set your settings on CPU + GPU and don't worry which is used. My system has 4 GB VRAM and often it switches to CPU only.

    Post edited by Erdehel on
  • marblemarble Posts: 7,500
    edited December 1969

    Erdehel said:
    ... So it should work with IRay if the available VRAM will allow it. Set your settings on CPU + GPU and don't worry which is used.

    Which is precisely how I have set it. :)

    I did a test with a few simple props and it was easy to tell by the render times whether GPU was being used (it was much faster than when I switched off GPU). So I guess I could do the same kind of test with G2F characters, clothes and hair. I just though there might be some indication somewhere which might say "rendering using GPU" but clearly not.

  • mjc1016mjc1016 Posts: 15,001
    edited May 2015

    Erdehel said:
    I am not aware of any means that will tell beforehand what size of VRAM will be needed. So it is run and guess.

    According to this (found on the Autodesk Iray FAQ)...

    "For estimating memory usage, budget about 1 GB per 8 million triangles, to which you must also add 3 bytes/pixel for any referenced bitmaps. "

    That should mean about 4 million quads...with no textures.

    That's about 50 MB per 4K x 4K texture...and that's EACH map.

    4096 pixels x 4096 pixels = 16,777,216 pixels x 3 bytes = 50,331,648 bytes (or 49,152 KB (50,331,648/1024) or 48 MB (49,152/1024) )

    There's also between a few dozen and some 100s of MB of other 'overhead'.

    And this can show the load and memory usage on the video card...while in use.

    http://www.techpowerup.com/gpuz/

    Post edited by mjc1016 on
  • mjc1016mjc1016 Posts: 15,001
    edited May 2015

    Erdehel said:
    It is very simple: Count 2 Bytes per pixel. You can't fit many textures of 4000 x 4000 in 1 GB lol. 4000 x 4000 * 2 = 32 000 000. To have that in GB you should divide that number twice by 1024: 32 000 000 / 1024 = is the result in MB and that result divided by 1024 is the theoretical size in GB (over 30 GB). Of course there is no GC with that amount of RAM. From what I know the renderer will reduce the picture's resolution. I think it does that while 'preparing the scene'. I doubt IRay reduces the resolution under 800 x 800. Do the math. That's 1.2 GB. So far I know a character has at least 6 main textures : Head - Body -Limbs and for each Diffuse and Bump. to fit that in 2 GB or less there is another optimization going on. Didn't delve in IRay Internals yet to know how this exactly works.

    Anyway these numbers show how rapidly the VRAM gets used. Simple scenes maybe but then with poor textures.

    Of course I may be wrong but that is my understanding of what is going on.

    Edit: Forgot to multiply by 2

    You are going the wrong way...you don't multiply bytes by 1024 to get to KB, MB or GB...you divide.

    The first division gets you KB...then next MB.

    The texture memory size is roughly the same as it would be if it were NOT a compressed jpeg/png file. An uncompressed 4K image file is in the 30 to 50 MB range....and that's what it will take up in system memory or video memory (but probably more on a hard drive due to block size....but that's a different story).

    Another way of looking at it....you can fit between 150 and 200 10 megapixel (3664 x 2748) images in 1 GB....that's what my camera can hold on a 1 GB SD card...(I think 187 is the 'exact' number)

    Here's an online bit calculator...

    http://www.matisse.net/bitcalc/?input_amount=50331648&input_units=bytes&notation=legacy

    Post edited by mjc1016 on
  • KhoryKhory Posts: 3,854
    edited December 1969

    I know that there are a lot of seriously well equipped enthusiasts here but that kind of hardware investment puts it out of the"hobby” category IMHO.

    Unless your a gamer.. Then most of the systems around here are starter level.

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    Khory said:
    I know that there are a lot of seriously well equipped enthusiasts here but that kind of hardware investment puts it out of the"hobby” category IMHO.

    Unless your a gamer.. Then most of the systems around here are starter level.

    Yeah...I'm happy with my son's hand-me-downs....if he doesn't burn them out, first.

  • BeeMKayBeeMKay Posts: 7,019
    edited December 1969

    marble said:

    I have tried to find out how to use the HDRI. I've played with turn on/off "Draw" and other options but can't get a sphere to show in my scene. I have lots of HDRI images downloaded that I used with Reality but can't figure out how to use them with IRay.

    In Render Settings, go to Environment (1).
    Select either "Dome and Scene" or "Dome only" (2).
    If the HDR is supposed to be visible in your render, click Draw Dome" to on (3).
    You can exchange the HDR image in "Environment Map" (4), and increase the light output by Environment Intensity in the slider above "Environment Map".

    dome.JPG
    390 x 698 - 57K
  • marblemarble Posts: 7,500
    edited December 1969

    Not sure what I'm doing wrong then because I had all 4 of those settings as you show them but I still can't see the HDRI image in either the viewport or the render.

    Perhaps I have the light intensity wrong or something similar.

  • BeeMKayBeeMKay Posts: 7,019
    edited December 1969

    You have to change the setting in (3) to ON, if the hdri is to show in the image.
    It will not show in the viewport, though.

  • ErdehelErdehel Posts: 386
    edited December 1969

    See this post and the links inside. Many questions answered.
    See the video to the end some neat trick there on how to use the viewport with interactive rendering enabled..

    http://www.daz3d.com/forums/viewreply/819082/

  • marblemarble Posts: 7,500
    edited December 1969

    Thanks - I was actually looking through that thread this morning.

    However, the problem I was having with those artefacts all over my screen (see above) has returned with a vengeance. I have been trawling through Apple support forums but can't find any answers other than "probably a faulty graphics card". I don't get them with anything other than IRay, however.

    It remains to be seen whether i will get them if I switch off GPU and just rely on CPU. When they start, they appear all over my screen, even outside of DAZ Studio. A reboot usually gets things back to normal and yesterday it was fine for a few hours before it happened again. But today I've started another project and will be sticking with 3Delight until I have more time to do more testing with 4.8.

  • mjc1016mjc1016 Posts: 15,001
    edited December 1969

    marble said:
    When they start, they appear all over my screen, even outside of DAZ Studio. A reboot usually gets things back to normal and yesterday it was fine for a few hours before it happened again.

    Sounds like yes, it is a card problem...and quite probably a heat related issue. When was the last time the machine was given a good clean out?

  • marblemarble Posts: 7,500
    edited December 1969

    mjc1016 said:
    marble said:
    When they start, they appear all over my screen, even outside of DAZ Studio. A reboot usually gets things back to normal and yesterday it was fine for a few hours before it happened again.

    Sounds like yes, it is a card problem...and quite probably a heat related issue. When was the last time the machine was given a good clean out?

    I got it back from the Apple service centre about a month ago. That was for a failed hard drive (which they replaced for free even though it was more than a year out of warranty) but they had it all opened up and I would assume they cleaned it. Apple make pretty damn sure that customers don't attempt to open up these wafer-thin iMacs.

    However, as I said earlier, my app that displays temperatures shows the GPU hardly moving from its ambient temperature during the render. The CPU temp does go up but not to a level I would need to worry about.

Sign In or Register to comment.