Iray Canvases, Light Path Expressions and Z-Buffer
![Esemwy](https://farnsworth-prod.uc.r.appspot.com/forums/uploads/userpics/471/nB75FE0B8135D.jpg)
I know it's in there, but I can't quite get there. I've tried everything I can think of to get a Z-Buffer out of Iray.
Turning on Canvases / Depth, with or without a node list gives me a blank white screen (it renders fast though :-) ). I've also tried Distance instead of Depth (not sure what the difference is in this context), but it does the same.
Has anybody out there gotten a depth mask out of Iray? Along the same line, has anybody done any general experimentation with LPEs?
Any help will be greatly appreciated.
Edit: Tried rendering direct to file as well and turning off tone mapping. No change.
Post edited by Esemwy on
Comments
I'm seeing the same - and normals come out black. Even saving as .exr and adjusting the levels doesn't seem to show any detail. Please report this, unless anyone else can point out what we are doing wrong.
Thanks for the confirmation, Richard. I'll give others some time to speak up, then I'll put in a ticket if all else fails.
And the secret is... HDR Toning...
There's depth detail, but all the values are above 1.0.
Direct from the DAZ3D Help Center:
Isn't equalizing the histogram going to distort the depth information?
- Greg
Yep.
Local Adaptation:
Equalized:
But since we don't have control of maximum distance in the Z-Buffer (presumably part of the "to do"), adjusting the histogram is necessary to get decent control in post-processing.
Hi Esemwy,
how do you export the image as EXR? From the DS menu I only get the default low-range formats (tif, jpg, bmp and png). Perhaps I'm missing something obvious
Thanks in advance for any reply
Rendering direct to file is the only way to get the EXR. DAZ will create a directory by the same name as your TIF (png, jpg, whatever) with "_canvases" appended. In that directory will be files names "<base>.<canvas>.exr".
Example:
Render with Canvas1.LPE direct to file: "Sphere.tif"
You get: "Sphere.tif" and "Sphere_canvases/Sphere.Canvas1.LPE.exr"
The names may be differ on Windows, but that should get you there.
Saving from the render window will also create the _canvasses folder, you don't have to render to file.
Confirmed. Guess I just never checked once I read that advice elsewhere.
so, yeah.. I"m waaaaay late to the party on all this stuff with iRay, but have a ton of experience with it Maya and other places that I'm trying to translate across. My dumb question of the day is: where would I find that "canvases" folder and where is that .exr save at becuase I'll I'm seeing are png, jpg, bmp and tif?
The Canvas feature is in the Advanced tab under Render.
When you render out with one or more canvases, save as usual (png, tif, jpg, it doesn't matter), and a folder with the EXR canvas files for that pass is also created in the same directory.
Relighting with Iray Canvases (shameless self promotion) has a fair bit of general information on the canvases pane as well.
Thanks Esemwy, bookmaked.
Figured I'd pop back into this thread since it was helpful previously... thanks, btw... :)
I am wondering if anyone has a solution to getting the results of a depth pass, in iray, to figure in transparency? For instance, if you're rendering a scene with a bunch of transmapped foliage, your depth pass will be based on the geometry and not what would actually render, so you end up with grey for that geometry instead of the shape of what is actually visible.. if that makes sense. The same issue exists in Poser, interestingly enough, but the answer to it there, for me anyway, has long been to use Semidieu's Advanced Render Settings (formerly at RDNA... wow, how times have changed...). That set of rendering scripts take a different approach to creating the Zdepth channel and are the best I've seen outside of high end software. Dude really outdid himself with that package... but I digress... is there something along those lines for iRay? Or perhaps something that would switch off rendering from iRay to a 3Delight based solution just for that pass? If there is, I want it. And if there isn't, I want to work with someone to help make it happen. :) -Les
I've had a ticket in on Iray for five months. I just pinged them again. Anyway, to your question, there's a depth camera in Age of Armour's Atmospheric Effects Cameras for 3Delight, but I haven't gone there yet. I saw one forum post (which now I can not find) that at least implied that it was broken as well. You can select faces and make them invisible in the geometry editor, and that will work with the depth canvas, but that's kind of a blunt instrument. If you happen to have or purchase the above mentioned camera, let me know. I may try to post a question on the Nvidia forum about it and see of there's a more general solution.
Any news on this one? The depth mask is unusable without tranparency. Also, it would be good to be able to set the starting and end point in the scene.
It's things like this that I miss from poser, someone always working for a solution. I wish Semidieu would jump over and help out :(
Use 3DL. I wrote shaders/scripts that use surface based shaders (that support transparency). What is considered near/far is controlled by the placement of 2 nodes within the scene (a near node and a far node).
- Greg
Could you go into a bit more detail on this? I haven't done much with 3Delight shaders, but having a functional depth camera would be nice.
In my solution, there are no special cameras invloved - it's all surface based. Basically, a script grabs all the parameters for every material on every object and stores them in arrays. The script also grabs the location of the near/far nodes and camera in world space.
Then, a new shader is applied to each object with the color for the shader set according to the location in world space of the point being shaded as compared to the known locations of the near/far points and camera. Other parameters like transparency and displacement are applied according to the stored values from the initial materials (yes, the conversion can be an issue for non-standard shaders).
The script then renders the scene, and finally restores the inital materials (and continues rendering out other render passes).
Now that I'm thinking about it, this may not be a good solution given the fact that transparency and displacement are handled differently in Iray. I'm just now starting to get into MDL a bit, so admittedly I'm no expert!
Hope this helps.
- Greg
That would be great! Can it do only the depth pass or would it have to do the whole render as well?
It's no trouble popping over to 3DL for a depth map. Any chance you could share that script? You'd be a life saver!
I had originally planned on making this a product here at DAZ (along with many other analysis passes as part of my NPR work). Not long after the time I started developing, DAZ announced Iray. To be honest, it's not worth the effort to get it into a form for public consumption as developing for 3DL in DAZ is a dead end.
I do believe there are already shaders for 3DL in the store that will crank out depth maps that support transparency? Perhaps somebody else that knows for sure could chime in. Keep in mind that the issues with displacement not being handled the same will exist.
- Greg
I think we've been looking at these passes as Photoshop friendly layers, and that may not always be the intended use-case. Not everything is made to go through Photoshop, or at least, not without additional filtration.
A .NET or other script could read the depth map, looking at the pixel values that denote distance from camera. Tone mapping isn't necessary (nor is an alpha channel, since you'd be applying the depth information to only visible objects from a color pass); these are raw 32-bit values being calculated. I suppose if you wanted to keep it solely within Photoshop, you could write a custom filter for it. The thing would be dog slow I imagine for any image of notable size.