Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
Don't know as of yet. I'm a total newbie when it comes to this process. I was just gathering all the downloads and then reading the posts here to understand which is better.
Thank you anyway :D
For my money, the Intel denoiser is superior. :)
The meat of it right now is the Iray to 3Delight script I'm working on (which is unfortunately not yet completely refined).
The main jist is that it takes the Iray materials' base map/colour properties, and copies that to both the 3DL materials' base and ambient properties, creating a self illuminated material, as well as preserving displacement and opacity maps, but any specular, bump, normal, etc maps are thrown out. After that, set the object's parameters to have "Cast Shadows" off and turn off all scene lights (which I assume I can do via the script, but I'm focusing on material parameters for now).
I then render in 3DL with the sample quality set suitably high, and it seems to work as an acceptable albedo.
If you can find a working LPE that provides an albedo, then I'll leap for joy, but having gone over the documentation, I believe any suggestion they could was some sort of misunderstanding of their purpose.
The Nvidia documentation I found seems fairly complete on the matter of LPEs and is up to date. My reading is that LPEs can manage custom properties, but those properties are all about reflection type, not material type. If you want to get just light rays from one specific source that have bounced twice and are now a specular reflection off a specific object in the scene, you can do that, but the output is strictly about the light that is coming off the surface, not the surface itself.
Normal buffers are an available canvas option in Daz, but are not, as far as I recall, available through LPEs.
I personally agree - which means I've thus far managed to overlook the Nvidia denoiser in my albedo tests. I guess I should try.
That sounds intensive and very cool. Thanks for sharing. Hopefully one day I'll be able to give it a shot, too. :)
The trick is to start the nvidias denoiser very early (around 100 iteration for example) and stop the render much faster.
This is the way other render engines use nvidia denoiser. Doesn't work on every situation but work very well on nature or architectural scenes
Hollow creek only 3min render (RTX 2070).
The second face image is only 5min. (the fist 16min). Look how much noiser are the 16min.
Thanks for the example, Thomas.
Nature looks good, but there is a good amount of detail loss on the denoiser. I think this is because, as Matt said, it doesn't use albedo / normals.
Matt, when you complete your script would you be amenable to posting it for others to use, as well?
I hope to, although as I butchered someone's freebie Iray to 3Delight script as a starting point (it saved a lot of time over trying to bungle through from scratch in a scripting language I'm not very familar with), I should really seek permission from them before redistributing it. (I doubt anyone of a mindset to release free scripts is particularly likely to object if they can be used to help other people in a different way, but still).
Thanks, Matt. I actually remembered I have the Octane plug-in (I don't use it because converting MDL to Octane materials is such a pain), but it can give me an albedo pass.
Are you using the "drag and drop" input from Taoz? Because it doesn't seem to be using my normals / albedo. I've done a bunch of tests and they all look the same with and without normals / albedo.
Sure you named them correctly (see online help)? The log file will also tell if they have been picked up by the denoiser:
-----------------
Launching Nvidia AI Denoiser command line app v2.2
Created by Declan Russell (25/12/2017 ~ Merry Christmas!)
Input image: H:\denoiser_test_images\tranq_gd_2iterp.png
Loaded successfully
Albedo image: H:\denoiser_test_images\tranq_gd_2iterp_alb.png
Loaded successfully
Normal image: H:\denoiser_test_images\tranq_gd_2iterp_nor.png
Loaded successfully
Output image: H:\denoiser_test_images\tranq_gd_2iterp_nvidia.png
Denoising...
Denoising complete
Saving to: H:\denoiser_test_images\tranq_gd_2iterp_nvidia.png
Done!
--------------
I've tested myself using fake AOVs and they did indeed change the output though not for the better. I have only been able to test the NVidia denoiser though (CPU too old for Intel).
Here are the NVIdia tests, first one is the original render, 2nd the denoised one (no AOVs) and the 3rd is denoised with fake AOVs.
Hi Taos! Yes, as far as I can tell they are name correctly. However, in the log they’re not being processed. I’m using the Intel denoiser. I’ll try Nvidia.
For the record, I'm not. I've got an old habit of using command line tools via .BAT files, so I've just got a premade set of parameters I can copy the correct filepaths into.
I have found that the Normal AOV provided by Iray does tend to provide more noisy results (given it is itself a noisy output), particularly on very detailed and irregular surfaces (e.g. hair), so in isolation, just providing the Albedo AOV does seem to provide the best results.
However, a Normal AOV does improve the detail in some areas, so in my zombie test, I actually processed the file three times - with both AOVs, with an albedo, and no AOVs, then masked different areas together in Photoshop; mostly using the Albedo-only layer, but masking in the "All" layer where more detail was required, or the depending on which I felt provided the best results.
It does mean some extra work, but it is still the best trade-off on my mediocre system as far as getting decent results without needing days of render time.
I got it to work! Although I still don't know what I was doing wrong before.
Here's the sad part: The normal pass in Iray is strange and not as useful, and the albedo pass is nonexistent.
I have a license for Octane which I rarely use because Iray's MDL is a pain to port over, and I prefer the speed of using ready-made materials.
However, Octane has great support for render passes. I was able to export the albedo (which they call Diffuse Filter (Beauty)) and a more useful normal pass (Shading Normal).
This resulted in good denoising.
I tested both Nvidia and Intel's denoiser and my conclusion is that Nvidia's is... well... not as good. Intel is faster and better at keeping details. See for yourself:
Interesting. It looks like nvidia uses a smooth filter on the normal buffer .. But doesn't octane have its own denoiser ?
Octane does, yes. But I don't use Octane in general, I use Iray.
Octane is a powerful tool, but I don't want to spend hours converting Iray's MDL shaders to Octane shaders.
I did a simulated test, and the Intel denoiser does pick up the AOVs (the logfile output is from the denoiser itself) before terminating with an error message:
---------------
Launching OIDN AI Denoiser command line app v1.0
Created by Declan Russell (01/03/2019)
Input image: H:\!intel test\img_orig.jpg
Loaded successfully
Albedo image: H:\!intel test\img_orig_alb.jpg
Loaded successfully
Normal image: H:\!intel test\img_orig_nor.jpg
Loaded successfully
Output image: H:\!intel test\img_orig_intel.jpg
Initializing OIDN
[OIDN]: SSE4.1 support is required at minimum
------------------
Not sure what you mean by nonexistent?
As for the Normal it requires the Albedo, if you use the Normal alone it will have no effect (both denoisers will generate an error message saying that). The Albedo can be used alone however.
There is no option for an Albedo canvas (and the "Diffuse", while sometimes used interchangeably as a term, relates to the Diffuse component of lighting, not surface information).
Hence why I've been using a bodged 3DL shader to render a fake albedo pass separately. (Which, while it takes time and effort, improves the performance of the denoiser so dramatically that I can massively reduce how long I need to render in Iray for a decent result).
Ah, OK, I thought he was talking about the denoiser scripts.
Hmm. My TrendMicro anti-virus seems to think Denoiser.exe is ransomware.
Try using this LPE to see if this is an Albedo pass:
L<TS>*<RD>E
I am not aware enough of the full meaning of Albedo but have been trying to find something based on other render engines LPE's and this may be the Iray version. I verified that it works though the window that opens for the preview will be blank the exr will have the info. You will have to add an exposure layer in PS to see the image if you have tonemapping on of course.
That translates as "A light path that starts at a light, and which is permitted to be transmitted specularly, but where the final reflection before the eye must be diffuse". It's unfortunately not an albedo, as it involves a light.
Any LPE that was genuinely an albedo pass should not involve any light, because an albedo pass should be entirely unshaded base colour information for the surfaces. However, all LPEs have to be delimited by starting or ending at a light, with the only exception being using an Lm term to create a matte of the object, so...
... hang on a minute. If... if... Iray is handling Lm as a fake light source with universal and non-directional intensity as a work-around in order to allow the eye to create a matte of specific objects, then that may mean that Lm could be used as the source to generate an albedo pass.
Hmm. Trying it out, it seems that Iray may actually be handling Lm as such a fake light. In which case Lm.?DE (direct diffuse events from the "matte light") is in theory a valid albedo pass.
Unfortunately. it appears to become increasingly prone to fireflies the more samples are rendered, as well as noisier than the actual render, so it really doesn't seem it's optimised to be used this way (unless I've messed up the LPE in some major way), so it's not actually very useful compared to my fake 3DL albedos.
The original from the site was:
C<TS>*<RD>A
I could not get that to work so tried the one i put.
Here is the link (bottom of page): http://julius-ihle.de/?p=2619
That page isn't specific to Iray, but to the concept of LPEs in general, which are used in several different renderers. Unfortunately, Iray's implementation of LPEs does not accept A as a parameter to get an Albedo.
On a related note, if you do need to modify LPEs from other programs, you need to be careful to maintain polarity. LPEs can be expressed in either direction from the camera, but the order of events is still important. Your modifed path has swapped the light source and eye without reversing the order of other events, so it's looking for the diffuse event at the wrong end of the lightpath.
It is fairly solidly documented: https://raytracing-docs.nvidia.com/iray/manual/manual.190223.A4.pdf although it is fairly dense reading.
What about:
E<TS>*<RD>L
It looks like the one you mentioned (Lm.?DE) except without the fireflies.
No, because the path is dependent on light sources (you're searching for paths that end at any light), it can't be a valid albedo pass. If you're getting something that's genuinely a close match to Lm.?DE, that just means my (very large) assumptions about the way that Iray's Matte passes work are either outright wrong or flawed enough to be useless.