Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
You can't create an albedo from a diffuse. Different materials need different "colors" in albedo. Glass and metals are always grey. To create an albedo you need diffuse textures and material information. This is work for LPEs, but the LPE has to be aware of the specific materials, so materials must also have tags that the LPE can understand.
EDIT. That is, the tutorials for Photoshop are about creating albedo textures. That is another story from getting an albedo buffer for the denoiser.
EDIT. One way to get an albedo buffer, as previously discussed, may be to use OpenGL with flat shaders. Then glass surfaces could be changed to pure white and metallic surfaces to a grey value equal to the reflectivity. This could work fine enough as a first approximation.
https://openimagedenoise.github.io/documentation.html
Well one of the tutorials actually uses a chart with colors for different materials:
https://www.youtube.com/watch?v=KKQZN3eoKUo
But how well it works, or if it works, I don't know, I found it a bit too complex to bother with.
CrazyBump as well as AwesomeBump are supposed to be able to do it, but with the total lack of documentation for both I haven't been able to figure out how.
Again nope .. those are to make albedo textures, that's a complete different story from making an albedo buffer. An albedo texture is essentially a diffuse texture without reflections and ambient occlusion. An albedo buffer also will use albedo textures for matte materials, but will treat metals and glass differently as explained above.
So the only way to get the albedo is if the app you create the render with can produce one? How about the normal, is that a special type too?
Both albedo and normal maps can be "faked" from photography textures. Because a photo always contains reflections and ambient occlusion information "baked" with the albedo. So there are algorithms to analyze the image and extract in some extent and approximation what would probably be the albedo and normal map for that image in common lighting conditions. This is for textures.
Creating an albedo buffer for the denoiser implies getting exact information for the image, that is generally not a simple surface as a texture would be, but a complex environment with different subjects and materials all interacting together with also possible complex lighting conditions. In this situation it is not possible to use the algorithms for textures.
So yes, I fear the only way to get the albedo and normal buffers is if the rendering engine can produce them.
OK, thanks. Then one can only hope that DAZ will make that option available in DS.
Ditto!
Now that I understand how to use this I think it's great! Thanks Taoz the DnD works great.
You're welcome, glad you got it working!
Taoz, thanks for front end. I use it in Win7Pro with intel denoiser but have to kill program in task manager to get it to output file. Green progress bar does not move and program does not crash? Weird huh?
That's weird yes. What does log say? What happens if you process multiple files? Have you tried to output to another folder than the one the source file is in? And where is it installed (path)?
Coming back to this rather late, I'm actually now experimenting with using 3Delight with flat shaders, as it is a more advanced renderer and thus has the option to calculate things like depth of field (giving the denoiser the information it needs to know that things in the distance aren't supposed to be razor sharp detail, like it would with a straight OpenGL render ).
Given that it's supposed to be an albedo render, almost all of the shader options can be turned off (no lights, shadows, occlusion, specularity, subsurface scattering, ray-tracing, etc), meaning it still renders fairly briskly.
I can't experiment very properly at the moment (as I'm away from my regular computer and currently using a not very powerful laptop), but it's showing a lot of promise. The intention is at some point to create a script that can do all the necessary set-up and surface conversion.
Okay, some more comparisons. This is a cropped version of a scene I'm working on at the moment, and part of the reason I'm using it as a test is because it's seriously slow to render on my machine (about 200 samples an hour).
No Denoising
Intel Denoising, No Albedo
Intel Denoising, OpenGL Albedo
Intel Denosing, 3Delight Albedo
As you can see, at 256 Iray samples (I like to work in binary for my test renders), it's really very noisy and could take days to render on its own.
We've then got a sample fed through the IntelOI Denoiser (which I personally preferred), which is really struggling to tell the difference between detail and noise.
I've then given the denoiser an OpenGL render with flat shaders to emulate an albedo pass. Detail retention is considerably improved, but because OpenGL won't do depth of field and doesn't handle transparency brilliantly, we're getting some really weird ghosts in the background where the albedo is telling the denoiser that things should be sharper than they actually are.
Final attempt is with a script I've bodged together to convert Iray shaders to flat 3Delight shaders. This is the most time consuming method, as rendering the 3Delight scene to a high enough quality (2400x2400 pixels @ 16x16 samples on my middling GTX 1050 Ti) took about 16 minutes, but due to advantages like 3Delight being able to match Iray's DoF settings, it does avoid inserting the weird artefacts of the OpenGL albedo.
There's still some noise the denoiser hasn't dealt with, particularly in the water droplets, but it's pulled back a lot of detail even a human might struggle to see in the original image.
~~~~~
So far, this is all without a Normal AOV, because I never rendered one in the first place, but it does still demonstrate it is possible to get the avantages of the Albedo AOV, even if Iray refuses to provide one itself.
I can't supply the script yet, as it's still a bit rough around the edges (and I should really seek permission from the person whose freebie Iray -> 3DL shader I cannibalised), but hopefully I'll be able to provide it sometime soon.
This is very interesting, Matt_Castle. Thank you for your hard work. But, man, it's irritating you need to do it at all. Iray should be able to provide both an albedo pass and a Normal AOV. That's typically a very simple function, and it's necessary for a good denoiser implementation.
I'd agree that it's a function that really should exist, but having actually gone through what documentation I can find on Iray's LPEs, it seems that the suggestion there was an LPE for the purpose is false.
While Light Path Expressions allow you to filter out just the light from one specific source that has bounced twice to become a diffuse reflection off a certain object in the scene, the output of LPEs only allows you to filter for rays in the scene that match specific parameters. As the name implies, they're entirely about light paths - they do not provide information about surfaces.
If a feature for getting an albedo AOV from Iray exists, it's not via LPEs, and it's something Daz would have to enable at a deeper level.
Been lucky enough to play with @Strangefate and @Roguey's new Medieval Bedroom set the past few days, and in the process started to understand the denoiser a bit better. The fur on the wolf, along with the depth of field and an extremely detailed environment were challenging. It's great that it works on 32-bit canvases, and in the end, I was pleased with the results:
- Greg
Here's a before/after denoiser comparison:
Definitely saved a bunch of time.
- Greg
I'm giving the denoiser a try out and I have to admit it's very good all the firefies on the glass are gone now and in the same time.
Her skin looks great, too Wayne - that's the hardest part!
- Greg
It's a bit scarey at the start but the longer you wait the better it gets !
Well done daz !!!
I've built a simple batch (bat) file for Declan Russel's command line Intel Denoiser. Since I always dump the current render into the same work folder I built a batch file around that workflow.
The batch file is here: http://www.trilobite.org/poserfiles/files/RRWDenoiseBat.zip
The command line denoiser is here: https://github.com/DeclanRussell/IntelOIDenoiser/releases
Useage is simple: Place the
denoise.bat
file in the folder you dump your renders in and click on it when you want to denoise an image. It will pop up a command prompt and ask you the name of the file you want to denoise, then the name you want to save that denoised file to (these must be different). If the source file does not exist it will tell you and ask for the name again. If the destination file exists it will ask you if you want to overwrite the destination file. If source and destination are the same it will ask you for a new destination name until they no longer match. When it has denoised your image it will ask you if you want to process another image or quit. Every prompt has a quit option so you can quit at any time.There is only one setting that has to be taken care of before you can use the batch file: On the 11th line of the file is a command to set the location of the DENOISER.EXE file. It assumes that you installed the file in "C:\Program Files\Denoiser". If you installed it anywhere else you will need to change the value of line 11: SET _progdir="C:\Program Files\Denoiser". Change the section in the quotes to the file path for your installation of the denoiser executable. !ONLY EDIT THE TEXT INSIDE THE QUOTATION MARKS! Anything else will break the batch file.
Cool - thanks for sharing! The nVidia denoiser built into DS handles beauty canvases - will the Intel process 32-bit EXR files?
- Greg
The Intel denoiser will process 32-bit EXR files, whether as a Beauty, Albedo or Normal input.
On the note of the built-in denoiser, I'm personally entirely ignoring it in favour of these standalone options; Processing the render afterwards is a non-destructive process that makes it possible to layer and mask the results in Photoshop. This is particularly powerful if you feed the denoiser different combinations of input canvases to have multiple layers to play with.
This was a test of that theory - and while I'll admit the subject matter is rather strange, I deliberately wanted to use 1) a demanding interior set lit via a single window and 2) a subject which had challenging contours. Given my normal style is "paranormality", you get a zombie getting dressed.
This was rendered for only 2000 samples, which would normally be woefully inadequate for an internal scene lit through one partially-shuttered window, even without the main subject having challenging detail; it was about at the point where you could mostly make out fine detail under the noise, but still only a small fraction of the samples that would be needed to get a clean result.
However, after applying the Denoiser (in three combinations - Beauty only, Beauty & Albedo, and all three of Beauty, Albedo and Normal) and masking the results to get the best possible compromises between noise and detail, the final results are actually surprisingly acceptable.
Would it have still benefitted from more render time in the first place? Of course, and that probably would have sharpened a few of the details up more, but for something I did on middling hardware in only a couple of hours, it's still pretty good.
I'm seriously bewildered by the reflections at her feet (with her reflection apparently standing on tip-toes). Has the image been composited in some way?
If so, be aware that because AI denoisers are looking for very specific patterns of noise, then any alterations done to the output renders before denoising severely impact the denoising effectiveness; you'll get the best results by doing it before any other editing.
Hi Matt - Did you figure out how to get the beauty / albedo / normal working in Iray? I thought you were still experimenting.
No, I still can't get an albedo from Iray. As I said before, I don't believe it's available via LPEs (which just focus on light paths, not surfaces), so if the feature is in Iray at all, it's something that Daz's programmers will need to unlock.
The albedo was again rendered in 3Delight, which is obviously less convenient than if Iray would supply one, but in the great scheme of things it's still a net gain to take a few minutes to render an albedo pass separately that allows me to save many thousands of render iterations; My best estimate is that for this scene, I would have had to render about four or five times as long for the denoiser to have been able to recover the same quality from just the beauty canvas.
(And yes, I know I could have brought the iteration count down a lot by using ghost lights, but the point was to test the denoiser, so it was a deliberate choice to light it inefficiently by just relying on an HDRI outside a partially-covered window).
Also, while somewhat annoying, if I'm looking at the "silver lining" side of things, rendering the albedo pass manually makes it possible to tweak it for best results. I found in one test I did, the doors and handles on some kitchen cabinets weren't very distinct by their default albedo settings, despite quite a strong contrast in the rendered scene; as such, I darkened the handles' albedo down to make them stand out more and to my eye, the denoiser's results got significantly better, providing a result much closer to the details and contrast my eyes could see in the original beauty render.
~~~~~
EDIT: Gah! I've just looked at the image I've linked above more closely, and although it's pretty good if you get the high-quality JPG version, if the site decides to supply the version recompressed to WEBP, it's shocking, particularly as I'm trying to illustrate detail retention. (Seriously, Daz, scrimping on bandwidth at the cost of image quality just isn't on.)
I've swapped it out with a link to a PNG version hosted to Imgur that will definitely show the full detail.
Interesting! Would you mind a brief tutorial of how you get this all to work together? I’d love to give your method a try.
I know very little of LPE, but from what I read they should be able to manage custom properties. So if DAZ can pass the material type to LPE, eg. matte metallic dielectric, then it should be possible to get the albedo. Also the normal buffer is important as well to preserve the bump maps.
But I agree this should be a work for the daz team. The actual "denoiser" provided in 4.11 is more an advanced blur filter at this time. Of course it's better than nothing but without albedo and normal it has nothing to do with a real denoiser such as blender's.
Yes it's a mirror image of the top without the shadows then blur it a bit, it denoises the alpha to so looks even better
There was a minor upgrade so it's 2.3 now from NVIDIA!
Thank you for the news :D
How is it? Is it better and less agressive now? Is it integrated in Daz or is there the old version still?