Another HDRI Question

in The Commons
I have been trying to figure out how to create my own HDRI from scenes (architecture interiors) and the technique that I got from Sevrin that uses the canvas beauty map and gimp works very nicely. The only problem I'm having now is that now matter how big I make the HDRI map (8000 x 4000 or even 10000 x 5000) the details in the map comeout looking blurry or not clear when I bring it back into DAZ as an environment map.
If I try changing the dimensions on the camera in my new scene with the new HDRI I can chane the focal length or frame width but the HDRI still appears just a bit fuzzy. Do I maybe need to zoom in more when I create the HDRI?
Comments
check the setting for environment lighting resolution and perhaps the lighting blur, also check what compression levels daz studio is using under the advanced tab of render settings, I'm not sure if those will have an effect on the environment map textures but its still worth it to check.
Thanks skinklizzard, I have tried adjusting lighting resolution a few times but I don't really notice any change, and lighting blur is a toggle right?
I tried the compression and it didn't budge.
It will always be slightly blurred, as even 10,000 pixels (the maximum width as far as I know) is still going to be spread out over a full circle, so that's about 28 pixels per degree of the 360 degree full rotation.
There are some things you can do to make it less noticable:
-Make sure there's enough light
-Render your HDRI with a slight blur yourself, so you at least have some control over how blurry it will be.
-When using the HDRI, don't move the camera too far away from your subject, but keep it relatively close, and make it a slightly more wide-angled shot. Say, your render is 1024 pixels wide. If you cram only 20 degrees of your HDRI into that render (long distance zoom shot), that means you render only 28*20=560 pixels of your HDRI. So those pixels get stretched out in your final render. If you take a closer camera with a more wide angled shot, and capture about 50 degrees of the HDRI, then that means you'll be cramming 50*28=1400 pixels of background HDRI into your 1024 pixel wide final render, that will appear much sharper. Pixels will get merged instead of stretched, and the viewers mind will subconciously fill in even more details.
You cannot make an hdri from any standard image format with a single exposure and have it look right and it will not be HDR.
To create an actual HDRI requires taking many images and combining them. This articles the technique for doing so with a camera:
https://blog.hdrihaven.com/how-to-create-high-quality-hdri/
The Beauty Canvas isn't a standard image format. It's a .exr file, which is capable of the full HDRI range, and actually does so when it's made in Daz Studio. One could actually use a HDRI, render it, and get a new HDRI out of it. It won't be as detailed as the original HDRI, as data will be lost during the rendering process, but, it's technically possible.
Yeah, it's pretty straightforward, really. Unless it's an indoor scene, you don't have a lot of other good choices if your scene's geometry assets don't fill the whole frame.
Wow, Thanks Drip, That is a lot of great information. I guess I need to make a better HDRI. I have the program brackets wil thgat be a good way to go to make an HDRI from a DAZ scene?
You cannot make an hdri from any standard image format with a single exposure and have it look right and it will not be HDR.
Not true. This figure was rendered in Iray and lit solely by an HDRI that was created from a series of in-game screencaps that were stitched together into a panorama, which was subsequently turned into an HDRI by an AI driven algorithm:
- Greg
Sevrin!!!, Always love your input. well, like I was saying above I like Sevrin's technique and using the exr, but the only issue is that I noticed in my last HDRi attempt that I could not get the HDR map to look in focus
it looks just a tad bit blurry. So I guess using the Beauty canvas technique won't prevent that. Like I said I went all the way up to a 10000 x 5000 render. It's funny I made another classroom render before with a slightly different interior scene and it did not come out looking blurry.
Do you think the Bracket (below) program would yeild a sharper loooking HDRI?
http://user.ceng.metu.edu.tr/~akyuz/bracket/bracket.html
It has no sharpening filters, so I doubt it would help, but might be good for previewing and organizing images. Do you have depth of field turned on? That would cause blurriness. You can export it to JPG and host it somewhere and post a link. Since we can't see what you're doing, we're stuck guessing at what went wrong.
And that is as you state not a single EXPOSURE! What is so hard to understand.
That as the OP stated and as the reply I quoted made clear the image will not be clear! Am I on a non English board?
It is a single exposure - there was no bracketing. The stitching of screencaps was just to get a higher resolution panorama. What don't you understand?
- Greg
series of in-game screencaps. Plural
I still love this character! She's awesome! I would love to see you use her in a comic! :D
Yes, the series of 1920x1080 screen caps were used to create a single 8192x4096 panorama, which is still lower resolution than the OP was talking about. After stitching, the image was still standard dynamic range.
The article you linked does a good job explaining whole process of bracketing multiple exposures in order to to create higher higher dynamic range:
https://blog.hdrihaven.com/how-to-create-high-quality-hdri/
Exposure bracket – a set of photos from an identical point of view with increasing or decreasing brightness. When merged together, taking the best-exposed parts of each one, they create a single image with a much higher dynamic range. Our monitors can’t display this higher dynamic range image, they don’t show anything brighter than “white” (RGB=255). Stitching a panorama when we can’t see all the parts of our images is hard, so to make things easier to see we can do some tonemapping.
Like I said, none of the screencaps I took were bracketed. They weren't from the identical point of view and there was no way for me to change the exposure in the game for each anyway. This is why saying "You cannot make an hdri from any standard image format with a single exposure and have it look right and it will not be HDR." is not true.
- Greg
Something has got lost in translation, the OP queried:
>The only problem I'm having now is that now matter how big I make the HDRI map (8000 x 4000 or even 10000 x 5000) the details in the map comeout looking blurry or not clear when I bring it back into DAZ as an environment map.
Blurry indicates a bug or fault in the process, "not clear" indicates the limit of the pixel resolution, as @Drip pointed out. The hdrhaven HDRIs that I typically use are the "16k" ones; that means they are 16384 (16x1024) pixels wide [and half that high, for obscure mathematical reasons.]
The traditional photog camera lens for 35mm is 50mm. In that system the film is 36mm wide by 24mm high, so the lens has a width of 40 degrees, 11% of the whole scene so 1800 pixels of the HDRI. The standard Daz camera is 65mm on a 36mm width; halfway between the standard 35mm size and the pseudo-standard for a 35mm portrait lens: 80mm. So the standard Daz camera has a field of view of just 31 degrees (horizontal) and only 1400 pixels of the 16k HDRI.
Nevertheless those should be fine for a backdrop if the backdrop is at a distance. In my case I have an interior HDRI (of the room in which I am writing this) which is actually 17918 pixels wide. That HDRI shows what I would call low resolution but might be termed "not clear" when I pose characters close to the actual walls; the characters are very sharp in a UHD image, the background which is pretty much in the same plane is not sharp. It is not "blurry"; blurry to me means bad re-sampling in the manner of circa 2000 image processing, which used an algorithm called "bilinear" which creates an error of up to about 20% in each pixel value when done *correctly*.
It is not difficult to create an HDRI above 16k, just very tedious. The typical HDRI production uses a lens which is around 12mm when expressed as a 35mm film equivalent. For the HDRI I produced that was around 32 separate positions with seven exposures in each position to get the HDR. Rendering an HDRI with Daz requires either that you use a suitable Daz lens to capture the whole thing in one shot, limiting you to a 10k resolution, or that you render multiple HDR images, rotating the camera before each, and attempt to sew them back together with a third party program. I haven't mastered either yet. Indeed my attempts do not even come close to "novice".
She is one of the main characters in the story, so you'll get your wish if I can ever make it happen ;)
To get back on topic, as has been suggested, it would help to have some more information about whether or not DOF was being used on the camera, etc. Even if that isn't the issue, it's important to keep in mind that the more zoomed in the camera is, the more resolution you'll need on the HDRI to avoid blurriness. Same holds true for rendering larger images.
- Greg
FYI - if you click on the gear for Pixel Size (Global) -> Parameter Settings and turn limits off, DS can render more than 10K, so you can render out a more than 10K HDRI in one go using the spherical lens in DS. VRAM is likely to become an issue when the output is such high resolution, though.
- Greg
Since we're having fun with HDRIs and what you can and can't use as an HDRI, I dug up the PNG file from when I made an HDRI with Stonemason's Urban Sprawl 3 and Orestes' Vanilla Skydome, and made a couple of renders with one using the HDR file I got from converting the EXR file as HDRI and one using that PNG as the HDRI. The exposure in the HDR isn't identical to what I got from the original render, because I was going for evening lighting, but it's sort of similar.
In both cases, the IBL is the only light in the scene. You can use a standard image for lighting, but, by comparison, it will look kind of crap. But kwannie's issue is the blurriness, not light quality. I use some DOF in almost all my renders, so I'm usually fine with 4k. In this case, the lack of crispness is due to the lowish resolution, not to "blurrriness"
The HDR (4k)
The PNG (also 4k, obv)
Okay, I attached the HDR map I was working with. in the first pic I have the characters that will be used. This is supposed to be a DAZ style remake of an MMD animation for the song Kimagure Mercy. The setting is based on the MMD video. The second pic is just the classroom. Obviously there still needs to be some finesse added to the lighting and placement of the characters. Of course the HDR map is somewhat out of scale and blurry.
Greg, where does this character come from? Is it a DAZ character? ...... and if so which one?
Thanks.
Well, I don't mind that the background is a little out of focus, since that makes the characters stand out more. To my mind, it would look worse if the busy background were in focus, since it fights with your characters for attention.
You didn't answer about depth of field with the background render, though.
Oh sorry, no depth of field is not turned on. I have had trouble with the scaling on this one too. The characters look way to big but I just need to keep playing with the settings.
BTW Sevrin, have you ever used Movie Maker from Dreamlight. It does some kind of crazy stuff with backgrounds so you can actully move the camera around in the scene.
It sounds like youre planning to render an animation yes? Is your camera stationary? if so you could render out a backplate in addition to the hdr and stick that in as a background.
Yes, I pretty much try to work only with animations. MMD has an unlimited source of free motions that can be harvested for DAZ. Generally I try to recreate the MMD scene into DAZ. Sometimes I can export the actual scene out of MMD and use it as an obj. in DAZ becuase it is low res enough to render faster.
What exactly is the backplate, j cade? Do you mean use a plane primative, and add a texture map? How would it work in conjuction with and HDR map?
youd need to render out a backdrop with the camera in the exact same location as the scene with the cameras - but since you can set it to the exact same dimentions It won't have any of the texture stretching
Theres a tab called "environment" wherein you can add a background image so you'd use the hdr you made for lighting, but the background would be supplied by the image - obviously this only works if the camera doesnt move around though
If you're rendering in Iray you could just plug a render of the background into as a Backdrop in the Environment Tab. @barbult has a tutorial on how to use renders and photos as a backdrop: https://www.daz3d.com/forums/discussion/comment/4358891/#Comment_4358891