Daz Studio Pro BETA - version 4.11.0.335! (*UPDATED*)
This discussion has been closed.
Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
I did, that's why I was a bit confused, seeing essentially some bug fixes but no real new (major) functions/features.
If you were looking for a major new feature like dForce in 4.10 you won't find any.
The main new features in 4.11 are the denoiser and support for Nvidia 20x cards in Iray. The rest is bug fixes and enhancements to existing tools or features.
Have you read the bits scattered in all the beta threads about the new minimum version requirement for the NVidia graphics card driver? That sounds like what happens when you have a don't-work-no-more version (I think the minimum for the new beta is version 418 or therabouts). What driver version do you have installed?
Sorry if someone already asked, I couldn't found on the search.
When donwloading drivers from Nvidia for a RTX 2080ti now we have this option "Creator Ready Driver"(CRD) or "Game Ready Driver"(GRD). Right now the versions are 419.67 and 430.39 respectively.
This make any difference to Daz/Iray? Which one would be the best?
..no it doesn't show up anywhere in the Content pane. The Presets folder there only seems to have material, camera, and lighting presets for a few different installed items, but no Poses folder in it.
So I downloaded the Pro BETA from the store, followed the instructions on the product page and installed via the DIM but I dont think anything changed.. Is there more I have to do to access 4.11?
EDIT: Figured it out
What happens if you try a different folder? Are you getting the options dialogue after the file dialogue? I just tested, using File>Save As>Poser preset, and it worked correctly (though not using the default folder).
..I didn't think it would have a different default path than before. When I saved material, character and pose presets they always defaulted to in D:/Daz3D/Studio4/Library/Presets folder.
I'm currently in the process of testing this. Preliminary results indicate that the 425.31 generally performs barely measurably better. Check the SickleYield benchmarking thread later in the week for my complete findings.
Is the public demo still available and where?
Here is my experience so far with the denoiser.. Longer render time and much, much less detail. This was done without optix acceleration but idk if that would make a difference between the two. Is the point that it is supposed to get more accurate quicker so that you can render to less convergence?
That about sums up my experience with the denoiser. It's just bad. There is nothing intelligent about it, it just wipes out all the finer details. Time-wise, yes you should be able to stop rendering much earlier if the thing was any good. Well actually you could because it doesn't get any better if you let it render longer. Details are gone anyway.
+1
I don't know if it's the same since it seems to me that the external post-denoiser only works on jpgs and not on hdr canvases. At least this is the way most people seem to use it. I'd expect the integrated denoiser to be better and faster. That said I really dunno it's just a guess.
Then if the post-denoiser is good and fast the same then yes, the main reason for getting 4.11 gets weak.
I have not tested it in the current beta that just released (.335) but in the 2 prior beta releases having denoise on while canvases were enabled wrecked my image causing black artifacts all over the image and almost a melted look across the image. I never used the denoiser unless I had canvases turned off. It may have been resolved with this beta but it was not producing the results I preferred so I just do not use it.
I think the denoising works well, but you have to be smart & selective on what you use it for (like any tool). For example, in the image I posted on the first page of the post denoising thread,.I liked how it worked on the environment...which was literally a 2-3 minute render. Then again, if you go for realism on a human character, like the above example, it will wash away detail, as details and fine noise are similar in some cases. (the again, if you render for almost an hour...and there is no visible noise, why denoise?) If the denoiser in DS allows you to denoise some layers and not others, that would be a benefit...or you could render to layers and then post-denoise what you like.
As far as the differences, I think the nVidia post and integrated tech are the same, since in both cases an external software in calling nVidia denoising. The Intel denoising is different, of course.
I always thought AI means it would actually analyze geometry and textures. If not that then at least recognize what are patterns and what is noise. This denoiser to me does none of that. Look at the arm texture in the above image. Shouldn't it be absolutely obvious to any intelligent algorithm that this is not noise? In my own tests it blurs out even the eyelashes.
In this case, I believe, the AI bit refers to it having an algorithm trained on a stack of existing images to remove noise while leaving detail. It's isn't an explicit, human-readable algorithm so where it works and where it fails is hard to predict - and I have had both good and bad results. Still, it's simple enough to use - turn it on (with a higher start value than the default for my taste) then toggle it on/off to see if it actually helping - if you find it consistently does not help your images then don't use it, until it gets updated at least.
It's not bad, it's just not applicable to your use case.
I very rarely render to completion. A 12 minute render with the denoiser looks much much better than a 12 minute render without the denoiser. Someone said already that the denoiser helps get the noise out much sooner than otherwise not and that the differences essentially go away as the render approaches completion. Looks like that's exactly what you're experiencing. As intended.
For me, the value of the denoiser comes not in the final render, but in letting me see MUCH SOONER after hitting render if my lighting checks out, if the shaders are behaving as I imagined they would in this light, and quite often if the hair is good, or needs a tweak. The denoiser lets me get through all that tweaking much sooner. That ALONE is of immense value to me. But I ran a scene to completion with and without denoiser and the final renders were indistinguishable from each other. I'm running an RTX 2080Ti on the latest DAZ Studio Beta 4.11.
I'll have to look when I get home - I wanna say "4.18"? I haven't had the card very long, so the driver update worked, and shortly there was an update that didn't work and I rolled back and haven't updated since, even though I think there is a more recent driver that works again. I may need to check that out. I'm ACHING to get the most possible bang for my buck on this card!
Of note: Initially, the denoiser produces a weird image that seems 'oil painted' or something, but it looks good enough soon enough, that I can see if I need to cancel and tweak some more. I'll try to remember to post my driver version after I get home later tonight.
Later that night:
Update: Okay, I am running:
NVIDIA Driver: 419.17
4352 CUDA Cores
12 GB VRAM
on an aging Alienware R4 Desktop
Core i7-3820 CPU @ 3.60 GHz 3.60 GHz
32 GB
Windows 10 Pro 64 Bit v1809
In DAZ 4.11.0.335 Pro Edition 64 Bit
I have Optix Prime Accelerations checked, and in both Photoreal Devices and Interactive Devices I have only GeForce RTX 2080 Tio checked.
Deep-learning based applications are only useful in working with a given workload if the assets used to initially train them are representative of that particular workload. In the case of the Nvidia OptiX denoiser currently included with DS 4.11+, the initial training was done using a large cache of actual project renders provided to them by developers of the 3D space planning program 3DVIA (eg. what powers the website HomeByMe.) Here's a representative sampling (original source: Nvidia) of what these scenes typically looked like:
Notice what's almost completely lacking here: People, much less closeups of people sufficient to show skin or hair textures. One of the simplest way to describe what a deep-learning neural network actually does when you run it is to say that it takes an input, compares it to a bunch of pre-existing outputs, modifies it to be consistent with those pre-existing outputs, and then outputs the result. Hence why the denoiser (right now) tends to treat people like chairs or hubcaps - because all it knows about is chairs and hubcaps. In order for the ai denoiser currently included with DS to be really useful for typical (ie. people included) Daz Studio renders it would need to be trained with LOTS of renders of actual Daz Studio content. All this is to day that Daz Studio's use of Nvidia's proof of concept AI denoiser (because that's all it really is without an application-specific dataset to work from) is very much a beta feature - not so much in terms of whether the code itself crashes or not (which would really be an alpha level issue anyway) but as regarding whether what it does do is really the right thing for the situation.
Assuming that Daz Studio's developers do intend to fully captialize on AI denosing as a final render enhancement solution (rather than just a viewport enhancement technique - which is also a valid use case btw) in the future (which would indeed seem to be the case), my expectation is that once Turing RTCore support gets worked out, there will be some sort of effort (perhaps crowd-sourced even) to put together a big enough collection of Daz Studio content renders for DS developers to have a big enough dataset to get the denoiser properly trained by Nvidia for them. For that matter, it may even be possible right now (hadn't thoguht to even look it up until now) for anyone with a big enough dataset and powerful enough GPU (one with Tensor cores) to train the denoiser using their own personal render library, for incredibly specific effective results.
Check the log file; in my case my TitanXP simply isn't found and my motherboard graphics (Intel 630) is disabled in Daz, therefore the Iray interactive (preview) render gets stuck at "preparing the scene" and all renders complete instantly with a transparent result. I had failed to updated my NVidia drivers:
Not absolutely sure that is the problem (still running a long render with 4.10) but it seems likely.
John Bowler
.
Agree.
Where characters form a significant or important part of the render, the denoiser is useless. So 4.10 gives me faster renders.
When I've experimented with 4.11, I've switched off denoiser due to above. Non-organic objects, seem to benefit from the denoiser.
I run a 960M, but I run Windows 10. The newest GEF Experience drivers caused a Black Screen render in the Beta for me. 417 drivers do the same. What is "the issue" stated above?
...or is this the issue -> "rendering with a GPU based on the Kepler microarchitecture is likely to fail and fall back to CPU"
Which family is the 700M? It may no longer be supported.
Whoops sorry, I have a 960M, which is Maxwell...but it's still called out above.
The bit you quote refers to Kepler chips under Windows 8, or there's the driver requirement (which 417.x fails to meet).
Just updated to the most recent driver, 430.39. I think the Beta renders faster than 4.10 now. I'd have to do side by side renders to know for sure, but it seems to.
But there is an annoying thing I noticed about the Beta. I noticed it in the last one and I notice it with 4.11.0.335. Sometimes, it will crash for no apparent reason. I can never tell when it's going to do it, but it's when I'm working scenes, not during batch rendering. That's pretty infrequent, an every other day thing. Far mor frequent is this issue with panning the camera. Sometimes, quite often actually, it starts to pan really really slowly, and I have to directly control the camera in the parameters tab. After I do that it pans fine. This is the kind of thing that really needs to be worked out for the final release. It's a pain in the ass.