Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
I have those pony's as well , Outoftouch makes great hair pieces, but they were to sleek for animation use I needed something that looked more wind blow thats why i used sherry hair its has a long hair morphs that will reach the floor as well as tons of movement and windy dials which help me keyframe the dynamics in animations. i tried deforce, but it keeps crashing daz so I gave up on it. maybe my systems to old for deforce but it was not worth the trouble i was having with animating deforce and how unstable it was having to save every time i took a step in creating animation with it. It did not save me any work flow time nor did it give me the results i was looking for. so I rather hand keyframe .obj movements rather than use deforce. at least until which time i get a new system with more power or daz improves deforce to work on older computers. which in either case is going to be a long time. my system I have now I paid about $6000 for it 5 years ago. So to buy a new PC equivalent to what i have now running would be a helleva lot more investment i just can't make at this time. so I work with what I have.
..yeah havn't even considering dForce yet after all the stories like yours I've heard and read.
Same here on the hardware. Have two systems with DDR3 memory one with a 6 core Xeon (my older one) and the other a 4 core i7. The one bright spot, I can run newer GPU cards (recently acquired a Maxwell Titan-X) even though the card slots are PCIe 2.0. Just means a little more time to load the scene or sim, once I get everything updated and in sync (netwroking the two systems together) I'll give dForce a try.
One feature I like about OOT's hairs is they have great translucency and thickness adjustment. Did a test of my namesake (who is an albino) with the sun behind her which produced a nice effect on the tail.
Testing out porting materials from a 3delight supplied materials to match something in iray.
Original.
aweSurface
23 minutes 47.51 seconds at 16x16 pixel samples.
Didn't quite get the angle right and the lighting is different. I just used the overhead light panels at 7300 K. Overall, it looks quite similar, but I need to adjust the bump mapping code. Looks like it's much stronger than what it looks on the iray shot. Walls and the chair looks way too bumpy.
Very interesting! Yes it's bumpier, and the legs of the chair are a bit jagged. Also the Iray version apparently has stronger indirect light.
Nice!
Yep. I was using way too high of a multiplier on the bump shader. Using half the original amount seems to produce similar results.
The aliasing on the chair is probably due to me using 100% reflectivity. Honestly, it should be lower (which also should help minimize chances of fireflies), but I was more keen to doing the porting with default values on aweSurface.
The additional indirect light probably comes from outside, which should be handled with an additional area light and/or ambient surface. To control the amount of global illumination, I've implemented exposure controls on the shader. By default, its set to -1.5 exposure which roughly translates to 35%. You could always raise that I suppose. For this scene, I just used exposure on the light, I think I used 4, which translates to 16x the intensity. The scheme should be an effective way to control exposure levels without actually having a physical camera.
Thanks for the sanity check, Sven.
Not about to set the default to some reasonable value?
Nope. Not needed anymore because aweSurface now has tonemapping and effective exposure controls.
No more overblown diffuse or specular and no fireflies.
Also, temperature offset to alter white balance. Notice how the white stays white on the render with a 8000 K temperature but gets tinted at 4000 K temperature.
The exposure/temperature controls are available for the full scene, subset of a scene, or individual materials if you prefer. In the render with the temperature offsets, the blue glass statue and the clear glass sphere actually have temperature controls turned off, but tone mapping is still enabled. I'm in the process of writing out saturation controls. Just because I can and just in case somebody is interested of doing a tribute to Pleasantville. No need to go through postwork to achieve that anymore.
As much as I'd like to implement auto exposure and white balance, I have to learn a bit more about imager shaders to implement them.
Your doing amazing things with 3Delight Wowie!
Thanks.
It's been a long journey, with many setbacks, few mishaps and outright fails. But I think I've fixed all of the bugs I've managed to find.
Some things I've not managed to solve :
Render times are surprisingly good, given all the stuff I've put into one shader. I did have an idea on minimizing noise with GI, but haven't fleshed out the approach just yet.'
I've decided to exposu some of optimization parameters for filtering opacity values. Kinda neat actually, since playing with them allows for pretty varied outputs. I'm thinking about adding an option to invert masks to get very different looks. Would that be useful?
Inverting masks would be very useful! :-)
OK. I'll add the option to invert mask like opacity, metal and transmission.
Awesome!
Finally got somewhat believeable blonde hair.
The hair has two layered specular/reflections, with a little bit of thinfilm and colored coat, plus translucency AND subsurface scattering. It will have to do until I understand how to implement 3delight's hair material.
Also solved the GI noise.
With that solved, there's less of a need to go higher than 8x8 pixel samples. Render time is 8 minutes, give or take with a HDRI lit scene. I still would like to do a bit more optimizing though.
Wow, that's amazing news! Looks very nice indeed! I love the way the shadows on those objects behave. And refraction looks fantastic
Man, I can't take it no longer

Fantastic image there! Love how the hair turned out. April would love this render!
Thanks Sven and RAMWolff.
Silly mistake though. I forgot to reset the max diffuse bounce depth for the metal and glass materials in that render. There should've been some color bleed on the ground plane from the gold Buddha statue and the red metallic paint sphere.
Max diffuse bounce for metal and glass set to 1.
Max diffuse bounce for metal and glass set to 4 (default).
Thin glass mode - non refracting glass. Notice you still get proper absorption either way.
I've understand now that for physically plausible rendering or PBR, you absolutely need proper white balance in addition to linear workflow (the whole gamma correction thing) and proper albedo values (diffuse strength). White balance is what's missing all these years in DS and 3delight. So, aweSurface defaults to using tone mapping and white white balance temperature set to 6500K.
I've integrated the invert mask options, though I haven't tested it out. Also did some cleanup on the necessary scene override to exposure, temperature and color saturation. The control is a dummy light, which makes it convenient since its visible in the viewport but does absolutely nothing in renders.
Well, except for letting the shaders know what temperature/saturation values you want to use. Thanks to light linking via light categories, you can actually link different materials to different controls. Or just override the scene controls per each material.
The temperature and saturation only works on diffuse materials though. Mainly because when I enable them for glass and metal, fireflies can occur when those are very close to each other. Didn't find a way to solve that, so for now I just disabled them.
Thin glass might come in handy for ghosts or something!
Without the render script, with progressive refinement - 19 minutes 12.38 seconds
With the render script, without progressive refinement - 3 minutes 13.85 seconds
With the render script, with progressive refinement - 2 minutes 35.14 seconds
So yeah, 6x times faster at least. Same shader, same settings (max trace depth 12, 8x8 pixel samples).
Very nice, @wowie
I really can't see the difference between the two modes, so it must be working as expected=)
Excellent!
I'm a little late to the party on this one. Love the renders but I'm not sure what we're discussing. Is this a potential product? A script? Where is it available?
Did another low res tests with my HDRI test scene ( the Stanford Lily, Buddha and Dragon Statue ) plus two nude Genesis 2 figures. Mostly cause I want to see what it would be like with full SSS. Included are different test shots with the scene.
The low res tests were done with 4x4 pixel samples, not my usual 8x8. GI/irradiance samples was set to 128, since that works well enough for this scene and still very fast. For final renders, 512 looks good, but obviously 1024 is a tad better.
Total Rendering Time: 1 minutes 24.71 seconds // with Mustakettu's render script set to bucket
Total Rendering Time: 1 minutes 7.7 seconds // with render script set to progressive
Total Rendering Time: 3 minutes 52.50 seconds // DAZ default with progressive
Total Rendering Time: 30 minutes 55.24 seconds // DAZ default non progressive
I even did a test with UE2 bounceGI mode and disabled the GI on my shader. Not really comparable since the resulting render is quite different.
Total Rendering Time: 2 minutes 34.6 seconds // UE2, with Mustakettu's render script set to progressive
Total Rendering Time: 8 minutes 10.13 seconds // with render script set to bucket
Didn't test without the script, since that will likely be over an hour. A 1280x720 version will likely take about 3 to 4 times longer.
It's basically one uber mega shader I made. I'm calling it aweSurface.
It does physically plausible shading for 3delight, built to work with metalness/roughness PBR workflow. I wrote something about it here and on another thread, about it. Some of the more recent stuff I've managed to put into the shader are luminance based Reinhard tone mapping, white balance offset and saturation controls. Mainly because I hate doing post work on renders.
Hopefully, there will be a commercial pack with other goodies like shader presets, more detailed tutorials and some extra shaders - an area light shader with barn door controls and gel light support (no IES or gobo support yet) and an ambient shader mainly to be used on an environment sphere, with controls so you can manipulate the HDRI tiling/offset, exposure, saturation on-the-fly. The tiling/offset works in the viewport, so you can orient the HDRI any way you want. Oh yes, an ambient light shader to make life easier by applying scene wide adjustments for controlling exposure, white balance and saturation. Could also be used to selectively apply different values to certain objects.
I'll even throw in an environment sphere with the normals pointing inward. Maybe also point/spot/distant light as well, since Mustakettu's haven't released hers.
Combined with Mustakettu's script, or my custom version of it, it should be able to give you renders as shown above, ridiculously fast. Plus it will be offered for free, hopefully here.
...OK, heart be still, need to breathe slowly...
Looks awesome. Anxiously awaiting for either the commercial or the free. I tend to do 3delight renders most often as doing one in iRay is extremely slow on my laptop.
The African male render... thought it was an iRay render. Very very nice work there!
How does this compare with render times in Iray, using roughly similar realism?
Good question
My three biggest issues with 3dl have been the lack of some sss stuff, like cloudy water and generally consistent behavior (which this apparently fixes), how much slower it is than Iray with GPU (for similar levels of realism ), and how scattered the different shader and light bases are with inconsistent or confusing incompatibility
Any chance of a built in dynamic lens flare camera shader
Honestly, I don't know. I'm still using DAZ 4.7 so no iray. I won't install 4.8, 4.9 and maybe 4.10 since there's like minimal difference in terms of 3delight build used. If I want to use the new builds, I just use the standalone render client.
In short, comparison with iray will have to wait until someone with either of those builds gets their hands on my shader and customized Mustakettu's render script.
Cloudy/murky water should be done with rough refraction, not SSS. You can technically have both of them enabled in my shader. The other day I tested using translucency and transmission to get something like sheer fabric.
Here's what I did with my shader using rough transmission. 1280x720, 8x8 pixel samples.
Model is the Stanford Dragon
http://www.mrbluesummers.com/3572/downloads/stanford-dragon-model
with this IBL
https://www.hdri-hub.com/hdrishop/freesamples/freehdri/item/117-hdr-041-path-free
Total Rendering Time: 4 minutes 15.65 seconds with a desktop Core i7 4770K.
This is a mix between the transimission roughness settings from before with an SSS material with the Meat1 profile from Jensen's reference values BTw, is there even such a thing?
a greenish semi transmissive meat. Eeeew.
Total Rendering Time: 9 minutes 54.41 seconds
Disabled the reflection since with that enabled, it would take longer. Obviously the refracted rays couldn't penetrate the material.
With just the SSS and reflection.
Total Rendering Time: 3 minutes 9.20 seconds
edit. Sorry, I used the wrong scale for SSS before, so the effect is very weak. SSS should be for solid, but translucent materials. Cream or milk rather than murky/cloudy water or even frosted glass.
Anyway, both are (going to be) free to use so it's up to you.
As for comparing a 3delight CPU render and an iray GPU render, that's apples and oranges.
I understand the argument but I'd say you should compare a 3delight CPU render with an iray CPU render, since iray can also run on the CPU. Unless you're going to give me something like AMD's new 2990WX Threadripper, then I'll be more than happy to compare 3delight render CPU times with say, an iray render with a Geforce 1080Ti. Not joking. If anyone is serious, mail me - I definitely won't object to using a 32 core/64 thread behemoth like that.
3delight allows for freedom so you can stick with physical values or break them with old school approaches. That's actually inconsistent. Something like light linking or trace sets actually break PBRT style renderers. In short, it should be as consistent as you choose it to be. I can't force everyone to follow PBR conventions if they won't even enable gamma correction in the renderer settings.
I thought Mustakettu already did something like that. I saw a lens flare shader in her blog awhile back.
Honestly though, I would prefer to have a proper physical camera first - with settings like ISO/aperture to have something akin to a camera exposure control. Also a way to control shutter with custom blades, roundness or crazy bokeh filters and of course, on-the-fly DOF adjustment, auto exposure and auto white balance. I already have a rough idea on how to implement them (at least some) but those requires outputting AOVs. Mustakettu did respond that AOVs should be doable by using the scripted renderer workaround but I haven't seen one example of an imager shader for DS that works (outside of the example shader in Shader Builder). There's like zero documentation about that.