3Delight Laboratory Thread: tips, questions, experiments

16364666869100

Comments

  • Mustakettu85Mustakettu85 Posts: 2,933
    mjc1016 said:
    PS I am not responsible if my surface shaders start a revolution and defect to Qo'noS when paired up with UE2 =D

    Better add that as part of the 'official' documentation...

    You're right, I am thinking about it.

    Either way, I just don't see myself working on it up until maybe April. The current real-world situation here is that you need to run like hell just to stay in place - literally. I don't really have much time or spare effort for anything outside work right now, especially stuff involving writing. Too much of it at work.

    OSL support is eating at me as well. If only there were a way to know if Rob (or some padawan of his, supposing they exist at all) is going to add built-in NSI support to DS or not. Is it even potentially possible. Because if not, I think it's best to scrap the public release of the RSL kit (the alpha testers will get the current state of affairs and any other major updates, but not much extra polish) and focus on learning the plugin SDK so as to add OSL eventually.

  • wowiewowie Posts: 2,029
    edited October 2016

    Some tips and tricks with UE2.

    1. Make the environment ball viewable permanently.

    Go to the 'Parameters' tab. Under the display option, there's a 'Visible to Render' switch. By default, this is off everytime you load UE2 or load a scene subset/light preset. Simply enable it then lock the option. Save the light as a scene subset or light preset. Now everytime you load that saved preset, the 'Visible in Render' is enabled by default.

    2. Alternate way of using HDRI with IDL and Bounce(GI) mode.

    Instead of loading up the HDRI in the 'Color' field of UE2 parameters, load the HDRI into the environment ball's 'Ambient color' field and make sure the color is white at 100% strength. Both in UE2 IDL and Bounce mode, surfaces with ambient enabled will turn into a diffuse only light emitter. Resize the environment ball so the diameter is roughly the same as max trace distance used. I usually just use the floor grid as a rough scale. 1x1 tile is 100 cm so for a distance of 5 tiles, you can use a max trace distance of 1000.

    This is actually roughly the same as 3delight for Maya guide for working with IBL/HDRI. My preferred method of using HDRI images, since it avoids the hassle of correcting UE2 coordinate system by avoiding it completely.

    3. Correcting the HDRI direction/alignment.

    The UV layout for the environment ball is made for the image to be viewed from outside of the ball, not from the inside. To correct this, you need to 'flip' the ball. Easiest way is to just reverse the x or z scale (using negative 100% scale) on either axis. On some HDRI, you may need to use flipped values on both x and z axis.

    4. Easier positioning of the environment light.

    Depending on your light settings, the actual surface of the environment ball may not be viewable in your viewport and thus making positioning difficult. To get around this, simply make a node instance of the environment ball and parent it as a child of UE2, placed in world/scene origin. Make the instance smaller, say something like 1% and make sure the 'Visible in Render' swich is disabled for the instance.

    Edit: The instanced ball also needs to be flipped.

    Rotate the UE2 node and both the environment ball and its instance will follow the rotation.

    The instance ball makes positioning point/spot lights to mimic those in the HDRI pretty easy too. Just match the highlights of a sphere probe to the HDRI image on the instance ball. This is doable in the viewport so you don't need to start an IPR session. You need to turn off the headlamp of course.

    Positioning workflow is generally oriented around the scene's origin. So rather than making a point/spot light and placing it by eye do it this way:

    • Make a dummy node and position it in on the scene's origin.
    • Create your lights and make it a child of the dummy node.
    • Put some distance between the light and the dummy node by translating in z axis.
    • Rotate the dummy node so the spec higlights on the sphere probe matches the HDRI.
    • Unparent the light in place so it retains the position you just set.
    • Repeat for each light

    You can always double check the placement by translating the light till it intersects the environment ball. Works with area lights too though you need to correct for perspective.

    5. Exclude the environment ball from GI.

    Disable Diffuse.

    Some additional thoughts.

    Since IDL and Bounce mode turns any ambient enabled surfaces into light emitters, there's really no use for separate area lights shaders. Just use the ball or use meshes with ambient enabled. Intensities will still vary on size and ambient strength. With this method, there's only one dial to control samples for all lights. No specular of course, but you can use reflection instead though the shader needs to feature blur.

    The biggest problem with this approach is render time and number of samples. GI needs a lot of samples to clear up, something like 2048 for a noiseless (or very minimal noise) which is inline to what 3delight devs suggested. Add that you need at least 4 bounces (only controllable with Kettu's script) and render times can go sky high. A more digestible alternative is simply to use direct lighting plus GI with much smaller max trace distance. Something like 200 since that's roughly the threshold where physical light falloff changes slope.

    Post edited by wowie on
  • Mustakettu85Mustakettu85 Posts: 2,933

    Lovely tips!

    But when you do want to use the ball for the HDRI (p.2), it should remain visible to GI.

  • wowiewowie Posts: 2,029

    Lovely tips!

    But when you do want to use the ball for the HDRI (p.2), it should remain visible to GI.

    By visible if you mean it contributes to GI, yes. But disabling diffuse on the ball means it won't receive GI. I'm pretty sure UE2 skips calculating GI for a surface if the diffuse is disabled since I actually noticed some speed up in the early stages by disabling diffuse on the ball. The only drawback i see is that you can't control contrast or saturation directly. Of course, you can always edit the image in a HDR image editor.

    There was a trick with Houdini COP you can use but unfortunately, DS lacks the bit depth (range) necessary. In Houdini, it's possible to do compositing on the image used for the environment light. Basically, stuff you do with HDR Light Studio. Theoretically, it's possible in DS via LIE but not really usable currently.

  • Mustakettu85Mustakettu85 Posts: 2,933

    Yes, I meant the "occlusion switch" type of visibility.

    Funny, I've been thinking about whether it may be possible to add extra functionality to LIE via scripting or if it would need a plugin =)

     

  • Mustakettu85Mustakettu85 Posts: 2,933

    And BTW: http://www.daz3d.com/forums/discussion/116791/linerender9000-commercial/p1

    As 3Delight as it gets! NPR is a thing to this day, and maybe even getting bigger day by day... For example: http://maneki.sh/

  • Oso3DOso3D Posts: 15,045

    The thing is, PBR is fun but it's EXTREMELY hard to be anywhere competitive with it without a lot of work and gear.

    NPR can be really satisfying, and is more easily leveraged into paying work in many areas.

     

  • wowiewowie Posts: 2,029

    Yes, I meant the "occlusion switch" type of visibility.

    Funny, I've been thinking about whether it may be possible to add extra functionality to LIE via scripting or if it would need a plugin =)

    I'm just using the default shader that came with the environment ball. You can use another shader to get occlusion switch/override, but it isn't needed if you just need a simple backdrop environment or nothing if you're going to composite the render later on.

    And BTW: http://www.daz3d.com/forums/discussion/116791/linerender9000-commercial/p1

    As 3Delight as it gets! NPR is a thing to this day, and maybe even getting bigger day by day... For example: http://maneki.sh/

    Yes, i.ve seen the teaser on 3delight's website awhile ago.

    The thing is, PBR is fun but it's EXTREMELY hard to be anywhere competitive with it without a lot of work and gear.

    NPR can be really satisfying, and is more easily leveraged into paying work in many areas.

    PBR isn't hard. To the contrary, it is easier. No adhoc materials/settings, explicit energy conservation, more accurate BRDF. Plus you have IES profiles, photometric lights. Heck, you even have a physical camera. Wasn't that the touted 'advantages'  of going with a physically based rendering solution? I remember a lot of people complaining about the 'complexity' of 3delight/Renderman shaders compared to what they got with 'unbiased' solutions awhile back (in the old thread). On that note, I find this info from Chaosgroup very enlightening - https://labs.chaosgroup.com/index.php/rendering-rd/the-truth-about-unbiased-rendering/

    Of course, those things only work if you feed them the correct values and bad inputs. But then again, that's true for any renderer.

    In case anybody is wondering who is Vlado, it's Vladimir Koylazov - https://www.fxguide.com/featured/v-ray-vladimir-koylazov-reflects/

  • Oso3DOso3D Posts: 15,045

    I didn't say PBR is hard. At all.

    Look, see? It's in the sentence you quoted.

     

  • wowiewowie Posts: 2,029

    I didn't say PBR is hard. At all.

    Look, see? It's in the sentence you quoted.

    I think you misunderstood me. What I meant is that while PBR isn't hard, I don't see a lot of DS iray render i would call convincingly physical. Particularly when compared to renders with Vray, Arnold. I'd go as far to  say I've seen much more convincing renders with Cycles. Easiest example would be arch viz style renders like you see in Evermotion galleries (or any other arch viz oriented site). If you're doing a PBR render, you want to get as close as possible to being physically correct.

    One example:

    http://www.evermotion.org/tutorials/show/10384/making-of-a-bright-living-room-tip-of-the-week

    With that level quality,  it shouldn't be a problem looking for work or commisions.

  • Oso3DOso3D Posts: 15,045
    edited October 2016

    Oooooooooooooooooooooooooooooooooooooh.

    You were agreeing with me, sorry, it's so unexpected. ;)

    Or, rather, qualified sorta agreeing. But that goes toward my comment about cost... Iray is conveniently easy and free, but most of the actually good stuff requires either a lot of money or a LOAD of training time (IE: Blender)

     

    Post edited by Oso3D on
  • Mustakettu85Mustakettu85 Posts: 2,933

    Maybe it's just me, but Blender isn't as complex as it seems, if you just use it for rendering pre-made stuff and running sims. A dozen hours max, and you know how to do everything.

    And Iray, yeah it's "easy" and free, but I think half the point Wowie´s driving at is that too many DS users still don't really use Iray in a physically based way. Like... white point, anyone?

  • Mustakettu85Mustakettu85 Posts: 2,933
    wowie said:
    On that note, I find this info from Chaosgroup very enlightening - https://labs.chaosgroup.com/index.php/rendering-rd/the-truth-about-unbiased-rendering/
     

    oooh great link thanks!!

    Remind me please to compile a list of similar useful outside links and useful posts!

  • mjc1016mjc1016 Posts: 15,001
    wowie said:
    On that note, I find this info from Chaosgroup very enlightening - https://labs.chaosgroup.com/index.php/rendering-rd/the-truth-about-unbiased-rendering/
     

    oooh great link thanks!!

    Remind me please to compile a list of similar useful outside links and useful posts!

    You've been reminded...oh, wait, you want the reminder at some future date, after we've all forgotten about it...wink

  • Mustakettu85Mustakettu85 Posts: 2,933

    =P Realistically, either I will remember about it this weekend or you will have to remind me in a couple of weeks.

  • mjc1016mjc1016 Posts: 15,001

    =P Realistically, either I will remember about it this weekend or you will have to remind me in a couple of weeks.

    Yep...by which time I will have forgotten, too.

  • wowiewowie Posts: 2,029
    edited October 2016

    The vray devs are doing some interesting things.

    https://www.fxguide.com/quicktakes/v-rays-practical-stochastic-rendering-of-spec-y-things/

    I haven't read the paper, but that does seem like something similar to the SIGGRAPH papers on glints. Only for metal paints for now. It's progress nonetheless.

    They've also ported Anders Langlands alSurface to vray. Most renderers seems to be adopting the alSurface arrangement now that Autodesk have dropped MentalRay from Maya and Max.

    https://labs.chaosgroup.com/index.php/rendering-rd/v-rays-implementation-of-the-anders-langlands-alsurface-shader/

    They're not afraid to say the alSurface SSS implementation was better than Vray's own method.

    http://www.wikihuman.org/index.php/off-topic/the-alsurface-shader-on-the-wikihuman-data/

    Oh yeah, from this Renderman 21 reel, i'd say that's confirmation they're using OSL.

    So OSL is now pretty much the shading language across renderers - that's Renderman (and 3delight), Vray, Arnold, Cycles.

    Post edited by wowie on
  • HaslorHaslor Posts: 408
    edited October 2016

    I am glad to see that no Tea Pots were harmed, in that video.laugh

    But it is amazing stuff.

    wowie said:

    The vray devs are doing some interesting things.

    https://www.fxguide.com/quicktakes/v-rays-practical-stochastic-rendering-of-spec-y-things/

    I haven't read the paper, but that does seem like something similar to the SIGGRAPH papers on glints. Only for metal paints for now. It's progress nonetheless.

    They've also ported Anders Langlands alSurface to vray. Most renderers seems to be adopting the alSurface arrangement now that Autodesk have dropped MentalRay from Maya and Max.

    Oh yeah, from this Renderman 21 reel, i'd say that's confirmation they're using OSL.

     

    Post edited by Haslor on
  • HaslorHaslor Posts: 408
    I found this video. I'm in love. https://vimeo.com/groups/3delight/videos/57552513

     

    Linwelly said:

    Looks tasty!

    And less Filling!

  • RAMWolffRAMWolff Posts: 10,249

    Need some assistance.  I'm trying to do some promos using the 3Delight engine using the Uber Environment base with some other lights.  My tester is telling me on her end the scene is rendering out the hair as white.  On my end it's fine.  So perhaps there is a SIMPLE but thorough light set up that is Uber friendly that someone can show me how to set up to see if I can recreate her issue? 

    This is Dawn with my Tina morph and skin and the Hivewire Fashionista Hair with my upcoming hair Strands.  You can see to the right the lights I have in the scene. 

    Uber Environment 2 is set to 50% - Pure White

    Specular 01 is set to 100% - Pale Blue

    Specular 02 is set to 50% - Pale Blue

    Infinite Light is set to 76% - Pale Blue

    RIM Light is set to 100% - Pure White

     

    ScreenHunter_288 Oct. 23 11.05.jpg
    1297 x 830 - 408K
  • srieschsriesch Posts: 4,241
    RAMWolff said:

    Need some assistance.  I'm trying to do some promos using the 3Delight engine using the Uber Environment base with some other lights.  My tester is telling me on her end the scene is rendering out the hair as white.  On my end it's fine. 

    Have your tester try exiting/restarting DS if they haven't already.  There's an issue where sometimes that happens occasionally.  If this is that issue, it should work fine if you exit/restart and then rerender. 

  • I also had brown hair go white or beige back when I started resetting gamma. The hair texture apparently had been designed with the gamma at 1.0.

  • RAMWolffRAMWolff Posts: 10,249

    Interesting.  OK.. I'll mention these bits to Suze and see what she says. 

  • RAMWolffRAMWolff Posts: 10,249

    So, pretty happy with the results.  I decided to do ditch the Uber stuff, I've never gotten along with it at all and my beta tester kept getting whiteouts so I just redid all the mats with DAZ base and got some very nice results.  It's for my upcoming strands pack.  Decided to try to get Tina's skin looking pretty good.  It's not as pretty as the iRay skin but it's nice enough. 

     

    FashionistaPromo-01.jpg
    1000 x 563 - 455K
  • wowiewowie Posts: 2,029
    edited October 2016

    Still needs to work on glass and raise the reflection samples, but general dieletric seems good. I think i only use something like 3 presets for all dieletric surfaces.

    test3.jpg
    1343 x 1007 - 671K
    Post edited by wowie on
  • HaslorHaslor Posts: 408
    RAMWolff said:
    So, pretty happy with the results.  I decided to do ditch the Uber stuff, I've never gotten along with it at all and my beta tester kept getting whiteouts so I just redid all the mats with DAZ base and got some very nice results.  It's for my upcoming strands pack.  Decided to try to get Tina's skin looking pretty good.  It's not as pretty as the iRay skin but it's nice enough. 
    RAMWolff said:

     

    Is it possible you are not using the Uber shaders that came with Studio, but the purchased ones from omnifreaker?

    It could be you have a bits of code which they do not, therefore it breaks your shaders?

    Just a thought.

  • RAMWolffRAMWolff Posts: 10,249

    Not sure.  I don't think I've ever bought directly from him but only from the store here.  I've tried reinstalling DAZ Studio but still get severity issues and now the feedback from my beta tester ... just put me off so trying to make the 3DL basic shader set up I've created work.  For hair, very well, for skin... meh! 

  • Mustakettu85Mustakettu85 Posts: 2,933
    RAMWolff said:

    Not sure.  I don't think I've ever bought directly from him but only from the store here.  I've tried reinstalling DAZ Studio but still get severity issues and now the feedback from my beta tester ... just put me off so trying to make the 3DL basic shader set up I've created work.  For hair, very well, for skin... meh!

    If you were using UberSurface2 - that is a store shader. Your tester had to have bought it as well.

    If you were using the first UberSurface, without "2" (which only has one layer), then your tester most likely has it because it comes with DS. If you get different lighting levels between your machine and the tester's, you should make sure that you have the same render settings, especially the gamma section.

    If you want to get acceptable skin using the DS default shader, I would advise you to use the "skin" lighting model and adjust the sheen/scatter colours. If you then manage to find the right balance for your specular settings (spec strength maps will most likely be a must), it will look nice. But it won't be actual SSS - no light penetration.

    If you want to get acceptable skin, with actual SSS, using any of the UberSurface shaders, I have written up everything I know about it in my SSS "treatise". Or you can try to terrorise Wowie and ask him for his secrets =)

    And here I posted starting values for getting working anisotropic highlights on hair using the original UberSurface: http://www.daz3d.com/forums/discussion/comment/815439/#Comment_815439

     

  • Mustakettu85Mustakettu85 Posts: 2,933
    wowie said:

     

    Still needs to work on glass and raise the reflection samples, but general dieletric seems good. I think i only use something like 3 presets for all dieletric surfaces.

    Lovely!!

    I also like your choice of colours. It's sort of relaxing, despite the reds.

  • wowiewowie Posts: 2,029
    edited October 2016
    If you want to get acceptable skin, with actual SSS, using any of the UberSurface shaders, I have written up everything I know about it in my SSS "treatise". Or you can try to terrorise Wowie and ask him for his secrets =)

    What secrets? I think they're in the thread somewhere...

    Generally, something like this:

    Most textures, even with gamma correction enabled comes out way too saturated so what you want to do is subtract a bit of red and let some of the redness from the SSS to 'bleed through'. Works quite nicely whether you use the color/diffuse map in the SSS color slot or not.

    To be honest though, there's only so much you can achieve with the classic Jensen dipole model. i'm not entirely sure, I think Arnold's cubic is somewhat similar to Jensen dipole model, while the directional mode is similar to the Deon better dipole. Renderman 21 has a third option - Burley normalized. Plus the mean free path.

    I couldn't find a paper describing/explaining Burley's method. Edit: It's probably this one: http://graphics.pixar.com/library/ApproxBSSRDF/paper.pdf

    That thread in CG Feedback was quite informative. I think they used a fresnel on the SSS/diffuse for additional control.

    Lovely!!

    I also like your choice of colours. It's sort of relaxing, despite the reds.

    Thanks. But the red wasn't my choice, since the reference render shot used red car paint.

    process.jpg
    3415 x 726 - 829K
    Post edited by wowie on
Sign In or Register to comment.