Stereoscopic Camera Rig for IRAY Using Offaxis Projection

zaz777zaz777 Posts: 115

The set of cameras in the scene subset file in the attached zip use ERCs to control the Lens->Lens Shift X (mm) parameter to allow the user to render stereo image pairs using the offaxis projection method.  This is different from the normally seen toe-in method which can produce vertical parallax issues.

The following text is from the README included in the attached zip file.

INTRODUCTION

The "Stereo Camera Set IRAY.duf" file contains a scene subset with 3 cameras
(a center camera that can be used to frame the shot and the left and right cameras
from which to render the left and right images) and a null object.  This
set of cameras provides a convenient way to render stereoscopic images or videos.

The cameras are setup to use an offaxis projection, rather than the toe-in method
most stereo camera rigs use.  The toe-in setup causes problems with vertical parallax
in the rendered images which can lead to eye strain and/or motion sickness.
For more information about the differences, see
http://paulbourke.net/exhibition/vpac/theory.html or use your favorite search
engine to find pages which reference "stereoscopic" and "frustrum".

The underlying mechanism used is the new "Lens->Lens Shift X (mm)" camera
parameter available in 4.9.3.166 and later.  I assume this parameter only works
in Iray, so these cameras will likely only function properly in Iray renders.

Everything is controlled from the BASE camera via ERCs.  One only needs to setup
the shot using the BASE camera and then render separate images with the LEFT
and RIGHT cameras.

CAVEATS

Since I don't have a HMD, I have no idea how well this works for spherical
projection.  I expect it doesn't, but it might.

This was my first attempt at using ERCs to control parameters on one object
via parameters on another object.  Everything might break horribly when in
the hands of a normal/real user.

The Lens->Lens Shift X (mm) parameter on the cameras is only active while
rendering.  If you are looking through the left or right camera and the
viewport DrawStyle isn't "NVIDIA Iray", you aren't actually seeing the
scene as it will be rendered.  So, if you need precise positioning of
the cameras, make sure you use the "NVIDIA Iray" DrawStyle in the
viewport.  The Base Camera's Lens Shift parameter isn't modified by
the ERCs and in most cases will give you a good understanding of
what the cameras will see.  

Don't change the locked values on the LEFT and RIGHT cameras nor on the
PROJECTION PLANE null.  They are locked and/or hidden for a reason.
Modifying them directly will cause the computation of the Lens Shift
X parameter to be incorrect.

If the intent is to view the stereo image pairs as anaglyphs, you may
want to over saturate your images a bit as they'll appear washed out
most of the time.

Ugly script icons are still my specialty, :-).

INSTALLATION

Unzip the file into one of your Daz Studio content directories.  The zip file
contains:

  ReadMe's/Zaz/Stereo Camera Set IRAY - README.txt
  Presets/Cameras/Stereo Camera Set IRAY.duf
  Presets/Cameras/Stereo Camera Set IRAY.duf.png
 
You are reading the "Stereo Camera Set IRAY - README.txt" file.  The other two
files can be found in "Daz Studio Formats->CONTENTDIRNAME->Presets->Cameras"
in your Content Library pane while running Daz Studio.  Replace CONTENTDIRNAME
with the name of the directory used to unzip the file.

DEFINITIONS

Projection Plane Depth: Objects at this distance from the cameras will be
appear to the viewer as if they are directly on the viewing screen.  Objects
futher away will be behind the screen and objects closer will appear in front
of the screen.

Camera Offset: The is the distance the left and right cameras are offset from
the base camera.  The higher the offset, the more pronounced the stereoscopic
effect will be.

See the paulbourke.net link above for some detailed information on how
this actually works.

USAGE

Merge the "Stereo Camera Set IRAY.duf" file into your scene.  Use "BASE
Stereo Camera" to:
   - frame your shot
   - set the Camera->Focal Length (mm) parameter on the 3 stereo cameras.
     ERCs on "BASE Stereo Camera" will control the parameter on the left
     and right cameras.
   - set the projection plane depth via the Stereo->Projection Plane Depth
     the "PROJECTION PLANE Depth" null will be moved in the scene to give
     visual feedback and this value will be used to compute the correct
     Lens->Lens Shift X (mm) parameter for the LEFT and RIGHT cameras.
   - set the left and right camera offsets with Stereo->Offset
     the "X Translate" parameter of the LEFT and RIGHT cameras will be changed
     via ERCs and this value will be used to compute the correct
     Lens->Lens Shift X (mm) parameter for the LEFT and RIGHT cameras.
   - Render images from both the "LEFT Stereo Camera" and the "RIGHT Stereo
     Camera" and save them.

VIEWING STEREOSCOPIC IMAGE PAIRS

What you do after you render is a bit beyond the scope of this project.
Meaning, you chose to go down this rabbit hole and I, or others, may or may
not have time to answer questions beyond the whys and hows of the cameras
included in this scene subset.  However, below are some pointers to let you
dig the hole as deep as you want, :-).

I normally use one of several methods to view stereoscopic image pais as
red/cyan anaglyphs.

Under linux, a viewer I wrote years ago is my preference.

Under windows I either use the ancient, but still very useful, functional
and still maintained, "StereoPhoto Maker" ( http://stereo.jpn.org/eng/stphmkr/ ).
I also sometimes use NVIDIA's "3D Vision Photo Viewer" that can be installed with
their drivers.  StereoPhoto Maker has more features than NVIDIA's program, last
I looked anyway.

I used to use my shutter glasses for viewing, but that was long ago as the
ones I have require CRTs.  When the dust settles some, I'll probably grab
a Vive or other non-Facebook (read their EULA if you want to know why) HMD
for an upgrade.  I bought a few pairs of anaglyph glasses from some web site
too long ago to remember who it was.

If you want to do movies, you can hack up some scripts to drive ffmpeg, as I
did :-), or grab "StereoMovie Maker" from the same place you can download
StereoPhoto Maker.  There are likely many other options, but I haven't
explored them for quite awhile.

 

zip
zip
Stereo Camera set IRAY.zip
13K

Comments

  • Thank you!  :)

  • mindsongmindsong Posts: 1,701

    appreciated, thanks a bunch!

    --ms

  • Thank you! This is amazing! Trying it out now.

  • AndySAndyS Posts: 1,438
    edited April 2017

    Several years ago I did similar (good old 3Delight times).
    Its available as a freebee on ShareCG.

    But it doesn't matter whether 3Delight or iRay. It is ever usable.

    Does the Stereo Photo Maker software support some hardware / interfaces for shutter glasses? Or does it only composite the two images? Cause that could be done with every primitive graphical software (i. e. very old version of Paintshop pro).

    Post edited by AndyS on
  • zaz777zaz777 Posts: 115

    You're welcome.  I'm glad some are finding it useful.

    @AndyS I think I actually looked at your camera setup on ShareCG.  IIRC, the setup looked similar to the toe-in based camera rigs I'd already created for myself.  To implement an offaxis projection method for 3Delight, I think you'd need to make a camera shader or render wider than wanted and crop to your final resolution while adjusting the offset.

    As far as Stero Photo Maker, and Movie Maker, they have support for multiple types of output.  More types of anaglyph than one would think exist and some forms of shutter glasses, some specific 3D panels, interlaced and possibly a few others.

    I may have used Stereo Photo/Movie Maker with shutter glasses when I still had a CRT that worked with my ancient shutter glasses.  My play with stereoscopic images today is limited to anaglyph while I wait for prices, quality and standards to settle in the HMD market.

  • AndySAndyS Posts: 1,438

    Yes Zaz,

    I work with anaglyph, too.
    And the "toe in" method (as you call it) is a good possibility to determine, where later the projection plane level sits. So you can have things in front, in and behind your computer screen.
    It was often used in the 3D cinema films to get the impression, things swallow the audience in front of you, while appearing out of the depth of the scene.

    With your setup, everything in the scene appears very in front of the screen. There isn't any parallaxe of the cameras, as it normally is for our eyes viewing an object.
    What is the influence of your "Projection Plane Depth"? I couldn't notice any effect.
    Your cameras only record a picture from two simply shifted positions. So you have your "Offset" as an unused overlap on both sides of the composite.
    Or didn't something work as designed?

  • zaz777zaz777 Posts: 115
    AndyS said:

    And the "toe in" method (as you call it) is a good possibility to determine, where later the projection plane level sits. So you can have things in front, in and behind your computer screen.
    It was often used in the 3D cinema films to get the impression, things swallow the audience in front of you, while appearing out of the depth of the scene.

    The link to the Portable Sterioscopic Projection page I provided in the original post and readme go into greater detail on the weaknesses of the toe-in method, but I'll recap a bit of the information here.

    The toe-in camera setup seems correct at first glance as you're replicating how the eyes work.  However, since the devices, for example a monitor, used to view images rendered this way aren't normally internal to the eye, unexpected distortion is created in the images.  The distortion lessens the fidelity of the illusion and can cause eye strain.

    I created my first stereo renders in blender long ago with a toe-in camera setup.  I thought it was the right way until I saw something posted by Ton describing why it was wrong.  In many renderers, it isn't possible to resolve the problem, like blender at the time, and one is left with the toe-in method or render wider and set the final projection by aligning and cropping the rendered images.

    You can get good results with a toe-in camera rig, but how good the results are is highly dependent on the scene.  You can get better results by rendering wider and then aligning and cropping in post production, but its a fair amount of extra manual work and requires rendering parts of the image you will never use.

    The benefits of the offaxis projection method is that it doesn't suffer from the distortion issues of a toe-in camera rig, it eliminates the need for the aligning and cropping post production and you only render pixels that will appear in the final image pairs.

    AndyS said:

    With your setup, everything in the scene appears very in front of the screen. There isn't any parallaxe of the cameras, as it normally is for our eyes viewing an object.

    The illusion of things appearing in front of the screen is available in both toe-in and offaxis projeciton camera setups.

    AndyS said:

    What is the influence of your "Projection Plane Depth"? I couldn't notice any effect.
    Your cameras only record a picture from two simply shifted positions. So you have your "Offset" as an unused overlap on both sides of the composite.
    Or didn't something work as designed?

    The stereoscopic effect in this camera setup only works in NVIDIA Iray renders.  The Camera->Lens group is not functional for any other renderer in DAZ Studio and is only available in the production DAZ Studio 4.9.3.166 or later (first seen in the 4.9.3.128 RC3 beta).  If the current renderer in DAZ Studio isn't set to NVIDIA Iray or you're running an older version, you won't see any changes.

    As mentioned in the original post and readme, you also won't see changes in the viewport unless the viewport is in NVIDIA Iray mode.  This is a bit of a short coming of DAZ Studio's viewport.  OpenGL has the ability to change the camera's frustrum, but that isn't used in DAZ Studio.

    In viewing the rendered images, the projection plane is the surface of the monitor or other device on which the rendered images are viewed.  Parts of the image pair that are coincident (share the same pixels in the images/are at the same location in the images) will appear to exist in space on the surface of the viewing device.

    NOTE: I chose the term "surface" here explicitly.  If your viewing surface isn't a flat plane, i.e. a normal monitor, a distortion will be introduced.  I believe the current HMDs effectively, or actually, are non-planar viewing devices.  How well this rig works, even if the Lens->Lens Distortion Type parameter is set to spherical is unknown to me as I currently don't have access to one.

    In rendering the images, the projection plane is the plane in the scene which will appear in the render and viewing to exist in space on the surface of the viewing device.  This plane doen't actually exist in the scene, I'm talking about a virtual/mathematical plane.  If one wants to better visualize it, one could parent a plane to the PROJECTION PLAN Depth null object on my camera rig.  I didn't do that as I felt it would confuse too many people.

    The Portable Sterioscopic Projection page has pictures to better visualize what I describe above.

    With that said, the BASE Stereo Camera->Stereo->Projection Plane Depth (cm) parameter sets the projection plane distance from the cameras for the render.  The parameter is used to move the PROJECTION PLANE Depth null object to give a visual reference of where the projection plane is and is used, along with BASE Stereo Camera->Stereo->Offset (cm), in the computation of the LEFT/RIGHT Stero Camera->Lens->Lens Shift X (mm) parameter.

    The Lens Shift X (mm) parameter, again this is only available when NVIDIA Iray is the active renderer and in latest release of DAZ Studio, is what provides the parallax in this camera rig.  It effectively performs the aligning and cropping post production step automatically.

    Objects in the scene that are the same distance from the cameras as the PROJECTION PLANE Depth null object will appear to exist on the surface of the viewing device.  Objects further from the cameras will be behind that surface and objects closer will be in front of the surface.

    Its possible there is a bug.  A relatively simple test that doesn't involve a render is to check/record the Lens->Lens Shift X (mm) on the LEFT and RIGHT stereo cameras.  By default, the LEFT camera should have a Lens Shift X (mm) of -0.15 and the RIGHT 0.15.  The BASE Stereo Camera->Stereo->Projection Plane Depth (cm) parameter defaults to 400.0 and the BASE Stereo Camera->Stereo->Offset (cm) defaults to 6.00.

    If you change either or both the BASE Stereo Camera->Stereo->Projection Plane Depth (cm) or the BASE Stereo Camera->Stereo->Offset (cm), the Lens->Lens Shift X (mm) parameters on both the LEFT and RIGHT cameras should change.

    If the Lens->Lens Shift X (mm) parameter isn't changing, then its possible the ERCs aren't working.

    If you can't find the LEFT/RIGHT Stero Camera->Lens->Lens Shift X (mm) parmaters, then you're either running an older version of DAZ Studio or you don't have the NVIDIA Iray render engine selected on the Render Settings tab/pane.

    The rig works fine here loading the it into new scenes across new runs of DAZ Studio, so I think it should work for others.  However, this is the first time I've released something with ERCs and the first project/item I've worked with ERCs controlled from other objects.  It is entirely possible that there is something I missed, but I think they should be working.

    @AndyS I apologize if what I've written above comes across as trivial or basic to you.  I expect you know much, if not all, of what I've described.  My intent was to insure that we are talking with the same vocabulary and to provide information for users who have less knowledge and/or experience in the subject matter.

  • AndySAndyS Posts: 1,438
    edited April 2017

    Hi zaz,

    zaz777 said:
    AndyS said:

    And the "toe in" method (as you call it) is a good possibility to determine, where later the projection plane level sits. So you can have things in front, in and behind your computer screen.
    It was often used in the 3D cinema films to get the impression, things swallow the audience in front of you, while appearing out of the depth of the scene.

    The link to the Portable Sterioscopic Projection page I provided in the original post and readme go into greater detail on the weaknesses of the toe-in method, but I'll recap a bit of the information here.

    Yes of cause, I studied this link.

    zaz777 said:

    The toe-in camera setup seems correct at first glance as you're replicating how the eyes work.  However, since the devices, for example a monitor, used to view images rendered this way aren't normally internal to the eye, unexpected distortion is created in the images.  The distortion lessens the fidelity of the illusion and can cause eye strain.

    I never experienced unexpected distortions. But it depends on people how much they're able to develop a 3-dimensional imagination. And some training on discoupling of focus and parallaxe sometimes.
    Cropping is not useful, cause you ever have objects not covering each other by 100%.

    zaz777 said:

    I created my first stereo renders in blender long ago with a toe-in camera setup.  I thought it was the right way until I saw something posted by Ton describing why it was wrong.  In many renderers, it isn't possible to resolve the problem, like blender at the time, and one is left with the toe-in method or render wider and set the final projection by aligning and cropping the rendered images.

     

    zaz777 said:
    AndyS said:

    With your setup, everything in the scene appears very in front of the screen. There isn't any parallaxe of the cameras, as it normally is for our eyes viewing an object.

    The illusion of things appearing in front of the screen is available in both toe-in and offaxis projeciton camera setups.

    Sure. But with toe-in you have the chance to see objects in front, in and behind the projection plane. This is a really big advantage.

    zaz777 said:
    AndyS said:

    What is the influence of your "Projection Plane Depth"? I couldn't notice any effect.
    Your cameras only record a picture from two simply shifted positions. So you have your "Offset" as an unused overlap on both sides of the composite.
    Or didn't something work as designed?

    The stereoscopic effect in this camera setup only works in NVIDIA Iray renders.  The Camera->Lens group is not functional for any other renderer in DAZ Studio and is only available in the production DAZ Studio 4.9.3.166 or later (first seen in the 4.9.3.128 RC3 beta).  If the current renderer in DAZ Studio isn't set to NVIDIA Iray or you're running an older version, you won't see any changes.

    Yes of cause, I'm using the 4.9.3.166 Beta. And yes, the render is iRay. Did I state something different?

    zaz777 said:

    The Portable Sterioscopic Projection page has pictures to better visualize what I describe above.

    If you change either or both the BASE Stereo Camera->Stereo->Projection Plane Depth (cm) or the BASE Stereo Camera->Stereo->Offset (cm), the Lens->Lens Shift X (mm) parameters on both the LEFT and RIGHT cameras should change.

    As I changed the Projection Plane Depth parameter, nothing happened to the lens shift. Everything still with selected iray render setting for the scene.
    I noticed those parameters are locked. They can't change. Was that your intention?

    If you have a close look to the illustrations of that link, you'll notice, that the left and right corners of the two cameras (left eye and right eye) are non-symmetrical in an opposite way. Those un-symmetrcal cameras are not available in DAZ.

    In real photography you need so called "shift" lenses to create the necessary distortion. There're different kinds and ways of using those shift adapters. You can correct "falling" lines in architectural photography as well as the asymmetrical shift of the optical axis (necessary for off-axis projection, you wanted to use).

    OK - at the moment I have a long time render running. But later I would like to follow your description step by step again.

    Attached a comparison of your and my method (done with iRay and DAZ 4.9.3.166).

    Do you notice the disadvantage of the off-axis projection?
    The character is half the distance between your eyes and the screen, whereas the focus is on the screen.  --> Good luck for your eyes.

    Stereo Test mine 3D.jpg
    1000 x 900 - 94K
    Stereo Test zaz 3D.jpg
    1000 x 900 - 107K
    stereo projection methods.jpg
    612 x 789 - 81K
    Post edited by AndyS on
  • zaz777zaz777 Posts: 115
    AndyS said:
    Sure. But with toe-in you have the chance to see objects in front, in and behind the projection plane. This is a really big advantage.
    AndyS said:
    zaz777 said:
    AndyS said:

    What is the influence of your "Projection Plane Depth"? I couldn't notice any effect.
    Your cameras only record a picture from two simply shifted positions. So you have your "Offset" as an unused overlap on both sides of the composite.
    Or didn't something work as designed?

    The stereoscopic effect in this camera setup only works in NVIDIA Iray renders.  The Camera->Lens group is not functional for any other renderer in DAZ Studio and is only available in the production DAZ Studio 4.9.3.166 or later (first seen in the 4.9.3.128 RC3 beta).  If the current renderer in DAZ Studio isn't set to NVIDIA Iray or you're running an older version, you won't see any changes.

    Yes of cause, I'm using the 4.9.3.166 Beta. And yes, the render is iRay. Did I state something different?

    No, you weren't specific, but had mentioned both 3Delight and Iray.  Also, many people have stayed with 4.8 or earlier versions of 4.9 for various reasons.

    AndyS said:
    zaz777 said:

    The Portable Sterioscopic Projection page has pictures to better visualize what I describe above.

    If you change either or both the BASE Stereo Camera->Stereo->Projection Plane Depth (cm) or the BASE Stereo Camera->Stereo->Offset (cm), the Lens->Lens Shift X (mm) parameters on both the LEFT and RIGHT cameras should change.

    As I changed the Projection Plane Depth parameter, nothing happened to the lens shift. Everything still with selected iray render setting for the scene.
    I noticed those parameters are locked. They can't change. Was that your intention?

    Yes, I locked those intentionally so they would be controlled only by the ERCs.  The same with the X Translate parameters on the LEFT and RIGHT cameras.  I also locked all of the other Transform parameters so users didn't modify them via zooming the camera while viewing through it.

    AndyS said:

    If you have a close look to the illustrations of that link, you'll notice, that the left and right corners of the two cameras (left eye and right eye) are non-symmetrical in an opposite way. Those un-symmetrcal cameras are not available in DAZ.

    In real photography you need so called "shift" lenses to create the necessary distortion. There're different kinds and ways of using those shift adapters. You can correct "falling" lines in architectural photography as well as the asymmetrical shift of the optical axis (necessary for off-axis projection, you wanted to use).

    That lens shift is exactly what the whole rig is supposed to be automating for the user.

    AndyS said:

    OK - at the moment I have a long time render running. But later I would like to follow your description step by step again.

    Attached a comparison of your and my method (done with iRay and DAZ 4.9.3.166).

    Do you notice the disadvantage of the off-axis projection?
    The character is half the distance between your eyes and the screen, whereas the focus is on the screen.  --> Good luck for your eyes.

    Yeah, the results you're getting with my rig are worse than anything I want to look at for more than a few seconds, smiley.  Given that you're not seeing the PROJECTION PLANE Depth null parented to the BASE Stereo Camera, I expect those results are because the ERCs aren't working.

    I've attached an example of what my rig can do when its working.  The left box is behind the projection plane, the middle box is on the projection plane and the right box is in front.  The images rendered in DAZ Studio with the rig and converted to anaglyph in Stereo Photo Maker.

    Are you seeing any errors in the log file when you merge the Stereo Camera Set Iray.duf file into your scene ?

    I specifically used a null object to avoid having to grab items from the data/auto_adapted directory.  I've also reviewed the duf file and it appears to be complete self contained except for the cameras which should be available from your base 4.9.3.166 install.

    s002.png
    1024 x 1024 - 611K
  • AndySAndyS Posts: 1,438
    edited April 2017

    OK, I see.

    Is your projection plane represented by the little red/green cross?
    That's what I thought and I positioned it exactly in level with the character. But still you see the outcome.

    And I found that shift parameter, I think you're talking about. But it only shows very small values (0.006).
    I'm working with German metrics. But that shouldn't matter, I hope. wink

    SO - if the ERCs don't work, the question is why. As I told you: DAZ 4.9.3.166 BETA with iRay.

    Post edited by AndyS on
  • zaz777zaz777 Posts: 115
    AndyS said:

    OK, I see.

    Is your projection plane represented by the little red/green cross?

    I use red/cyan parallel.

    AndyS said:


    That's what I thought and I positioned it exactly in level with the character. But still you see the outcome.

    Yep, you're definitely not getting the expected results for some reason.

    AndyS said:

    And I found that shift parameter, I think you're talking about. But it only shows very small values (0.006).
    I'm working with German metrics. But that shouldn't matter, I hope. wink

    Its all metric anyway, as much as that matters for virtual stuff.  With the camera's offset from the center camera by 6.0 (cm) and the projection plane depth at 400.0 (cm), the computed Lens Shift X (mm) should be -0.15 and 0.15, left and right respectively.  So, if you're seeing 0.006 in those parameters, something strange is going on.

    I actually initially just played around with that parameter manually until I got the desired results.  I then worked back from there to determine the correct computation for the ERC control of the Lens Shift.  So, the rig I made isn't necessary, its just easier to get consistent results.

    AndyS said:

    SO - if the ERCs don't work, the question is why. As I told you: DAZ 4.9.3.166 BETA with iRay.

    Yeah, I'm updating my laptop for a 'clean room' install.  Its taking awhile as I haven't run DAZ Studio on it since the 4.7 days.  The install of DAZ Studio and tons of other content/updates as well as upgrading the database is probably going to another 30 to 60 minutes.

    I never tested against 4.9.3.166 BETA, but would expect it to work.  I might try that later.

  • zaz777zaz777 Posts: 115

    Oh, here are the raw images rendered so you can use whatever method of viewing you desire.

    s002-L.png
    1024 x 1024 - 635K
    s002-R.png
    1024 x 1024 - 635K
  • AndySAndyS Posts: 1,438

    Hi Zaz,

    zaz777 said:
    Yep, you're definitely not getting the expected results for some reason.

    I found the reason:
    At my first try, I saved the complete scene in 4.9.2. with iRay engine settings.
    It is pretty strange, but only rendering with the newer version 4.9.3.166 didn't help. I had to reload it as a scene subset without cameras, without render settings, etc. in 4.9.3.166. Only now I added your stereo camera. Setup the render settings again. And only now it worked.

    Think, this is an important hint. Scenes built up with older versions of DAZ are not usable for your Stereo Camera. Even rendering those in the latest version doesn't help. surprise

  • AndySAndyS Posts: 1,438

    A further question:

    The BASE Stereo Camera->Stereo->Offset (cm) is set to 6, which corresponds very well to the normal eye's distance. But for the left and right camera these x-offsets are -6cm and 6cm. Together this adds to 12 cm. That way the 3Ds would look like photos of dolls, table-top model shots or similar. Or what is it about it?

  • zaz777zaz777 Posts: 115
    AndyS said:

    Hi Zaz,

    zaz777 said:
    Yep, you're definitely not getting the expected results for some reason.

    I found the reason:
    At my first try, I saved the complete scene in 4.9.2. with iRay engine settings.
    It is pretty strange, but only rendering with the newer version 4.9.3.166 didn't help. I had to reload it as a scene subset without cameras, without render settings, etc. in 4.9.3.166. Only now I added your stereo camera. Setup the render settings again. And only now it worked.

    Think, this is an important hint. Scenes built up with older versions of DAZ are not usable for your Stereo Camera. Even rendering those in the latest version doesn't help. surprise

    Good to see that you figured out the problem.

    Late last night my laptop finally finished updating and upgrading DAZ Studio and the camera rig worked fine on it.  I was running out of options as to why it wasn't working for you.

    Given that the Camera->Lens parameters are only available in 4.9.3.128 and later, saving a scene with the camera rig in it while using an older version of DAZ Studio likely broke all the ERCs.  Loading that scene in a version that supported them isn't likely to have restored the function of the ERCs.

    You are correct that the BASE Stereo Camera->Stereo->Offset (cm) default is too high to represent a pair of human eyes.  I originally had it at 3 cm.  I changed it when I was working out the equations for the ERC.  The larger separation made it easier to detect when things were properly lined up and I never changed it back.

    If you want to change the default, just set the default via the (gear icon)->Parameter Settings... on the BASE Stereo Camera->Stereo->Offset (cm) parameter and save the rig in a new scene subset.  I don't think I'll do another release to only change the default.  If other changes accrue, I'll include the offset change with them in a new release at some point in the future.

  • AndySAndyS Posts: 1,438
    edited April 2017

    Thanks Zaz,

    yes I know ...

    The strangeness is, that only saving the scene (without your camera) with an older DAZ version already prevents your camera from working properly.
    So it seems to be some parameter in the render settings of the older version conflicting with the shift mechanism.

    Post edited by AndyS on
  • barbultbarbult Posts: 24,248
    edited June 2017

    This is great! I really appreciate the details you put into creating this camera and the instructions. The projection plane positioning works wonderfully for setting the place where the image will be positioned with regard to what is in front of and behind the screen. I appreciate that you maintained parallel cameras and did not make them toe in. Here is my first attempt with your camera. I saved it as a grayscale anaglyph with StereoPhoto Maker.

    G3F Lick Ice Cream_002 stereo gray anaglyph.JPG
    1600 x 2000 - 2M
    Post edited by barbult on
  • RGcincyRGcincy Posts: 2,834
    edited September 2017
    barbult said:

    This is great! I really appreciate the details you put into creating this camera and the instructions. 

    Very nice @barbult! Definitely has the hand extending out of the screen.

    Thanks @zaz777 for the camera preset. I was playing around with the stereo offset myself when I found this discussion thread. I downloaded the camera and got better results the first time using it than from the few experiments I tried. Made it very easy. Here's a render using your camera and assembled using StereoPhoto Maker. This used the Dubois red/cyan color anaglyph setting, which gave the best results from the various options.

    Here's another looking down from the coach's seat. Best to click on image for larger version.

    wagon anaglyph.JPG
    850 x 510 - 476K
    riding into town.JPG
    1200 x 720 - 853K
    Post edited by RGcincy on
  • i'm very impresed, with a bit adjustment the effects are very pleasing. Thankyou so much.

  • Nice 3D comics. Thank you for this camera.

  • dragon440dragon440 Posts: 33

    Using it to make 360 stereo VR images for Oculus Go with Daz Studio Pro. Thanks a lot! I modified a little bit the lens distances to -3.25 and +3.25 with excellent results! Thanks again for this!

  • MarkHMarkH Posts: 79

    I believe mCasual is working on a VR camera rig but have not seen any updates in many months.

  • FirePro9FirePro9 Posts: 456
    dragon440 said:

    Using it to make 360 stereo VR images for Oculus Go with Daz Studio Pro. Thanks a lot! I modified a little bit the lens distances to -3.25 and +3.25 with excellent results! Thanks again for this!

    If you find a 360 stereo solution for DS please let us know.  I believe, at the moment DS stereo 360 images will look good looking forward but as one turns around the stereo gets messed up.

  • dragon440dragon440 Posts: 33
    FirePro9 said:
    dragon440 said:

    Using it to make 360 stereo VR images for Oculus Go with Daz Studio Pro. Thanks a lot! I modified a little bit the lens distances to -3.25 and +3.25 with excellent results! Thanks again for this!

    If you find a 360 stereo solution for DS please let us know.  I believe, at the moment DS stereo 360 images will look good looking forward but as one turns around the stereo gets messed up.

    Pardon my late replying to this. As you said, everything looks good to the front, but if you turn around 180° the view becomes messy because left and right are inverted. If you try the awkward position to look down forward everything looks upside down but ok. I don't have the acknowledgment to solve this

Sign In or Register to comment.