Where is stereoscopic spherical camera?
Hey guys,
I usually try to figure things out by myself as much as I can before I ask for help, and this is exactly what I've tried to do, but I'm at a loss here, so here goes: one of the features of the newly released DAZ Studio 4.9.3 was "Stereoscopic spherical camera support has been added". Now, I can see the spherical camera, and I can render a scene with it, but it's not stereoscopic, it looks like the resulting image is rendered from a fixed center point, so if you render two spherical images from two cameras offset at something like 6.5 cm (eye to eye distance), you'll only get stereo effect if you're looking right in front of you (talking about head-mounted display here, naturally), as you look around you, the effect degrades, and becomes completely inversed when you look behind you.
Am I missing something here? There has to be a reason for this feature to be named "stereoscopic spherical camera", so I must be missing some checkbox or setting, but I can't for the life of me find it anywhere on the forums or the docs or in the Studio itself. The "stereo-offset" parameter seems to just be rotating the camera around the aforementioned center point.
I know I can render cubemaps, but that requires rendering multiple images and stitching them together afterwards, whereas a stereoscopic spherical projection would only require to combine two images in an "over-under" fashion - way more efficient.
Anyway, if anyone's figured that out, I'd really appreciate some tips. Meanwhile I'll continue trying to make it work and post results here if successful.
May something good happen to you today :)
Comments
For what I can tell, "stereoscopic" function is added, and "spherical" render support, but not a combo of those 2. They are 2 separate functions (logically, the combo is not even possible...).
I don't understand why isn't the combo possible? It's a fairly well-known method that exists in most render engines nowadays, known as spherical stereo camera or lat-long camera. You do have to render two images, one for the left and one for the right eye, although some render engines like Octane support the "over-under" output natively (for what I could tell from their demo).
Anyway, I've looked everywhere and tried just about anything, and it looks like you're right, this feature was either misnamed or incorrectly implemented. Gotta wait for an update :)
Duplicate post, please delete
Duplicate post, please delete
A lat-long (or equirectangular) image is a 360 degree, non stereoscopic, panoramic type of image (for example, I use the upper halve of such images, created with Vue, as skydome textures for my skydome..) They provide a full environment view. Not to be confused with stereoscopic view
https://en.wikipedia.org/wiki/Equirectangular_projection
https://en.wikipedia.org/wiki/Stereoscopy
Basically, you'd have to have two cameras, close to each other (roughly 63mm apart), and render each as a spherical camera. I'm curious to try this too, once I can find the software to process those images. Could be a fun update on Viewmaster for Google Cardboard. I'd probably have the two cameras parented to something that you'd want to rotate and translate as a face/set of eyes, then have another object that the cameras point at for their "look at" focus. (sorry, not sure what the technical term is). It is indeed possible with 4.9.
@Outré Limits
Unfortunately, this was the first thing I've tried, and it doesn't work, because the camera rotates around it's dead center when rendering spherical view. It's like your eyes are rotating inside your eye socket, as opposed to you rotating your head. So when you render two images (left and right eye) and then compose them together, you are getting stereo effect only when you look directly in front of you. Turn your head 90 deg. left or right, and both of your eyes will be getting almost the same perspective, so you lose depth perception. Look directly behind you, and the left and right eye will effectively swap.
I assumed that "stereoscopic spherical camera" (@glaseye - I am merely quoting the official press-release, it's the term they have used) means that you can offset the center, basically the origin that the camera rotates around, but I haven't found the way to do it yet. So again, you can sort of produce 360VR, but only when you're looking directly in front of you. That is, if you want to do it "the easy way", by rendering only two images. You can still achieve 360 stereo by using cubemaps, but that will require rendering 12 images (6 per eye).
I still have no clue why was this feature described by the devs as "stereoscopic spherical camera", I assume they either meant "panoramic" or something is not working the way it was intended to.
BTW., Outré Limits, try "Shotcut" editor. It's free and has features necessary for producing VR content. It only works with video, so what I was doing for images is simply create a 1 second video using a still image (I'm using the over-under method). It's not perfect, but it works. My current customer was very impressed when he could look around his office that currently only exists as a 3D model. It was rendered in vray, though, not DAZ
have you figured this out yet? the closest i got to making this work is rendering 2 spherical cameras in the same place. Each camera is set to "stereo offset" -20 and 20 (can't go out of those limits) and it appears stereoscopic. However, this distance is only 40mm apart and our eyes are around 65mm apart.
I also tried rendering 12 separate images, each a 90 degree face. But the edges don't stitch together well.
-boneheaded reply removed-
sorry
Update: it has something to do with "ODS projection", which is like the standard way of rendering 360 stereoscopic images. It's like faux 3d, but it's close to the real thing. Otherwise, we would need to render a pair of images for every head orientation, which is impossible to put in just 2 images. You can read about it in a google developer document.
In ODS projection, everything looks 3d except for things 60cm in front of you and 3.5m top and bottom from you.
I think the DAZ "stereo offset" settings implement ODS projection because it looks pretty good in VR. you just need to scale down the world by 40% (since the offset is locked to 60% of our eye distance)
Here's something I made. Tested using oculus virtual desktop (It has an environment creator that lets you create 3d panaromas from 2 panoramic images. so you don't even need to convert it to cubemap.)
One last tip: always render at 2x the resolution (12288 x 6144 x 2). but don't scale it down. Let your HMD scale it down for you. This makes it sharper in VR. Here I've scaled it down for space constraints.
I think there was a grand total of one P.A. who had a stereo offering in the store... I last saw this a few years ago, probably before 2015. I can't remember his name but I think it was a guy (Brian something?) and I think it had to do with Bryce or Vue. **I THINK** it processed two renders to make an image for viewing with red-blue glasses.
Of possible interest: I get e-mail from a company in Iceland from time to time. Their current offering is a prismatic gadget that clips onto a smartphone resulting in two left/right images being recorded, and I suppose you can view the result with a pair stereo lognettes or a stereoscope. I think the woman has a picture of herself (holding the rig) posted on her web page: https://www.kula3d.com