Pose capture from Mocap hardware/software

I have been looking for a good software/hardware way to capture real-life pose and transfer it to a genesis figure.
I have heard there are some Mocap options, but i am not too familiar with that, plus i don't need animation.
Can anyone suggest a suitable solution, preferable full-body, and facial expressions.
Post edited by rannar22 on
Comments
The only way to get Human mocap Data into Daz studio, is as BVH after parsing it through some other third party software like AutoDesk motionbuilder or Reallusion Iclone pro Pipeline.
It seems it would be less expensive to buy poses from the Daz & renderosity markets and learn to adjust them yourself as needed.
If you are not animating, I cannot think of any mocap solution that would give you better results than just doing it manually from multi-angle reference images. Even more so for facial expressions. You vastly overestimate how good mocap systems less than $30,000 are.
Not to mention that even the best mocap hardware and software don't circumvent the need to do manual adjustments.
The OP may be interested in 3D rotoscoping...
Wow, i am really disappointed.
I was sure there would be something half decent at this point.
I saw a Kinect demo and go excited.
Currently, i am using default bought posses, and manually change them.
But i was hoping to, create something more natural on my own.
perception neuron and iClone 7 or other compatible software is probably the only affordable but still very expensive hobbyist solution out there, beyond that you are likely looking at tens of thousands of dollars
and just for a pose?
I would understand mocap but even then it would need to be pretty good mocap of something not on the market already, some special sports, gymnastics, fight moves, choreography etc as there are only so many idle, talking, walking, eating and sits stands etc people need.
... and probably the adult erotic industry but they understandably prefer using real people to 3D
IMO a good photographer is a better choice capturing your pose from all angles to facilitate matching it with a 3D rigged figure
probably a link not allowed google neuronmocap
start making a list ...
Actually, you do need animations. The motion helps the algorithms converge. It's like dForce clothing, in reverse. So let the system track you through a few moves, terminating on where you need to be.
I've used a setup with multiple Kinect 2 units. Can't remember the name, K2 has been dead for a while, and I haven't gotten any K3 yet for my own work. I really need to finish the new space.
I have to admire the animation people. Personally I don't have the patience to do animation and very rarely even try it. I gave it a bash today and had to look up how to do it because it's been so long.
A 120 frame 350x400 pixel one took nearly an hour to finish. And that was with a good RTX GPU and in a very simple scene. Can't imagine how long it would take at a decent size with multiple scene elements.
I might just stick to still renders.
brekel mocap with a kinect might be able to get you some decent useable motion sequences that you can use as a baseline to extract out some still-pose files for earlier DAZ/Poser figures (and up-convert to G3/G8), depending on your standards/needs. The soon-to-be released version looks like the result of lessons well-learned over a number of years working in that domain. Probably $250 to play for body motions. (vs face/hands)
(I notice that the current iclone motion-live suite doesn't even offer a kinect plugin solution. Their older mocap was kinect-based, but the new stuff is devoid of that capture H/W option.)
As others have indicated, doing the full grab, clean, apply, refine process with mocap is best done with a full tool-suite that supports the entire workflow - keyframe and graph-editors, etc., and something that gets you some decent data from the start. I personally wouldn't use DS for anything over a minute or so, but I would recommend it with 'canned' animations for moving figures into dynamic still shots as mentioned by the OP. Others will likely argue. Do your own due-dilligence (starting at the current DS release and beta threads - look for animation comments...).
One other angle, look into the mixamo and some of the other pre-created mocap animations that are available in the DAZ store and elsewhere (mixamo=adobe). If you import those, you can freeze any frame to taste and adjust/save to a standard pose if you like. Aniblocks can be used this way as well.
best,
--ms
If I were going to do this, I'd probably use canned poses and try to vary them with Puppeteer.
A lot of developers stopped working with Kinect when the incredible Kinect 2 came out. K2 was faster than K1, and literally about 10x highee resolution and more accurate. Unfortunately, Microsoft decided to use it as a vehicle to promote Windows 8 back in the day when serious stuff was running on 7. So we went from K1
To K2:
To K3:
And that's what happened to the 3D camera based motion capture industry.
Are you sure about that? This link has instructions for Ubuntu 18.04. I've had two for a good long while, but I haven't tried it with Linux yet, believe it or not.
https://docs.microsoft.com/en-us/azure/kinect-dk/sensor-sdk-download
Very interesting history - one to save in the "tidbits" file.
I knew the USB3.x drivers were a requirement for keeping up with the K2 data-rates, and some the USB3 h/w controllers didn't work well. That thinned the crowd and Kinect momentum too.
I've also seen older versions of drivers work on W10 when newer versions don't etc., so it seems somewhat intentional sometimes.
It's amazing how the protectionistic silos that vendors try to throw out there auditably fail each time, and yet history rhymes again.
@wiz, knowing this, what direction have you taken for your mocap efforts (body, hands, facial) in light of the mentioned constraints?
To the OP, there are some basic bone-capturing apps for the original xbox-360, and if you can capture a popular BVH skeleton and convert/map to poser or DAZ figure bones, you may be able to 'play' for almost free in a Windows environment.
YMMV,
--ms