Creating a Genesis 8 character from a custom mesh?

13»

Comments

  • Silent WinterSilent Winter Posts: 3,721

    arstropica said:

    Silent Winter said:

    Cool :)

    But if you export at 1%, shouldn't you reimport at 10,000%? (or do you mean export from blender at 100% and then import to Daz at 10,000%?)

    For some reason, Blender's OBJ export plugin doesn't use percentages so 1.0 is 100%. Using the 100.0 setting is actually 10,000%.

    Oh, I see what you mean. I just leave blender as-is and export/import to DS as opposites.

  • KainjyKainjy Posts: 821
    edited June 2021

    I follow... curious about different solutions showed.

    Post edited by Kainjy on
  • I have been trying to use Wrap Russian3dScanner but my issue is the eyelashes. Anyone know how to deal with this? (there is no eyelashes on my photogrammetry model)

     

    Thanks!

  • combatwombat781combatwombat781 Posts: 2
    edited August 2023

    arstropica said:

    Richard Haseltine said:

    I'm sure Blender must have some kind of projection/shrink wrap type function. Hexxagon will snap to a background object, I think, though not sure if it works with a push/shrink type of tool.

    Based on my research, Blender's shrinkwrap uses a combination of vertex positioning and groups to deform meshes.  As a tool, it is more of a broadsword than a scalpel for doing any type of detailed retopology work. I've checked out HeadShop, Facegen and Daz's Face Transfer plugin and none of them really deliver good results. My initial goal was to take a 2D information from a photograph of a face and project it onto a Daz figure as accurately as possible. I had to come up with my own modeling workflow which uses 3 different types of software to import photogrammetric objects into Daz.

    1. Face Alignment (free / open source)

     

    URL: https://github.com/1adrianb/face-alignment

    Requirements: Linux (apparently mac and windows too), Miniconda (https://docs.conda.io/en/latest/miniconda.html), Python 3.7

    Installation: Clone my repository fork (https://github.com/arstropica/face-alignment/tree/demo) and follow the Github instructions.

    This python script is part of a 2-step process that generates an accurate 3D facial model from almost any photograph of a human in which the facial features are visible.  The purpose of the script is to identify and localize facial landmarks on a 2D photo that can be used to build a 3D model later on.  In it's original form, the script will generate 69 facial landmarks using DLib, a Computer Vision library.  But we really only need 5 of these landmarks (2 eyes, 1 nose, and 2 lips corners).

    I had to write my own implementation of the script to extract the 5 landmarks for the next step. (https://github.com/arstropica/face-alignment/blob/demo/scripts/batch.py).

    Run python scripts/batch.py <image directory> to generate the landmark text files.

     

    2. Microsoft Deep 3D Face Reconstruction (free / open source)

     

    URL: https://github.com/microsoft/Deep3DFaceReconstruction

    Requirements: Linux, Github Large File Storage (https://git-lfs.github.com/), Miniconda (https://docs.conda.io/en/latest/miniconda.html), Python 3.7, g++ & gcc Linux libraries

    Installation: You can either clone the original repository, setup an anaconda environment and then download the dependencies separately or use my branch (https://github.com/arstropica/Deep3DFaceReconstruction/tree/demo) which contains the anaconda environment file as well as the dependencies and follow the Github instructions. If you do use my branch, you'll also need to install and initialize Github's LFS extension (https://git-lfs.github.com/) in the cloned repository in order to be able to download the 100M+ model files.

    The Deep 3D package uses AI to reconstruct a 3D facial model from an image.  The python script accepts the 5 landmarks from the previous step (similar to Facegen) and uses a trained AI model to render a textured 3D mesh of the facial characteristics of the subject in a photograph. 

    Copy your text and image files from the image directory in the face-alignment repo, to an "input" directory in Deep3D and run the demo.py script to generate the mesh(es).

    3. Softwrap for Blender (paid plugin)

     

    URL: https://blendermarket.com/products/softwrap

    Requirements: Blender v2.8+, Meshlab

    Installation: install from the settings/add-ons panel like any Blender plugin.

    This is finally where the magic happens!

    Softwrap is a Blender plugin that uses a cloth-like physics engine (not unlike dForce) to deform one mesh using another.  It does it without changing the vertex order of the source mesh so it's perfect for Daz. There are a ton of settings from symmetrical deformations to mesh filtering, smoothing, elasticity, etc.  A better way to look at the plugin is that it treats your source mesh as a cloth that wraps around a hard 3D object. Using the plugin to get a Genesis model to conform to a 3D mesh requires a few preliminary steps.

    First take your 3D face mesh from Deep 3D and use an application like MeshLab to convert it to a DAE model. You need to do this if you want to import the face model into Daz Studio as it doesn't handle OBJ vertex colors well.

    Then, in Blender, unwrap the model's UV map and bake the DAE model's vertex colors to a bitmap texture. (Here is a good tutorial if you don't know how: https://www.youtube.com/watch?v=7cphcAZ5ai8). It is a good idea to normalize the positioning of the model by rotating and aligning it with the origin and resetting its pivot.

    Export the baked texture and the unwrapped model back to an OBJ and MAT format.  Now you can import it into Daz.

    At this point, I usually make shape or pose to adjustments to my Genesis character's head in Daz.  I import my textured facial model into a scene with a vanilla Genesis character and position and/or scale it so that it covers the character's face. I then use the shape and pose morphs to make gross changes to Genesis facial characteristics so that the forehead, eyes, nose, mouth, chin and jaw are roughly the same shape, size and dimensions as my facial mask.  This doesn't have to be perfect, but it will help with Softwrap's retopology process.

    Export your Genesis character from Daz as an OBJ at base resolution (I use 1% scale to work in Blender).  Also export your positioned 3D facial model the same way.

    Import both OBJ models into Blender (ensuring the Poly Groups option is checked).

    You may want to create a custom vertex group for the part of your Genesis model that will should not be affected by Softwrap.  I usually create a group excluding the character's face, lips and eyelids.

    In the Softwrap rollout, select the Genesis character as the source mesh and the Deep 3D mask as the target mesh.  Select your vertex group as the Snapping & Frozen Group to protect the bits of your Genesis character that shouldn't be affected by the plugin. Hit the Start button and slowly increase the Snapping strength control to see the changes.

    Use the pins, and other settings to fine tune the fit and then hit apply once you are satisfied.  You can also leave the deformation as a shapekey and continue modifying your mesh.

    Then you just export the mesh back to Daz (this time at 100%) and import it as a morph on a clean Genesis character and you have a good beginning for a custom character without ever having to do manual modelling.

    Hope someone finds this helpful!

    Has anyone gotten this to work, or improved on this workflow?

    I can use softwrap in Blender, but have not had much luck with face alignment or Deep 3D Face Reconstruction in Windows

    It seems like there is a new Pytorch-based version of Deep 3D Face Reconstruction that doesn't require the 5 facial landmarks: https://github.com/sicxu/Deep3DFaceRecon_pytorch#inference-with-a-pre-trained-model

    Post edited by combatwombat781 on
Sign In or Register to comment.