Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
One can mingle meshes making a geo-graft. In effect you could have the new mesh to 'hide' the mesh of G8 that it covers -- if you want a tutorial and a product [keeping in mind I'm not a professional] I would need the 3D scan - polished up as you would want it to be. If you just want a geograft tutorial we could place a lantern or something like that for a head.
editing to add image
But a Geograft is not the same thing as the mesh itself, right? But if it works, I'll take the geograft!!! :) In any case I am very interested in the tuorial! Thanks a lot!!!
"Geograft" is just a term ... New mesh attached to old mesh ... in the production it requires part of the old mesh to be attached to it [hence selecting a ring of normal faces to border to]. When the attachment is loaded to the figure, it will auto-hide that which was selected to be hidden. Geografts can only be used on/for the figure they are created for. So which G8 figure exactly are you interested in?
I'm not sure what you mean there,
I'd consider using Facegen; take a few shots to get face on and profile, and as others have said you might not like facegen's morph, but using black and white versions of the shots you took would allow you to get your face shape using morphs. It may just be quicker than the time spent looking for an alternative.
That is new for me: how can you get black and white pics to create the morph for me (other than the standard FaceGen procedure?)
I don't think you want a geograft head. A head mesh with a different vertex count than the Genesis base won't work with any of its morphs. Morphs are based on vertices, as you probably know.
If it is a morph instead of a geograft, it would indeed simplify things on the texture front...
You might be correct however clothing morphs with the figure and geografts are "fit to" items as well. It's not a 5 min. project so we'll find out later.
There is no perfect automated way to do what you want,
using the scanned head as a 3D guide is probably your best option to be honest
I used both Blender shrinkwrap and Zprojection for vertex paint and shape transfer myself but only after manually adjusting the meshes to be as close as possible using a combination of morphs and soft select in Carrara mostly, in my case it was iClone Character creator figures and Adobe Fuse figures, I had to then attenuate the morphloader morph to only include the face then head etc so I could use the original eye and mouth parts moved to fit, a lot of back and forwards importing exporting.
Black & White shots would likely result in less accurate morphs from FaceGen because the depth each pixel might have is reduced from a full color picture.
Take full color pictures for 10 - 15 foot away zoomed in but no closer zoomed then shoulder. Make multiple direct frontal shots and profile shots from both sizes until you actually have 3 that are sharply and evenly focus. Take them when the ambient lighting in the room is strongest and most evenly cast about the room, so likely sometime between 10:00 - 14:00.
Don't make the lighting too bright taking the pictures because it will blowout the whites and flatten the picture. No strong shadows. No face in shadows but well lit via ambient light bouncing off the walls, floors and ceiling..
Yes, I figured that it would be a complex process... Hence my question if there was any documentation/tutorials available to guide me through such a pipeline :) because we already have zbrush, hexagon,RD3S Wrap, belnder... and who know how many other applications that could help me do this... So yes, the plot thickens... :)
I think photogrammetry only looks at the brightness of the pixel, color is ignored. You don't want pure white or pure black pixels, so highlights and shadows should be eliminated. So, if that's a correct assumption, if you ran the same image of a color photo and a grayscale photo (desaturated color) through facegen, they would both give you the same results.
Maybe that's why FaceGen comes up short then because apparently pure photogrammetry can't estimate surface normals or other properties for a pixel. Things that might be derived if the entire color spectrum was used along with whether a surface was skin, hair, eyes surfaces, various cloths so PBR databases could be used to apply to refine the result. If metallic and other surface type materials properties are not going to be taken into account the results won't be alway correct.
Working with Genesis 3 male, concept works to give it a geograft head. The new mesh does accept some activity from the morphs, results depend upon many factors. After the new head is made and for sure ready to be grafted, it does of course have to be uvmapped and that is not happening in Hexagon lol ...
I think they build a 3d point cloud first, then construct a surface from that. Vertex normals are generated from the constructed surface, so normals are the very last thing calculated. I guess you could say photogrammetry is basically a 3d point plotter.
That's surprising because a geograft mesh can literally be anything, from a horse's head to a tree stump. I would guess you would have to create your own morphs for a geograft head, everything from opening its eyes and mouth, to smiles and expressions. In other words, a lot of unnecessary work, if you go that route.
If you want a detailed working head, yes. If you want to put your own image on a shaped sphere* and be happy with that, then it takes less than a day to make and doesn't cost quite as much as this delightful looking product sold in the store.
* actually was using the gsuit by Joequick.
A character like this is possible is because I think they masked out all the vertices that control the face, and only added the geograft to the top of the head:
https://www.daz3d.com/lekkulian-2-for-genesis-8-male
https://www.daz3d.com/lekkulian-2-for-genesis-8-female
And to blend the geograft to the rest of the face, like the brows, I think those are morphed vertices, not geograft vertices.
Just a guess. Maybe the eyebrows are part of the geograft too, it's hard to tell.
Experimenting ... mouth and neck, concept works ... eye rings next to do [on another day!]
That's G8M with some Freak 3 dialed in.
Looks promising!
Yeah! It all worked! Tutorial is in the proof-reading basket so may be released in the next day or so. I'm including the mask as a product for any and all to enjoy. It is recommended to avoid making morphs anywhere too near the red borders which are illustrated in the tutorial!
It doesn't matter which modeler you use as long as 'nothing' happens to the red and yellow mesh areas ... you can delete all of the rest of the mask and carefully weld your 3D scan mesh to the yellow border lines. That is a derivative border drawn up from the exact mesh so there is a little more play allowed as to exactly where the line goes from the yellow to your mesh. When done, save out the .obj file however you do being careful not to change the vertex count on the red mesh. How to make a geograft instructions are included in the tutorial. I tend to write my tutorials with the beginner in mind, however geografting is not exactly a beginner topic.
Some teasers ...
edit to add: for anybody interested there is a package of some 372 face morphs for G8M in the store.
It's ready! Link to my post in Freebies. Link retired for now.
I'll be sure to read it through in the coming days!!!
Is the new FaceTransfer feature any good for this? Let's find out! :)
Based on my research, Blender's shrinkwrap uses a combination of vertex positioning and groups to deform meshes. As a tool, it is more of a broadsword than a scalpel for doing any type of detailed retopology work. I've checked out HeadShop, Facegen and Daz's Face Transfer plugin and none of them really deliver good results. My initial goal was to take a 2D information from a photograph of a face and project it onto a Daz figure as accurately as possible. I had to come up with my own modeling workflow which uses 3 different types of software to import photogrammetric objects into Daz.
1. Face Alignment (free / open source)
URL: https://github.com/1adrianb/face-alignment
Requirements: Linux (apparently mac and windows too), Miniconda (https://docs.conda.io/en/latest/miniconda.html), Python 3.7
Installation: Clone my repository fork (https://github.com/arstropica/face-alignment/tree/demo) and follow the Github instructions.
This python script is part of a 2-step process that generates an accurate 3D facial model from almost any photograph of a human in which the facial features are visible. The purpose of the script is to identify and localize facial landmarks on a 2D photo that can be used to build a 3D model later on. In it's original form, the script will generate 69 facial landmarks using DLib, a Computer Vision library. But we really only need 5 of these landmarks (2 eyes, 1 nose, and 2 lips corners).
I had to write my own implementation of the script to extract the 5 landmarks for the next step. (https://github.com/arstropica/face-alignment/blob/demo/scripts/batch.py).
Run python scripts/batch.py <image directory> to generate the landmark text files.
2. Microsoft Deep 3D Face Reconstruction (free / open source)
URL: https://github.com/microsoft/Deep3DFaceReconstruction
Requirements: Linux, Github Large File Storage (https://git-lfs.github.com/), Miniconda (https://docs.conda.io/en/latest/miniconda.html), Python 3.7, g++ & gcc Linux libraries
Installation: You can either clone the original repository, setup an anaconda environment and then download the dependencies separately or use my branch (https://github.com/arstropica/Deep3DFaceReconstruction/tree/demo) which contains the anaconda environment file as well as the dependencies and follow the Github instructions. If you do use my branch, you'll also need to install and initialize Github's LFS extension (https://git-lfs.github.com/) in the cloned repository in order to be able to download the 100M+ model files.
The Deep 3D package uses AI to reconstruct a 3D facial model from an image. The python script accepts the 5 landmarks from the previous step (similar to Facegen) and uses a trained AI model to render a textured 3D mesh of the facial characteristics of the subject in a photograph.
Copy your text and image files from the image directory in the face-alignment repo, to an "input" directory in Deep3D and run the demo.py script to generate the mesh(es).
3. Softwrap for Blender (paid plugin)
URL: https://blendermarket.com/products/softwrap
Requirements: Blender v2.8+, Meshlab
Installation: install from the settings/add-ons panel like any Blender plugin.
This is finally where the magic happens!
Softwrap is a Blender plugin that uses a cloth-like physics engine (not unlike dForce) to deform one mesh using another. It does it without changing the vertex order of the source mesh so it's perfect for Daz. There are a ton of settings from symmetrical deformations to mesh filtering, smoothing, elasticity, etc. A better way to look at the plugin is that it treats your source mesh as a cloth that wraps around a hard 3D object. Using the plugin to get a Genesis model to conform to a 3D mesh requires a few preliminary steps.
First take your 3D face mesh from Deep 3D and use an application like MeshLab to convert it to a DAE model. You need to do this if you want to import the face model into Daz Studio as it doesn't handle OBJ vertex colors well.
Then, in Blender, unwrap the model's UV map and bake the DAE model's vertex colors to a bitmap texture. (Here is a good tutorial if you don't know how: https://www.youtube.com/watch?v=7cphcAZ5ai8). It is a good idea to normalize the positioning of the model by rotating and aligning it with the origin and resetting its pivot.
Export the baked texture and the unwrapped model back to an OBJ and MAT format. Now you can import it into Daz.
At this point, I usually make shape or pose to adjustments to my Genesis character's head in Daz. I import my textured facial model into a scene with a vanilla Genesis character and position and/or scale it so that it covers the character's face. I then use the shape and pose morphs to make gross changes to Genesis facial characteristics so that the forehead, eyes, nose, mouth, chin and jaw are roughly the same shape, size and dimensions as my facial mask. This doesn't have to be perfect, but it will help with Softwrap's retopology process.
Export your Genesis character from Daz as an OBJ at base resolution (I use 1% scale to work in Blender). Also export your positioned 3D facial model the same way.
Import both OBJ models into Blender (ensuring the Poly Groups option is checked).
You may want to create a custom vertex group for the part of your Genesis model that will should not be affected by Softwrap. I usually create a group excluding the character's face, lips and eyelids.
In the Softwrap rollout, select the Genesis character as the source mesh and the Deep 3D mask as the target mesh. Select your vertex group as the Snapping & Frozen Group to protect the bits of your Genesis character that shouldn't be affected by the plugin. Hit the Start button and slowly increase the Snapping strength control to see the changes.
Use the pins, and other settings to fine tune the fit and then hit apply once you are satisfied. You can also leave the deformation as a shapekey and continue modifying your mesh.
Then you just export the mesh back to Daz (this time at 100%) and import it as a morph on a clean Genesis character and you have a good beginning for a custom character without ever having to do manual modelling.
Hope someone finds this helpful!
Kick ass, I'll give it a try.
Hi Arstropica, this is Laslo from Abalone LLC, creators of HeadShop. I sent you a PM, love to talk to you about a project.
My email is abalonellc@yahoo.com or laslov@hotmail.com. My skype is lasloves, I am in California.
Hope to talk to you soon.
Laslo
I replied your message.
Cool :)
But if you export at 1%, shouldn't you reimport at 10,000%? (or do you mean export from blender at 100% and then import to Daz at 10,000%?)
For some reason, Blender's OBJ export plugin doesn't use percentages so 1.0 is 100%. Using the 100.0 setting is actually 10,000%.