Sculpting new character morphs and using lip-sync

Hi Folks,

I have been successfully using the Genesis 3 figures, along with store bought character morphs, and clothing for a number of animation projects, completed in C4D over the last few years. I have been exporting out to C4D as FBX, and more recently via the Cinema 4D Bridge script.
I have also been utilising the Lip-sync plug-in (via the 32 bit version) in all of these projects, with a surprising amount of success in over 26 languages. I have done manual lip-syncing previously, which is fine for short sections of speech, but with almost industrial levels of lip-sync required, an automated process is absolutely necessary! I've subsequently established a solid, functional work-flow between both programs, and would like to continue using it.

However, for my next project, I am being asked to create bespoke/stylised cartoon like figures, which will still require an enormous amount of lip-sync to be generated. So, rather than trying to reinvent the wheel, I am hoping to utilise the existing system with my custom characters, which brings me to my questions!:

As I understand it, I can export any of the genesis base meshes in it's original form to Z Brush, where I can reshape/sculpt the existing mesh into a new form, keeping vertex count the same, and then reimport as a morph into Daz, readjust the rigging, and dial in like any store bought morph?

If I want to use the lip-sync system, what other steps do I have to look at, to enable this character to speak to the same level as the store bought ones?

Do I need to sculpt all new mouth shapes for the phonemes, or does some 'relative' magic occur between the base mesh/base morphs and a new character morph?

What about the mouth/face rig aspect - are these going to need to be tweaked for every phoneme as well?

For other expression morphs (smile, eyes closed, brows raised etc), I’m assuming I similarly create these in Z Brush on my re-sculpted figure, and also import back into DAZ for later use in Cinema 4D?

I have been using Gen 3 for all of the previous projects. It has a specific .dmc file which is used by the lip-sync system to control (as far as I can see) the actual values of each phenome, or expression for that base figure. If I use Gen 3 as the base, will this profile still be good as is, and relative to my new morph(s)?

Finally, would any of the other Genesis figures be a better option to use?, bearing in mind that I would still like the possiblity to use the C4D Bridge if possible.

Apologies for all the (probably daft!) questions - I have never really dived into the guts of Daz3D, so im unsure of how many of these aspects are automatically retargeted, and how much has to be created fresh, and reconnected!

Thanks in advance,
Adam :)

Comments

  • Faux2DFaux2D Posts: 452

    kromekat said:

    Hi Folks,

    I have been successfully using the Genesis 3 figures, along with store bought character morphs, and clothing for a number of animation projects, completed in C4D over the last few years. I have been exporting out to C4D as FBX, and more recently via the Cinema 4D Bridge script.
    I have also been utilising the Lip-sync plug-in (via the 32 bit version) in all of these projects, with a surprising amount of success in over 26 languages. I have done manual lip-syncing previously, which is fine for short sections of speech, but with almost industrial levels of lip-sync required, an automated process is absolutely necessary! I've subsequently established a solid, functional work-flow between both programs, and would like to continue using it.

    However, for my next project, I am being asked to create bespoke/stylised cartoon like figures, which will still require an enormous amount of lip-sync to be generated. So, rather than trying to reinvent the wheel, I am hoping to utilise the existing system with my custom characters, which brings me to my questions!:

    As I understand it, I can export any of the genesis base meshes in it's original form to Z Brush, where I can reshape/sculpt the existing mesh into a new form, keeping vertex count the same, and then reimport as a morph into Daz, readjust the rigging, and dial in like any store bought morph?

    If I want to use the lip-sync system, what other steps do I have to look at, to enable this character to speak to the same level as the store bought ones?

    Do I need to sculpt all new mouth shapes for the phonemes, or does some 'relative' magic occur between the base mesh/base morphs and a new character morph?

    What about the mouth/face rig aspect - are these going to need to be tweaked for every phoneme as well?

    For other expression morphs (smile, eyes closed, brows raised etc), I’m assuming I similarly create these in Z Brush on my re-sculpted figure, and also import back into DAZ for later use in Cinema 4D?

    I have been using Gen 3 for all of the previous projects. It has a specific .dmc file which is used by the lip-sync system to control (as far as I can see) the actual values of each phenome, or expression for that base figure. If I use Gen 3 as the base, will this profile still be good as is, and relative to my new morph(s)?

    Finally, would any of the other Genesis figures be a better option to use?, bearing in mind that I would still like the possiblity to use the C4D Bridge if possible.

    Apologies for all the (probably daft!) questions - I have never really dived into the guts of Daz3D, so im unsure of how many of these aspects are automatically retargeted, and how much has to be created fresh, and reconnected!

    Thanks in advance,
    Adam :)

    https://www.daz3d.com/genesis-3--8-face-controls

    Lip-synching is no easy task in animating. Genesis Face Controls already has a whole set of phoneme poses, 52 in total. Automated solutions require a lot of time investment but you can always go the manual route. You can load an audio file in the timeline and then load in the phoneme poses manually one by one as you hear them in the audio.

  • wolf359wolf359 Posts: 3,834

    Hi Adam 
    Although I have left the Daz/Genesis/C4D eco system for Blender/Reallusion, 
    I do remember that the mimic DMC files for Genesis 1,2,3,8 are written to describe  the basic phonemes that the old  32 bit mimic basic  function (in 32 bit Daz studio )will look for when parsing an audio file for lipsynch.
    these basic Phonemes are buit into genesis 3 


    Obviously there have been many third party commerical face expression morphs and stylized head shapes for those figures that (in my many years of usage ) never interfered with the basic mimic phonemes.

    But understand this old friend,
    Any NEW  custom Phonemes you  might create for a bespoke Character would have to  be hand animated as the ability to generate a new custom DMC file  for mimic Disappeared years ago along with the venerable mimic pro 3 software

    Capture.PNG
    1931 x 1080 - 171K
  • kromekatkromekat Posts: 25

    wolf359 said:

    Hi Adam 
    Although I have left the Daz/Genesis/C4D eco system for Blender/Reallusion, 
    I do remember that the mimic DMC files for Genesis 1,2,3,8 are written to describe  the basic phonemes that the old  32 bit mimic basic  function (in 32 bit Daz studio )will look for when parsing an audio file for lipsynch.
    these basic Phonemes are buit into genesis 3 


    Obviously there have been many third party commerical face expression morphs and stylized head shapes for those figures that (in my many years of usage ) never interfered with the basic mimic phonemes.

    But understand this old friend,
    Any NEW  custom Phonemes you  might create for a bespoke Character would have to  be hand animated as the ability to generate a new custom DMC file  for mimic Disappeared years ago along with the venerable mimic pro 3 software

     

    Hey fella,

    Thanks for commenting. Currently I suspect I will be able to do as I was mentioning above, and simply create a new character shape using a base figure (Gen 3/8), and import as a morph, and still retain all of the existing lipsync functionality, which would be ideal.

    I have been looking into the Reallusion stuff as well though, and am trialing both Character Creator 3 and iClone 7. I don't see me having much use for the latter tbh, other than utilising it's own lip-sync functionality. Having tried this yesterday, Idespite having the intensity controls, and the various live input methods (via iphone), I can't say I am finding the lip-sync as reliable or accurate as the Daz3D 32 bit one, which has surpised me. The workflow possibilities between ZBrush and CC3 are superb, with the ability to send posed figures back and forth, and reimport with custom sculpted clothing that is automatically conformed to the base model! - thats fantasticly useful! - Is there any decent resource for learning about the best use of the lip-sync in iClone?

    Adam ;)

  • wolf359wolf359 Posts: 3,834

    Hi Adam, if one just imports an Audio file into the Iclone software the basic  lip movment generated will not be any better than Daz mimic pro.

    But there is where the similarities end!!


    Assuming you are not going the Iphone face capture route you still have a powerful audio based lipsynch toolset

    first you have a lip smoothing function that can globally or slectively(upper/lower) eliminate that linear "quick snap/ lip flap" effect with Daz mimic

    Second you have a powerul phoneme editor and replacement pallet more advanced than the one in the old mimicpro 3 software.

    32 bit mimic basic has ZERO editing option beyond Daz's
     manual timeline keyframe editor( such as it is)

    you can layer on facial expressions with face puppet and face key.
     I am sending you a private message with a link to a video of how well face puppet can add to a peformance 
    (to avoid post deletion for "commercial promotion" in the open forums.)

    And the FBX exporter from CC3 pipeline exports the facial performances  as animated shape keys to Blender
    thus  I can store and reuse iclone face and body motions on any CC3 character in any other Blender scene .

    Check your PM on the site here

  • kromekatkromekat Posts: 25

    Thanks Wolf,

    I will definitely look into this further with iClone then. Tbh, as I have said previously, I have been seriously impressed by how well the old 32bit LipSync plugin works, and how mostly accurate it is, as long as it's a clean audio track, and accompanied by a phonetically written text file, depsite any lack of further control or editing. I have occasionally needed to go in and add an additional, or emphasized lip shape in some places, but otherwise it fits the puropse!

    However, being able to add and edit the expressions this way is a definite benefit, and would save some time with the morphs tag sliders in C4D. If the lipsync can be improved beyond my first look (and from your video, it clearly can), then with these additonal features, and the Zbrush editing/posing pipeline will probably tip the balance (Sorry Daz!)

    Adam :)

  • wolf359wolf359 Posts: 3,834

    Sure thing Mate!!

    I still have copies of the Classic mimic pro3 installed on two of my old Laptops and used it & DAZ mimic live, in the production of my feature length Film "Galactus Rising".

    It is a shame Mimic was effectively abandoned before the era of Genesis as it was ahead of its time.

    But Alas I had to move on


    User Tim Vining "Auroratrek" recently migrated to an CC3/FBX- to Maxon C4D based pipeline, after years of being stuck using Vicky 3/4 with mimic pro 3.

    He might be able to offer more specific info on the C4D based pipeline, as I no longer use the Maxon software myself.


    Just remember you only need the basic $199 USD version of Iclone. however you must get the stand alone version of CC3 pipeline 
    for full  FBX export capability, Daz genesis character conversion to native CC3, and the Zbrush I/O workflow of course.

    I create all of my own content (clothing and props) in Blender
    and some nice fellow just released a free Auto-set up Addon for Blender that perfectly auto creates the CC3 skin & hair materials in Blender upon import and  has a Zrush style I/O transfer between CC3 & Blender for CHaracter custom morphing .
     
    This will greatly speed up the production of my current youtube mini-series "HALO Reclaimer"  

    Cheers.  

  • mindsongmindsong Posts: 1,712

    kromekat said:

    Thanks Wolf,

    I will definitely look into this further with iClone then. Tbh, as I have said previously, I have been seriously impressed by how well the old 32bit LipSync plugin works, and how mostly accurate it is, as long as it's a clean audio track, and accompanied by a phonetically written text file, depsite any lack of further control or editing. I have occasionally needed to go in and add an additional, or emphasized lip shape in some places, but otherwise it fits the puropse!

    However, being able to add and edit the expressions this way is a definite benefit, and would save some time with the morphs tag sliders in C4D. If the lipsync can be improved beyond my first look (and from your video, it clearly can), then with these additonal features, and the Zbrush editing/posing pipeline will probably tip the balance (Sorry Daz!)

    Adam :)

    2c,

    Hi Adam - I'm with (but 'slightly' behind :) Wolf in his described non-DS workflow, but I'm still captive to the DS framework for a bit longer for practical reasons.

    Although Wolf is correct in being at the mercy of existing/available DMC lipsync control files unless you have access to a copy of the Mimic Pro standalone SW, you might want to take a quick look at the innards of one of the DMC files in Notepad(++) - you may find that the syntax used is not too daunting to manually edit them for some of the figure-specific tweaks you may wish to apply to your custom character(s). The control stanzas are pretty obvious, and the settings/range values can be easily edited and tested if you are comfortable putzing with files like that. Be sure to make backupsof your original DMC files, or rename your test DMC files - and select them in the lipsync tool as per usual. WendyLuvsCats indicates that this is a successful method in some of her use-cases.

    hope this helps,

    --ms

  • kromekatkromekat Posts: 25

    Thanks Wolf and Mindsong,

    I appreciate your help on this! ;)

    Adam 

Sign In or Register to comment.