Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
Yes, in the Blender/Houdini column on the scaling tab, check only the current subd box. This is the one you will render in Blender, so you want the actual subdivided resolution.
In the MD column on the scaling tab, check only the subD cage box. MD will simulate fine with the unsubdivided subD cage and the file will be much smaller. Sagan will now export two alembic files, catered to these two specific uses.
I only found out about MD at version 9 :) I didn't know retopo was something new. It's not really necessary, particularly if you're not going to simulate it more in Blender (why would you have to). You might want to play with a smoothing modifier, though.
Correct across the board. You can delete the Diffeo model if you want to, as long as you immediately import the Alembic scene, but I tend to still use it when the differences in subdivision between Daz and Blender are not so noticeable.
Hi All - Is there documentation on how to install and use this exporter? I don't see anything in the ZIP, or in this thread. Thanks!
It can be, but for dforce specific items, dforce generally works much better. I've I have to protect feet/hands/eyes/mouth - some or all of em in Blender to prevent weird gathering.
If I was doing a static pose, with the character in the shape I wanted, so all I want to do in blender is render, then do everything first, then export it.
If the clothes don't transfer quite right, re-sim perhaps in Studio, export as an obj, import into Blender and copy the materials from the diffeo imported clothes.
I'll see if I can do an example. No matter, I've not tried Sagan as yet.
I agree, although it's not without its issues.
I bought MD and use that too; I use dforce, blender and MD; tools are meant to be used as appropriate not become the raison d'être.
@Marble
When it's not an animation, and it isn't really for me so far, although recently I've begun to look at it just because the tools are there and work.
Export an obj to use as avatar in the A pose (or T if G3 or earlier); export the character in the desired pose as an obj.
In MD import the obj as a Morph Target (I use 9.5 so no clue if its available in earlier versions); set the frame count if 30 (default) is not sufficient. NOTE: save the scene first incase something goes wrong - such as insufficient frames.
This generally works, but depending on how radically different the new pose is the transition can be weird or require one to grab bits of cloth in an attempt to stop disasters.
This way there are no issues with creating animations and exporting/importing them - more time-consuming that the above method.
1) Those skills take time to acquite.
2) You don't acquire them all at once.
3) It is necessary to break them down into parts.
How you adapt it to Sagan will depend on what you need to do, but this was what I did.
A) I started simple with Diffeo. I brought into it a posed naked character without hair. Once I had that imported I tried rendering.
B) I then brought one in in the Apose and imported a pose from studio; I applied that pose, then rendered. (I discovered I could also save that pose for use again. If I decided to change the pose - by importing another.)
C) I decided to create my own materials to get the look I was after; I use Agenesis 2, and I customise every character's textures I buy (as well as creating my own. I have re-visited these materials since and adjusted twice and after that started completely from scratch; I now use those materials that have since been tweaked.).
D) I then started bothering about clothes.
E) I then looked at Hair (maybe I did this before clothes, but can't recall)
F) I then started looking at dealing with props - simple items; I already knew how to bring them all in as obj (but for anything other than backgrounds - ideally blurred with DOF - the needed substantial adjustment on materials), but I specifically started using Diffeo, to see what steps were needed (if any) to tweak. Generally they work brilliantly without adjustment. Some products still need the obj method.
If you try to do everything at once, chances are you'll get frustrated, overwhelmed or just downright mad.
Nah, it's not an issue or mistake but the look you were after for a Halloween character.
I have not read any further on in this thread to see how others have answered your question. But I will describe my workflow for you, bearing in mind I dont use Blender! And I am only creating static images, not animations.
1. I set up my animation in Daz Studio of my unclothed figure with shoes on. Usually i only do short animations like Frame 0 to Frame 30. Frame 0 will be A pose. And Frame 30 will be my final Pose. If you are trying to create a full and long animation, then your workflow here will be different because you will want to set up a proper animation from A pose to your start pose and then your start pose through your animation).
2. Export as Alembic using Sagan. You dont need a high subD for Marvelous Designer simulation, so usually i just export the Cage (i.e., zero SubD). It doesnt really matter if you have high SubD on a short animation though, because file size of the alembic will be low anyway. But it will simulate slower in MD if the mesh is too high res.
3. Import Alembic into Marvelous Designer.
4. Clothe your character. Create garment on figure or load in an existing garment (Objs are imported as 'Obj to Garment'. Existing Marvelous Designer projects are imported as 'Add' then 'Garment File' or 'Project File').. Resise obj or MD Pattern as needed in order to get a good fit.
Note on fitting Objs: You will not have much flexibility in fitting with Objs other than changing their scale on import. (Usually I scale any Obj clothing in Daz before bringing them into Marvelous Designer so they are at correct scale).
Note on fitting MD garments: For fitting Marvelous Designer garments, then you have more flexibility in getting a good fit. I am not going to describe how to use Marvelous Designer here - but essentially you have Shrinkage Weft/Warp and Pattern editing at your disposal. Do each garment piece at a time if you have complex garment with various pieces.
6. Once fully clothed, you should have at that stage done a basic simulation of all clothing pieces fitted to your character.
7. Animate in the Animation Tab. I animate to my final pose in a high particle density value (PD above 15) because I dont need the animation quality to be good- I jsut need the final pose to be good. If you want to actually use your animation in a final render etc, then you probably want the animation in a decent particle density (i.e., PD under 10). This will take a long time at PD less than 10 to animate if you have a complex layered outfit.
8. For my purposes, I then go back to Simulation Tab and then i do my final simulation at ~ 3-5 particle density. (If you already animated at a PD less than 10 & want to keep your animation, then you dont want to do this)
7. Export the final simulated garment/animation as either Obj/Alembic etc. I export as Obj and use Daz to render! Again, im only using static images.
Notes on simulating 3rd part assets:
Photogrammetry stuff is usually welded. There are photogrammetry assets in Daz Store - you may already own some. So you can export those as obj and they will sim in MD.
Polygonal Miniatures (PA name) has lots of stuff in Daz store for females. He has some stuff for males on his personal website and other 3rd party store but they arent very interesting garments.
Anything photogrammetry/scanned will most likely be simulatable in MD. It is pretty obvious to identify these types of models in 3rd party stores, just by sight, so dont think you necessarily need to google 'photogrammetry clothing' or 'scanned clothing' in order to find them.
Here is a free garment you can test with (I havent used this website before so dont know if it works or not!) https://tooto3dscan.com/flared-dress-v-neck-scan.html
In my opinion, you should get familiar with using MD if you are going to refit Marvelous Designer garments to figures.
Ha, yeah, the queen looks like an evil witch in that one.
First try, forgot to switch to cycles before diffeo import, I do that so much, decided to just go with it and do a eevvee render first. Corrected the UV's and got this. not bad for a 7 second render. And I actually screwed up with the UV correction on it, noticed when I reopened and started setting it up for a cycles render. I only corrected one material for each mesh lol..... Correcting the UV is a bit tedious, but vey easy. I would have never thought to look there for troubleshooting it, so glad someone could point it out to me lol.
That's the cycles render. 11 minutes, 300 samples. I am seeing some issues on the forehead and temple area, guessing its the heaircap. The jewel sheder in her crown is messed up, that could easily be replaced with a native cycles shader if I was doing a real production render I suppose. The peach fuzz looks a bit crazy. I actually got the spotlight pretty damn close to what I was going for this time.
Ha ha, yes, let's go with that...
I forgot a millions times, until:
File|Defaults|Save Startup File
I always plan on doing that too, but forget until the next time after I import. I think I broke blender, tried to use the make hair button to see what it comes out like, been in not responding mode for like 30 minutes or so. Weird though, it's staying only around 4gb of RAM usage and onle 8% CPU. Maybe it's single threaded process or something, I dunno. Gonna give it a while more before I kill the process. Hoping today to have some time to switch over to the linux partition and get ecycles and diffeomorphic all installed and see how much of a difference in performance I get compared to winblows 10.
Forgive me for nitpicking, but it caused me a lot of trouble early on before I realized that the subD cage (no SubD) and SubD level 0 are actually not the same thing... even SubD level 0 has some degree of smoothing and the level only refers to the number of faces that have been added, i.e. by a factor of 2 to the power of the level.
Missing that distinction was what was causing the horrible rounded teeh problem, and I think it bit Thomas in Diffeo, as well.
Sorry for being pedantic, but it's an important difference.
@Leonides02
There is a DLL in the zip. Just copy it to the Daz plugins directory. If you have the old DLL, alembic_exporter.dll, delete it.
Oh - well im new!
I barely understand what a UV is let alone a SubD cage.
Yeh, I have my compositor set up; and various other things, including a couple of groups I created for materials.
Look in Studio at a character; they have a sub d of 2, usually. It's under parameters, and Resolution Level; High Resolution will have a value below, even at zero, it is high resolution. You can change High Resolution to Base.
If for instance you were creating a morph for use in studio, and exported the character at High Resolution, but level zero, it wouldn't work; it needs to be Base.
In Blender with the default cube when it opens, add a Subdivision Modifier - Ctrl 2 works fine as long as the Cube is selected. In Edit mode it is possible to get it looking like that - the effect the Subdivision Modifier has on the mesh; Studio and Blender don't work quite the same, but that might give you an idea.
Thanks, that worked.
I also have Diffeomorphic. I understand they can work together to use the Diffeomorphic matierial conversion? Again, how?
What does the "Choose Base Directory" mean?
I do all my posing in Daz so this is a better solution for me than Diffeo.
Am I missing the "How To" info?
So far, this is the workflow I have come up with
1)save scene as duf
2)export hd with diffeo
3)exports with sagan - base directory is where it's gonna save the files, I save it in same folder as diffeo export and the saved scene for simplicity sake, not sure if that is required. Set which frames want to export in the general tab, In the scaling tab, in the blender/houdini section, I checked just current subd level. In material tab of sagan, make sure diffeomorphic daz importer is checked.
4)In blender, set render engine to whichever you want to render in. use diffeo to imort the scene, then use import alembic for the mesh. Then delete the diffeo mesh. Take a test render, if materials look crazy, go into the materials and link the normal maps that are showing as bright red. I just had to click on it, and there was only choice to pick in the dropdown. Might be different if there are more than one poople imported, not sure. That's as far as I have got, no idea what any of the other options do yet.
Thanks, TheKD.
Now if only I could get Diffeo to work!
I have to say a big thank you to all who added helpful advice and examples of workflow. My day is just starting here in NZ so I have a lot to experiment with today. No doubt I'll be back with more questions but I can only hope that thrashing out the details will help others get to grips with this. It must seem so obvious to the experienced old hands but for me, Blender has become a real challenge for my ageing brain cells.
@Marble ....
@TheKD tip #4 with the UV mapping is important for figures like the cyborg DA-15V or Aurora with deep grooves and multiple layers of clothing or in this case 'shell' armour around her. But it takes maybe 30 seconds to go into blender shader editor and just fix it and 100% correct textures along with the alembic files for all those deep grooves and extra detail.
Also tested it on monsters; for example the hydra 7 head creature with a 300 frame animation on all heads and also subdivision 4 detail for other HD models (even though animation with sub level 4 is gonna give you millions of vertexes and a looooooooooooong time to write the alembic cache file) but it works....
So basically I haven't come across an issue but instead finally happy to work in daz, animate, morph, pose, layer clothing etc. and export to blender and get exactly what I expected and further edit my scene around the daz character and finish. Blows my mind still. Difffeo and Sagan been a game changer.
I bet the abc file is ginormous too lol. I think I was only doing level 2 and 3 on my import and it was almost 2gb in size. Not that it matters much for me though, I got ~2tb free on my work project HD, so not really worried about it now. Hopefully by the time it becomes an issue 8 or 12 tb drives will have come down a bit in price :P
Just me or is daz forums slow as heck today?
@TheMysteryIsThePoint
This is a request if you get some time for us, please feel very free to ignore this. Since Thomas isn't deep in the daz api and c++, it would be great for diffeo if you could code a c++ version of the HD exporter. This would make things much faster for HD figures. The daz script source is about 300 lines of code and my hope is that it may use the same api available for c++. This would also mean to maintain the c++ version when Thomas changes something in the HD exporter, that should happen rarely.
Please let us know what you think. Again please feel free to ignore this if you don't have time or for any reason. Export script attached.
https://bitbucket.org/Diffeomorphic/import_daz/issues/53/use-the-multiresolution-modifier-for-hd
@Padone I absolutely cannot say "no" to you :)
But I would want to first get working the UV merging, the Blender script to swap the UV maps, and separating surfacing out into objects. And the DSON parsing has gotten quite slow, now that it has to hunt down all the uv maps. Then I'll be happy to take this on.
I vaguely understand the difference between Daz HD and simply a higher subd, but how bad can 300 lines be? Famous last words, I know, but with your help, I'm sure we'll manage.
@TheMysteryIsThePoint Thank you so much that will be great since the HD exporter via script is so slow right now and sort of a PITA to be honest. Of course I understand sagan takes precedence and hope the merge uvs idea will work fine. Waiting to test the next update.
Another way though to export baked animations from daz studio would be to export a vertex cache in mdd format. Taking the HD exporter as reference. I mean if the vertex order doesn't change then you could just apply the animation to the imported HD figure. That could be an added script "export MDD to Blender" and would integrate perfectly with diffeo. This method would get you out from any material or uv map issue. I see in blender the mesh cache modifier can import mdd files.
Of course alembic is far superior to mdd but also more complex. And given the limited animation and simulation capabilities of daz studio mdd would just fit. Another advantage is that mdd may fit all the applications that can import mdd files and have a daz bridge. Since daz bridge is available for many platforms now. Provided the vertex order doesn't change among bridges that may not be the case though.
This is just an "extra" idea for you not intended to push anything.
Would that require the Animate 2 plugin that has been previously required to export MDD?
I honestly hadn't thought of all those implications that you just pointed out. I was already interested in MDD because of its fixed-length frames that would make it easy to edit the mesh. I've already written the implementation, it's just be another subclass of the Exporter class, just like AlembicExporter already is. It should "just work".
I agree that Alembic is made for more ambitious tasks, and is overkill.
Ultimately, I would like and option for Daz to talk directly to Blender without intermediate files, and Daz Studio will act as a slave to Blender, serving up frames and geometry data when Blender requests it.
No, we're going to put MDD export functionality into Sagan.
This would mean to keep both daz studio and blender up and running though. With the scene doubled in daz and blender. That especially for large scenes may require a certain amount of extra ram that's not needed if you just use files.
edit.
As for mdd if you already can go for it I believe that's great. I guess you'll need one mdd file per object in the daz scene. Then in blender you'll need a script to add a mesh cache with its own mdd to each object. Otherwise the user could add them by hand just for the objects he needs to animate that's the easy solution.
As for HD figures by diffeo the mdd cache has to contain the base mesh vertices, since diffeo uses a multires so it's the base mesh to be animated. The HD exporter exports both the base mesh and the HD mesh. The multires and the base mesh have the same vertex order. Luckily you will also have to ignore shells in the daz scene since those are implemented as extra material layers in diffeo so they get no extra geometry.
It is an old format from Newtek(Lightwave3D) that is really is not used in the industry much these days.
The problem with MDD in Blender is that there is not really a decent import dialog interface which why I abandoned the idea when trying to find an alternative to my old C4D based pipeline.
In other Daz studio animation to Blender news, it seems that retargeting Body motion has been possible in Blender for many years now.
Hey Wolf, you know a format is old when it requires Big Endian encoding, and not because of a new architecture, like ARM :) But MDD is light, simple, even simpler that Wavefront OBJs, and completely documented. It does exactly one thing and nothing more. Where it is used and by whom is not really relevant if it meets a specific technical requirement better than any other format. It's fixed-length frames also mean we could edit the mesh in edit-mode and instantly update just that frame of the MDD file on disk, rather than one's edits disappearing as soon as one goes back to object mode. Try doing that with Alembic, FBX, or anything else.
But I'm not sure I would call linking armature data in Blender "retargeting", as it only works between perfectly similar armatures. To my knowledge, there is still no tool in Blender that does actual retargeting, as in it calculates that pose which minimizes the error in the position of the end effectors, and respects constraints. Personally, I don't want to work with anything less than that, even if it means giving money to Autodesk.