The Alembic Exporter

These are some problems I've encountered with the Alembic Exporter that completely stopped my project. It:

1) Takes over an hour to export 300 frames (or at least what SHOULD have been 300 frames) of a single figure at base resolution.

2) Screws up the UV maps.

3) Stops at arbitrary frames, usually around 140.

So I got the official Alembic sources from GitHub, and wrote a short program to convert a series of OBJs exported from Daz to an Alembic file that be imported into Blender, Maya, or whatever.

I have an initial version that works, but:

1) There's something I don't understand about how Alembic reckons time, because the model walks in slow motion in Blender.

2) I don't yet support material zones, so I can't re-texture the imported model.

3) I don't even support multiple objects, so the import is just one comprehensive object.

4) Blender complains about a missing transformation at every frame, I have no idea what that means, but it still works.

But it sure is nice not having to worry about JCMs and whatnot. And the Alembic code is AMAZING... I tested it with 40 frames, each one an 80 meg .obj, and the resultant .abc file is only 220 megs. There's some SERIOUS programming going on in there; I would not have thought that there would be much opportunity for any kind of compression at all. And it writes out frames so fast that the bottleneck is my parsing of the .obj file.

So, anyone who has suffered the same frustration with the Alembic exporter that I have, help is on the way...

Donald

Post edited by TheMysteryIsThePoint on

Comments

  • Today I was able to write out 300 frames of an 80 megabyte model, for a combined .abc file of 1.7 gigs, so there don't seem to be any arbitrary frame limit.

    The time reckoning still has me confused, and the Alembic docs are not very helpful. I've requested help from the mailing list and hopefully I'll have it solved this weekend. But seeing a G8 move in slow motion really just makes one appreciate how beautifully those JCMs really work.

    Alembic supports time in basically four modes: Identity, which configures the beginning of the .abc file as frame 0 and 1fps, Uniform, where you specify a beginning time and a frame time which is 1/fps (a generalization of Identity), Cyclic, which is there to support motion blur, and Acyclic, which is the most flexible, where at every frame you basically tell the Alembic library at what time you are in the scene. The problem is that the way I am creating the .abc file, it is stuck on Identity, when I really want Uniform starting at the specified frame and at the specified rate. So when I import 300 frames into Blender, it should be 10 seconds of animation, but Blender thinks it's 1fps and so I get a super long 300 second animation at 1/30 the speed :/

    I'm making progress, but there are clearly some things to be figured out. I continue being seriously impressed with the official Alembic reference implementation.

  • Frame rate issue resolved.

    I should really have fixed the materials first, but the systems programmer in me can’t resist threading the app... In initial testing, a 32-core first gen Threadripper does in 33 seconds what takes the original single-threaded version 480 seconds.

  • WendyLuvsCatzWendyLuvsCatz Posts: 38,582

    watching with interest

    while I own the DAZ Alembic exporter I am still interested in somethhing better

  • Hi Wendy,

    I think the official exporter will continue to be more convenient, if it works for you, as it is a one-click operation. My code works from a series of .obj files exported by another script, and it is written in POSIX C++, which means unless some Windows programmer wants to port it, the user will need to run it in Windows Subsystem for Linux. Doing so is quite trivial, but it is another step. But if, like in my case, it makes the difference between being able to proceed with a project and being stuck wondering why it didn't export all the frames or why the UV maps are all mangled, or what "unexpected error" means, or why it's taking so long, then the additional step is worth it.

  • The "missing transform" warning in blender is resolved. There was just an important point missign from the Alembic documentation.

    Multiple objects is resolved, so the imported model is no longer a single, monolithic object. Having done that part now, I still have no idea why the Daz Alembic exporter would give that "unexpected error" when two objects have the same type... it is trivial to generate a unique name in those cases.

    So all that remains are to export normals and vertex coords (there's sample code for that), and the face groups for re-applying the materials. I suppose I should not be lazy and group all the texture files from the obj file to assist in that :)

    I hope to have all this done by the end of the week and I hope this is helpful to other people; the Daz Alembic Exporter really breaks my heart because, with workarounds, it actually works pretty well... until it doesn't.

  • The arbitrary stoppage in the official Alembic Error turned out not to be so arbitrary; it was user error (me)... Blender defaults to 24fps while my exporter exports at 30. My test was 300 frames, and 300 * 24 / 30 = 240, exactly where it appeared to stop exporting frames. What can I say, my Blender fu is weak :)

    No "unexpected error" with non-unique object names.

    No mangled UVs.

    Still a strange problem with the normals; everything looks OK, but with any kind of cloth simulation where the imported object is a collision object, the cloth flies up inside the object and jitters around. I figured that Wavefront .obj files and Alembic reckon normals in opposite senses, so I thought the solution was just to flip the normals as I read them, but that didn't work, either. I'll keep digging.

    But I think that is the last bug before I can say this is useful and in some ways "better".

    One unique thing that this code can do is to ignore certain objects, or surfaces within an object. This is incredibly useful to me. I had a character with the Misery outfit from Renderosity. I wanted to simulate the skirt in Blender instead of Daz, but the skirt surface is in the same object as the belt. In Daz I had to use the Geometry Editor tool to make the skirt surface invisible before exporting, which is an extra step and a hassle because those edits are not saved. With my code, I just told it to ignore the skirt surface.

  • It isn't fixed yet, but I have a better handle on the problem. I'm not sure it's the normals at all, because those are precomputed and stored in the Wavefront object file. The problem is the vertex order. Wavefront stores them in counter clockwise order, but Alembic wants them in clockwise order. Visually, in Blender the imported object looks the same in wireframe view because the faces consist of the same vertices.

    Here is where speculation starts. What I *think* is going on is that for visual aspects, Blender must use the precomputed normals from the .obj file, but for cloth simulations, it must compute the normals it uses for collision distances. The vertices are in reversed order, so for quads at least, we're guaranteed that all the vector cross products are the exact negative of what they should be. I'll have to find that spot in the Blender code in order to verify, but that would explain why the imported objects render perfectly, but the cloth sim goes all janky.

    "This is a simple fix" are famous last words for developers, but this sure seems like a... simple fix.

    I think I'm really close to being done, at which point I'll help anyone who is interested to install Wndows Subsystem for Linux, export the objects, and run the program to generate the .abc file. I hope no one is turned off by the Linux aspect; contrary to what people may have heard, it is dead simple, more intuitive, and you might actually end up liking its simplicity. 60 years of people using it to get serious work done can't be wrong, and they aren't :)

     

  • I'm kinda slow at times, but I eventually catch up.

    OF COURSE simulations compute vector normals instead of using the precomputed ones from the .obj file. It has to. The whole point of a simulation is that vertices should move, invalidating the normal. That the sim should know that for collision objects imported via .obj, they are immutable and there is probably a precomputed normal available would be a very, very specific optimization that was certainly not implemented.

    As a result, the sim computes these normals on the fly. That would explain why my inverting them as I read them in did not fix the sim: the sim is ignoring these values anyway.

  • So, that was the problem; vertex order. Now everything works perfectly, if not conveniently. I can export any number of frames, the UVs work correctly, sims work correctly, no "unexpected error" or whatever BS.

    But I still have to export all the .obj files via script first, and this is an inconvenient and slow process. I also don't collect materials yet (although the official exporter can be used for this, exporting a single frame), nor the "preserve Subd" option. This last one is kind of a killer because I am having to export at high resolution with subd because, as I've discovered, exporting at base resolution and then using the same subd algorithm in Blender is not at all the same thing as exporting the subd information from Daz and then doing the subd in Blender.

    The other things, along with re-doing the mutli-threading, are nice-to-haves, but the SubD is a must-have: With dForce Classic Long Curly Hair, the OBJs can push 700 megs per frame. Yes, a bit much. I've solicited some help in another thread and hopefully I'll be able to implement this soon.

    I'm sooo close but no cigar yet. But I will say that so far, Daz + Blender = Heaven :)

     

  • An update.

    I did get everything to work via the hack of writing out an OBJ file per frame in Daz, and then processing each of those to create the abc file. This did work, and I got my animation and cloth sim done in Blender, but 300 frames took a couple of hours and over 200 gigs. It was only acceptable because I had no other choice: I was forced to export the already subdivided mesh because, and I really don't like the way Daz does this, there's no distinction between the base resolution model and the Subd cage suitable for subdivision; they're one in the same. That's why there are products called "HD Lip Contours" that morph the lips to look like a SubD cage, so that it looks wierd before subdivision, but perfect after. Trying to export this morph to Blender defeated the entire purpose of using Alembic.

    But while I was wracking my brain trying to figure out this problem, it dawned on me that I was missing the forest for the trees: Daz is closed source, but they provide a decent SDK. Blender, my target app, is open. I should be able to write a "Blender Bridge" as a plugin that runs in Daz and automagically transfers the entire scene to Blender.

    I've never written a Daz plugin before now, and I swore that I would never program for Windows ever again, but I'm extremely excited about the prospect. I've already got a dialog box to come up in Daz that allows external apps to connect and request the vertices, normals, and UVs for all the objects in the scene for a given frame. I think I've figured out how to get all of that information, and a wire protocol to send it. The test will be to write a simple app to connect to the plugin in Daz and write out an OBJ file that can be loaded in Blender. If it looks right, then I'll hack Blender to connect to Daz directly, with no intermediate files. The Daz scene should just appear in Blender automatically, and you should only have to texture it once fo EEVEE or Cycles.

    And while I was figuring out how to write the plugin, I realized that, duh, the plugin approach would solve the original problem with Alembic of having to write out the OBJ files. I think I could compile the Alembic reference implementation in Windows, but honestly, I remember how much I really hated coding for Windows and don't want to do it :( If someone were interested, I'd give them the code so that they could implement the last 10%. But then again, as I understand it, the only problem that I solved that still plagues the newest official Alembic exporter is the "unexpected error" with duplicate object names, so it may not be worth it.

    So, unless someone wants the Alembic code, I'm going to stop posting here, and will report back when I've got the Blender Bridge working. It's going too well, so I know that I'm going to hit a critical problem with something undocumented, like the Daz thread model or something, and as usual no one in the SDK forum will be able to help, but I'm keeping my fingers crossed...

    Donald

  • Richard HaseltineRichard Haseltine Posts: 102,701

    An update.

    I did get everything to work via the hack of writing out an OBJ file per frame in Daz, and then processing each of those to create the abc file. This did work, and I got my animation and cloth sim done in Blender, but 300 frames took a couple of hours and over 200 gigs. It was only acceptable because I had no other choice: I was forced to export the already subdivided mesh because, and I really don't like the way Daz does this, there's no distinction between the base resolution model and the Subd cage suitable for subdivision; they're one in the same. That's why there are products called "HD Lip Contours" that morph the lips to look like a SubD cage, so that it looks wierd before subdivision, but perfect after. Trying to export this morph to Blender defeated the entire purpose of using Alembic.

    I'm not quiye sure what you are meaning here. HD moprhs work on the virtual vertices created by SubD, while "SD" morphs work on the cage. I was under the impression this was a standard way to implement multi-resolution morphs, and is in the OpenSubDiv standard (which is used in DS and 3Delight, but not I think in Iray - hence the need to bake the mesh to a set "Render Division level" for Iray).

    But while I was wracking my brain trying to figure out this problem, it dawned on me that I was missing the forest for the trees: Daz is closed source, but they provide a decent SDK. Blender, my target app, is open. I should be able to write a "Blender Bridge" as a plugin that runs in Daz and automagically transfers the entire scene to Blender.

    I've never written a Daz plugin before now, and I swore that I would never program for Windows ever again, but I'm extremely excited about the prospect. I've already got a dialog box to come up in Daz that allows external apps to connect and request the vertices, normals, and UVs for all the objects in the scene for a given frame. I think I've figured out how to get all of that information, and a wire protocol to send it. The test will be to write a simple app to connect to the plugin in Daz and write out an OBJ file that can be loaded in Blender. If it looks right, then I'll hack Blender to connect to Daz directly, with no intermediate files. The Daz scene should just appear in Blender automatically, and you should only have to texture it once fo EEVEE or Cycles.

    And while I was figuring out how to write the plugin, I realized that, duh, the plugin approach would solve the original problem with Alembic of having to write out the OBJ files. I think I could compile the Alembic reference implementation in Windows, but honestly, I remember how much I really hated coding for Windows and don't want to do it :( If someone were interested, I'd give them the code so that they could implement the last 10%. But then again, as I understand it, the only problem that I solved that still plagues the newest official Alembic exporter is the "unexpected error" with duplicate object names, so it may not be worth it.

    So, unless someone wants the Alembic code, I'm going to stop posting here, and will report back when I've got the Blender Bridge working. It's going too well, so I know that I'm going to hit a critical problem with something undocumented, like the Daz thread model or something, and as usual no one in the SDK forum will be able to help, but I'm keeping my fingers crossed...

    Donald

     

  • Thanks for responding, Richard. I'm trying to make sense of what I'm seeing, and I appreciate your effort to debug my perceptions of things. Please tell me if/when I say something that isn't true:

    If, in Blender, you import a Daz Model at base res and subdivide it, for example the lips pull away from each other and what looked perfect at base resolution, can look grotesque at high. This is because of the way SubD works; without SubD, surfaces GO THROUGH the vertices. With it, surfaces are INFLUENCED by the vertices. I think they are splines, and so the mesh will always receded back into concave curves a bit, and pop out of convex ones. But only one of these can correspond to the artist's original vision.

    I think some PAs realized this, and at least on created a morph called "HD Lips Contour". It doesn't seem to truly be an HD Morph. All it does is to adjust the vertices around where the lips meet, such that the model may now look grotesque at base resolution, but when it is subdivided, it more closely preserves the artist's original vision. That is, the morph accounts for the fact that you really can't use a base resolution model not intended to be subdivided, as a SubD cage and expect them to have a high degree of fidelity relative to each other without having to morph the model a bit. Even the model at subd level 0 in some areas looks nothing like the base resolution model. And how could it? A  mesh and a SubD cage interpret the vertices in completely different ways.

    Or is it the case that users are never supposed to actually render the base resolution models, and the lowest poly model that represents the artist's vision is always SubD level 0?

    Thanks for your attention, Richard.

  • Richard HaseltineRichard Haseltine Posts: 102,701

    DS uses SubD - it implements the Pixar OpenSubDiv standard I mentione above, and by default uses the Catmark algorithm. If Blender is producing diffrent results it sounds as if it is using a different algorithm, and so producing (in this area) a more extreme shrinkage.

    HD moprhs are acting dirctly on the additional vertices created by the SubD algorithm, allowing for greater detail than simply adjusting the base cage but with the benefits of morphs (adjustable values, show up in preview if the division level is high enough) ovwer displacement maps.

    The expectation is that the model will be rendered with SubD applied, though you may (especially in Iray) want to lower its level or remove it for background figures (or perhaps if you want to simlate the look of a statue).

  • The expectation is that the model will be rendered with SubD applied, though you may (especially in Iray) want to lower its level or remove it for background figures (or perhaps if you want to simlate the look of a statue).

    OK, now I understand. I shouldn't look for anything called a SubD cage; it's the base resolution model. Thank you.

    Donald

  • lilweeplilweep Posts: 2,558
    edited August 2019

    This is the only active Alembic Exporter thread, so I am just going to dump my question here:

    I can't seem to export alembic anymore. I dont have the option to export as alembic under File->Export.

    I have uninstalled and reinstalled the alembic exporter.  It was working fine a couple of months ago.

    Edit:  Seems it's disabled in Plug-ins. See screenshot  Is there a way to activate it?

    Edit: Okay, problem solved, I realised you can get the serial number from the DIM by right-clicking on the installed product.

     

    Capture.JPG
    628 x 577 - 51K
    Post edited by lilweep on
Sign In or Register to comment.