Saving data with new format.

Vasily LevinVasily Levin Posts: 0

What is a correct way to save custom objects that forms hierarchy in new format?

It can not be deduced from documentation and headers, so I have to make some guessing here, correct me please if I am wrong.

Say I have a custom node class Alpha:

 public DzNode {};
and helper
class AlphaIO: public DzExtraNodeIO {};

and a
 class Beta: public Alpha{};

I guess to implement saving of Beta I have to make BetaIO child of AlphaIO and explicitly call parent methods.

Is it how it works, or there is some underlying magic that would call all IO helpers and I just have to implement Independent IO classes?

Also it is unclear how save/load should work if several different object are referencing another one. Previously it worked through ReadPointer, now I do not see mechanism for that.

Post edited by Vasily Levin on
«1

Comments

  • foleyprofoleypro Posts: 485
    edited November 2012

    Here is a Jpeg of the error inside of DS...

    Interesting thing here is I can save as a ParticleFX file format aka constructor(Which Vasily Programmed into export options) and everything is fine But when we try to save in .duf format you see the errors in the Jpeg...I have a zipped Log file from DS4.5 if needed...

    PFX.DSX_2File_.jpg
    1280 x 1024 - 172K
    Post edited by foleypro on
  • dtammdtamm Posts: 126
    edited December 1969

    The old way was quite a bit easier. Simply inherit from DzBase, implement your save and load, and use WritePointer and ReadPointer for your custom objects.

    The .duf way requires serialization to a text format. Or in other words, no pointers. It is significantly more work. You can write all your objects and give each one a unique token of some sort. Then write that token where you have a pointer to said object. You read in all your objects and resolve your pointers.

    Studio itself usually uses the ids. Here is how to write a url to DzNode dazNode. This would go in your writeExtraInstance

    io->startObject();
    io->addMember("node", DzAssetIOMgr::getAssetInstanceUri(dazNode).toString());
    io->finishObject()

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    and do the save/load initiated automatically for every node or even DzBase objects or do I have to initiate it somehow?

  • dtammdtamm Posts: 126
    edited December 1969

    and do the save/load initiated automatically for every node or even DzBase objects or do I have to initiate it somehow?

    Anything that derives from DzNode and is added to the scene via dzScene->addNode will get saved by the scene. samples\modifiers\DzBlackHole is an example of that.

    What are you deriving from and who owns it? Scene, another node, etc?

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    In addition to node that have a DzNode as an ancestor I have objects derived from DzModifier and DzBase, modifiers are owned by objects that are nodes, added to them by call

     getObject()->addModifier(mod);

    DzBase based objects are owned by one of plugin class, and stored in container.

  • dtammdtamm Posts: 126
    edited December 1969

    In addition to node that have a DzNode as an ancestor I have objects derived from DzModifier and DzBase, modifiers are owned by objects that are nodes, added to them by call
     getObject()->addModifier(mod);

    DzBase based objects are owned by one of plugin class, and stored in container.

    and do the save/load initiated automatically for every node or even DzBase objects or do I have to initiate it somehow?


    Every node added to the scene and every modifier that is a child of that node will get the write initiated on it.
  • dtammdtamm Posts: 126
    edited December 1969

    Do you have custom data on the derived DzModifier?

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    dtamm said:
    Do you have custom data on the derived DzModifier?

    yes, I have.
  • dtammdtamm Posts: 126
    edited December 1969

    dtamm said:
    Do you have custom data on the derived DzModifier?

    yes, I have.

    Then you will need to do it like MyCustomModifier does. How far are you getting? Or in other words, looking at the duf file what is getting written out that looks correct and what does not?

  • Vasily LevinVasily Levin Posts: 0
    edited November 2012

    dtamm said:
    How far are you getting? Or in other words, looking at the duf file what is getting written out that looks correct and what does not?

    Do not know yet, it is half way to by ready to run, for now I am trying to understand what is a correct way to handle save when you have a hierarchy objects. will get back to you if I have difficult problems after making it running.

    Post edited by Vasily Levin on
  • dtammdtamm Posts: 126
    edited November 2012

    dtamm said:
    How far are you getting? Or in other words, looking at the duf file what is getting written out that looks correct and what does not?

    Do not know yet, it is half way to by ready to run, for now I am trying to understand what is a correct way to handle save when you have a hierarhy objects.

    For the DzNode and DzModifiers that are known to the system and have their own custom data, then you will need to do it like the samples. For your own custom classes, its more or less up to you. So suppose your DzNode called CustomNode has a pointer to an object of type Happy. Happy is your own class and not known to Studio.
    - if Happy is only ever known to your CustomNode , then you could have the writer and reader for CustomNode handle Happy in its entirety.
    - If other objects might have pointers to Happy, then you have to decide who ownes it and who is just referencing it.

    Post edited by dtamm on
  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    for me most interesting case is when you have CustomNode2 derived from CustomNode, that case is not in examples.

    Should serialization helper classes such as

    class CustomNode2IO 
    and
     class ReadCustomNode2Node
    be derived from corresponding helpers of CustomNode, or it will work if they are independent?
  • dtammdtamm Posts: 126
    edited November 2012

    for me most interesting case is when you have CustomNode2 derived from CustomNode, that case is not in examples.

    Should serialization helper classes such as

    class CustomNode2IO 
    and
     class ReadCustomNode2Node
    be derived from corresponding helpers of CustomNode, or it will work if they are independent?

    Either way is fine and have pros and cons


    Some confusion could be clear up by thinking of a DzNode as the following:
    - The core part
    - The extra part

    The core part gets read and written automatically. CustomNodeIO is an ExtraNodeIO, it only writes the extra part. That is why you don't have to make calls the DzNode's reader or writter in your own derived functions like you used to. Example of chaining up the call that is no longer required:

    void DzBlackHoleNode::loadSection( DzInFile *file, short sectionID )
    {
     if( sectionID == DZ_BLACK_HOLE_RADIUS_SECTION )
      file->readPointer();
     else
      DzNode::loadSection( file, sectionID );
    }
    

    But if you make your CustomNode2IO derive from CustomNodeIO, then you will want to chain up from CustomNode2IO to CustomNodeIO

    Post edited by dtamm on
  • Vasily LevinVasily Levin Posts: 0
    edited December 2012

    Could resolve most problems, have several problems to solve though.

    1. how should I implement save/load for objects derived from DzBase? in my case they are owned by another object of DzNode type. I would like their load to follow same architecture but do not see how to make DS loader to initiate writing/reading of those object. Maybe I should base them on other class? I first thought about DzSceneData, but examples has bogus comment about it that DzSceneData of a given type is singleton and can have only one exemplar. And also it is not clear if old scenes would load if I base class on the other type.

    2. What is the difference between instance write/read and defenition write/read? when they are called and what supposed to be loaded on each state?

    3. How can I prevent DS from saving some data? I have a nodes with auto-generated geometry and does not want to keep it in file I am getting compressed save file of size 10MB and about 200MB uncompressed, but actually it would be enough several kilobytes to store my data.

    and 4. is just a complain, please make a documentation for your new serialization system at least comments to parameters and return values in the declarations, that already would be a huge help. also architecture overview with description of stages and on what steps what object are created and initialized. without that data implementing serialization to the new format is very discouraging process.

    Post edited by Vasily Levin on
  • dtammdtamm Posts: 126
    edited December 2012

    Could resolve most problems, have several problems to solve though.

    1. how should I implement save/load for objects derived from DzBase? in my case they are owned by another object of DzNode type. I would like their load to follow same architecture but do not see how to make DS loader to initiate writing/reading of those object. Maybe I should base them on other class? I first thought about DzSceneData, but examples has bogus comment about it that DzSceneData of a given type is singleton and can have only one exemplar. And also it is not clear if old scenes would load if I base class on the other type.

    Let's assume you have VasilyThing : public DzBase and VasilyNode : public DzNode ?
    - am I correct in my assumption?

    If your VasilyThing wants to be saved you can do one of the following,
    1) VasilyNode needs to write out the data of VasilyThing
    2) or you need your VasilyThing thing to implement IDzSceneAsset. See MyCustomModifier.h

    In the case of #2, VasilyNode will still need to tell the system to save VasilyThing using getOwnedAssets.

    
    void MyCustomNodeIO::getOwnedAssets(QObject* object, QList< IDzSceneAsset* >& assets)
    {
     MyCustomNode* node = qobject_cast(object);
    
     // we don't need to do this since we added our property to the node.
     //  this is shown as an example for when you have assets that are not known to the system.
     //assets.push_back(node->m_property);
    }
    



    2. What is the difference between instance write/read and defenition write/read? when they are called and what supposed to be loaded on each state?

    definition, think library
    instance, think actual thing. an instance can reference a definition(does not need to be in the same file) and apply extra settings. An instance cannot be referenced outside the file it exists in.
    I would suggest to read the spec while looking at some duf and dsf files.
    - http://docs.daz3d.com/doku.php/public/dson_spec/start


    3. How can I prevent DS from saving some data? I have a nodes with auto-generated geometry and does not want to keep it in file I am getting compressed save file of size 10MB and about 200MB uncompressed, but actually it would be enough several kilobytes to store my data.


    if you create a generic DzFacetMesh and put it on a DzNode, it 's going to save with the scene.
    I know for sure you can do it if you implement your own Shape/Mesh. You may be able to do it by subclassing DzFacetMesh and overriding some of IDzSceneAsset, I will look into that.

    Or you could watch for the following signals on dzScene and remove your geometry when the save happens:
    void sceneSaveStarting( const QString &filename; );
    void sceneSaved( const QString &filename; );



    and 4. is just a complain, please make a documentation for your new serialization system at least comments to parameters and return values in the declarations, that already would be a huge help. also architecture overview with description of stages and on what steps what object are created and initialized. without that data implementing serialization to the new format is very discouraging process.

    The current save and load paradigm is at least a 10 fold increase in complexity. The old .DAZ paradigm was great for the programmer and very orthogonal, but for the user it had severe limitations.

    Again, the best place to understand it is to read the spec while looking at some duf and dsf files.
    - http://docs.daz3d.com/doku.php/public/dson_spec/start

    hopefully we will figure out a better way to convey the architecture and best practices.

    Post edited by dtamm on
  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    thanks alot, that helps.

    one more question on 1.

    Most convenient way to go for me is MyCustomModifier way, but what asset type should I set to it? Modifier has a special constant for it, but my objects are custom and it seems that existing constants do not fit into the model.

    
    IDzSceneAsset::AssetType MyCustomModifier::getAssetType() const
    {
     return ModifierAsset;
    }
    
  • dtammdtamm Posts: 126
    edited December 1969

    thanks alot, that helps.

    one more question on 1.

    Most convenient way to go for me is MyCustomModifier way, but what asset type should I set to it? Modifier has a special constant for it, but my objects are custom and it seems that existing constants do not fit into the model.

    
    IDzSceneAsset::AssetType MyCustomModifier::getAssetType() const
    {
     return ModifierAsset;
    }
    

    return ModifierAsset;

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    still a problem I return owned assets but write write functions are never called; I register my custom type with

    
    DZ_PLUGIN_CLASS_FACTORY( ParticleConnector, PARTICLE_CONNECTOR );
    DZ_PLUGIN_CLASS_GUID( ParticleConnectorIO, 50DC77CD-6780-4EA3-BDC6-08D8C0C32684 );
    DZ_PLUGIN_REGISTER_MODIFIER_EXTRA_OBJECT_IO( "Particle_connector", ParticleConnectorIO, ParticleConnector );
    

    what am I missing?

    thing to note, when it was registered by DZ_PLUGIN_REGISTER_NODE_EXTRA_OBJECT_IO DS crashed on saving scene somewhere on creationg IO object for extra modifier, and there were no any of my functions on stack trace. When I changed this to DZ_PLUGIN_REGISTER_MODIFIER_EXTRA_OBJECT_IO macros crashes gone, but still no writing for those objects.

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    can I treat

    virtual DzError DzAssetExtraObjectIO::finalizeInstance( QObject *object, const DzFileIOSettings *opts ) const;
    as PostLoadFile in previos saving scheme?
  • dtammdtamm Posts: 126
    edited December 1969

    can I treat
    virtual DzError DzAssetExtraObjectIO::finalizeInstance( QObject *object, const DzFileIOSettings *opts ) const;
    as PostLoadFile in previos saving scheme?

    It's is sort of implied by the header file but yes, finalizeInstance is the last thing called. Like the following

    for all
    - applyInstanceToObject
    for all
    - resolveInstance
    for all
    - finalizeInstance

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    and what about previous question about write functions never called?

    maybe you need more details?

  • dtammdtamm Posts: 126
    edited December 1969

    and what about previous question about write functions never called?

    maybe you need more details?

    That I will have to investigate.

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    One more strange thing, after loading saved file hierarchy structure is lost, do I have to store it by hand and restore on load?

  • dtammdtamm Posts: 126
    edited December 1969

    One more strange thing, after loading saved file hierarchy structure is lost, do I have to store it by hand and restore on load?

    I am unsure what you mean by "saved file hierarchy structure"?

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    I mean that scene I save has Nodes, child nodes, etc, when I load it it flattens to the list of nodes, I do not see how I could screw it since no reparenting is done on load.

    So question is is structure is kept automatically and I actually destroy it on load somehow, or I should specifically do something to restore parent-child relations?

  • dtammdtamm Posts: 126
    edited December 1969

    I mean that scene I save has Nodes, child nodes, etc, when I load it it flattens to the list of nodes, I do not see how I could screw it since no reparenting is done on load.

    So question is is structure is kept automatically and I actually destroy it on load somehow, or I should specifically do something to restore parent-child relations?

    I just double checked MyCustomNode put into a hierarchy of nodes, it is working.

    Are you setting the names of your node to be unique? use setName on your dznode

  • dtammdtamm Posts: 126
    edited December 1969

    it might be helpful to pm me your duf file so I can take a look at it.

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    from the smallest particle scene resulting file is still big, would not fit into PM, here is dropbox link.

    https://dl.dropbox.com/u/21104304/particles.duf

    it is big, but when most autogenerated data is folded it is ok to explore.

  • dtammdtamm Posts: 126
    edited December 1969

    Search for the lines in the file that look like:

    
     "scene" : {
      "nodes" : [
       {
    ...
    

    The first node is the camera and looks fine.
    The second node looks fine
    The third node has no id or name. I am pretty sure that is the problem. Please give all the nodes you create a name, it does not have to be unique.

  • Vasily LevinVasily Levin Posts: 0
    edited December 1969

    I managed to move significantly futher in my save/load code.

    Now I have a question, can I control order of finalize calls for sibling nodes? Or maybe there is some guaranteed order of calls to apply instance, finalize instance?

Sign In or Register to comment.