Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
I would be very surprised if it isn't possible, but I'm afraid I am not completely clear what it is that you want to do ('rendering alpha channels for individual surfaces'). Could you explain what you are doing in DS - perhaps with (a) screenshot(s)?
You might try this from the ubiquitous Andrew Price:
Disclaimer - I haven't watched it yet so not exactly sure what it contains, going by the title 'Where everything is in Blender 2.8'
(I see Mr Price is sticking steadfastly to '2.8'; I'm leaning towards 2.80 ('two point eighty') because 2.81 ('two point eighty-one') is in alpha already.)
I just watched this one because I'm a bit bewildered about the best way to start learning. The guy has a delivery like an automated railway announcement but he does deliver good advice.
Thanks for the clarification, Wolf... I bought pretty much the whole suite of RL stuff, but was extremely disappointed when their much tauted support for G8s in CC3 did not support JCMs, even after they said "fully" supported. I kind of rage-quit RL after that. But 3DXchange as a cheap MB is not something that one can ignore. When WebAnimate didn't support G8s either, I was afraid that I might be stuck with MB.
By this time next year DAZ Studio will have equaled or surpassed most of those features & the pipeline features are already in DAZ Studio. Really only the animation capabilities are lacking.
Andya, sorry for the delay and if I wasn't clear enough before. Below is an example of a mask from a pic I'm currently working on. I only wanted to mask the skin of these wings, not the bones or limbs. I've seen Blender export masks of individual objects, but I don't yet know if it can create these sorts of masks from specific materials or object surfaces.
You can pretty much tell the shader to do anything you want; shaders can get very complicated, and do.
I haven't done much in Cycles for a couple or more years now, but I used to use it often; the car in my profile is one example.
Understand that CC3's
Genesis "support" is essentially a shape projection algorithum similar to the GENX 2 Plugin for Daz studio.
You get a an Iclone Base Avatar "Doppleganger" of whatever G3/8 Character you imported via FBX
so the Daz JCM's or HD morphs are never part of the equation .
Is there new info the Change log that confirms this ??
One of the main pipeline features of Iclone 3DX is the ability to import an FBX rig from very nearly every other character program and apply realtime Imotion Data to it and export the Data back to your external program as BVH or FBX or even Alembic.
Daz studio has no useful FBX import capbility, that I have seen demonstrated,even in 4.12 beta.
As well as the Live face ,camera based facial animation mocap system and support for the full body motion capture hardware that is optionally available for Iclone as well as a realtime live link to unreal 4
If Daz studio"equals or surpasses" these features by this time next year I would love to know where this roadmap of advanced animation pipeline features has been published by Daz.
No problem. I think you can do what you want, if I have got the idea correctly. You will have to render in Cycles and use the compositor.
For example, to get a mask based on a particular material or materials, I have a cube with two materials, one plain red and one blue with a check texture as a transmap, sitting on a plane with a green material of it's own. You can see the blue material set up and the render preview in the first image.
In the View Layer tab, find the Cryptomatte subsection in the Passes section and select the Material button. Hit F12 to render.
Now click on the Compositing workspace tab along the top of the viewport. You will have a Render Layer node and a Composite node connected to it. Add a Viewer node and click on the Backdrop button at the top right of the viewport (see second image). You should see your render behind the nodes as it appeared in the render window.
Next, add a Cryptomatte node between the Render Layer and the Viewer node and connect them up as shown in the third image. This should give you the 'Pick View' behind the nodes, with each material identified by a different color. Click the Add button in the Cryptomatte node, and use the eyedropper to select the material you want to create a mask for e.g. I selected the greenish color representing the blue material with the transmap. You should see an odd looking value appear in the field under the Add button.
And now the magic happens. If you connect the Image output from the Cryptomatte node to the Image input of the Viewer node you will see an image with just the selected material, and if you connect the Matte output to the Image input you will see what is essentially an alpha mask for your selected material (image four).
To save this as an image, connect the Matte output from the Cryptomatte node to the image input of the Composite node. Your render window should now show the mask image, and you can save it as normal from the Image menu (picture 5).
You can select multiple materials on mutliple objects - for the last image, I picked the red material on the cube and the plane material, after removing the blue material (using the Remove button and selecting with the eyedropper again). Of course a semi-transparent area on a material will give you a corresponding grey color in your alpha mask.
Hope that is clear enough to follow.
No, there is no official DAZ 3D confirmation of that. It's just me talking out the side of my mouth but I think it will be the case. I don't know what iMotion data is. I have imported FBX models before and you are right they often fail but I will start filing tickets in the future when they fail so at least they have a record of who is trying to use FBX in DAZ for what.
The FBX export I've done has worked well in Unity with the observation that they are 'good quality' for games but not DAZ Studio iRay render quality.
Thanks again Andya, you obviously know Blender very well and I know from your explanation that it is still ridiculously user-unfriendly. :) It apparently took 21 freaking years for the Blender team to add a camera rotation widget, so I'm not going to hold my breath on them making material ID renders any easier either (speaking of which, like Studio, Blender doesn't anti-alias it's ID maps either? Are there any actual artists working on these programs?).
No question Blender is powerful, but it's still not for me. I appreciate the help though and thanks for taking the time to answer my questions.
Indeed, that is my Iclone retargeting Pipeline as well.
My exported Genesis Characters never travel beyond the 3D exchange app where they receive my realtime Imotion Data, I create in iclone, and export the BVH Back to the "actor" in Daz studio for Final tweaking,Lipsynch and costuming before export as .obj/MDD to Maxon C4D to render.
Iclone nor Daz studio is not a suitable final render environment for my types of productions
because large, complex scene management ( Hundreds of scene items in nested hierarchies),
is Mission Impossible in Iclone&Daz studio, to say nothing of the rather poor compositing&VFX options.
Do you use Eevee ( which set as default render engine in 2.8 ) or Cycles ? if you`ve used Eevee , you need to setup MATERIAL-TAB --> OPTIONS --> Blend Mode = Alpha Clip or any Aplha option which suit you .
Sadly... unless your GPU support Open GL 3.3 , I dont see another workaround/cheat/trick to makes it work
Yes, in this case not a cinch for sure. I'm no expert on UX/UI design, but I see the tension between ease of use and power/flexibility is unresolved in many software applications, so it must be a hard problem. If one is interested in doing thing A at present in some complex application, then it can be hard to do, and the fact one can also do things B through Z is no consolation at that particular time!
Well, I think Blender does, but you have to 'play' with the pixel filter width settings a bit (Render tab->Film->Pixel Filter). The default 1.5 pixels may not always give the best results. (I seem to recall there was a checkbox to toggle anti-aliasing on/off in versions prior to 2.80, but seems you can't disable it entirely now.)
I think that's partially caused by the differences between how programmers and artists think. A programmer's priority is to get the feature into the software; if it's difficult to actually use, that's not their problem. An artist comes up with the idea for the feature but has no clue how to actually implement it. :)
I won't give up on Blender completely, but I can't think about making it my primary program until it can render these masks without *too* much work and as long as it has trouble handling DAZ content. What frustrates me is that they've had 21 years to work on this software with a ton of community support and yet it's still rather unnecessarily complicated and unintuitive compared to other similar programs. Apparently not too many right-brained members in that community, heh.
Thanks again for your help so far.
Not sure if this got answered yet as I'm still catching up on the thread, but you can easily do this with cryptomatte in Blender.
The result will be a separate render with a different solid color for each material. I actually find this much faster than having to render separate masks for each material - you just take the one image into (in my case) Photoshop, colorpick the color and use that selection to add a mask to whatever you are doing. It's two clicks to get a mask and no extra rendering time, which really speeds things up.
EDIT: Whoops - well maybe I should finish reading the thread before replying as I see this was actually answered already. My way is a slight variation on the method above. I think andya is trying to perfectly mimic what you currently do which is a slower workflow than this, although my way would require a slight change in how you work.
Sreenshots of the steps below. My example object is a cylinder with three material nodes.
TURNING ON THE CRYPTOMATTE RENDER LAYER
NODE SETUP IN THE COMPOSITING WORKSPACE
COMPOSITED RENDER FOR EXPORT
Anyone know how to put a script on a custom button somewhere in the UI? Or something like that?
ArtOfMark, that seems pretty easy, but can you anti-alias that resulting render? I still don't quite know why anyone would need a jagged Material ID render because it's useless for masking in Photoshop, but both DAZ Studio and Blender seem to want to only offer that option.
The composite render using cryptomatte is exactly as anti-aliased as the regular render. Mine just looks "stair stepped" because I rendered at a really low resolution - if I had uploaded the render too you would see that they match up exactly. I just tried it with a more complicated, higher resolution image and it definitely works.
I'll test it out tonight then, thank you.
EDIT: I tested it and it works - but I still can't figure out why. The Material ID map is clearly not anti-aliased when I look at it, but when selected and altered, it seems - so far - to give a satisfactory result. Is there something more technical going on with the edges appearing more defined than they really are or with the export format? Just seems weird. I need to test it with a much larger image and make sure my selection tool isn't anti-aliased as well, but this might solve one of the bigger problems I had with Blender. Thank you very much.
EDIT: OK it does seem Cryptomatte is the way to go, but I haven't figured out the options yet. Sorry for the multiple posts, will try to wrangle this soon.
Thank you very much for posting these steps. I would never imagine, that it is possible in Blender. What a powerful program it is.
OK, I think I figured it out but it's a bit more difficult than some of these explanations if you want to get these masks over to Photoshop for postwork. You have to set up a viewing node to show the background, then constantly disconnect/switch the image with the matte view to see what masks you're actually putting together with the eyedropper, and then open a new window set to Image Editor to actually save the masks out as individual renders. This tutorial does a pretty good job of explaining it.
Powerful? Absolutely. Intuitive and designed with artists in mind? Hilariously not. Thanks to those who helped put me on the right path to figuring this out though, I appreciate it.
I think it can be easier than that. You can set up as many viewer nodes as you like. Whichever one you have selected is the one that you will see in the backdrop. So, for example, one for the full image, one for the pick output, one for the image output of the cryptomatte node. You could have one for the matte output. No more connecting/disconnecting.
You can select to render to a new window, which saves opening an image editor window manually. Connect the Matte output from the Cryptomatte node to the image input of the Composite node and that is what will appear in your render window.
There are artists employed by the Blender Institute, who have produced several animated 'open movies' (shorts, not features), including 'Spring' which was used as a testing ground for 2.80 while it was in development. (See also 'Big Buck Bunny', 'Cosmos Laundromat', 'Elephants Dream' for previous releases). I have to presume they were, are, and will be consulted as part of the ongoing development process.
https://www.youtube.com/watch?v=WhWc3b3KhnY
https://www.blender.org/press/spring-open-movie/
Thanks, I will experiment with those suggestions later. I also don't literally mean there are no artists involved with Blender, some amazing things have certainly been created with it. Many would agree that it's pretty unintuitive though and has been for decades.
Just curious, there isn't a way to make it recognize the Z axis as forward and back instead of up and down, is there? Yet another thing that make coming to this program from another a pain in the neck. ;)
I still feel like you are are overcomplicating this process. What is the need to export all the different masks from Blender?
When I do this, I save only two images from Blender: the render, and the composited render showing a separate flat color for each material that hooking up the pick output to the Composite node produces. Then in Photoshop when I want to make adjustments to just one material, I use the magic wand tool to select that material on the cryptomatte export and just click the mask button to add a mask using that selection to whatever layer or group of layers I am working on.
I don't create and save out a billion masks in Blender - that's a ton of unnecessary work. I just export the render and the cryptomatte composited render and do all my masking right in Photoshop.
Because the cryptomatte composite is not assigning a different color to each material. It's only using four or five shades of blue and green, and I've had the white areas connect to other white ones, making selections difficult or impossible. If it could make something like the Random viewing option does, where every object is given a unique color to help defining them in complex scenes, I'd definitely do what you're saying.
I'm still waiting for someone to explain why at least two different programs (Blender and Studio) offer Material ID renders but not anti-aliased ones. That's all I want, an anti-aliased render with every material assigned a slightly different color.