Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
Customers' reviews are important, unfortunately there's no such a function in here... but no need to give up eating for choking... I myself are still willing to leave space for PAs to make improvements....while keeping eyes on these stuff.
ok, checked a few sets, this "issue" seems to only be in older sets, newer ones doesn't have single color metallic maps.
How it works is that if you have a map and set the metallicity slider to max, the map will define the metallicity from the map. With a black map set to max, you can remove the map and set the slider to zero, which is th same thing.
For a pure grey scale map between 0 and 1, set the slider to that grey value (0-1) (where its is a remap of 0 -> 255 for RGB8 ), so a grey of 132 is 132/255 => 0.518.
Might be able to do this scriptwise, but the tricky tricky thing is when you have a 4k black map with 8 white pixels on it for making a single tiny surface part metallic. It takes a while to check every pixel on the map to see if they are different, specially on 4k+ maps.
Same here :)
well the Banquet Hall I used as an example is new and it has a heap of them
possibly the most I have seen
I reduced 100 maps in that set
I'll check if I have that but I have others with black maps (like Mystic Alehouse), I use that as a dev test set right now.
and as for Irfanview, I use it's batch processes a lot and soon in on the task regretted not isolating those maps and using it in this case
Hi guys, you can also use Windows Power Tools to batch process images, for resizing, renaming and other things like this. Works from the context menu and is really fast.
I've wondered that a lot, in general daz assets are good enough in quality but not optimized at all, in some cases really wasting resources for nothing as the black/white large textures reported here. Both DAZ and the PAs don't seem to care as this practice in going on for a long time. I've come to the conclusion that unfortunately most daz assets need to be reworked to be used in production, at least I do the rework once anyway to fit blender, so not a big deal, but certainly a waste of time and a defect in the DAZ shop.
I use it because it automatically fixes file extensions that don't match file types. Sometimes the web browser will save a file as .jpg when it's really a webp file, or a file will be saved with .html or no extension at all, and IrfanView figures out what it is and gives it the right file extension. Then the file loads properly in everything else.
I feel more than ever like I found my tribe.
Dont get me started on this topic, its a major pet-peeve for me. This is one of the many reasons why I manuallly install products. I check them over for the all black, white or greyscale maps and use XNConvert to batch process them down and even drop the color depth. I also check for single color maps and I double check them for oversized maps like repeating patterns that dont need to be 8k or 4k. I also wish other PAs would take the time to learn a bit more about the Iray Uber shader and all its little secrets that can help make sets better with smaller textures instead of just tossing the model into Substance and taking what ever they are given. Making sure all parts of a model that share the same map are to the same UV scale is also helpful.
As a PA who is also a customer, I really wish other PAs would take the extra time to optimize their sets, regardless of what it is. Our customers cant afford to upgrade their PCs whenever they want so optimizing our products is a good way to ensure better compatiblity with older hardware and in the end, improve their experiance, and help build better brand loyalty.
just a little vram tip, both higher res images and higher image bit depth values use more vram. And the file size on disk does NOT reflect how much vram an image can use. Open the image in a paint program and it can usally tell you how much ram it takes.
Dont get me started on this topic . . . . . . .
Correct, the images use VRAM based on the uncompressed size, not compressed size in memory, or at least a size more equal to uncompressed, like the .bmp format.
If VRAM usage is based on a texture's uncompressed size, why are PAs distributing them in uncompressible .png and .tiff formats rather than .jpg? And why are almost all character textures distributed in 4K format with no low-res options?
Because png and tiff are just that, non destructive compression. jpg will ruin any delicate normal map with its "nearest similar" and "find repetition" algorithms
Tiff and PNG are lossless and so preserve full quality - that can make a difference to some maps (some will show jpg compression artefacts).
Thank you, it is nice to see that at least some PAs do care for some optimization. As for the bitdepth, in my old tests with iray it doesn't seem to affect the vram so much, probably because of the iray texture compression. Other packages as blender are more sensitive to the bitdepth since cycles has no texture compression. I agree it is anyway a good practice to care for the bitdepth.
Some of the assets look decent enough in render, but if you're starved for VRAM...it's often a terrible use of resources to use some of the assets. I have it in mind to just purge my library of all of FG's environmnet assets, which is extra annoying becuase it's not just a matter of uninstalling from the IM. Sure I can just edit the assets, but that's not my job. This becomes especially attractive when I'm currently running out of space on my primary drive. I'll still keep the DIMs (on my portable archive) so I can re-install when I build a new PC and VRAM will become less of an issue. Also, I don't want to make it personal, but there are a lot of assets in this PA's catalogue that are similar to other (often earlier) works by artists that went through the process properly, look better in render, and don't gobble 80% of the VRAM in any one scene. On the other hand, the clothes don't always seem to have this issue.
Going through Files, and suddenly finding gigs, gigs! of space cleared up on HD all of a sudden, now reconsidering keeping the products in archives on principal; The PA's name is the clue...like the Assylum of DAZ PA's. OP is an angel for posting this and confirming the issue. Thank you!
I finally bought https://www.daz3d.com/code-66-toolbox--volume-5 with a discount code the other day and the flat map script works a charm
it's a shame we need to fix other PAs choices but it is what it is I guess
I'm not disagreeing at all, but from what I understand from what I've seen stated here in many questions about the QC process, that sort of level of detail is outside the general purview of the DAZ QC process, which is primarily concerned with making sure that everything listed for the program is included and that the program will load correctly on the current version of DS. And there is a certain level of sense to that, as if every QC was conducted to that sort of level of detail, products would be inevitably begin bottlenecking in QC.
The thing is that some 3D engines doesn't support "number fed material params" like the Daz Uber Iray does, so they need a flat map to work, and that map needs to be the same size as the other maps for the same material.
Now, that's an opposite idea, for exporting out from Studio to other engines, a "fat map generator" for those zones that have numbers, and then there would be no reason at all to have flat maps in Studio content.
This is the beauty of different 3D engines, they all work differently and have different requirements....
There are PNG to JPG converters out there that claim to be lossless but exactly what that means I'm not sure.
I don't believe that is possible since JPG is a lossy format. On something you directly look at like the final render or if you're looking at a diffuse map with very low or the lowest level of compression it may be impossible to tell to the human eye.
I remember doing some tests with a normal as when I was using a JPG for a sticker, the edges were getting pixelated looking even with the least amount of compression and a fairly large image size. It all went away when I used a PNG normal.
I checked Wikipedia, it seems to be true actually, though not with the standard jpg algorithm (as far as I can read):
https://en.wikipedia.org/wiki/Lossless_JPEG