Daz Studio Pro BETA - version 4.21.0.5! (*UPDATED*)

135678

Comments

  • Richard HaseltineRichard Haseltine Posts: 100,781

    jbowler said:

    johndoe_36eb90b0 said:

    Richard Haseltine said:

    While those could indicate noise they could also indicate more successful rendering of subtle distinctions.

    Both renders at 95% convergence still have somewhat visible fireflies if you go pixel-hunting, they are just in different places on character skin. If rendering was more successful then those would be gone in 4.20.

    Not at 95% because that says "ignore 5% of the pixels, your choice."  I have, in the past, tried 100% convergence with render quality of around 10 just to see what would happen; this was on a scene which converged rapidly anyway; no emissive surfaces in frame, minimal surface reflection, no HD textures, no complex geometry like strand based hair, minimal subdivision.  The PNG file size got consistently smaller but I really couldn't see any improvement in the image.  I do always use the firefly filter, geometrically correct Gaussian pixel filter, sampling radius 1.5 (though no one has explained what that means to my knowledge.)

    That is actually why I bothered to calculate PSNR.

    That only works with a Beauty canvas; i.e. with the full original dynamic range and no tone mapping, but the noise on the brightest pixels will swamp that in the darker ones even using the logarithmic scale (i.e. noise in dB).  The challenge for NVidia is that they have the full precision, floating point, values in their hands and have to make some guess as to what the tone mapping will do.  It think in the past it must have simply ignored values above the tone-mapped peak (i.e. 1.0) because of all the comments about making sure the scene is "fully lit", but for certain it wasn't doing that in recent versions prior to 4.20.1

    Of course, NVIDIA can claim (by using some other image quality metric such as SSIM) that the image quality is now much better because there are more distinct "colors".

    However, even if the quality is measurably better by some other image quality metric, sacrificing so much performance to get there while still not reaching better convergence should, in my opinion, not be an acceptable tradeoff.

    Changing "Render Quality" or "Rendering Converged Ratio" (did that name just change?) makes absolutely no difference to the quality of the output; that's why they can be changed on the fly during a render. 

    Render Converged Ratio is how many pixels are counted as converged; Render Quality is how fussy Iray is about what counts as converged.

    They do exactly the same thing as "max time" (which I always set to 0) and max samples (which I always set to -1); they just stop the render at some point.  One approach is to ignore them all; disable "Render Quality Enable", max time:=0, max samples:=-1 (turn off limits) then just stop the render when it looks good enough.  Previously (prior to 4.16) it was possible, with complete reliability, to change any of the four settings mid-render; cancel, change the settings, restart, but 4.16 seemed to break the quality/convergence changes, at least with a canvas.  Cancelling the render saves the output PNG or JPEG and the canvases, so it should always be possible to cancel, do not change anything, examine the saved PNG/JPEG and/or canvases and restart if they are not good enough.  I find this is a good general approach to using DAZ.

    DAZ can certainly be dinged for allowing a setting in the render window to be changed mid-render in a way that causes a restart, but the only thing apart from that which I see as an issue is that the behavior of the controls should be both consistent and documented.  It doesn't matter if the behavior depends on NVidia; the UI is DAZ and therefore the onus in on DAZ not to change the behavior of UI controls without extensive warning and, preferably, work rounds.  It's the same as ghost lights.

  • Richard HaseltineRichard Haseltine Posts: 100,781

    innes53_4e67625942 said:

    IceCrMn said:

    Hmm,, I'm seeing a 10 second shutdown with this new beta.

    I've noticed this for a long time...there is a solution...After closing Daz Studio, open Task Manager (I have a shortcut to Task Manager on the task bar) and you'll see a Daz file still open...just close it in Task manager. Then reopen Daz if you want.

    Don't. if you want to open a new version of DS while the original is closing use an instance Daz Studio Pro 4.12 - instances . Force-quitting is asking for trouble, including corruption of the content database,

  • jbowlerjbowler Posts: 794

    johndoe_36eb90b0 said:

    So you don't have to use any SUB filter to encode that image when you can encode it well enough with any other method including RLE.

    Zlib does RLE and a whole lot more very efficiently; the whole point of the filtering is to improve the performance of LZ compression.  The gains tend to be small on continuous tone images, but then PNG was never intended for continuous tone images.  On common non-sampled images such as screen shots (on a system without anti-aliasing, which was every system when PNG was designed) UP and SUB work really well for things like window edges, indeed any low color image which is not properly sampled (anything with a sharp edge in it) the gains are significant.  This was the rationale for doing the filtering and it has stood the test of both time and experiment.

    The point I am making is -- the amount of randomness in the lowest 3-4 planes determines how well the image can be compressed. Your contrived example is a bunch of repeating patterns so despite having more "colors" (or shades of gray) it is a well known exception from the rule I mentioned.

    This is where we disagree; compressing a planar format does not work if there are patterns in the pixel values, because the planes themselves don't work as an efficient filter for the pixel values.  Your own example refutes your second point; look at the sky, it is a gradient and it is a smooth gradient in the original scene (i.e. with no introduced noise).  Such gradients occurs frequently in real world (sky, curved walls or flat walls lit from close up) and contrived scenes (photographs using a backdrop, may DAZ renders I've seen which use a flat background).

    Take a close look at the low order bitplane in your Lichtenstein image.  The striations from the gradient in the sky on both sides of the building are clearly visible, however they are blocked by the digitization; sub-sample displacement of the gradient is enough to shift the color values between adjacent complete samples.  This is digitization noise, perhaps there is a low level of actual noise in the samples too but I can't see any way to distinguish that from the digitization noise.

    However this is drifting away from the original point; the hypothesis was that increased image quality resulted in more colors.  I think we both agree that this is not correct in the tests you did; we both consistently see reduced PNG file size as image quality increases and so the argument is that both the increase in colors and the increase in file size is a result of increased noise itself arising because of less convergence (using an RQ of 95% means we don't know what the actual convergence was even if NVidia didn't change the measurement!)

    The misleading idea is that PNG compression automatically gets worse with more colors.  I demonstrated that this is not the case; it may, but not necessarily.  PNG compression gets worse with increased noise and with image patterns that the PNG filtering and LZ compression combined do not recognize.  In this case I assert it is the increased noise in the image, not the increased number of colors, which causes the PNG file to be larger.

  • jbowlerjbowler Posts: 794
    edited June 2022

    johndoe_36eb90b0 said:

    I mean, slice the data any way you want

    FWIW I eventually worked out how to do a high pass filter, of a sort, on your posted image.  PhotoShop doesn't support this, I used the GIMP with a "high pass" filter using the defaults except for the deviation which I changed from 4 to 1:

    I DELETED THE REST OF THIS POST: the stupid, idiotic, interface truncates the image I posted at both ends.  I don't know how to get round this, I GIVE UP.  DIY.

     

    Post edited by jbowler on
  • HoMartHoMart Posts: 480
    edited June 2022

    Is there a problem with https://www.daz3d.com/content-wizard in the latest beta?

    Tried now 5 different .zips - always get "No User Facing Files were detected in the product"

    But there are files in the People folder inside the Zips.

    Edit: tested the zips with 4.15 - content wizard worked fine in 4.15

    Post edited by HoMart on
  • jbowlerjbowler Posts: 794

    HoMart said:

    Is there a problem with https://www.daz3d.com/content-wizard in the latest beta?

    Tried now 5 different .zips - always get "No User Facing Files were detected in the product"

    But there are files in the People folder inside the Zips.

    Edit: tested the zips with 4.15 - content wizard worked fine in 4.15

    If you look at the "temporary" directory I believe you will find that all the files have been extracted as (empty) directories.  It doesn't happen with all .zip files.  I simply unzipped the files using the native Windows (11) support and rezipped them into a single file also using Windows.  So far I've only seen the problem with one product purchased from Renderosity.

  • HoMartHoMart Posts: 480
    edited June 2022

    jbowler said:

    HoMart said:

    Is there a problem with https://www.daz3d.com/content-wizard in the latest beta?

    Tried now 5 different .zips - always get "No User Facing Files were detected in the product"

    But there are files in the People folder inside the Zips.

    Edit: tested the zips with 4.15 - content wizard worked fine in 4.15

    If you look at the "temporary" directory I believe you will find that all the files have been extracted as (empty) directories.  It doesn't happen with all .zip files.  I simply unzipped the files using the native Windows (11) support and rezipped them into a single file also using Windows.  So far I've only seen the problem with one product purchased from Renderosity.

    Very interesting behavior.
    After doing the whole thing in 4.15, I simply deleted everything that was in the TMP/SCI.
    Since then it works again in 4.20. So next time I'll check the TEMP/SCI first.

     

    Post edited by HoMart on
  • jbowlerjbowler Posts: 794

    HoMart said:

    Very interesting behavior.
    After doing the whole thing in 4.15, I simply deleted everything that was in the TMP/SCI.
    Since then it works again in 4.20. So next time I'll check the TEMP/SCI first.

    Content Wizard deletes the whole of SCI every time, in fact multiple times.  I tried a few attempts at a repro but couldn't find a cause (i.e. the things I tested worked ok).  I still have the original .zips; interesting problem :-)

  • Richard Haseltine said:

    innes53_4e67625942 said:

    IceCrMn said:

    Hmm,, I'm seeing a 10 second shutdown with this new beta.

    I've noticed this for a long time...there is a solution...After closing Daz Studio, open Task Manager (I have a shortcut to Task Manager on the task bar) and you'll see a Daz file still open...just close it in Task manager. Then reopen Daz if you want.

    Don't. if you want to open a new version of DS while the original is closing use an instance Daz Studio Pro 4.12 - instances . Force-quitting is asking for trouble, including corruption of the content database,

    Thanks for correcting me...I won't do that anymore...

  • kyoto kidkyoto kid Posts: 41,040
    edited June 2022

    jbowler said:

    Changing "Render Quality" or "Rendering Converged Ratio" (did that name just change?) makes absolutely no difference to the quality of the output; that's why they can be changed on the fly during a render.  They do exactly the same thing as "max time" (which I always set to 0) and max samples (which I always set to -1); they just stop the render at some point.  One approach is to ignore them all; disable "Render Quality Enable", max time:=0, max samples:=-1 (turn off limits) then just stop the render when it looks good enough.  Previously (prior to 4.16) it was possible, with complete reliability, to change any of the four settings mid-render; cancel, change the settings, restart, but 4.16 seemed to break the quality/convergence changes, at least with a canvas.  Cancelling the render saves the output PNG or JPEG and the canvases, so it should always be possible to cancel, do not change anything, examine the saved PNG/JPEG and/or canvases and restart if they are not good enough.  I find this is a good general approach to using DAZ.

    ...that was pretty much the same drill with Reality/Lux, You just let it run until you thought the convergence looked good and then saved it at that point.  The difference from Iray however, Lux was still a CPU based engine and therefore took far longer to even get to a "passable" quality.   

    For Iray I generally seat convergence ratio to 99% and that seems to work the best at least where medium to well lit scenes are concerned. I also use spectral rendering at the default setting and set compression at 1024 medium 4096 high and get rather clean images with decent colour depth. 

    Just ran a test of a character in a desert scene at 975 x 1,200 pixels which finished in 16m 45s at 2692 iterations using render quality 2 and an atmospheric effect camera.albeit in the 4,16.1.40 beta.

    Have not installed the latest 4.20 beta yet as it would overwrite the beta I am currently using. I need to check my installer archive to make sure i can roll back after running tests in 4.20 using the same settings to compare.render times and quality.

    Image of the first test attached

    Amineh and Camillia atmo cam.jpg
    927 x 1200 - 652K
    Post edited by kyoto kid on
  • jbowler said:

    This is where we disagree; compressing a planar format does not work if there are patterns in the pixel values, because the planes themselves don't work as an efficient filter for the pixel values.

    Separating an image into bitplanes is a rough approximation of separating it into low and high frequency components. You can do that more accurately with Discrete Cosine Transform (aka DCT) which is the cornerstone of all lossy image compression algorithms. I used bitplanes just to show you where the noise (which makes image harder to compress unless filtered out) resides.

    jbowler said:

    The misleading idea is that PNG compression automatically gets worse with more colors.

    It is not misleading -- we seem to be spliting hairs on the definition of "more colors".

    You seem to think about it in absolute terms -- more colors in a smooth gradient, or more flat surfaces each with a different color (both easy to compress because PNG compression filters are designed to exploit data uniformity).

    I am, on the other hand, talking about more colors with random distribution as was clearly the case in my render tests which no lossless compression algorithm (and especially not the outdated PNG) could ever compress better.

    So what I am saying is "more noise" == "worse PNG compression" which is an easily verifiable fact. I have attached 4 sample images, each with Gaussian noise at 1%, 5%, 10%, 50% respectively, image filename contains color count. Do check the file sizes and know that all were packed using Photoshop with smallest file size option which produces file sizes equivalent to pngcrush.

    Therefore, if "more noise" == "more colors" (which seems to hold as well in my sample images below), then it logically follows that "more colors" == "worse PNG compression" as well.

    Finally, Photoshop sure does have High Pass filter, it's in Filters -> Other -> High Pass.

    Noise_01p775c.png
    512 x 512 - 202K
    Noise_05p17764c.png
    512 x 512 - 318K
    Noise_10p37438c.png
    512 x 512 - 363K
    Noise_50p101557c.png
    512 x 512 - 476K
  • I discovered this today when trying to debug an issue with a large scene I was working on.

    I can confirm that this happens with 4.16 and the most recent 4.20 beta, but I do not get this warning for Daz 4.14 beta

    2022-06-04 13:42:36.920 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(359): Iray [WARNING] - TRIT:GEOMETRY ::   1.0   TRIT   geo  warn : Object DS_shape_137e_6418: the parametric approximation level is set to 1. The original value of 2 would produce too much geometry in a single mesh.
    2022-06-04 13:42:38.705 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(359): Iray [WARNING] - TRIT:GEOMETRY ::   1.0   TRIT   geo  warn : Object DS_shape_145a_7953: the parametric approximation level is set to 3. The original value of 4 would produce too much geometry in a single mesh.
    2022-06-04 13:42:39.371 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(359): Iray [WARNING] - TRIT:GEOMETRY ::   1.0   TRIT   geo  warn : Object DS_shape_145e_7965: the parametric approximation level is set to 3. The original value of 4 would produce too much geometry in a single mesh.

    It would appear that now Daz and/or Iray is capping what we can subdivide. The subdived 4 instances it mentions are two g8m figures in the scene that were set to render at subdivision 4.

    I find this a bit concerting because there is a noticeable difference between subdivision 3 and 4 on the g8 figures, not to mention I have multiple products with HD morphs and jcm's that reference in their name their for subdivision of 4.

  • kyoto kidkyoto kid Posts: 41,040
    edited June 2022

    ...OK saved the 4.16 beta installer installed 4.20.1.43.  Start-up went fine not glitches except that my AV popped up a caution flag as it is new n an never been on this system before. my UI layout was there, loaded the scene and checked to make sure all the settings were the same. then did one render pass. As this was the first time rendering a scene optimised for an older version of the programme it went through a fair amount of updating that took about two and a half minutes before the first iteration posted, so I let it run all the way through then closed the window.  I launched the render again and the updating portion took about 35 seconds before the scene appeared in the render window. After it finished saved and then opened at the logfile.

    In the 4.16 test the actual total time was 16m 53s on 2,692 iterations posted. (the figures I gave in my post a bit above was from just eyeballing the progress monitor).

    For the 4.20 render it completed in 15m 25s on 2,682 iterations  so shaved about a minute and a half off. the render time  OK so that is encouraging particularly as I am rendering on a legacy GPU (Maxwell Titan-X)

    I really don't see any appreciable difference between the two rendered images save for maybe a very slight increase definition in the 4.20 test. Both test images are attached below, the first being the one rendered in 4.16 and the  second in 4.20

    Again no odd glitches, BSODs, or crashes  (yet) so will play with it for a while, but keeping  the 4.16 Beta in reserve.

    Oh as to that message from my AV it was about the Data Protector blocking a suspicious action by he Daz Programme.  Looking at the details report it has to do with the Logfile for LAMH. I restarted the programme, loaded the scene, saved it closed it form the viewport then closed the programme and the message never appeared again. So again likely because this was the first time I opened and used this version of Daz on the system.

    Amineh and Camillia atmo cam.jpg
    927 x 1200 - 652K
    Amineh test 4-20.jpg
    927 x 1200 - 653K
    Post edited by kyoto kid on
  • kyoto kidkyoto kid Posts: 41,040

    ...a footnote too the above, 

    Having read all the issues people have experience when a new version of Saz is rolled out, it makes me wonder, is it necessarily the beta update or is it something else?  I remember when 4.15 and then 4.16 were said to be buggy (both the beta and general releases) yet I never experienced any negative issues with either. So after holding my breath with starting up 4.20 I again was surprised and pleased at how smoothly it ran (save for that little message from my AV which turned out to be inconsequential). I didn't lose my custom UI or library/Runtime setup, there were no BSODs, I could open older scenes, and access content with no trouble.  Granted some features like Volumetrics I will not be using because my system is older and doesn't have the horsepower to handle it, but so far so good.

    Most people here are on W10, a few on W11, and some on MacOS, I'm still on W7 for which I turned off auto updating and a couple features I didn't need when I first installed it a decade ago and upgraded to Pro (there weren't many to begin with) compared to later versions of Windows.  Could it be conflicts with some AVs or other software/processes?. When I start a work session I shut all other programmes and unnecessary processes down and go offline (even putting my AV into quiet mode) primarily to maximise resources.  I also don't use DazConnect so the Net connection to the Daz site in the programme is always turned off (this is part of my gripe with some subscription software that requires one to be online during a work session).

    I don't know, maybe my system is charmed, I'll see what happens when I upgrade the hardware and move to W11 in a few months.

  • prixatprixat Posts: 1,588
    edited June 2022

    What time and differences do you get with 'Guided Sampling' turned on?

    ...and, even without Guided Sampling, it seems to be when there is transparency/refraction; that makes the Beta render slow significantly.

    Post edited by prixat on
  • PerttiAPerttiA Posts: 10,024

    kyoto kid said:

    ...a footnote too the above, 

    Having read all the issues people have experience when a new version of Saz is rolled out, it makes me wonder, is it necessarily the beta update or is it something else?  I remember when 4.15 and then 4.16 were said to be buggy (both the beta and general releases) yet I never experienced any negative issues with either. So after holding my breath with starting up 4.20 I again was surprised and pleased at how smoothly it ran (save for that little message from my AV which turned out to be inconsequential). I didn't lose my custom UI or library/Runtime setup, there were no BSODs, I could open older scenes, and access content with no trouble.  Granted some features like Volumetrics I will not be using because my system is older and doesn't have the horsepower to handle it, but so far so good.

    Most people here are on W10, a few on W11, and some on MacOS, I'm still on W7 for which I turned off auto updating and a couple features I didn't need when I first installed it a decade ago and upgraded to Pro (there weren't many to begin with) compared to later versions of Windows.  Could it be conflicts with some AVs or other software/processes?. When I start a work session I shut all other programmes and unnecessary processes down and go offline (even putting my AV into quiet mode) primarily to maximise resources.  I also don't use DazConnect so the Net connection to the Daz site in the programme is always turned off (this is part of my gripe with some subscription software that requires one to be online during a work session).

    I don't know, maybe my system is charmed, I'll see what happens when I upgrade the hardware and move to W11 in a few months.

    I think your building your computer out of quality parts, has lot to do with not having problems.

    I do the same, I plan my system to suit my purposes and use quality components, often going against the latest trends and have very little problems (if any) running what ever.

  • @kyoto kid

    Attached is the most trivial scene I can come up to prove how much slower 4.20 is for me compared to 4.16.

    It consists of a 2D plane and a cube. Both are using Iray Uber shader ("Car Paint - Pearl White" and "Car Paint - Midnight Blue" respectively).

    Before rendering I center the perspective view to 2D plane and I render it to 1920x1920 pixels.

    Even with this trivial scene which contains no human figures, no textures and simplest possible geometry, the new Iray in 4.20.1.38 Public Beta is 13.29% slower (16.98% with guided sampling on) than 4.16.0.3 General Release and the results seem to have a bit more noise as well. Tested on RTX 3090 with NVIDIA Studio Driver 512.96.

    duf
    duf
    iray_test.duf
    20K
  • kyoto kidkyoto kid Posts: 41,040
    edited June 2022

    ...considering the  scene includes a large semi transparent cube for the dusty haze effect, that should have resulted in a longer render time and it didn't.  I was actually expecting a jump in render time before I launched the first render process and was rather pleased to see it too was slightly shorter than in 4.16 even with all the preliminary updating befoire the image appeares in the render window.

    The 4.16 version I was using previously was 4.16.1.40 beta which was released after the general 4.16.0.3. 

    My satisfaction is that none of the other "horror stories" I've read about 4.20 that kept me away form adopting it have occurred and that's fair enough for me.

    To run additional tests I would need to uninstall 4.20 and reinstall 4.16 as both are public beta's and they cannot exist concurrently on the same system.  I have other work to complete so that gets put on a back burner for now.

     

     

    Post edited by kyoto kid on
  • marius_ifmarius_if Posts: 48

    Shutting down takes about 5 (five) seconds. 

    Everything works even smoother and faster, including faster rendering. I mean not rendering a ball with default render settings, I mean the real thing, complex volumetrics, huge complex true caustics, complex private shaders, many millions polys and so on. Not throwing a bunch of rays, just feeding the parallel monster properly, as intended. 

    Many thanks to these amazing brilliant gentlemen, amazing brilliant designers and programmers. The only problem is I know what I'm talking about. 

    Not going to repeat Rob and other awesome gentlemen - some use to be often around - should have their names carved in titanium building's foundation, small fonts, 1 meter depth and at least 1 meter width, to be sure they stay there visible for a while. Don't have titanium yet, then you better have it asap, it will help mankind remember why life on Earth was good and the reasons it happened.

    This is how it is, hugs, marius.

  • kyoto kid said:

    ...considering the  scene includes a large semi transparent cube for the dusty haze effect, that should have resulted in a longer render time and it didn't.  I was actually expecting a jump in render time before I launched the first render process and was rather pleased to see it too was slightly shorter than in 4.16 even with all the preliminary updating befoire the image appeares in the render window.

    The 4.16 version I was using previously was 4.16.1.40 beta which was released after the general 4.16.0.3. 

    My satisfaction is that none of the other "horror stories" I've read about 4.20 that kept me away form adopting it have occurred and that's fair enough for me.

    To run additional tests I would need to uninstall 4.20 and reinstall 4.16 as both are public beta's and they cannot exist concurrently on the same system.  I have other work to complete so that gets put on a back burner for now.

    Eh? A cube should by no means be semi-transparent -- it has Iray shader "Car Paint - Midnight Blue" applied, see the attached image.

    All 4.16 Beta versions after 4.16.0.3 General Release already come with newer Iray version so the speed comparison of 4.16 beta with latest 4.20 beta is kind of meaningless.

    If you insist on comparing your existing 4.16 beta just toggle Render Quality Enable to off, set Max Time (secs) to 0, set Max Samples to 15000, center viewport on the plane, render at 1920x1920px resolution, and tell us the times and number of iterations from the Daz Studio log, not how much faster or slower it seemed to you.

    cube.png
    888 x 889 - 122K
  • marblemarble Posts: 7,500

    marius_if said:

    Shutting down takes about 5 (five) seconds. 

    Everything works even smoother and faster, including faster rendering. I mean not rendering a ball with default render settings, I mean the real thing, complex volumetrics, huge complex true caustics, complex private shaders, many millions polys and so on. Not throwing a bunch of rays, just feeding the parallel monster properly, as intended. 

    Many thanks to these amazing brilliant gentlemen, amazing brilliant designers and programmers. The only problem is I know what I'm talking about. 

    Not going to repeat Rob and other awesome gentlemen - some use to be often around - should have their names carved in titanium building's foundation, small fonts, 1 meter depth and at least 1 meter width, to be sure they stay there visible for a while. Don't have titanium yet, then you better have it asap, it will help mankind remember why life on Earth was good and the reasons it happened.

    This is how it is, hugs, marius.

    Forgive me but what version are you talking about? I see this thread is now 4.20.1.43 - maybe that? I haven't loaded that yet (I'm on .34) but I would be far from agreeing that everything is quick. Rendering is no quicker, shutdown times - no change (still very slow with more than one G8F in the scene), dFroce simulation: dreadfully slow. I have a RTX 3090 so should see some benefit but 4.15 was, if anything, quicker.

  • prixatprixat Posts: 1,588

    johndoe_36eb90b0 said:

    kyoto kid said:

    ...considering the  scene includes a large semi transparent cube for the dusty haze effect, that should have resulted in a longer render time and it didn't. 

    Eh? A cube should by no means be semi-transparent -- it has Iray shader "Car Paint - Midnight Blue" applied, see the attached image.

    I think Kyoto Kid is talking about the cube around his own desert scene posted earlier. 

    Though I did notice with your scene that it renders faster on my (RTX 3060) system if I have 'Draw Ground' turned off!

  • prixat said:

    I think Kyoto Kid is talking about the cube around his own desert scene posted earlier. 

    Though I did notice with your scene that it renders faster on my (RTX 3060) system if I have 'Draw Ground' turned off!

    My apologies then, it sounded to me like they were talking about testing my scene which also has a cube.

    Anyway, ground or no ground it is slower for me in 4.20.

  • johndoe_36eb90b0 said:

    Separating an image into bitplanes is a rough approximation of separating it into low and high frequency components. You can do that more accurately with Discrete Cosine Transform (aka DCT) which is the cornerstone of all lossy image compression algorithms. I used bitplanes just to show you where the noise (which makes image harder to compress unless filtered out) resides.

    It was not your main point, but DCTs are not the basis of all lossy image compression; There are Wavelets, too...

  • kyoto kidkyoto kid Posts: 41,040
    edited June 2022

    johndoe_36eb90b0 said:

    kyoto kid said:

    ...considering the  scene includes a large semi transparent cube for the dusty haze effect, that should have resulted in a longer render time and it didn't.  I was actually expecting a jump in render time before I launched the first render process and was rather pleased to see it too was slightly shorter than in 4.16 even with all the preliminary updating befoire the image appeares in the render window.

    The 4.16 version I was using previously was 4.16.1.40 beta which was released after the general 4.16.0.3. 

    My satisfaction is that none of the other "horror stories" I've read about 4.20 that kept me away form adopting it have occurred and that's fair enough for me.

    To run additional tests I would need to uninstall 4.20 and reinstall 4.16 as both are public beta's and they cannot exist concurrently on the same system.  I have other work to complete so that gets put on a back burner for now.

    Eh? A cube should by no means be semi-transparent -- it has Iray shader "Car Paint - Midnight Blue" applied, see the attached image.

    All 4.16 Beta versions after 4.16.0.3 General Release already come with newer Iray version so the speed comparison of 4.16 beta with latest 4.20 beta is kind of meaningless.

    If you insist on comparing your existing 4.16 beta just toggle Render Quality Enable to off, set Max Time (secs) to 0, set Max Samples to 15000, center viewport on the plane, render at 1920x1920px resolution, and tell us the times and number of iterations from the Daz Studio log, not how much faster or slower it seemed to you.

    ..,I used the Iray Atrmo Cam which employs a semitransparent cube with different "moisture" values. It is labelled a "volumetric camera" but that's only a title as we haven't seen real volumetrics until 4.20 

    I used a similar procedure in this scene to create the damp looking fog.  In a test I did, turning off the cube noticeably improved render time

    railway station beta.png
    1600 x 1200 - 3M
    Post edited by kyoto kid on
  • TheMysteryIsThePoint said:

    It was not your main point, but DCTs are not the basis of all lossy image compression; There are Wavelets, too...

    Sorry, should have clarified that a bit better -- all widely used lossy image compression. I am aware of wavelets, but they never really took off when it comes to practical use. There is some limited use of JPEG2000 in medical imaging, and there is also JPEG XS which uses wavelets and... well that's about it. On the other hand, JPEG, and all major video compression standards such as H.264 H.265, etc, all use some variant of DCT, not 2D DWT.

    Lossy image compression works by eliminating high-frequency noise, and it really doesn't matter much how you slice the data to get at those frequencies (bitplanes being but a one very illustrative way of doing it hence why I picked it). Since PNG compression is lossless it does not allow for eliminating high-frequency noise which results in worse compression on noiser images which was my original point.

  • ImagoImago Posts: 5,155
    edited June 2022

    It's just me or DAZ Studio 4.20 stopped writing the Customactions.dsx file? Every time I make a change in the custom actions and I quit DAZ Studio, in the next session none of the changes are in my interface.

    I have to use 4.12 to make the changes to have them in 4.20.

    There's some new hidden option that I don't know about?

    Post edited by Imago on
  • Richard HaseltineRichard Haseltine Posts: 100,781

    Imago said:

    It's just me or DAZ Studio 4.20 stopped writing the Customactions.dsx file? Every time I make a change in the custom actions and I quit DAZ Studio, in the next session none of the changes are in my interface.

    I have to use 4.12 to make the changes to have them in 4.20.

    There's some new hidden option that I don't know about?

    4.20.what.which?

  • ImagoImago Posts: 5,155

    4.20.0.17 but the latest beta too seems to not write the changes on the custom actions.

  • no__nameno__name Posts: 88
    edited June 2022

    Some assets outright crash Iray when hitting render button ("unable to render") with 4.20.1.43. Iray preview works just fine.

    Since Iray preview works fine, I believe something went wrong with SubD level difference between View SubD lvl & Render SubD lvl.

    Everything is fine on 4.16.0.3.

    Exemple : Teen Bedroom

    Msg : [WARNING] :: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(369): Iray [WARNING] - API:DATABASE ::   1.0   API    db   warn : Transaction is released without being committed or aborted. Automatically aborting.

    Nvidia Driver: 512.96

    Edit: If I select every meshes and change it as "View SubD lvl = Render SubD lvl" -> Scene render correctly 

     

    Post edited by no__name on
Sign In or Register to comment.