Daz Studio Pro BETA - version 4.9.3.166! Release Candidate 5 (*UPDATED*)

1212224262729

Comments

  • OK, so we are almost on the same page. I was explaining what the next level of "Clean install" would be from what I did do. Went threw the (not as clean install) a lot a while back testing graphics cards with Iray. At the time 347.88 was the only stable Iray driver.

  • nicsttnicstt Posts: 11,715
    edited December 2016

    I always do a "clean install" when updating the drivers.

    Agreed, and try Display Driver Uninstaller too.

    Just uninstalling is often not sufficient; it will boot into safe mode to complete the processes.

    Post edited by nicstt on
  • mjc1016mjc1016 Posts: 15,001
    nicstt said:

    I always do a "clean install" when updating the drivers.

    Agreed, and try Display Driver Uninstaller too.

    Just uninstalling is often not sufficient; it will boot into safe mode to complete the processes.

    DDU is the only way to go, especially if you start having odd ball problems.

  • OK, I removed that PhysX thing and the '3d vision experience' thing (Dose anything actually need that, SonarX, Iray, Headus, Hexagon, Eagle cad, LASI 7, etc), and now video playback appears to be a tad happier and no spontaneous driver crashes as of yet. And I found yet another hard drive power connector with a flaky connection (Are humans incapable of making a hard drive power connector that works these days), grrrrr.

  • mjc1016mjc1016 Posts: 15,001

    OK, I removed that PhysX thing and the '3d vision experience' thing (Dose anything actually need that, SonarX, Iray, Headus, Hexagon, Eagle cad, LASI 7, etc), and now video playback appears to be a tad happier and no spontaneous driver crashes as of yet. And I found yet another hard drive power connector with a flaky connection (Are humans incapable of making a hard drive power connector that works these days), grrrrr.

    Umm...I think the problem isn't the humans, but the machines...had one hard drive, not too long ago that had the crappiest soldering I've seen in ages, like there was barely enough solder to hold the pins in place, on the power connector. 

    3DVision...not really needed.  PhysX, maybe, but if it is, the one that needs it will complain about it not being there.

  • barbultbarbult Posts: 24,781

    I tried to install Pink Oak Room as an offline install with Daz Connect. I downloaded the .sep and .sea to the install directory. When I click on Install Offline Packages, Daz Studio 4.9.3.153 crashes to the desktop. It is repeatable. I submitted a help request.

  • wizard1200wizard1200 Posts: 239
    edited December 2016

    Any news regarding the release date of the non-beta version?

    Post edited by wizard1200 on
  • Any news regarding the release date of the non-beta version?

    When pigs fly most likely, it would be nice if they could get their crap together and release stuff in a more fast manner rather then making us wait years for each new version 

  • namffuaknamffuak Posts: 4,176

    Any news regarding the release date of the non-beta version?

    When pigs fly most likely, it would be nice if they could get their crap together and release stuff in a more fast manner rather then making us wait years for each new version 

    I think they're real close now. The big hold-up was getting Iray support for the pascal (10-series) gpus - and then fixing the problems caused by that Iray upgrade. The current beta supports the pascal cards, works with the old Nvidia cards on older Macs, and doesn't crash with older AMD cpus. So it depends on whether or not other serious issues crop up, including what barbult just reported - and when Nvidia releases this version of Iray for general use, as it seems to also be a release candidate if I'm reading the logs correctly.

  • TheKDTheKD Posts: 2,696
    edited December 2016

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Post edited by TheKD on
  • RAMWolffRAMWolff Posts: 10,231

    Well one improvement I've noticed... it was a niggle of mine too so ... when I'm working on a product and I'm in Photoshop, make changes to a texture file the update is almost instantaneous unlike past releases where sometimes it would update, sometimes I'd have to resave the file out AGAIN to finally see the changes. PLEASE don't break this bit of code.  It works now very well, so lock it in!  Thanks!

  • zaz777zaz777 Posts: 115
    TheKD said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    I have noticed more problems with the beta in running out of VRAM as well.

    I've also had a few more situations where once the GPU became unavailable in DS, the GPU wouldn't return to full speed until I rebooted.  These normally happened when I was using the iray draw mode set to photoreal.

  • Cake OneCake One Posts: 381
    Cake One said:

    Hi everyone

    i'm on Mac OS X, 10.11, cuda 8.

    So far, the JIT compilation used in some shaders used to crash CUDA totally (so renders were done using CPU only).

     

    On the last BETA relase : no more JIT error 

    So thank you for that

    BUT

    (and that is a big BUT)

    Shader don't crash Cuda anymore but don't look as they are supposed to :

    Here is OOT Cathy hair with Hairblending shader (known to crash cuda in MAC) rendered in Daz Studio 4.9 :

    https://gyazo.com/6e3e90ea99ba62659b5fb7decae37e44

    It looks like it shoud, based on the highlighted DUF file on the left. It was rendered by CPU, obviously, as CUDA crashed.

     

    Here is the same scene, rendered in the last beta :

    https://gyazo.com/26beb0107b46cbb81d417afcb5e18ed3

     

    Cuda doesn't crash, the GPU renders but the hair looks nothing like it.

    Can any other mac user confirm this difference in JIT shaders in both versions?

    Thanks

    C.

     

     

     

    Could you please attach a simple scene which reproduces the issue, listing any needed content.

    Sorry i didn't get any notification on this thread.

    Don't need to attach a scene, it's the defaut scene, with defaut lightning.

    The set(s) that causes the issues shown are OOT hairs, any one that has blending shaders applied.

    Just load the hair, and click iray preview.
     

    In The beta, the shaders go mad (but render with GPU)
    In the "Normal" DS, the shaders renders correctly, but crashes CUDA (rendering is then CPU only)

     

    C.

  • CelexaCelexa Posts: 73
    edited December 2016

    Noticed an odd thing, my textures aren't updating. I loaded an old scene, and proceeded to change the material on the arm to exact duplicate of the material, but with a different tattoo. The old tattoo/material stayed and the new one never appeared. This wasn't just limited to the view port, but rendering also showed the old file even though the surfaces tab showed it was the new mat. Any clue how to fix this? 

    Edit: I also see in the bottom of the screen migrating items. I was using DAZ 4.8 before this beta, does this mean I have the Valentina system instead of PostgreSQL, even though I saw PostgreSQL in my processes when I loaded 4.8? Will it do this every time I load a new scene?

    Post edited by Celexa on
  • namffuaknamffuak Posts: 4,176
    Celexa said:

    Noticed an odd thing, my textures aren't updating. I loaded an old scene, and proceeded to change the material on the arm to exact duplicate of the material, but with a different tattoo. The old tattoo/material stayed and the new one never appeared. This wasn't just limited to the view port, but rendering also showed the old file even though the surfaces tab showed it was the new mat. Any clue how to fix this? 

    Edit: I also see in the bottom of the screen migrating items. I was using DAZ 4.8 before this beta, does this mean I have the Valentina system instead of PostgreSQL, even though I saw PostgreSQL in my processes when I loaded 4.8? Will it do this every time I load a new scene?

    DAZ reworked some of the database tables in 4.9 to handle connect and clean up some of the errors people were having with smart content in 4.8. "migrating items" means that the 4.8 definitions are being copied over to the new 4.9 tables. Anything you install with DIM goes into the 4.8 structure as well, so any new products you add should also trigger the migrating items message.

  • takezo_3001takezo_3001 Posts: 1,997
    edited December 2016

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    Post edited by takezo_3001 on
  • CelexaCelexa Posts: 73
    edited December 2016
    namffuak said:

    DAZ reworked some of the database tables in 4.9 to handle connect and clean up some of the errors people were having with smart content in 4.8. "migrating items" means that the 4.8 definitions are being copied over to the new 4.9 tables. Anything you install with DIM goes into the 4.8 structure as well, so any new products you add should also trigger the migrating items message.

    Thanks.  It should only do this once I'm assuming not every time I open the beta?

    Post edited by Celexa on
  • L'AdairL'Adair Posts: 9,479

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

  • barbultbarbult Posts: 24,781
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

  • barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

  • barbultbarbult Posts: 24,781
    edited December 2016
    barbult said:

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    I hadn't thought of the search and replace option. Notepad++ could handle that easily. We need a script master to make a tool to do the whole thing (reduce the images, plug them into the right slots) for us. V3Digitimes is a script master. I've suggested that she make something like this for the store. You are a script master, too. Maybe the two of you can collaborate.

    Edit: It would be nice to just select an item in the scene pane and click something to make it use less video memory by reducing texture size. Maybe there could be options for how much to reduce it - 50%, 25%, etc.

    Post edited by barbult on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited December 2016


    And then convince daz it is worth having in the store is the other side of getting something in to the store. And don't feel bad about not having enough video memory, I just ran into similar in 3DL looking to do a line up of all the 3DL 'AltShader' mat presets I am working on. After forteen figures, I ran out of room on the 4GB scratch-disk for the textures and stuff.

    And here I was thinking "I have 32GB of ram for 3DL" no sweat, I'm not limited by the NVIDIA memory willful inadequacies, I can do this, lol.

    My solution is to cough up some $$$ for the ram-disk program so I can make it larger then 4GB, There is no such upgrade option for NVIDIA graphics card memory negligence, lol.

    FYI,

    Front L-R, Des Grace, Des Kaia, FR Agnes, FR Enya, Raiya Kelly, Raiya Jolina, LY Hanny.

    Back L-R, Fwsa Grace, Fwsa Paloma, Fwsa Sushmita, Fwsa Yulia, ADSI Glimmer, RS Lindsey, LY Jessenia.

    Post edited by ZarconDeeGrissom on
  • L'AdairL'Adair Posts: 9,479
    barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    I don't use a shortcut. I suppose you could copy the original textures into a subfolder, then save the smaller images with the original filenames. I think the Ctrl+I command would update the images. Or you could close DS to clear memory, restart DS and open the file.

    I don't use it often myself, but there is also the Texture Atlas.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    Or what Richard said...

    I like that even better. Create a sub-directory for your modifed files, and use a script to update the images... not that I have any idea how to write any script...

  • barbultbarbult Posts: 24,781
    L'Adair said:
    barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    I don't use a shortcut. I suppose you could copy the original textures into a subfolder, then save the smaller images with the original filenames. I think the Ctrl+I command would update the images. Or you could close DS to clear memory, restart DS and open the file.

    I don't use it often myself, but there is also the Texture Atlas.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    Or what Richard said...

    I like that even better. Create a sub-directory for your modifed files, and use a script to update the images... not that I have any idea how to write any script...

    I tried Texture Atlas right after Iray came out, when I only had 2GB in my graphicis card. Texture Atlas doesn't handle most of the Iray Uber shader channels. I found it unhelpful for most things. I wrote a help request, but nothing was ever done about it.

  • hphoenixhphoenix Posts: 1,335
    L'Adair said:
    barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    I don't use a shortcut. I suppose you could copy the original textures into a subfolder, then save the smaller images with the original filenames. I think the Ctrl+I command would update the images. Or you could close DS to clear memory, restart DS and open the file.

    I don't use it often myself, but there is also the Texture Atlas.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    Or what Richard said...

    I like that even better. Create a sub-directory for your modifed files, and use a script to update the images... not that I have any idea how to write any script...

    GAH.  I've got to get my scripts cleaned up and the install/usage documentation for my image resizing stuff done.  I have a set of scripts (that use ImageMagick) so that you can automatically resize (all/selected) nodes material images (it creates new resized images in the same location as the original, with a suffix on the name) and also to select which ones are being used (i.e., you select the nodes you want to change to one of those newly resized images, and the size, and it does the changes for you.)  It's set up to either scale by 1/4 (half x half) or 1/16 (one-fourth x one-fourth) or both.  So a 4k x 4k image named 'image.png' would have one or two new images created:  A medium image at 2k x 2k named 'image_med.png' and maybe a small image at 1k x 1k named 'image_sml.png'.

    It does require you to install a certain build (not version, just the staticly linked build) of ImageMagick for the script to use, and you set the path to it in the script the first time.  (ImageMagick is a Open Source and freely available library of image-manipulation routines and utilities, and includes a GUI front-end as well)

    It's pretty easy to use, and as people keep running into these memory issues, I should probably get the UI cleaned up and the docs written.

    Don't expect HUGE gains in memory.  I did some experimentation, and saw around 10%-25% savings in memory usage in the GPU.  Larger scenes with a lot more figures would probably see more.

     

  • barbultbarbult Posts: 24,781
    hphoenix said:
    L'Adair said:
    barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    I don't use a shortcut. I suppose you could copy the original textures into a subfolder, then save the smaller images with the original filenames. I think the Ctrl+I command would update the images. Or you could close DS to clear memory, restart DS and open the file.

    I don't use it often myself, but there is also the Texture Atlas.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    Or what Richard said...

    I like that even better. Create a sub-directory for your modifed files, and use a script to update the images... not that I have any idea how to write any script...

    GAH.  I've got to get my scripts cleaned up and the install/usage documentation for my image resizing stuff done.  I have a set of scripts (that use ImageMagick) so that you can automatically resize (all/selected) nodes material images (it creates new resized images in the same location as the original, with a suffix on the name) and also to select which ones are being used (i.e., you select the nodes you want to change to one of those newly resized images, and the size, and it does the changes for you.)  It's set up to either scale by 1/4 (half x half) or 1/16 (one-fourth x one-fourth) or both.  So a 4k x 4k image named 'image.png' would have one or two new images created:  A medium image at 2k x 2k named 'image_med.png' and maybe a small image at 1k x 1k named 'image_sml.png'.

    It does require you to install a certain build (not version, just the staticly linked build) of ImageMagick for the script to use, and you set the path to it in the script the first time.  (ImageMagick is a Open Source and freely available library of image-manipulation routines and utilities, and includes a GUI front-end as well)

    It's pretty easy to use, and as people keep running into these memory issues, I should probably get the UI cleaned up and the docs written.

    Don't expect HUGE gains in memory.  I did some experimentation, and saw around 10%-25% savings in memory usage in the GPU.  Larger scenes with a lot more figures would probably see more.

     

    I think this would sell like hot cakes. Can it even handle normal maps? I'm not sure of the correct way to edit or resize a normal map. Can it do jpg, tiff, and png? I'll be a release day buyer, or even beta tester if you need one. I'm unavailable between January 4 and 11, but before and after, I am at your service, if you need me.

  • mjc1016mjc1016 Posts: 15,001
    hphoenix said:
     

    Don't expect HUGE gains in memory.  I did some experimentation, and saw around 10%-25% savings in memory usage in the GPU.  Larger scenes with a lot more figures would probably see more.

     

    Yes...also you can save even more by eliminating some maps altogether.   Especially on mid/long shots. Few if any control maps are needed for those.

    Also one needs to double check that alternate options...like makeup, are not loading another map instead of replacing one...

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited December 2016
    hphoenix said:
    L'Adair said:
    barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    I don't use a shortcut. I suppose you could copy the original textures into a subfolder, then save the smaller images with the original filenames. I think the Ctrl+I command would update the images. Or you could close DS to clear memory, restart DS and open the file.

    I don't use it often myself, but there is also the Texture Atlas.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    Or what Richard said...

    I like that even better. Create a sub-directory for your modifed files, and use a script to update the images... not that I have any idea how to write any script...

    GAH.  I've got to get my scripts cleaned up and the install/usage documentation for my image resizing stuff done.  I have a set of scripts (that use ImageMagick) so that you can automatically resize (all/selected) nodes material images (it creates new resized images in the same location as the original, with a suffix on the name) and also to select which ones are being used (i.e., you select the nodes you want to change to one of those newly resized images, and the size, and it does the changes for you.)  It's set up to either scale by 1/4 (half x half) or 1/16 (one-fourth x one-fourth) or both.  So a 4k x 4k image named 'image.png' would have one or two new images created:  A medium image at 2k x 2k named 'image_med.png' and maybe a small image at 1k x 1k named 'image_sml.png'.

    It does require you to install a certain build (not version, just the staticly linked build) of ImageMagick for the script to use, and you set the path to it in the script the first time.  (ImageMagick is a Open Source and freely available library of image-manipulation routines and utilities, and includes a GUI front-end as well)

    It's pretty easy to use, and as people keep running into these memory issues, I should probably get the UI cleaned up and the docs written.

    Don't expect HUGE gains in memory.  I did some experimentation, and saw around 10%-25% savings in memory usage in the GPU.  Larger scenes with a lot more figures would probably see more.

    Well, the other half of the GB usage is the geometry. Iray has no 'Displacement' trickery like 3DL, so some texture stuff must be done with something-gons, and that over-sampled sub-d mesh can eat up lots of GPU memory very quickly.

    10-20% is very good and something I'm impressed with. 3DL has a few built in things to optimize map sizes, I thought Iray also had something similar. your experiments clearly indicate that If Iray has a map Optimizer-thing, it has potental for improvement. It still dose not make up for NVIDIA launching brand new cards with Iray capable CUDA and inadequate memory, lol.

    Post edited by ZarconDeeGrissom on
  • hphoenixhphoenix Posts: 1,335
    edited December 2016
    barbult said:
    hphoenix said:
    L'Adair said:
    barbult said:
    L'Adair said:

    Is it just me, or is the beta way worse with vram management than the general release? I was attempting to get a simple scene, g3f, a hair, and a paper roll in backround to render and it kept getting kicked to CPU, had no problem rendering on my 960 in the general release.

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    After you resize the texture files in Photoshop, do you have to go into the surface settings and select each image map and browse for your modified version? That seems cumbersome. I'm hoping there is a shortcut I'm not aware of.

    I don't use a shortcut. I suppose you could copy the original textures into a subfolder, then save the smaller images with the original filenames. I think the Ctrl+I command would update the images. Or you could close DS to clear memory, restart DS and open the file.

    I don't use it often myself, but there is also the Texture Atlas.

    Overwriting the maps would make them apply automatically, but they would then be replaced by the next update. A systematic renaming system (either of the files, or keeping the filenames but placing them in a new folder) should make the updating scriptable (or search-and-repalceable using a copy of the original materials presets).

    Or what Richard said...

    I like that even better. Create a sub-directory for your modifed files, and use a script to update the images... not that I have any idea how to write any script...

    GAH.  I've got to get my scripts cleaned up and the install/usage documentation for my image resizing stuff done.  I have a set of scripts (that use ImageMagick) so that you can automatically resize (all/selected) nodes material images (it creates new resized images in the same location as the original, with a suffix on the name) and also to select which ones are being used (i.e., you select the nodes you want to change to one of those newly resized images, and the size, and it does the changes for you.)  It's set up to either scale by 1/4 (half x half) or 1/16 (one-fourth x one-fourth) or both.  So a 4k x 4k image named 'image.png' would have one or two new images created:  A medium image at 2k x 2k named 'image_med.png' and maybe a small image at 1k x 1k named 'image_sml.png'.

    It does require you to install a certain build (not version, just the staticly linked build) of ImageMagick for the script to use, and you set the path to it in the script the first time.  (ImageMagick is a Open Source and freely available library of image-manipulation routines and utilities, and includes a GUI front-end as well)

    It's pretty easy to use, and as people keep running into these memory issues, I should probably get the UI cleaned up and the docs written.

    Don't expect HUGE gains in memory.  I did some experimentation, and saw around 10%-25% savings in memory usage in the GPU.  Larger scenes with a lot more figures would probably see more.

     

    I think this would sell like hot cakes. Can it even handle normal maps? I'm not sure of the correct way to edit or resize a normal map. Can it do jpg, tiff, and png? I'll be a release day buyer, or even beta tester if you need one. I'm unavailable between January 4 and 11, but before and after, I am at your service, if you need me.

    Well, since it uses an external free program the user has to install, it's going to be priceless.

    Or more succinctly, it'll have no price......a freebie.  The scripts aren't that complicated, and the ImageMagick libs are freely available (for Win/Mac/Lin)

    It doesn't currently do any differentiation of types of maps.  And if the Normal/Bump/Opacity/whatever causes an issue at render time, switch that one back to using a higher res image.

    It handles all those image formats, and more.  Normal maps resize the same as any map.  You'll just lose detail, just like with any mapped channel.

    (I'm going to be away for the holidays starting tomorrow until early next week.....I'll try to get it done and uploaded somewhere late next week.)

     

    Post edited by hphoenix on
  • takezo_3001takezo_3001 Posts: 1,997
    edited December 2016

     

    L'Adair said:

    Grrrrr! This build's render goes straight to CPU!  The only time it renders to GPU is via the view-port preview mode! Thing is, it doesn't even matter that I don't have CPU checked off for the main renderer or not, defaults to CPU!

    I'm also using 20 gbs out of 32 gb RAM for a sky & Dome lit scene with two G3s with Stone mason's lake scene file...And my CPU is doing the entire rendering!

    WIN 10-64  AMD 8350 32 gbs Gskill RAM, Gigabyte GTX G1 1080

    The 1080 only has 8GB and if you don't have onboard video or another card, some of that is being allocated for system video, like your monitor. If your scene requires more than the available memory on the card, it will go to CPU only.

    When I run into this, I look for objects that don't impact the scene and remove them. If it's a wide shot, I look for objects with hi-rez materials and resize them in Photoshop. If an object is in the background, 20-30 "feet" from the camera, it doesn't need materials that are 4096 x 4096 pixels; 2048 or 1024 should be fine. Once I get the textures for small or distant objects cut down in size, I can usually get my large scenes to render using my 1080.

    To be honest, I should do the same with all my scenes, even if there are less objects I can "optimize"... so my images will render at the best speed possible.
    smiley

    Thanks for the info, heh, I got it to run with GPU-only by taking the resolution down on the render... (Previously 1920x1200 to 1440x900)

    Still took more than 3 hours to render though! I couldn't even game/watch videos as it was using 100% of my GPU, at least the temps were a respectable 50c!...Maybe Octane renderer is a possible purchase if I want to render any animated videos.

    Post edited by takezo_3001 on
Sign In or Register to comment.