GPU to CPU, (it's not going away anytime soon)

135

Comments

  • fdsphotofdsphoto Posts: 62
    edited January 2020

    @ marble

    How do you find ON1 Resize - do you notice any artefacts in the upscaled image? I tried Topaz Gigapixel AI (they have a trial download) and was impressed at first until I noticed that it made the results unrealistically sharp and had noticeabe artefacts. Still, it seemed to work better than the normal bicubic or lanczoz options in most image editors.

    To be frank, I find all these tricks and workarounds very frustrating after paying a considerable amount to buy one of the recommended NVidia cards. I don't have the luxury of 11GB because I only have a 1070 but all of these problems since the release of 4.12 and NVidia RTX have rather dampened my enthusiasm for DAZ Studio and I'm looking at other options for a creative hobby.

    I think it does pretty well, see attachments.

    The ON1 Image is at 100% in the viewer, after resizing the original image to 24 x 36 @ 300 PPI (7200 x 10800) . The Original image is at 100% in the Windows 10 Photo viewer.

    Of note -

    1. The Original image is at 1000 W x 1500 H. Never intended to resize it, but it did it quite well.

    2. I'm a photographer, and as a photograher, the resize would pass muster as no one ever view a 24 x 36 that close up wink  You'd need to take a couple of steps back to enjoy the full essence of a portrait that big.

    3. The resize took 1 min to complete in ON1 Resize

    4. The DAZ Studio memory consumption on my video card during render didn't pass 3.8GB on this render. So still have plenty of room to play with, but also leverages the full power of the card. Should I ever decide to print it bigger, I can always re-render in a bigger starting size as long as I keep to or below the my 75% mark - as I still have plenty of room to play with.

    5. Also, use .TIFF files. JPEG's will more the like lose detail just copying it alone.

    (original image in JPEG, because you know size lol, but the images in the .PNG screenshot were TIFF format. Also, didn't make any image adjustments in ON1 )

     

    Note: ON1 Resize has a reccomened "No More 10x" Max of the original size of the image. For an image like this, that would be 10000 x 15000 or 40" x 60" at 250 PPI (250 PPI - Recommended PPI for Lab Printing)

     

    https://www.daz3d.com/gallery/#images/919501/

    Capture.PNG
    1921 x 1157 - 3M
    Post edited by fdsphoto on
  • fdsphotofdsphoto Posts: 62
    edited January 2020
    dawnblade said:
    Thanks for the detailed steps. Just a note that the architectural optimizer was disabled in a recent update, so it shouldn't be causing issues with vram or cpu.

    4.12 Pro still has it. Seems like it was elimnated in the 4.12 Beta - which I'm using.

    Mainly because I noticed a high memory consumption in DAZ 4.12 Pro vs. the 4.12 Beta.

    Of Note: The DAZ Support Team seem to imply that only 4.12 Beta can fully leverage the power of the RTX Cards last time I reached out to them. They insisted that I use the Beta the RTX 2080 Ti.

    Post edited by fdsphoto on
  • IvyIvy Posts: 7,165
    edited January 2020

    I don't mind the tricks and work around . But I do hate being forced to mothball a expensive gpu investment a year old to upgrade to another expensive gpu just to play with iray toys i have already bought.    NVIDIA must be taken a page for Microsoft playbook for killing support for things it no longer can make money from. It makes you feel like you are forced buy more products to keep using the content you have already paid for ,& if you don't update the driver your vulnerable to memory leaks and outside influences penetrating  your system., its like some kind of a scam.  Its just like the BS I am having with windows 10  laptop forcing me to take a driver  update that break my adobe office suite programs and when I roll back the driver to get the program to work again, That screwy  w10 reinstalls the dam driver back again on next start up of windows, & not being able to refuse windows updates its a madding endless loop of software shame.

    I have 2 - gtx1080ti SC that are 11gig cards each and at this point If I want to use the latest version of Daz Studio 4.12.040  they  are no better than your gtx1070 .   because its GTX technology and its now brken because of RTX tecnology ,      So I'll stick with daz 4.10  & 4.12.83 and driver 436.30 and stop purchase of any new iray products until this mess is straighten out. I can't afford to shell out $1000 or more for a new GPU every year that is just BS

    I have poser , But I always thought the renders are not as clean & working in rooms in poser is awkward compared to working in tabs with daz studio, But I'm pretty good setting up 3delight in daz studio for animation. . But there is limited support for 3delight in the daz store, because most of the Pa's are creating for iray only.  So it maybe time to  dust poser off and play with all the wonderful content that I  bough that still works in the version of studio I have for a while.

    like you said maybe look for a new hobby to start investing in if I'm going to have to throw a $1000 bucks away every time NVIDIA has a new GPU released, screw that noise

    Besides it might be fun to learn something new. like drawing with paper and pencil  that seems to never to require a upgrade to work. laugh

    marble said:
     

    To be frank, I find all these tricks and workarounds very frustrating after paying a considerable amount to buy one of the recommended NVidia cards. I don't have the luxury of 11GB because I only have a 1070 but all of these problems since the release of 4.12 and NVidia RTX have rather dampened my enthusiasm for DAZ Studio and I'm looking at other options for a creative hobby.

     

    Post edited by Ivy on
  • scorpioscorpio Posts: 8,415
    Ivy said:

    I don't mind the tricks and work around . But I do hate being forced to mothball a expensive gpu investment a year old to upgrade to another expensive gpu just to play with iray toys i have already bought.    NVIDIA must be taken a page for Microsoft playbook for killing support for things it no longer can make money from. It makes you feel like you are forced buy more products to keep using the content you have already paid for ,& if you don't update the driver your vulnerable to memory leaks and outside influences penetrating  your system., its like some kind of a scam.  Its just like the BS I am having with windows 10  laptop forcing me to take a driver  update that break my adobe office suite programs and when I roll back the driver to get the program to work again, That screwy  w10 reinstalls the dam driver back again on next start up of windows, & not being able to refuse windows updates its a madding endless loop of software shame.

    I have 2 - gtx1080ti SC that are 11gig cards each and at this point If I want to use the latest version of Daz Studio 4.12.040  they  are no better than your gtx1070 .   because its GTX technology and its now brken because of RTX tecnology ,      So I'll stick with daz 4.10  & 4.12.83 and driver 436.30 and stop purchase of any new iray products until this mess is straighten out. I can't afford to shell out $1000 or more for a new GPU every year that is just BS

    I have poser , But I always thought the renders are not as clean & working in rooms in poser is awkward compared to working in tabs with daz studio, But I'm pretty good setting up 3delight in daz studio for animation. . But there is limited support for 3delight in the daz store, because most of the Pa's are creating for iray only.  So it maybe time to  dust poser off and play with all the wonderful content that I  bough that still works in the version of studio I have for a while.

    like you said maybe look for a new hobby to start investing in if I'm going to have to throw a $1000 bucks away every time NVIDIA has a new GPU released, screw that noise

    Besides it might be fun to learn something new. like drawing with paper and pencil  that seems to never to require a upgrade to work. laugh

    marble said:
     

    To be frank, I find all these tricks and workarounds very frustrating after paying a considerable amount to buy one of the recommended NVidia cards. I don't have the luxury of 11GB because I only have a 1070 but all of these problems since the release of 4.12 and NVidia RTX have rather dampened my enthusiasm for DAZ Studio and I'm looking at other options for a creative hobby.

     

    This

    for me the last straw.

  • Please stop posting this to every DS thread. It is not the case that GTX cards are no longer supported. It is true, as far as I am aware, that OptiX Prime is now always used for non-RTX cards, so if it was not used in the past that adds a new memory overhead and will lead to more scenes dropping to CPU without tweaking (adjusting texture sizes through Scene Optimiser, for example). I was able to render https://www.daz3d.com/rog-medieval-fantasy-bedroom on my 750Ti in 4.12 (not sure if it was the release or the beta) following the use of Scene Optimiser.

  • IvyIvy Posts: 7,165
    edited January 2020

    Please let me tell you about my experience Richard and as someone that requires daz to render more than one image at a time for animation.  I can assure you gtx cards are obsolete and are no longer acceptable for using in Daz Studio with Iray for animations. So your right maybe  saying its not supported maybe the wrong word.  Broken would be a more correct word. I do not believe it is not daz studio as much as the new GPU drivers  but when you have 2 - 11 gig gpu card and they throw back to cpu after about 5 images being rendered with daz studio 4.12.1.04  that to me is useless pure and simple no other description needed. you can preach  scene Optimizer or texture resource savers all day. it will not help for that gpu maxing out vram and throwing renders to cpu when render image series , nor will it help when you load more that 2 fully dress characters with hair ina 2k hdri and your render get thrown to cpu  .. & not once in a  while, but every singe blessed time.   

    So someone in my shoes that uses daz for animation  I will state with a clear conscious Daz with a gtx GPU its useless  and seeming i paid about $900 each for these 1080ti gpus about a year ago I won't be buying any new rtx cards anytime soon   So yeah maybe saying not supported is a bad choice of word to describe this issue  .its more like broken would be the proper adjective.  because it is

    Post edited by Ivy on
  • ???? 

    Ivy - what is your system setup?

  • fdsphotofdsphoto Posts: 62
    edited January 2020

    Please stop posting this to every DS thread. It is not the case that GTX cards are no longer supported. It is true, as far as I am aware, that OptiX Prime is now always used for non-RTX cards, so if it was not used in the past that adds a new memory overhead and will lead to more scenes dropping to CPU without tweaking (adjusting texture sizes through Scene Optimiser, for example). I was able to render https://www.daz3d.com/rog-medieval-fantasy-bedroom on my 750Ti in 4.12 (not sure if it was the release or the beta) following the use of Scene Optimiser.

    This is the way. - lol

     

    Which Optimizers do you use?

    Post edited by fdsphoto on
  • IvyIvy Posts: 7,165
    fdsphoto said:

    ???? 

    Ivy - what is your system setup?

     

    card Capture.jpg
    1816 x 954 - 124K
    capture1.jpg
    477 x 841 - 66K
    Capture.JPG
    731 x 547 - 99K
    2f57dc08188568766017d387accba4.jpg
    1440 x 1080 - 125K
  • Richard HaseltineRichard Haseltine Posts: 100,842
    edited January 2020
    fdsphoto said:

    Please stop posting this to every DS thread. It is not the case that GTX cards are no longer supported. It is true, as far as I am aware, that OptiX Prime is now always used for non-RTX cards, so if it was not used in the past that adds a new memory overhead and will lead to more scenes dropping to CPU without tweaking (adjusting texture sizes through Scene Optimiser, for example). I was able to render https://www.daz3d.com/rog-medieval-fantasy-bedroom on my 750Ti in 4.12 (not sure if it was the release or the beta) following the use of Scene Optimiser.

    This is the way. - lol

     

    Which Optimizers do you use?

    For that product I used Scene Optimiser https://www.daz3d.com/scene-optimizer , I can't recall if I halved or quartered (linearly) the more distant maps to get it rendering.

    ------------------------------------

    n general, I should add that the update designed to help with image sequences is listed in te chnage log but not yet available so it's too soon to say how much it will help http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log#4_12_1_41

    Post edited by Richard Haseltine on
  • marblemarble Posts: 7,500
    Ivy said:
    fdsphoto said:

    ???? 

    Ivy - what is your system setup?

     

    Jeez, Ivy, I thought my desk was crammed! Do you have space to move your mouse? ;)

  • marblemarble Posts: 7,500
    edited January 2020
    fdsphoto said:

    Please stop posting this to every DS thread. It is not the case that GTX cards are no longer supported. It is true, as far as I am aware, that OptiX Prime is now always used for non-RTX cards, so if it was not used in the past that adds a new memory overhead and will lead to more scenes dropping to CPU without tweaking (adjusting texture sizes through Scene Optimiser, for example). I was able to render https://www.daz3d.com/rog-medieval-fantasy-bedroom on my 750Ti in 4.12 (not sure if it was the release or the beta) following the use of Scene Optimiser.

    This is the way. - lol

     

    Which Optimizers do you use?

    For that product I used Scene Optimiser https://www.daz3d.com/scene-optimizer , I can't recall if I halved or quartered (linearly) the more distant maps to get it rendering.

    ------------------------------------

    n general, I should add that the update designed to help with image sequences is listed in te chnage log but not yet available so it's too soon to say how much it will help http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log#4_12_1_41

    You are right about the same subject coming up in different threads - I just asked you a question about that changelog in the other thread so I won't repeat it here.

    By the way, Scene Optimizer helps but I use it automatically these days and I still get 4.12 (Beta) falling back to CPU for scenes that will render fine in 4.11. It seems to me, but I can't prove it, that 4.12 allocates VRAM somewhat differently (aside from Optix) than pre-RTX days so it gets to the upper limits with less content than before. I did find a setting that seemed to help with this but can't remember what it was. - Ah, maybe it was the compression setting? I'm not at my DAZ PC right now.

    Post edited by marble on
  • fdsphotofdsphoto Posts: 62
    edited January 2020

    ???? 

    Ivy - what is your system setup?

    Hey Ivy,

    Seems like you’re oveclocking your cards, and not sure which version of Daz Studio you’re using.

    Have you tried setting your gpu’s to its default timing (not overclocked) and have you tried the latest 4.12 beta (64 bit).

    Even My RTX 2080 ti starts doing weird crap and kicking back to CPU when overclocked while using Daz 

     

    Post edited by fdsphoto on
  • I just put together a much improved computer and am looking to do some stills and (hopefully) animation in DAZ Studio, though my preferred program for the latter is Carrara.

    Are there certain panes in DAZ Studio or Windows utilities I'll need to watch to determine if my renders are using GPU or falling back to CPU - or will that info pop up automatically? THanks.

  • scorpioscorpio Posts: 8,415

    Please stop posting this to every DS thread. It is not the case that GTX cards are no longer supported. It is true, as far as I am aware, that OptiX Prime is now always used for non-RTX cards, so if it was not used in the past that adds a new memory overhead and will lead to more scenes dropping to CPU without tweaking (adjusting texture sizes through Scene Optimiser, for example). I was able to render https://www.daz3d.com/rog-medieval-fantasy-bedroom on my 750Ti in 4.12 (not sure if it was the release or the beta) following the use of Scene Optimiser.

    I'm not posting it to every DS thread, and I wwas merely going on information in a post by a Daz vendor.

  • fdsphotofdsphoto Posts: 62
    edited January 2020

    I just put together a much improved computer and am looking to do some stills and (hopefully) animation in DAZ Studio, though my preferred program for the latter is Carrara.

    Are there certain panes in DAZ Studio or Windows utilities I'll need to watch to determine if my renders are using GPU or falling back to CPU - or will that info pop up automatically? THanks.

    Windows Task Manager. You can view your CPU, RAM, and GPU usage (In the GPU stats, you can also see your CUDA core usage, and GPU mem usage).

    Daz itself will also tell you in the status bar, or render dialog that the render has been kicked to CPU.

    Post edited by fdsphoto on
  • scorpio said:

    Please stop posting this to every DS thread. It is not the case that GTX cards are no longer supported. It is true, as far as I am aware, that OptiX Prime is now always used for non-RTX cards, so if it was not used in the past that adds a new memory overhead and will lead to more scenes dropping to CPU without tweaking (adjusting texture sizes through Scene Optimiser, for example). I was able to render https://www.daz3d.com/rog-medieval-fantasy-bedroom on my 750Ti in 4.12 (not sure if it was the release or the beta) following the use of Scene Optimiser.

    I'm not posting it to every DS thread, and I wwas merely going on information in a post by a Daz vendor.

    The comment wasn't addressed to you.

  • Mosk the ScribeMosk the Scribe Posts: 888
    edited January 2020
    fdsphoto said:

    I just put together a much improved computer and am looking to do some stills and (hopefully) animation in DAZ Studio, though my preferred program for the latter is Carrara.

    Are there certain panes in DAZ Studio or Windows utilities I'll need to watch to determine if my renders are using GPU or falling back to CPU - or will that info pop up automatically? THanks.

    Windows Task Manager. You can view your CPU, RAM, and GPU usage (In the GPU stats, you can also see your CUDA core usage, and GPU mem usage).

    Daz itself will also tell you in the status bar, or render dialog that the render has been kicked to CPU.

    Thanks for the info. Unfortunately, it looks like DAZ isn't using my GPU (RTX 2070 Super with nVidia Studio Driver Version 441.66) at all.

    It pegs all of the cores of my processor (Ryzen 3950x) at or near 100%, which gives results much faster than my old system, but would sure like to get GPU working with DAZ.

    Using Task Manager, memory usage stays around 13GB RAM (out of 128GB DDR4 RAM. 2666MHz) and GPU at 0

     

    I did a test scene in Carrara out of curiosity, and there my CPU only goes up to 25-30% on about 8 cores (rather than nearly 100% for all 32 cores) and my GPU goes to around 5% rather than 0% - but still relying mostly on CPU rather than GPU there as well.

    Also, where in the DAZ interface should I see the status bar or render dialogue?  I have the area at bottom that shows lessons (which I think is the status bar) but didn't see any notes there - and didn't notice any render dialogue pop up while I left an animation rendering for several minutes.

    Would welcome any suggestions on best way to proceed. Thanks.

     

     

    Post edited by Mosk the Scribe on
  • Mosk the ScribeMosk the Scribe Posts: 888
    edited January 2020

    (duplicate post removed)

    Post edited by Mosk the Scribe on
  • fdsphotofdsphoto Posts: 62
    edited January 2020
    fdsphoto said:

    I just put together a much improved computer and am looking to do some stills and (hopefully) animation in DAZ Studio, though my preferred program for the latter is Carrara.

    Are there certain panes in DAZ Studio or Windows utilities I'll need to watch to determine if my renders are using GPU or falling back to CPU - or will that info pop up automatically? THanks.

    Windows Task Manager. You can view your CPU, RAM, and GPU usage (In the GPU stats, you can also see your CUDA core usage, and GPU mem usage).

    Daz itself will also tell you in the status bar, or render dialog that the render has been kicked to CPU.

    Thanks for the info. Unfortunately, it looks like DAZ isn't using my GPU (RTX 2070 Super with nVidia Studio Driver Version 441.66) at all.

    It pegs all of the cores of my processor (Ryzen 3950x) at or near 100%, which gives results much faster than my old system, but would sure like to get GPU working with DAZ.

    Using Task Manager, memory usage stays around 13GB RAM (out of 128GB DDR4 RAM. 2666MHz) and GPU at 0

     

    I did a test scene in Carrara out of curiosity, and there my CPU only goes up to 25-30% on about 8 cores (rather than nearly 100% for all 32 cores) and my GPU goes to around 5% rather than 0% - but still relying mostly on CPU rather than GPU there as well.

    Also, where in the DAZ interface should I see the status bar or render dialogue?  I have the area at bottom that shows lessons (which I think is the status bar) but didn't see any notes there - and didn't notice any render dialogue pop up while I left an animation rendering for several minutes.

    Would welcome any suggestions on best way to proceed. Thanks.

    So full disclosure, I am a Hardware Support Engineer for Advanced Electronics and Computer Systems by day.

    I'm in a bit of shock, as I see most of you guys are running Monsterous PC Systems, compared to my small and humble setup with a 2080 Ti in it.

    I'm pretty confident that this maybe more of an issue with how your guys systems are configured, in regards to your OS and possibly System Overclocking setting (if any).

    The only thng I can suggest at this point is to put your GPU settings back to their Out-of-the-Box default setup. You can do this in the Nvidia control panel. Also, disable any and all overclocking on your system.

    My setup (a $200 Ebay Special)

    HP Compaq 8200 Elite Small Form Factor PC (2012) with:

    Intel i7 3770 - 3rd Generation Quad-Core (Released in the 2nd Quarter of 2012)

    Amazon Upgrades:

    32GB Ram (4x8), A-Tech DDR3 1600 Mhz (PC3 12800) UDIMM 

    800W Corsair Modular Power Supply

    Nvidia RTX 2080 ti

    And, of course, the perks of the job (lol):

    2 - Intel 800GB SSD's (3500 series)

    3 - Intel 1200GB SSD's (3800 or 3900 series)

    -----------------------------------------------------------------------------------------

    Of note, is your Power Supply big enough to run your entire system?

    The 2080 ti alone sucks up close to 500W when overclocked.

     

    Post edited by fdsphoto on
  • Hey @fdsphoto - regarding system, for last 5-6 years, I've been using an i7-4770K with 32GB DDR3 RAM, and  2GB nVidia GTX660ti for my GPU - recent upgrade was triggered by relatively affordable RAM, powerfuil GPU's and new Ryzen chips which will (hopefully) let me dive into Houdini and run simulations while speeding my workflow in other 2d and 2d animation and special effectgs programs.

    I have a 850W PSU for the new system, which looked like it should be plenty since I'm just running a single GPU (2070RTX Super).

    I just installed the current Beta for DAZ Studio and tested an animation there using iray render - but once again, my GPU isn't used at all and my cpu  pegs out at 100%

     

  • Mosk the ScribeMosk the Scribe Posts: 888
    edited January 2020

    Further testing in both the 4.12 general release and the Beta (which is supposed to include support for RTX cards, I believe) - and GPU isn't contributing anything to renders in eatierh version.  If I untick the box that allows the CPU to contribute (in Render Settings>Advanced where there are checkboxes next to CPU and GPU) - then everything slows to a crawl. 

    Are people with nVidia RTX cards simply not using their GPUs with DAZ Studion for the time being - or is there a workaround?

    Is there any earlier nVidia Driver that works with the latest version of DAZ Studio?

    Thanks for any suggestions.

    Post edited by Mosk the Scribe on
  • SnowSultanSnowSultan Posts: 3,595

    compared to my small and humble setup with a 2080 Ti in it.

    LOL, is that setup in your very modest and humble four bedroom, five bathroom Central Park penthouse too?   ;)

     

    I haven't read this whole discussion, but has anyone mentioned the workaround we talked about in another thread?: do not adjust surface settings while in Iray preview mode, that will revert even a humble 2080ti to CPU.

  • PDSmithPDSmith Posts: 712
    edited January 2020

    @Mosk

    Looking over your specs I’m not seeing anything that might cause the problems you’re having. I’m not trying to sound condescending so bear with me as I give a few suggestions.

    1)      In the windows system information for your drivers, does it show your card clear and available? or is it shown with a ‘yield sign’ this would mean your card isn’t working at a windows level and your CPU is carrying the load to run everything.

    2)      Option 2, version of driver. I’m running the 441 game dev version of the driver. Nvidia offers a studio version but it does not support anything below the GTX 1000 series of cards so my 2 gtx960 would be essential bricks inside my tower case. your version of driver may be pre-2070  just guessing.

    3)      Over clocking…nope, not a bit. And I still get the gpu to cpu drop, ie why I started this thread.

    4)      Roll back to an earlier driver and use the clean set up if offered, this is evasive but will delete any and all other video drivers currently on or in your system and give you essential a fresh start.

    5)      Dumb idea but check your power cord to the card. Couple years back one of my cables was shipped with a shorter prong so it didn’t plug into power supply.  The fans ran but that was all. Windows said I had a card but when it came to use it simply wasn’t there.

    6)      Test your card with some game. Anyone for Starcraft II?

    7)      The RTX series of cards is support from 4.10 and up, none of them have shown to be stable like 4.9 was but as said RTX series of cards won’t work. So I’m waiting impatiently to test out the next beta (if it’s a beta) where we can shut off dropping to CPU (why would we want to drop to cpu rendering? OTOY’s Octane3d uses system ram to offset what VRam is missing and keeps all cards online and chugging away.)       Ive included a copy/paste of the change log for Studio/Iray.

    a.    Added an option, to the Hardware page under NVIDIA Iray Advanced Render Settings, that allows the user to enable/disable Iray CPU fallback; requires restart to apply changes

    b.    Added an option, to the Hardware page under NVIDIA Iray Advanced Render Settings, that allows the user to enable/disable Iray GPU detection; requires restart to apply changes

    c.     Added an option, to the Hardware page under NVIDIA Iray Advanced Render Settings, that allows the user to enable/disable Iray GPU driver version check; requires restart to apply changes; depends on GPU detection being enabled

    d.    Added an option, to the Photoreal Mode section of the Hardware page under NVIDIA Iray Advanced Render Settings, that allows the user to set Iray NVLINK peer group size; requires supported hardware; depends on GPU detection being enabled.

    8)    https://www.geforce.com/drivers  go under manual driver search and enter your data paying close attention to the fact nvidia keeps pumping the notebook drivers.  If that’s not what you have then go with the other option.

    And that all I have to offer.

    Post edited by PDSmith on
  • fdsphotofdsphoto Posts: 62
    edited January 2020

    Further testing in both the 4.12 general release and the Beta (which is supposed to include support for RTX cards, I believe) - and GPU isn't contributing anything to renders in eatierh version.  If I untick the box that allows the CPU to contribute (in Render Settings>Advanced where there are checkboxes next to CPU and GPU) - then everything slows to a crawl. 

    Are people with nVidia RTX cards simply not using their GPUs with DAZ Studion for the time being - or is there a workaround?

    Is there any earlier nVidia Driver that works with the latest version of DAZ Studio?

    Thanks for any suggestions.

    Nvidia Driver 441.36 was the version I stayed on until Nvidia 441.66 Studio Driver was released and fixed the issues that I was having.

    Post edited by fdsphoto on
  • Saxa -- SDSaxa -- SD Posts: 872
    edited January 2020
     

    I haven't read this whole discussion, but has anyone mentioned the workaround we talked about in another thread?: do not adjust surface settings while in Iray preview mode, that will revert even a humble 2080ti to CPU.

    Don't hate me, but am on 4.12.0.86 and I can change surfaces all day long.  4.12.1.40 I cannot.  In fact, 4.12.0.86 works incredibly well for my PC setup, but am missing timeline improvements like deleting Pose Keys (hidden properties).  As posted on Page 2 of this thread my specs win7Pro-64, DS 4.12.0.86, 441.66, and you know my card lol.

    @ Mosk the Scribe.

    PDSmith's 1st suggestion is the first and most important.  You need to know if Daz Studio is recognizing your card.  If it does show, then try and render one G8 or G3.  With your specs that would be easy-peasy.  If it doesn't work then it means you probably need to reinstall this or that to get DS to recognize your card.  Well unles you just got your GPU and like PDSMith wrote it's not actually tested and part of your PC.

     

     

    Post edited by Saxa -- SD on
  • Hi - 

    Thanks for all of the comments and suggestions.

    Yes, my GPU is recognized as working normally in Device Manager. I also downloaded some benchmark software and seems to be working fine.

    Note, when I render in DAZ, although my GPU stays at roughly 0% in task manager, if I go to the Performance Tab and select Cuda, that goes to roughly 100% and stays there for the duration.

    Note - I also restarted my system after seeing it mentioned that the system needed to be restarted before certain driver changes would take effect (not sure if that was in this thread or elsewhere)   As I was running this test, the GPU actually went up to 100% also and stayed there for a couple of minutes before dropping back to 0.

    Now it's back to 0-1% and part of the task manager closed so going to re-start system again and try some more.  

     

  • SnowSultanSnowSultan Posts: 3,595

    Interesting, I think I have 4.12.0.86 (that's the public release?) and I can't change surfaces there while Iray preview is on either. I'm using the Creator drivers or whatever they're called on my 2080ti now.

  • JPJP Posts: 60

    I have a 1080 ti and running 4.12. Just adding two characters with a few clothing items will max out the RAM (apparently) and the CPU is the  real workhorse. On some very small basic scenes with a simple primitive the GPU and CPU will both render and on my system that means it is using about 600 watts! And it does not seem that fast either. The CPU can crank a decent noiseless 10,000 x 5,625 rendering in 2 hours with 10 characters and props. Is there any GPU on the market that can do that?!

    So at the moment Iray with the GPU is basically useless. How much RAM is really required to get GPU rendering going? 24 GB? It also seems like the CPU is rendering faster than the GPU? I have a 10  core I9 with 20 threads and it is the one getting the job done. I have 128 GB RAM so huge scenes are not an issue. Just wish I could see the GPU in action - at this point I am trying to understand what all the fuss about GPU rendering is all about?!

  • From primitives to ten characters and props is a pretty large step.

Sign In or Register to comment.