RTX 3060 GFX vs my old GTX 1080ti card for DAZ Studio IRAY, did I waste my money?

24

Comments

  • TorquinoxTorquinox Posts: 3,571

    N00b4Ever said:

    Torquinox said:

    N00b4Ever said:

    IMO I wouldn't buy any XX60 if I need real performance, not even if it comes with a "ti" attached to it.  Take it as an expensive learning experience. 

    What is real performance? Are we discussing gaming performance or rendering performance? If you buy an 8GB card and it won't hold your scene, your DS performance is whatever your CPU can provide. If you buy a 3090, you have to be sure the rest of your system can support it, and total cost goes up considerably. Info online suggests 3080ti is a waste of money, might as well buy the 3090 for rendering.

    For the OP is clearly rendering... for me is both as I'm a gamer as well (and throw in some video editing on the side). I jumped from my GTX970 to my RTX3090 precisely 'cause my GTX970 couldn't handle much. So perhaps I didn't choose the right word, I guess "real performance" was more of an overall performance. A gpu that can tackle most of things. Now my problem is that the rest of my PC components are falling apart, but I'm waiting on DDR5 to become more popular next year so I can drain my bank account and upgrade everything around my gpu.

    But back to the topic, if I were the OP I wouldn't spent $1000 for just one extra 1GB. And again not on a XX60

    Cheers.

    I see. Thanks for the reply. smiley

  • Wow, thank you everyone for all of the assistance on this issue, I really appreciate it! ^_^ Sadly though guys, I had 'thought' that I did enough research to make such an expensive decision, but obviously I did not. -_- I did look at the benchmark between the two cards and yes, the 1080ti still outpowered the RTX 3060 across the board. But, I guess I feel into a novice pithole that you would think newer cards could well exceed their older counterparts. But, I was 100% wrong, I see! I do play online games sure, but this card was purchased primarily for DAZStudio IRAY renders. But, I was reading lots of comments and opinions on which card works the best (ie cuda cores) when rendering and I do realize that both the RTX 3060 and GT 1080ti have identical cuda cores and 1 has 1GB more then the other I couldn't afford any of the higher end models and should have  just waited as paying this much for a card that is probably worse then my old GTX 1080ti, well, as someone here said, 'chalk it up to experience' even if it is a HARD $1000 one.

    I was thinking about pairing the two cards as well, but I don't know how to do that. My MB is SLI compatitble and I do have room for both cards, but I also read that I need a bridge as well as a SLI connector. I cannot afford a bridge as Amazon has them for $100-$200 and I don't want to shell out that much for a bridge to connect both cards. I mean, is it really that bad to use the 3060 vs the 1080ti with IRAY? I was rendering full scenes with 2 Gen 8 or lower characters, clothes, hair, house, props, in about 5 1/2 hours with the 1080ti. I hadn't tested it with the RTX 3060, but as someone mentioned here, I should probably try that with a few different renders and see if the RTX chops off a little time then the 1080ti. At least I won't feel like I wasted $1,000 on a SEVERELY overpriced card, which deep down, I know that is exactly what I did. >_<

     

    But thank you everyone for your assistance and opinions, I really appreciate all of the help! :)

     

    Merc 

  • PerttiAPerttiA Posts: 10,024

    Mercblue22 said:

    I mean, is it really that bad to use the 3060 vs the 1080ti with IRAY? I was rendering full scenes with 2 Gen 8 or lower characters, clothes, hair, house, props, in about 5 1/2 hours with the 1080ti.

    That sounds like rendering done on CPU instead of the GPU... I have a 2070 Super, and it rarely takes more than an hour and then it's complex scenes.

    I still don't understand, where did you get the idea that the RTX 3060 performs worse than your old card in Iray rendering, the benchmark thread clearly shows that it renders the scene in half the time than it takes on the 1080Ti

  • kyoto kidkyoto kid Posts: 41,202

    Ghosty12 said:

    AgitatedRiot said:

    kyoto kid said:

    ...it will come in handy particularly as you get more ambitious with your scenes. or say a 16 or 20 GB card comes out. 

    RTX 3090s has 24 GB VRAM. 

     The funny thing here is that the 3090 here in Australia while expensive as heck $3150 to $4500 AUD, is more worth it because of the 24GB of vram.. Although here one could buy a somewhat decent car for that price.. lol

    ...I actually was considering an RTX A4000 with 16 GB as for the cost it would have given more performance for the dollar than the prices being charged for 3060s (upwards of 800 - 900 USD and even a couple were about the same as the A4000s I was pricing).  The markrup on the A series cards is not as steep, pretty much reflecting the with the chip shortage's effects.  For example The MSRP on the A4000 is 1,000 USD and I was seeing them offered for between 1,150 to 1,250 USD on average.  With 16 GB of VRAM and 6144 cores it would have been a reasonable jump to get a lower per GB cost than the 3060s that are on the market. 

    When I received an email from EVGA that they had a 3060 12 GB for me at 409 USD I jumped on that as while the A4000 is definitely superior, it was over an 800 USD difference and I could have it now rather than wait many months dining on "mac & orange powder in a box" until i could afford an A4000. Again another matter is the impending loss of Iray support for Maxwell as well as the VRAM used by Optix Prime acceleration that is forced on all GTX series cards in the current version of Iray.

  • Ghosty12Ghosty12 Posts: 2,065
    edited November 2021

    kyoto kid said:

    Ghosty12 said:

    AgitatedRiot said:

    kyoto kid said:

    ...it will come in handy particularly as you get more ambitious with your scenes. or say a 16 or 20 GB card comes out. 

    RTX 3090s has 24 GB VRAM. 

     The funny thing here is that the 3090 here in Australia while expensive as heck $3150 to $4500 AUD, is more worth it because of the 24GB of vram.. Although here one could buy a somewhat decent car for that price.. lol

    ...I actually was considering an RTX A4000 with 16 GB as for the cost it would have given more performance for the dollar than the prices being charged for 3060s (upwards of 800 - 900 USD and even a couple were about the same as the A4000s I was pricing).  The markrup on the A series cards is not as steep, pretty much reflecting the with the chip shortage's effects.  For example The MSRP on the A4000 is 1,000 USD and I was seeing them offered for between 1,150 to 1,250 USD on average.  With 16 GB of VRAM and 6144 cores it would have been a reasonable jump to get a lower per GB cost than the 3060s that are on the market. 

    When I received an email from EVGA that they had a 3060 12 GB for me at 409 USD I jumped on that as while the A4000 is definitely superior, it was over an 800 USD difference and I could have it now rather than wait many months dining on "mac & orange powder in a box" until i could afford an A4000. Again another matter is the impending loss of Iray support for Maxwell as well as the VRAM used by Optix Prime acceleration that is forced on all GTX series cards in the current version of Iray.

    And this is the problem, I would of done similar gone for a Quadro card as they are more suited for this type of work, but this time around I had to work within a budget.. lol The 3080ti looks nice and all but what you get for it, it is not worth the price being asked for them.. And this is why I bought a 3060, because it was within my budget and it has a decent amount of vram on it..

    Here is a listing for a A5000 24GB https://www.mwave.com.au/product/leadtek-rtx-a5000-24gb-professional-video-card-ac46031 when you look at the cost of the cheapest 3090 https://www.mwave.com.au/product/evga-geforce-rtx-3090-ftw3-gaming-24gb-video-card-ac38482 and the cheapest 3080ti https://www.mwave.com.au/product/gigabyte-geforce-rtx-3080-ti-eagle-12gb-video-card-ac45023 it is not bad at all..

    There is a listing for an A4000 but there is no stock, the interesting part is that it has more ram and miles cheaper than a 3080ti https://www.mwave.com.au/product/leadtek-nvidia-quadro-rtx-a4000-16gb-video-card-ac45138 and like the A5000 they are more suited for heavy workloads like rendering and so on..

    Post edited by Ghosty12 on
  • TorquinoxTorquinox Posts: 3,571

    Mercblue22 said:

    I was thinking about pairing the two cards as well, but I don't know how to do that. My MB is SLI compatitble and I do have room for both cards, but I also read that I need a bridge as well as a SLI connector. I cannot afford a bridge as Amazon has them for $100-$200 and I don't want to shell out that much for a bridge to connect both cards. I mean, is it really that bad to use the 3060 vs the 1080ti with IRAY? I was rendering full scenes with 2 Gen 8 or lower characters, clothes, hair, house, props, in about 5 1/2 hours with the 1080ti. I hadn't tested it with the RTX 3060, but as someone mentioned here, I should probably try that with a few different renders and see if the RTX chops off a little time then the 1080ti. At least I won't feel like I wasted $1,000 on a SEVERELY overpriced card, which deep down, I know that is exactly what I did. >_<

    As I understand it, you can't pair them for pooled VRAM, but you can install both in your computer (power supply and slots permitting). If the scene fits in each card's VRAM, DS will use both cards to help with rendering. If the scene only fits in one card, DS will use that card. If the scene doesn't fit in either card, you're back to the CPU.

  • MelissaGTMelissaGT Posts: 2,611

    PerttiA said:

    Mercblue22 said:

    I mean, is it really that bad to use the 3060 vs the 1080ti with IRAY? I was rendering full scenes with 2 Gen 8 or lower characters, clothes, hair, house, props, in about 5 1/2 hours with the 1080ti.

    That sounds like rendering done on CPU instead of the GPU... I have a 2070 Super, and it rarely takes more than an hour and then it's complex scenes.

    When I was rendering with my 1080TI, especially after the move past DS 4.11, scenes could take upwards of 4 or 5 (or more) hours to render. I'd commonly just let it go overnight. That was for one or two G8/8.1 figures, dForce clothing on both, SubD 4, dForce hair, etc etc. So complex scenes. So like this scene would be an example...this one took several hours to render at ~5000px and I used an older version of DS (4.12) because it wasn't possible to render at all on 4.15. It did render out on the GPU though, so no CPU rendering...I did one pass with everything but the horse's LAMH...and then I did a second render with the LAMH. When I moved to the 3090, I was able to render everything all at once, and it would have taken maybe 30min (if that). So it's a big difference between GTX and RTX infrastructure. 

  • PerttiA said:

    Mercblue22 said:

    I mean, is it really that bad to use the 3060 vs the 1080ti with IRAY? I was rendering full scenes with 2 Gen 8 or lower characters, clothes, hair, house, props, in about 5 1/2 hours with the 1080ti.

    That sounds like rendering done on CPU instead of the GPU... I have a 2070 Super, and it rarely takes more than an hour and then it's complex scenes.

    I still don't understand, where did you get the idea that the RTX 3060 performs worse than your old card in Iray rendering, the benchmark thread clearly shows that it renders the scene in half the time than it takes on the 1080Ti

    Which version of Daz Studio are you usign (Help>About Daz studio) and what driver version (right-click on Windows desktop>nVidia Control Panel)? It does sound like the card is not being seen/used.

  • WonderlandWonderland Posts: 7,027
    edited November 2021

    I can’t use DS at all with a 1080ti in DS 4.15. Even when I just put iRay preview on, it immediately crashes. I’m using DS 4.14 beta now and it doesn’t crash as much but even portraits with dforce hair and no clothes or just a top can take hours. Especially G8.1. There are some G8.1 characters I can’t render at all on a 1080ti unless I lower the subD to 1 or it crashes. Most things with fur crash. I can’t afford to upgrade now, but I don’t know how anyone is using a 1080ti in DS 4.15 especially with G8.1 characters or fur.

    Post edited by Wonderland on
  • marblemarble Posts: 7,500

    Ghosty12 said:

    AgitatedRiot said:

    kyoto kid said:

    ...it will come in handy particularly as you get more ambitious with your scenes. or say a 16 or 20 GB card comes out. 

    RTX 3090s has 24 GB VRAM. 

     The funny thing here is that the 3090 here in Australia while expensive as heck $3150 to $4500 AUD, is more worth it because of the 24GB of vram.. Although here one could buy a somewhat decent car for that price.. lol

     I count myself as one of the lucky ones. I had been bemoaning the lack of VRAM in my 1070 so my son suggested to give it o him as it was good enough for his gaming and he then helped me (massively) to buy a 3090 as soon as they appeared here in New Zealand. I had sold my iMac in order to upgrade the PC with decent CPU/RAM/Motherboard/PSU so basically I had a new PC..The price I paid for the 3090 was only a couple of hundred NZ$ over MSRP which, shortly after I bought it, shot up to well over NZ$1000 over MSRP. So I count myself lucky to have such a generous son and for getting the timing right.

    Render times are now not a problem but I wish it had made a bigger difference to dForce simulation times which are still annoyingly slow.

  • MelissaGTMelissaGT Posts: 2,611
    edited November 2021

    marble said:

    Ghosty12 said:

    AgitatedRiot said:

    kyoto kid said:

    ...it will come in handy particularly as you get more ambitious with your scenes. or say a 16 or 20 GB card comes out. 

    RTX 3090s has 24 GB VRAM. 

     The funny thing here is that the 3090 here in Australia while expensive as heck $3150 to $4500 AUD, is more worth it because of the 24GB of vram.. Although here one could buy a somewhat decent car for that price.. lol

     I count myself as one of the lucky ones. I had been bemoaning the lack of VRAM in my 1070 so my son suggested to give it o him as it was good enough for his gaming and he then helped me (massively) to buy a 3090 as soon as they appeared here in New Zealand. I had sold my iMac in order to upgrade the PC with decent CPU/RAM/Motherboard/PSU so basically I had a new PC..The price I paid for the 3090 was only a couple of hundred NZ$ over MSRP which, shortly after I bought it, shot up to well over NZ$1000 over MSRP. So I count myself lucky to have such a generous son and for getting the timing right.

    Render times are now not a problem but I wish it had made a bigger difference to dForce simulation times which are still annoyingly slow.

    So many folks say that it is slow, but out of curiosity, what is slow? I've been loving dForce from the start and I typically simulate on a timeline over the course of 30 - 60 frames and it doesn't feel slow to me. Maybe 5min per simulation, if that. I can probably count on one hand the times it has taken longer than 10 or so minutes, and most of the time it's a Linday hair with geoshells that does it. I went from a 1080TI to a 3090 and simulation is still pretty good. What can slow it down...is if you have a lot of morphs and jcm's on your figure...as it goes through the pose transition on the timeline and all those jcm's have to adjust with each frame...that can bog it down hardcore, but that's not the fault of dForce. I have the same problem with ActivePose...so many jcm's on certain characters...it's hideous how boggy it can get. 

    Post edited by MelissaGT on
  • outrider42outrider42 Posts: 3,679
    edited November 2021

    I used a 1080ti with 4.15 so it worked for me. I had two 1080tis. My issue is that I could get crashes when using both GPUs to render after a few renders. It doesn't make sense because I could get these crashes even on small renders, using less than half the VRAM capacity. But using one by itself never presented an issue. That is not optimal though, as the whole point was to have both GPUs run to go faster. This issue has been present for a while across several versions of Daz for me. I can do a few renders, but then seemingly at random it crashes. Sometimes it crashes on the first render, I just never know. I have done clean GPU installs. I have even built an entirely new PC! So that this happens across multiple versions, and multiple desktops, would indicate Daz Studio Iray has a real problem.

    I believe the issue is how Daz and Iray manage memory. I close Daz Studio and it takes FOREVER for Daz to exit my system RAM. The larger the scene, the longer it takes. It can take 10, 15 minutes for Daz to stop showing up on MS Task Manager. I can sit there with Task Manager on and watch as Daz VERY SLOWLY drops in RAM usage. Literally 50MB per second, if that. So do the math. I can have scenes that are 50GB in size, and closing Daz only 50MB ticks off per second. This is not an exaggeration. As the Task Manager shows Daz drop 50MB per second, a 50GB scene takes about 17 minutes to clear completely and drop off the Task Manager list. What the hell is going on here? No other program I have ever used works this way. When I close pretty much any other program, it leaves my RAM in seconds. Daz seriously needs to address this because users are getting sick of it.

    I managed to get a 3090 in October, and at the $1500 MSRP from Best Buy, who appears to be the only one to sell them for MSRP. I used the Hot Stock app to get the alerts, but it took a few months to finally score.

    The strange multiple GPU issue still happens. If I use both the 3090 and 1080ti I may get a crash. Sometimes it works, sometimes it doesn't. I tried the Iray benchmark and got a crash. That is why I didn't post a bench time with both the 3090 and 1080ti, they just crash. I don't know why. However, since the 3090 is more than twice as fast as my two 1080tis combined, it hasn't bothered me to leave the 1080ti out. Plus running both of these does warm up my computer room, LOL. I can use the 1080ti to do the viewport and stuff to give the 3090 as many resources as possible.

    Anyway, unless you get lucky like I did, today is the worst time in history to buy a GPU. It has never been like this before. If you can ride it out with what you have, I strongly suggest doing so. You could try using something like Hot Stock to grab one when they pop up, and that is about all you can do. Otherwise the prices are completely out of control.

    You may be able to play off these prices if you have something worth selling. The 1080ti for example can be sold for $700 on ebay with ease. That could take some of the sting out of overpaying for another card to replace it. It is very easy to use multiple GPUs with Iray. All you have to do is just plug them in! No SLI is needed for Iray. BUT you need to verify your power supply can handle both of these cards. I would assume it would if you had a 1080ti to begin with, you probably got a decent PSU. But you still need to check. Though I said I had some problems using 2 GPUs, it is still something I would do, because aside from crashes, when it worked it doubled my rendering speed. The 3060 is almost certainly faster than the 1080ti at rendering Iray. The 3060 is not too far off the 1080ti at gaming for that matter. There are games that are better optimized for the 3060. Additionally, if you ever happen to use Steam Remote Play, the experience with Ampere is much improved over the Pascal based 1000 series. I use Steam Remote Coop sometimes, this is where you stream your game to your coop partner, they do not even need a copy of the game because you are playing it off your machine like a server. My coop partners actually refused to play any more Remote Coop games because the experience was just that bad for them, even when playing simple 2D games that barely use the CPU/GPU. The image quality was blurry and the lag was very bad. When I built my new PC with a 5800X, that did not help Remote Play at all. I assumed it just wasn't possible, that something in our connection was the issue. But when I got my 3090, I was able to convince them to try Remote Play again. Not only was the image quality better on their end, he even said he couldn't feel any lag playing. The experience for him was near native quality. So this has been a wonderful, as Remote Coop opens the doors to so many games that never had a proper online feature. Turing and Ampere each took steps to improve their streaming abilities, which makes sense given how popular game streaming is with Twitch and Youtube.

    And then there is Iray. Obviously, Ampere gives you ray tracing cores, and that makes a massive difference in Iray. The actual speed improvement might vary by scene, because the amount of ray tracing versus shading can vary a lot. To put it simply, the more geometry you have in your scene, the more RTX can speed up the rendering.

    So there may be situations where the 1080ti gets close to the 3060. But I honestly doubt that is too common. When I get some time, I plan on doing some testing to try to work some of this out, since I still have a 1080ti and a 3090 installed.

    If you have both GPUs installed, then you can also have the option of using one or the other to play games. The 1080ti should have the edge in most games, especially older ones. But some newer games may run better on the 3060, plus the 3060 gives you DLSS and ray tracing. With DLSS the 3060 might beat the 1080ti. And it will beat the 1080ti in any form of ray tracing if you were to use it in games. Though I am not sure if you would want to use the 3060 for ray tracing games unless the effect is very light, as ray tracing is a very tasking process. Still, the 3060 can play some games with ray tracing on at 1080p and even 1440p. Also, the 3060 uses a lot less energy than the 1080ti, if that is any concern for you.

    Post edited by outrider42 on
  • marblemarble Posts: 7,500
    edited November 2021

    MelissaGT said:

    marble said:

    Ghosty12 said:

    AgitatedRiot said:

    kyoto kid said:

    ...it will come in handy particularly as you get more ambitious with your scenes. or say a 16 or 20 GB card comes out. 

    RTX 3090s has 24 GB VRAM. 

     The funny thing here is that the 3090 here in Australia while expensive as heck $3150 to $4500 AUD, is more worth it because of the 24GB of vram.. Although here one could buy a somewhat decent car for that price.. lol

     I count myself as one of the lucky ones. I had been bemoaning the lack of VRAM in my 1070 so my son suggested to give it o him as it was good enough for his gaming and he then helped me (massively) to buy a 3090 as soon as they appeared here in New Zealand. I had sold my iMac in order to upgrade the PC with decent CPU/RAM/Motherboard/PSU so basically I had a new PC..The price I paid for the 3090 was only a couple of hundred NZ$ over MSRP which, shortly after I bought it, shot up to well over NZ$1000 over MSRP. So I count myself lucky to have such a generous son and for getting the timing right.

    Render times are now not a problem but I wish it had made a bigger difference to dForce simulation times which are still annoyingly slow.

    So many folks say that it is slow, but out of curiosity, what is slow? I've been loving dForce from the start and I typically simulate on a timeline over the course of 30 - 60 frames and it doesn't feel slow to me. Maybe 5min per simulation, if that. I can probably count on one hand the times it has taken longer than 10 or so minutes, and most of the time it's a Linday hair with geoshells that does it. I went from a 1080TI to a 3090 and simulation is still pretty good. What can slow it down...is if you have a lot of morphs and jcm's on your figure...as it goes through the pose transition on the timeline and all those jcm's have to adjust with each frame...that can bog it down hardcore, but that's not the fault of dForce. I have the same problem with ActivePose...so many jcm's on certain characters...it's hideous how boggy it can get. 

     

    See - there's the difference - 5 minutes to simulate a dress is far too slow for me. I have Marvelous Designer 8 (*before* they introduced GPU simulation)and what takes minutes in dForce takes seconds in MD. I bought the VWD cloth sim product (unfortunately it crashed by PC so I had to stop using it) but that was way faster than dForce too. And with both of those mentioned products, it is possible to manipulate the cloth while simulating - something that should be a basic requirement (I think the Blender cloth sim can do that too). 

    I'm told that dForce uses the GPU but I can't say that I have noticed a difference in simulation times compared to render times when I upgraded to a 3090. I'm almost convinced that there is no difference.

    By the way ... I feel guilty that I have MD sitting there and I still buy all clothing form the store instead of getting down to the task of learning how to make them myself.

    Post edited by marble on
  • outrider42 said:

    I used a 1080ti with 4.15 so it worked for me. I had two 1080tis. My issue is that I could get crashes when using both GPUs to render after a few renders. It doesn't make sense because I could get these crashes even on small renders, using less than half the VRAM capacity. But using one by itself never presented an issue. That is not optimal though, as the whole point was to have both GPUs run to go faster. This issue has been present for a while across several versions of Daz for me. I can do a few renders, but then seemingly at random it crashes. Sometimes it crashes on the first render, I just never know. I have done clean GPU installs. I have even built an entirely new PC! So that this happens across multiple versions, and multiple desktops, would indicate Daz Studio Iray has a real problem.

    I believe the issue is how Daz and Iray manage memory. I close Daz Studio and it takes FOREVER for Daz to exit my system RAM. The larger the scene, the longer it takes. It can take 10, 15 minutes for Daz to stop showing up on MS Task Manager. I can sit there with Task Manager on and watch as Daz VERY SLOWLY drops in RAM usage. Literally 50MB per second, if that. So do the math. I can have scenes that are 50GB in size, and closing Daz only 50MB ticks off per second. This is not an exaggeration. As the Task Manager shows Daz drop 50MB per second, a 50GB scene takes about 17 minutes to clear completely and drop off the Task Manager list. What the hell is going on here? No other program I have ever used works this way. When I close pretty much any other program, it leaves my RAM in seconds. Daz seriously needs to address this because users are getting sick of it.

    That is Daz Studio cleaning up the ERC links between properties, it affects load time and exit time but Iray just gets the final shape 9at the final resolution), it doesn't care about rigging or morphs. You will find thata  geometry-heavy scene with few linked properties is likely to exit much more quickly than a relatively low-polygon figure with a lot of morphs.

    I managed to get a 3090 in October, and at the $1500 MSRP from Best Buy, who appears to be the only one to sell them for MSRP. I used the Hot Stock app to get the alerts, but it took a few months to finally score.

    The strange multiple GPU issue still happens. If I use both the 3090 and 1080ti I may get a crash. Sometimes it works, sometimes it doesn't. I tried the Iray benchmark and got a crash. That is why I didn't post a bench time with both the 3090 and 1080ti, they just crash. I don't know why. However, since the 3090 is more than twice as fast as my two 1080tis combined, it hasn't bothered me to leave the 1080ti out. Plus running both of these does warm up my computer room, LOL. I can use the 1080ti to do the viewport and stuff to give the 3090 as many resources as possible.

    Anyway, unless you get lucky like I did, today is the worst time in history to buy a GPU. It has never been like this before. If you can ride it out with what you have, I strongly suggest doing so. You could try using something like Hot Stock to grab one when they pop up, and that is about all you can do. Otherwise the prices are completely out of control.

    You may be able to play off these prices if you have something worth selling. The 1080ti for example can be sold for $700 on ebay with ease. That could take some of the sting out of overpaying for another card to replace it. It is very easy to use multiple GPUs with Iray. All you have to do is just plug them in! No SLI is needed for Iray. BUT you need to verify your power supply can handle both of these cards. I would assume it would if you had a 1080ti to begin with, you probably got a decent PSU. But you still need to check. Though I said I had some problems using 2 GPUs, it is still something I would do, because aside from crashes, when it worked it doubled my rendering speed. The 3060 is almost certainly faster than the 1080ti at rendering Iray. The 3060 is not too far off the 1080ti at gaming for that matter. There are games that are better optimized for the 3060. Additionally, if you ever happen to use Steam Remote Play, the experience with Ampere is much improved over the Pascal based 1000 series. I use Steam Remote Coop sometimes, this is where you stream your game to your coop partner, they do not even need a copy of the game because you are playing it off your machine like a server. My coop partners actually refused to play any more Remote Coop games because the experience was just that bad for them, even when playing simple 2D games that barely use the CPU/GPU. The image quality was blurry and the lag was very bad. When I built my new PC with a 5800X, that did not help Remote Play at all. I assumed it just wasn't possible, that something in our connection was the issue. But when I got my 3090, I was able to convince them to try Remote Play again. Not only was the image quality better on their end, he even said he couldn't feel any lag playing. The experience for him was near native quality. So this has been a wonderful, as Remote Coop opens the doors to so many games that never had a proper online feature. Turing and Ampere each took steps to improve their streaming abilities, which makes sense given how popular game streaming is with Twitch and Youtube.

    And then there is Iray. Obviously, Ampere gives you ray tracing cores, and that makes a massive difference in Iray. The actual speed improvement might vary by scene, because the amount of ray tracing versus shading can vary a lot. To put it simply, the more geometry you have in your scene, the more RTX can speed up the rendering.

    So there may be situations where the 1080ti gets close to the 3060. But I honestly doubt that is too common. When I get some time, I plan on doing some testing to try to work some of this out, since I still have a 1080ti and a 3090 installed.

    If you have both GPUs installed, then you can also have the option of using one or the other to play games. The 1080ti should have the edge in most games, especially older ones. But some newer games may run better on the 3060, plus the 3060 gives you DLSS and ray tracing. With DLSS the 3060 might beat the 1080ti. And it will beat the 1080ti in any form of ray tracing if you were to use it in games. Though I am not sure if you would want to use the 3060 for ray tracing games unless the effect is very light, as ray tracing is a very tasking process. Still, the 3060 can play some games with ray tracing on at 1080p and even 1440p. Also, the 3060 uses a lot less energy than the 1080ti, if that is any concern for you.

  • kyoto kidkyoto kid Posts: 41,202
    edited November 2021

    ...as I mentioned on one of the other forums, before closing out I click "New" to clear the scene and scene data, after which I exit the programme and it clears from system memory much quicker. Of course I'm not yet up to doing 50GB scenes until I can afford to perform the planned upgrade to a Ryzen 9 5900X and 64 GB system memory.

    As to rendering performance, my old Titan-X doesn't do too bad in 4.15.0,30 save for the hit on VRAM as OptiX Prime is always on for GTX cards (can't install any of the newer beta versions as they will overwrite the 4.11 beta I use for rendering without OptiX).  Single character render tests using a photo studio IBL, with simple clothing, and hair render in about minute or two.  I rarely do portraits so I have little need for SSS or fine  details like pores & such which lowers the texture load a bit   

    Interesting that on the  Tech Power Up specs page, the 3060 12 GB is rated 7% below the 1080Ti for gaining purposes (in contrast the Maxwel Titan-X is rated 39% below the 1080Ti).  Both the 1080 Ti and 3060 12 GB have 3584 shading units (CUDA cores) however in addition to the Tensor and RT cores, it also has a higher memory clock and boost clock. Used/refurbished 1080Tis are going for just over double, and new ones are priced at over four times.what i paid for my 3060.

    Post edited by kyoto kid on
  • Ghosty12Ghosty12 Posts: 2,065

    marble said:

    Ghosty12 said:

    AgitatedRiot said:

    kyoto kid said:

    ...it will come in handy particularly as you get more ambitious with your scenes. or say a 16 or 20 GB card comes out. 

    RTX 3090s has 24 GB VRAM. 

     The funny thing here is that the 3090 here in Australia while expensive as heck $3150 to $4500 AUD, is more worth it because of the 24GB of vram.. Although here one could buy a somewhat decent car for that price.. lol

     I count myself as one of the lucky ones. I had been bemoaning the lack of VRAM in my 1070 so my son suggested to give it o him as it was good enough for his gaming and he then helped me (massively) to buy a 3090 as soon as they appeared here in New Zealand. I had sold my iMac in order to upgrade the PC with decent CPU/RAM/Motherboard/PSU so basically I had a new PC..The price I paid for the 3090 was only a couple of hundred NZ$ over MSRP which, shortly after I bought it, shot up to well over NZ$1000 over MSRP. So I count myself lucky to have such a generous son and for getting the timing right.

    Render times are now not a problem but I wish it had made a bigger difference to dForce simulation times which are still annoyingly slow.

    I love my 1070ti and still have it just incase, but I bought a 3060 for many reasons from rendering speed, to not having to worry so much about crunching down textures and so on for scenes.. And you were very lucky to get a 3090 cheap, my 3060 was somewhat reasonable at $1150 AUD, the other reason was the money I did have set aside I was burning through it, so I bought it before I had no money left.. lol

  • kyoto kidkyoto kid Posts: 41,202

    ...that's about  the price I'm seeing here in the States when converted to USD.  However I believe you folks in Australia also has a VAT, correct?.

  • Ghosty12Ghosty12 Posts: 2,065

    kyoto kid said:

    ...that's about  the price I'm seeing here in the States when converted to USD.  However I believe you folks in Australia also has a VAT, correct?.

    Yeah we have a 10% GST added on pretty much everything we buy.. Plus we are at the whims of everyone from the AIB's to the distributors as well, since we can't get or find it difficult to get Founders Edition cards here..

  • kyoto kidkyoto kid Posts: 41,202
    edited November 2021

    ...is that included in the price? 

    Yeah Founders Editions tend to command a high price if you can find them as they were pretty much snapped up quickly   I got my 3060 XC from EVGA at as close as I've seen one to Nvidia's suggested price (79 USD more) without having to "camp out" at a Best Buy overnight (I'm in my late 60s).

    Post edited by kyoto kid on
  • Ghosty12Ghosty12 Posts: 2,065
    edited November 2021

    kyoto kid said:

    ...is that included in the price? 

    Yeah Founders Editions tend to command a high price if you can find them as they were pretty much snapped up quickly   I got my 3060 XC from EVGA at as close as I've seen one to Nvidia's suggested price (79 USD more) without having to "camp out" at a Best Buy overnight (I'm in my late 60s).

    Yeah GST is included in the final price.. I got my Gigabyte RTX 3060 Eagle 12G R2.0 at $1149 AUD, which at the time was considered cheap.. lol It was actually quite a chore to get one, as previously to make sure folks could get new videocards computer shops were told by their suppliers, that it was preferred that the customer buy a whole system in order to get a new videocard.. crying

    Post edited by Ghosty12 on
  • kyoto kid said:

    ...as I mentioned on one of the other forums, before closing out I click "New" to clear the scene and scene data, after which I exit the programme and it clears from system memory much quicker. Of course I'm not yet up to doing 50GB scenes until I can afford to perform the planned upgrade to a Ryzen 9 5900X and 64 GB system memory.

    For me File>New takes longer than just closign DS. As i said, there do seem to eb other factors in play which mean that different people see different behaviours (some of us find aplying a Character preset much slower than just loading the base figure and aplying shaping and materials, others don't).

    As to rendering performance, my old Titan-X doesn't do too bad in 4.15.0,30 save for the hit on VRAM as OptiX Prime is always on for GTX cards (can't install any of the newer beta versions as they will overwrite the 4.11 beta I use for rendering without OptiX).  Single character render tests using a photo studio IBL, with simple clothing, and hair render in about minute or two.  I rarely do portraits so I have little need for SSS or fine  details like pores & such which lowers the texture load a bit   

    Interesting that on the  Tech Power Up specs page, the 3060 12 GB is rated 7% below the 1080Ti for gaining purposes (in contrast the Maxwel Titan-X is rated 39% below the 1080Ti).  Both the 1080 Ti and 3060 12 GB have 3584 shading units (CUDA cores) however in addition to the Tensor and RT cores, it also has a higher memory clock and boost clock. Used/refurbished 1080Tis are going for just over double, and new ones are priced at over four times.what i paid for my 3060.

  • outrider42outrider42 Posts: 3,679

    That is Daz Studio cleaning up the ERC links between properties, it affects load time and exit time but Iray just gets the final shape 9at the final resolution), it doesn't care about rigging or morphs. You will find thata  geometry-heavy scene with few linked properties is likely to exit much more quickly than a relatively low-polygon figure with a lot of morphs.

    I do not care what the reason for staying in memory is. A better question is WHY does the program need to clean up anything at all when it is being shut down? JUST SHUT DOWN.

    If I shut down Blender, it doesn't matter how massive the scene is. It doesn't matter if I have several millions of polygons, with lots of stuff going on, with lots of posing. BLENDER SHUTS DOWN. Oh, and these are Daz characters exported to Blender. Why is it that Blender handles Daz models better than Daz Studio handles Daz models? There is a fundamental flaw with the way Daz Studio handles these properties.

    And yes I am repeating myself on purpose to drive the point home. This is getting ridiculous. Daz Studio is becoming too clunky and bloated. Rendering speed is cool, but the program itself needs to be usable and responsive for its users. There is no good reason why it takes so long to load this stuff, nor is there any good reason for it to take so long shutting down.

    Interesting that on the  Tech Power Up specs page, the 3060 12 GB is rated 7% below the 1080Ti for gaining purposes (in contrast the Maxwel Titan-X is rated 39% below the 1080Ti).  Both the 1080 Ti and 3060 12 GB have 3584 shading units (CUDA cores) however in addition to the Tensor and RT cores, it also has a higher memory clock and boost clock. Used/refurbished 1080Tis are going for just over double, and new ones are priced at over four times.what i paid for my 3060.

    It is just like I said, the 1080ti is faster at gaming generally, but not that much faster, the 3060 can even beat it in some new games. It depends on the game. So I don't really understand why some look at the 3060 as such a downgrade from the 1080ti for gaming...it isn't. If anything, they are pretty much the same performance wise and the latest games might run better on the 3060. It is interesting the 3060 has the same number of shaders, but that is not important because architectures change so much between generations. What is interesting is to see the difference in power draw. The 3060 can use a full 100 Watts less energy than the 1080ti, which could be a factor for some people.

    Because the 3060 is decently efficient, it is easier to double up if one desires. You can pop two 3060s into a PC and still use less energy than a 3080. While the 3080 will obviously torch the 3060 at gaming since there is no SLI anymore, the two 3060s will probably render faster than a single 3080 while also using around the same energy AND while having more VRAM since the 3080 only has 10gb. Plus if market conditions were not so horrible, two 3060s might even be cheaper than a 3080. The bench scene the 3080 did around 12 iterations per second, while the 3060 did 8. So two 3060s would beat a 3080 fairly easily here. 

    The 3060 also beat the 2080 from last gen in the Iray benchmark scene, which is pretty impressive, and again, with 12gb, it offers more memory. So for Iray the 3060 makes even the 2080 obsolete. These two factors make the 3060 uniquely ideal for Iray, because of how important VRAM is to Iray. Because of weird Nvidia had to be, the 3060 offers the 2nd largest amount of VRAM for the 3000 series so far. Only the 3080ti equals it, and only the 3090 offers more (much more.)

    So I would not feel any shame in buying a 3060. Just know what you are getting, and if you do play video games, that might alter the choice a bit. There are countless gaming benchmarks out there to compare, but Iray does not have really any place to compare other than the little thread here at Daz. Iray is just too niche for hardware reviewers to take a look at it more.

  • marblemarble Posts: 7,500
    edited November 2021

    @outrider42 you make a good point about Blender shutdown. In fact, I can't remember ever having a delay shutting any other program down either. Not that I tend to play with other DAZ Studio/Blender type software. I've had big, bloated games on  my PC which shutdown instantly. And I have VR stuff for my Oculus Rift (including a "game" that uses DAZ content) - no problem. In fact, until I saw the discussions here about the clear memory delays I assumed it was a problem with my PC and was looking at all kinds of memory diagnostic utilities.

    I still don't understand why, if I remain in DAZ Studio and click "New", there is no delay. I can just start on a new scene. 

    Post edited by marble on
  • kyoto kidkyoto kid Posts: 41,202

    ...I've clicked "New" to clear the scene with even fairly involved scenes and Daz still closes out of system memory pretty quick.

  • UHFUHF Posts: 515

    Sometimes I just kill Daz when it hangs on exit or worse... when you try to cancel something... anything.  Its just not worth the wait.

  • TorquinoxTorquinox Posts: 3,571

    Kyoto Kid and Outrider42, thanks for your thoughts about cards and computers here. I find what you're saying quite reassuring. enlightened

  • jmtbankjmtbank Posts: 175
    edited November 2021

    I had a 3060 recently and it was way faster than my old 1080ti.  Not far off twice the speed as stated in the benchmark figures earlier in this thread.  I ran the benchmark when underclocking to 120Watts, but +1000 memory overclock and got a slightly faster result than the ones in the linked benchmark thread.  The card is very efficient.

    I can also confirm that you have a lot more available memory in recent Daz revisions than the 12 vs 11gb would suggest due to the reasons kyoto kid has stated.  I had given up running recent daz at all on my 1080ti and went back to CPU rendering.  The 3060 made GPU rendering viable again for me.

    That said, I think I'd rather do CPU rendering than spend the stated $1K on one.  I paid $400 for the 3060 a few months back when Ethereum bottomed to $1800.  Are the 3090 founders just no longer dropping where you live?  You would pay the extra $500 for than instead of $1000 on a 3060.  I would return then 3060 if it is still within change of mind period at that price if you still feel as you do when starting the thread.

    Post edited by jmtbank on
  • rrwardrrward Posts: 556

    Torquinox said:

    I've been shopping Alienware systems on the Dell site. If I spec a system with 12GB 3060 and 64GB RAM, the 3090 is a $1274 upgrade and must add another ~$500 to get RAM up to 128GB. I already selected cases with 750W or 1000W PSU, depending on processor generation and all that. So, it's an $1800 premium to add in the 3090 and get max value out of it. 

    Don't buy from Dell until you look at videos put out by Gamer's Nexus. Dell's consumer machines, even the Alienware boxes, are really dodgy. 

  • TorquinoxTorquinox Posts: 3,571
    edited November 2021

    rrward said:

    Torquinox said:

    I've been shopping Alienware systems on the Dell site. If I spec a system with 12GB 3060 and 64GB RAM, the 3090 is a $1274 upgrade and must add another ~$500 to get RAM up to 128GB. I already selected cases with 750W or 1000W PSU, depending on processor generation and all that. So, it's an $1800 premium to add in the 3090 and get max value out of it. 

    Don't buy from Dell until you look at videos put out by Gamer's Nexus. Dell's consumer machines, even the Alienware boxes, are really dodgy. 

    I see. Hmm... I've had Dells for a long time. What would you recommend? 

    Add:

    The Gamer's Nexus folks say ABS for build quality, but they don't seem to have AMD processors.

    Post edited by Torquinox on
  • TorquinoxTorquinox Posts: 3,571

    What do people here think about HP Omen computers? I can configure to suit and for less than Dell.

Sign In or Register to comment.