Quick question on nVidia cards

I have always believed to buy to best you can get for your budget. Looking around on Newegg, I see that I can either get a 1070 8gb, or a 2060 6gb. From what I can see, they have the same number of CUDA cores, and seem to run the same speed. So logic suggests that the 1070 is actually the better card, because it has more RAM. So, before I drop this kind of money, I thought I would ask: Is there any reason to pick the 2060 over the 1070? I've been brand loyal to ATI for decades, even after AMD bought them. nVidia has done a great job of crowding everyone else out of the market, and I've decided that faster render times would, indeed, be worth a switch. Thank you for your help.
Post edited by dragonreborn2k6 on

Comments

  • FSMCDesignsFSMCDesigns Posts: 12,781

    the 2060 in newer tech, but all Iray cares about is DDR and CUDA, so I would go with the 1070

  • Thanks
  • kenshaw011267kenshaw011267 Posts: 3,805

     Newer generation CUDA is faster and the 20xx cards use GDDR6 while the 10xx cards use GDDR5. So if you can fit your scenes in 6Gb the 2060 should render faster than the 1070 and that assumes the RTX features are never enabled for the 20xx cards. If the RTX features ever get enabled that would also provide a substantial render speed improvement. So the question has to how important is 8 Gb to you?

  •  Newer generation CUDA is faster and the 20xx cards use GDDR6 while the 10xx cards use GDDR5. So if you can fit your scenes in 6Gb the 2060 should render faster than the 1070 and that assumes the RTX features are never enabled for the 20xx cards. If the RTX features ever get enabled that would also provide a substantial render speed improvement. So the question has to how important is 8 Gb to you?

    The GDDR6 comes with a higher latency, too. I think the extra $60 for a feature (RTX) that doesn't even work right now isn't worth it. I also didn't see any prooff that the CUDA cores were actually different. Clock speed is the key, and they were identical. I think the bigger difference is a smaller core size, which doesn't have a net gain effect on anything. Same thing with AMD cards. Some generations are literally the same as the previous, but with a more efficient build design. Maybe with a different directX protocol enabled (which is more driver-based.)

    I don't use 6GBs now, using CPU-only renders. So I figure the extra 2 GBs isn't that big a deal. I do figure that the xx60 line is the "Budget-minded" line. Generally, these come with fans that aren't as good, not as good heat pipe tech, etc. The real difference between most budget cards and the next grade up is almost always related to it's cooling aparatus. Which is why you'll hear about folks who go to water-cooling and push budget cards to well beyond their rating via overclock.

    I figure by the time RTX actually does anything, the price will have come down. Then again, I might also find that rendering a 4k scene on a 1070 is already so much better than the CPU-only renders I do now that I won't see any reason to switch.

    If all IRAY cares about is CUDA and RAM, then the 1070 will outperform the 2060, because of the RAM latency. All that other stuff has no effect on Iray. Heck, right now I'm rocking a fantastic (now one generation old) top-of-the-line AMD card which TECHNICALLY had much greater specs than a 1070. Only the software don't care because it wants CUDA. I'm a bit unhappy that Daz doesn't support the AMD render engine, but hey. Gotta work with what I've got.

    I'm starting work on a webcomic, and these 6,7,9 hour renders for simple 1920x1080 renders is killing me. When you need 5-7 panels per page... ugh. And I refuse to reduce quality further. So, I'm making the switch to a company that I dislike, who charges a lot more for cards with techincally lower specs, and gets away with it because they sponser everything. They push companies to write the software to be "nVidia optimized" which is a fancy way of saying "Make it suck on AMD-based systems."

    Anyway, not meaning to rant. I just wanted to figure out if spending the extra money for the 2060 was going to be worth it. Since Daz Studio only cares for CUDA cores and RAM size/speed, than I have my answer.

  • damselnoir007damselnoir007 Posts: 24
    edited March 2019

    Reguardless of ram and speed, MAKE SURE  the card you pick works with the nVidia rendor feature of DAZ3D. I have a 2080 in my computer and after nVidia last software update, it stoppped rendering for DAZ. I even back versioned the nVidia driver to the orignal, but no luck. Pretty fricking shitty on nVidias part, so be careful.

    Post edited by damselnoir007 on
  • Richard HaseltineRichard Haseltine Posts: 102,730

    Reguardless of ram and speed, MAKE SURE  the card you pick works with the nVidia rendor feature of DAZ3D. I have a 2080 in my computer and after nVidia last software update, it stoppped rendering for DAZ. I even back versioned the nVidia driver to the orignal, but no luck. Pretty fricking shitty on nVidias part, so be careful.

    417.71 and the new 419.x should be OK, any 418.x is known to be bad 9for iray, and I think Octane, in general not just in DS).

  • I got the new 1070. Started a render. I have Task Manager open to see how it does. At 11.5 minutes, I'm at 3% of the render. And it says the GPU is working at 0%. I'm a bit confused. It must be doing something because I shut off the CPU out of the render. Shouldn't it be pushing the GPU? Have I done something wrong? Have I discovered another installation bug to eradicate? Or is this normal?
  • SixDsSixDs Posts: 2,384

    There will usually be a slight delay at the beginning as everything loads into memory, but not what you have indicated. I would suggest that you have a look at the DAZ Studio log file (scroll right to the bottom) to see if it can provide some insight as to what is going on.

  • Richard HaseltineRichard Haseltine Posts: 102,730

    I recommend using GPU-Z, not task Manager, to monitor this https://www.techpowerup.com/download/techpowerup-gpu-z/

Sign In or Register to comment.