New AMD hardware & DAZ3D

Hi folks.

I'm soon to buy a brand new desktop for my DAZ3D venture and noticed that AMD are soon to release some awesome new hardware, Ryzen processors and Navi GPU's.

(Scheduled for July 7th)

My main question is: Will this new hardware work nicely with the DAZ3D software and enable lightning fast rendering etc/?

(I guess there will be teathing problems but that's not unexpected).

Happy to pile a shed load of money into this new hardware IF it all works well together with the software.

Any feedback is most welcomed.

Thanks, N.

Comments

  • kenshaw011267kenshaw011267 Posts: 3,805

    The Navi GPU's will not work with Iray at all. 

    The Ryzen 3000 CPU's will be very nice but don't expect some giant increase in rendering speed from these CPU's in Iray. If you render in 3Delight you should see some gains if your present hardware is older.

  • LenioTGLenioTG Posts: 2,118

    You can pick a nice Ryzen 3000 CPU without any problem, but you absolutely need an Nvidia GPU to use Iray properly!!!

    It's a good idea to wait for AMD GPUs to come out anyway, since Nvidia may lower the prices accordingly (most GPUs are sold to gamers, and they can use AMD as well).

  • Not a huge fan of Daz relying on such a specific brand of gpus (and especially architecture, since RTX cards are only suppported in the next version of DS, currently in beta).  I've never been a brand loyalist, and even have an Nvidia card in my current machine, but if I wanted to upgrade - specifically for Daz Studio - I'd have to pay an outrageous price.  I can see the tides turning very soon, since AMD supposedly boasts 1080-level performance at a much more affordable price; whereas if you wanted a 1080, you'd have to pay an inflated price since nVidia pulled production.  For people who don't want to pay extra for the raytracing features, that doesn't leave very many options.

  • fastbike1fastbike1 Posts: 4,078

    @harrcj101

    there are rtx cards at all price levels comp to 1000 series.

  • Sure, but if you aren't looking for 'realtime raytracing in videogames,' you'd be paying a lot for a feature you don't want.  It's not even a widely-utilized feature, and you wouldn't be able to benefit from it much if you went for a 2050.

    To be a little more clear, I was referring to the GTX 1660/Ti as the only reasonably-priced card for the Daz-Studio-But-Not-Gaming Enthusiast.  I kinda wish they'd make a 1680.

  • kenshaw011267kenshaw011267 Posts: 3,805

    You are aware a RTX enabled version of iray is due in the next 6 months?

  • I even acknowledged it in my first post.

  • kenshaw011267kenshaw011267 Posts: 3,805
    harrcj101 said:

    I even acknowledged it in my first post.

    No, you didn't. Supporting the RTX cards is not the same as enabling RTX features on them in iray. When RTX Iray comes out the RTX cards will, based on the Vray results, have something between 40% and 100% increase in rendering performance. Recommending people not buy them makes no sense.

  • mclaughmclaugh Posts: 221
    harrcj101 said:

    I even acknowledged it in my first post.

    No, you didn't. Supporting the RTX cards is not the same as enabling RTX features on them in iray. When RTX Iray comes out the RTX cards will, based on the Vray results, have something between 40% and 100% increase in rendering performance. Recommending people not buy them makes no sense.

    Doesn't make sense for YOUR use case; lots of different use cases where it doesn't make sense.

  • harrcj101 said:

    I even acknowledged it in my first post.

    No, you didn't. Supporting the RTX cards is not the same as enabling RTX features on them in iray. When RTX Iray comes out the RTX cards will, based on the Vray results, have something between 40% and 100% increase in rendering performance. Recommending people not buy them makes no sense.

    Well I guess that's a problem of equivocation then.  I thought I was making an umbrella-statement that included RTX support (as in, it works in DS 4.11) AND taking advantage of features. I'll be more specifi c next time.  Speaking of misunderstanding, where did I recommend that people not buy them?

  • nicsttnicstt Posts: 11,715

    I'm sympathetic with the OP.

    I bought a 980ti and have had good use out of it; to get the same value out of a 2080ti or another 20 series, I'd need to keep it nearly twice as long before upgrading. So, I'm still thinking about it, including watching how Octane do with enabling AMD cards; that is much more interesting. I wish Daz would better integrate Octane into Studio.

  • kenshaw011267kenshaw011267 Posts: 3,805
    harrcj101 said:
    harrcj101 said:

    I even acknowledged it in my first post.

    No, you didn't. Supporting the RTX cards is not the same as enabling RTX features on them in iray. When RTX Iray comes out the RTX cards will, based on the Vray results, have something between 40% and 100% increase in rendering performance. Recommending people not buy them makes no sense.

    Well I guess that's a problem of equivocation then.  I thought I was making an umbrella-statement that included RTX support (as in, it works in DS 4.11) AND taking advantage of features. I'll be more specifi c next time.  Speaking of misunderstanding, where did I recommend that people not buy them?

    "To be a little more clear, I was referring to the GTX 1660/Ti as the only reasonably-priced card for the Daz-Studio-But-Not-Gaming Enthusiast.  I kinda wish they'd make a 1680."

  • kenshaw011267kenshaw011267 Posts: 3,805

    This claim keeps coming up. I work in a datacenter, I'm manager of IT for the center, and we run a lot of CUDA applications for clients. To the best of my knowledge none of that software was written with support from Nvidia, I've certainly never dealt with anyone from Nvidia except when ordering cards and none of the devs I've spoken to about problems with their software have ever said anything about being in contact with Nvidia. Since Nvidia makes an order of magnitude, at least, more from datacenter and other Quadro sales than they do from gaming card sales I'd think if Nvidia really was doing what is claimed they'd be doing it in the datacenter market.

    Has Nvidia invested a small amount, compared to what the company makes, in getting game companies to adopt their specific techs? Yes. But that hasn't killed off Radeon. What is killing Radeon is years and years of not coming anywhere near the performance of Nvidia's flagship products. When was the last time a Radeon card matched the xx80ti card of the latest generation? Maybe the 780ti and the R9 290X? I won't even hazard a guess as to when it was true on the professional side. Radeon pro's and Firepro's have never been comparable to the highest end Quadro's since at least 2007 when CUDA was introduced.

    Are the Turing gaming cards overpriced? Yes. But that is not true of older generations of cards. The 1080ti's performance and performance uplift over earlier cards made it a real bargain for a gamer or content creator not willing to spend on a Quadro. Can anyone really argue that the $700 price of a 1080ti wasn't worth it compared to either the RX590 or Vega 64 if you wanted to game at 4k, or high refresh at lower res, or were a content creator? TBH Nvidia's execs are probably regreting the decision to end production of Pascal chips. People have been very slow to buy Turing, as the imminent price drop shows, while there is a lot of evidence that both in the commercial and pro spaces Pascal was their best selling microarchitecture. I know that of the several hundred Quadro's in my center less then 50 are RTX cards while I am pretty sure less than that are Maxwell or Kepler. IIRC we have one GV100 so Volta was a complete bust for Nvidia which they clearly knew as they never even tried to expand the line.

    Look at what is finally coming out of AMD next month, Navi's top card at release is a 2070 competitor. That might be where the meat of the market is and AMD may have simply decided to not compete against Nvidia's top tier as the number of sales wouldn't justify the costs but I can guarantee you that flagships drive the market even if most people don't buy them. The reviews will certainly make clear that the card is outclassed by the 2080 and 2080ti and that if the RTX price drop does happen it may well be more expensive than the 2070. That is no way to gain market share.

  • PsyckosamaPsyckosama Posts: 495
    edited June 2019

    This claim keeps coming up. I work in a datacenter, I'm manager of IT for the center, and we run a lot of CUDA applications for clients. To the best of my knowledge none of that software was written with support from Nvidia, I've certainly never dealt with anyone from Nvidia except when ordering cards and none of the devs I've spoken to about problems with their software have ever said anything about being in contact with Nvidia. Since Nvidia makes an order of magnitude, at least, more from datacenter and other Quadro sales than they do from gaming card sales I'd think if Nvidia really was doing what is claimed they'd be doing it in the datacenter market.

    Has Nvidia invested a small amount, compared to what the company makes, in getting game companies to adopt their specific techs? Yes. But that hasn't killed off Radeon. What is killing Radeon is years and years of not coming anywhere near the performance of Nvidia's flagship products. When was the last time a Radeon card matched the xx80ti card of the latest generation? Maybe the 780ti and the R9 290X? I won't even hazard a guess as to when it was true on the professional side. Radeon pro's and Firepro's have never been comparable to the highest end Quadro's since at least 2007 when CUDA was introduced.

    Are the Turing gaming cards overpriced? Yes. But that is not true of older generations of cards. The 1080ti's performance and performance uplift over earlier cards made it a real bargain for a gamer or content creator not willing to spend on a Quadro. Can anyone really argue that the $700 price of a 1080ti wasn't worth it compared to either the RX590 or Vega 64 if you wanted to game at 4k, or high refresh at lower res, or were a content creator? TBH Nvidia's execs are probably regreting the decision to end production of Pascal chips. People have been very slow to buy Turing, as the imminent price drop shows, while there is a lot of evidence that both in the commercial and pro spaces Pascal was their best selling microarchitecture. I know that of the several hundred Quadro's in my center less then 50 are RTX cards while I am pretty sure less than that are Maxwell or Kepler. IIRC we have one GV100 so Volta was a complete bust for Nvidia which they clearly knew as they never even tried to expand the line.

    Look at what is finally coming out of AMD next month, Navi's top card at release is a 2070 competitor. That might be where the meat of the market is and AMD may have simply decided to not compete against Nvidia's top tier as the number of sales wouldn't justify the costs but I can guarantee you that flagships drive the market even if most people don't buy them. The reviews will certainly make clear that the card is outclassed by the 2080 and 2080ti and that if the RTX price drop does happen it may well be more expensive than the 2070. That is no way to gain market share.

    Two things.

    1) They're going for a mid range card because that's what peole buy.

    2) AMD cards are actually monsterously good at compute. I have a friend who actually does own both a Vega 7 and a RTX 2080Ti. In compute based operations like liquid simulation, the Radeon eats the 2080 Ti alive. It also hasn't had to be RMAed twice.

    3) Cuda is not made of magical marshmallows. Programing something that is platform agnostic is in no way a detriment to anyone but nvidia.

    Post edited by Psyckosama on
  • kenshaw011267kenshaw011267 Posts: 3,805

    This claim keeps coming up. I work in a datacenter, I'm manager of IT for the center, and we run a lot of CUDA applications for clients. To the best of my knowledge none of that software was written with support from Nvidia, I've certainly never dealt with anyone from Nvidia except when ordering cards and none of the devs I've spoken to about problems with their software have ever said anything about being in contact with Nvidia. Since Nvidia makes an order of magnitude, at least, more from datacenter and other Quadro sales than they do from gaming card sales I'd think if Nvidia really was doing what is claimed they'd be doing it in the datacenter market.

    Has Nvidia invested a small amount, compared to what the company makes, in getting game companies to adopt their specific techs? Yes. But that hasn't killed off Radeon. What is killing Radeon is years and years of not coming anywhere near the performance of Nvidia's flagship products. When was the last time a Radeon card matched the xx80ti card of the latest generation? Maybe the 780ti and the R9 290X? I won't even hazard a guess as to when it was true on the professional side. Radeon pro's and Firepro's have never been comparable to the highest end Quadro's since at least 2007 when CUDA was introduced.

    Are the Turing gaming cards overpriced? Yes. But that is not true of older generations of cards. The 1080ti's performance and performance uplift over earlier cards made it a real bargain for a gamer or content creator not willing to spend on a Quadro. Can anyone really argue that the $700 price of a 1080ti wasn't worth it compared to either the RX590 or Vega 64 if you wanted to game at 4k, or high refresh at lower res, or were a content creator? TBH Nvidia's execs are probably regreting the decision to end production of Pascal chips. People have been very slow to buy Turing, as the imminent price drop shows, while there is a lot of evidence that both in the commercial and pro spaces Pascal was their best selling microarchitecture. I know that of the several hundred Quadro's in my center less then 50 are RTX cards while I am pretty sure less than that are Maxwell or Kepler. IIRC we have one GV100 so Volta was a complete bust for Nvidia which they clearly knew as they never even tried to expand the line.

    Look at what is finally coming out of AMD next month, Navi's top card at release is a 2070 competitor. That might be where the meat of the market is and AMD may have simply decided to not compete against Nvidia's top tier as the number of sales wouldn't justify the costs but I can guarantee you that flagships drive the market even if most people don't buy them. The reviews will certainly make clear that the card is outclassed by the 2080 and 2080ti and that if the RTX price drop does happen it may well be more expensive than the 2070. That is no way to gain market share.

    Two things.

    1) They're going for a mid range card because that's what peole buy.

    2) AMD cards are actually monsterously good at compute. I have a friend who actually does own both a Vega 7 and a RTX 2080Ti. In compute based operations like liquid simulation, the Radeon eats the 2080 Ti alive. It also hasn't had to be RMAed twice.

    3) Cuda is not made of magical marshmallows. Programing something that is platform agnostic is in no way a detriment to anyone but nvidia.

    1) I acknowledged that that might be the reason. However they're going to be savaged in reviews which will cost them sales.

    2) No idea what software your friend is using but I have a datacenter full of servers running all kinds of software and not a single Radeon Pro because no customer has ever specified that they wanted them.

    3) Actually programming portable compute code is a big problem. There is no real equivalent to Direct X or Vulkan on the compute side so coders have to write their code for the specific features of Nvidia or AMD or write twice the code. 

Sign In or Register to comment.