RTX 2080ti or x2 2070

Hi

 

Im looking into getting a new computer and I understand that the more CUDA cores your graphic card has the faster the render will be (in general).

If I buy two RTX 2070 I will have more CUDA cores than a single RTX 2080ti and the price would be cheaper too

Does DAZ take advantage of having dual cards or the software use only one card when rendering ?

 

Thanks.

Comments

  • DAZ Studio will use both, but not both memory! So with the 2080ti you will have more VRAM.

  • kenshaw011267kenshaw011267 Posts: 3,805
    edited February 2020

    DAZ Studio will use both, but not both memory! So with the 2080ti you will have more VRAM.

    Today. The Beta version says they support VRAM pooling through NVLink. So 2 2070's with a NVLink connector would get nearly 16Gb for rendering.

    But even now the 2 cards would be used.

    Post edited by kenshaw011267 on
  • DAZ Studio will use both, but not both memory! So with the 2080ti you will have more VRAM.

    Today. The Beta version says they support VRAM pooling through NVLink. So 2 2070's with a NVLink connector would get nearly 16Gb for rendering.

    But even now the 2 cards would be used.

    Good news here...yes

  • alex86firealex86fire Posts: 1,130

    I would suggest one 2070 super for start. If you can wait until the end of the year with it, see what the new generation brings. There are a lot of rumours of good things. If you can't wait you can get a second 2070 super.

    I have one atm and am very happy about render times. There is a thread about render times around here and I feel like for render speed the 2080ti is not worth it's price.

  • DAZ Studio will use both, but not both memory! So with the 2080ti you will have more VRAM.

    Today. The Beta version says they support VRAM pooling through NVLink. So 2 2070's with a NVLink connector would get nearly 16Gb for rendering.

    But even now the 2 cards would be used.

    Are you sure that that will work with the 2070?

  • fastbike1fastbike1 Posts: 4,078
    edited February 2020

    Definitely a 2070 Super instead of a 2070. Same price. Better card.

    FWIW, the Nvidia site shows NVlink support only for the 2080TI, 2080 Super 2080 and 2070 Super.

    Post edited by fastbike1 on
  • With my new 2080ti I'm able to render scenes in a 10th of the time than with my 1070. but the price... the Price... is sooooo insane. sad

  • LeanaLeana Posts: 11,821

    DAZ Studio will use both, but not both memory! So with the 2080ti you will have more VRAM.

    Today. The Beta version says they support VRAM pooling through NVLink. So 2 2070's with a NVLink connector would get nearly 16Gb for rendering.

    But even now the 2 cards would be used.

    Are you sure that that will work with the 2070?

    According to this post it requires nvlink 2.0, so it works only with some cards:

    https://www.daz3d.com/forums/discussion/comment/5289361/#Comment_5289361

  • DAZ Studio will use both, but not both memory! So with the 2080ti you will have more VRAM.

    Today. The Beta version says they support VRAM pooling through NVLink. So 2 2070's with a NVLink connector would get nearly 16Gb for rendering.

    But even now the 2 cards would be used.

    Are you sure that that will work with the 2070?

    I forgot the 2070 has no connector. My bad. If you intend to do this get 2 2070 Supers.

  • GordigGordig Posts: 10,174

    Has anyone posted any success stories of memory pooling in DS yet?

  • Gordig said:

    Has anyone posted any success stories of memory pooling in DS yet?

    Not that I've seen. Not sure how many people have the correct hardware.

  • RobinsonRobinson Posts: 751
    edited February 2020

    It's a bad time to be buying right now, both CPU and GPU.  NVIDIA are going to release their next level stuff (30xx) first half of this year and AMD will release Zen 3 towards the end of the year.  I'd wait.

    Take this with a pinch of salt but I read, "From what we’ve heard thus far, the alleged 7nm graphics cards may be up to 50% faster than current Turing GPUs, although that may be when it comes to ray tracing only – improving the performance of the latter is apparently going to be a major focus for Ampere (which wouldn’t be surprising)."

    Post edited by Robinson on
  • kenshaw011267kenshaw011267 Posts: 3,805
    edited February 2020
    Robinson said:

    It's a bad time to be buying right now, both CPU and GPU.  NVIDIA are going to release their next level stuff (30xx) first half of this year and AMD will release Zen 3 towards the end of the year.  I'd wait.

    Take this with a pinch of salt but I read, "From what we’ve heard thus far, the alleged 7nm graphics cards may be up to 50% faster than current Turing GPUs, although that may be when it comes to ray tracing only – improving the performance of the latter is apparently going to be a major focus for Ampere (which wouldn’t be surprising)."

    I strongly, strongly doubt that there will be Ampere cards for sale by the end of June. The absolute earliest Nvidia is going to announce the cards, which still are not officially announced even to the datacenter market, is GTC at the end of March. Those announcement have preceded the card releases by 3 to 6 months.

    Zen 3 is likely to be announced at Computex the first week of June (they passed on announcing at CES in January which has been their practice) Their lead from announcing the product to release has been roughly 6 months. That makes a release of Zen 3 before Xmas unlikely.

    If you wait for the next cool thing you'll never pull teh trigger. Tech is always about to release the next cool thing.

    Post edited by kenshaw011267 on
  • RobinsonRobinson Posts: 751
    If you wait for the next cool thing you'll never pull teh trigger. Tech is always about to release the next cool thing.

    I suppose the optimal point to pull the trigger depends how much money you have and how often you can do it.  My upgrade cycle is 4-5 years, which is an age in technology terms.

  • Robinson said:
    If you wait for the next cool thing you'll never pull teh trigger. Tech is always about to release the next cool thing.

    I suppose the optimal point to pull the trigger depends how much money you have and how often you can do it.  My upgrade cycle is 4-5 years, which is an age in technology terms.

    That's not a terrible cycle if you buy above average when you do buy. If you bought a GTX 970 and i7 5000 series in late 2014 you'd be fine in most task even today. iRay is pretty GPU intensive so that single 970 would not be holding up well so you might have pulled the trigger after only 4 years and gone to a R7 2700 and RTX 2070 in late 2018 in which case you'd be in very good shape for the next 2 years at least, I hope as that's the hardware I have and really don't want to build a new system before 2022 or 23.

  • fastbike1fastbike1 Posts: 4,078

    @kenshaw011267 "That's not a terrible cycle if you buy above average when you do buy."

    True that. I went from a 780TI to a 980TI a couple months after the 10XX GPUs were available. About due for a new system since it dates from the 780TI. Will be at least a 2080 Super in the new one (but probaly a 2080TI).

Sign In or Register to comment.