How Many GPUs can Daz really use for Iray?

In anticipation of a massive nVidia 1080ti price drop, I'm dreaming about a little mini-iRay render farm.

But, I heard that IRay is restricted by the thread number on the processor... or is it the core number?

So for example, if you got a mining motherboard that could power 19 GPUs, would you be restricted to the 16 threads that the processor could supply?

Or is there a bottleneck with Windows 10 or nVidia drivers?

I've read elsewhere that you can only use 8 nvidia GPUs with any Windows 10 motherboard set-up? Has anyone else looked into this and came to any conclusions?

 

 

Comments

  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,501
    edited August 2020

    It really is theorictical vs actual..

    I couldnt get a machine a stable with 7 GPU's.. so your milage will definately vary.  I think if you run some enterprise level hardware (motherboard etc) and linux, you'll be ok.. but I certainly had very little luck with a Windows solution that should have worked on paper (Asus X299 SAGE/10).

    Post edited by Daz Jack Tomalin on
  • kenshaw011267kenshaw011267 Posts: 3,805

    iRay's technical documentation says 1 thread per GPU.

    No mining motherboard supports a CPU with enough threads to fully populate all of its PCIE connections in iRay.

    There has been a bunch of very confusing info about the number of GPU's supported by Win10. 8 is definitely supported. More might be but I've been unable to verify.

  • I've seen a few mining motherboards that can support up to 19 gpus. They all use LGA 1151, the most threads on any of those processors is 16. But, I heard there was some sort of bottleneck with the number of nVidia cards that can possibly be used. Maybe 8 is the limit?

  • I think the limit is in IRay's implementation, not CUDA specifically, which the mining boards would be using.

  • I've seen a few mining motherboards that can support up to 19 gpus. They all use LGA 1151, the most threads on any of those processors is 16. But, I heard there was some sort of bottleneck with the number of nVidia cards that can possibly be used. Maybe 8 is the limit?

    The socket is nearly irrelevant. What matters is the chipset. I think those were all B250. That would max out at the i7 7700 which has 4c/8t. So you'd be stuck at 8GPU's no matter how many PCIE risers you had.

  • The socket is nearly irrelevant. What matters is the chipset. I think those were all B250. That would max out at the i7 7700 which has 4c/8t. So you'd be stuck at 8GPU's no matter how many PCIE risers you had.

    Seems that Biostar has a few motherboards that use 9th Gen 1151. So this one could conceivably use 16 GPUs, and I have read reports that the nVidia GPU limits are gone now in Win10

    https://www.biostar.com.tw/app/en/mb/introduction.php?S_ID=928.

    Irrespective of that, I wonder if a device like this could actually bypass the iRay thread requirement?

    https://www.newegg.com/p/0EP-00MN-00323

  • The socket is nearly irrelevant. What matters is the chipset. I think those were all B250. That would max out at the i7 7700 which has 4c/8t. So you'd be stuck at 8GPU's no matter how many PCIE risers you had.

    Seems that Biostar has a few motherboards that use 9th Gen 1151. So this one could conceivably use 16 GPUs, and I have read reports that the nVidia GPU limits are gone now in Win10

    https://www.biostar.com.tw/app/en/mb/introduction.php?S_ID=928.

    Irrespective of that, I wonder if a device like this could actually bypass the iRay thread requirement?

    https://www.newegg.com/p/0EP-00MN-00323

    That could do 16 GPU's with a 9900.

    No, that card just splits a x16 slots to 8 usb 3 ports. Those can then be reverted back to PCIE risers by other adapters. But iRay would still see the GPU's.

  • Geez, I just noticed the Gforce 3090 is going to have over 10,000 Cuda cores per card. Maybe time to re-think this endeavor. :-D

  • Geez, I just noticed the Gforce 3090 is going to have over 10,000 Cuda cores per card. Maybe time to re-think this endeavor. :-D

    I don't think the 3090 is worth the asking $1,499 unless you just absolutely need the 24GB, which seems unlikely. For the price of 2 3090s at $3,000 and only 20k CUDA cores, why not just pick up (4) 3080s and have 34,816 CUDA cores instead? Personally I already have (2) 2080 Ti card and want to get a single 3080 to replace them with but also do not want to re-sell or just rid off my once very expensive 2080 Ti cards either, so the plan is to experiment with adding them into eGPU enclosures so that I can have 3 cards running in tandem pushing just over 17k cores for only the cost of the enclsosures and the new 3080 at $699, which would literally double the performance of what I have now and kind of wanted to begin with.

  • nicsttnicstt Posts: 11,715

    Geez, I just noticed the Gforce 3090 is going to have over 10,000 Cuda cores per card. Maybe time to re-think this endeavor. :-D

    I don't think the 3090 is worth the asking $1,499 unless you just absolutely need the 24GB, which seems unlikely. For the price of 2 3090s at $3,000 and only 20k CUDA cores, why not just pick up (4) 3080s and have 34,816 CUDA cores instead? Personally I already have (2) 2080 Ti card and want to get a single 3080 to replace them with but also do not want to re-sell or just rid off my once very expensive 2080 Ti cards either, so the plan is to experiment with adding them into eGPU enclosures so that I can have 3 cards running in tandem pushing just over 17k cores for only the cost of the enclsosures and the new 3080 at $699, which would literally double the performance of what I have now and kind of wanted to begin with.

    For Iray users, it is the only option considering that for the new cards the next in line is 10. Those with the budget to spend it is the equivalent of a cheaper Titan than previously available.

    CPU rendering isn't really an option, although some (surprisingly) put up with it.

    10 is semi-decent, and IMO the minimum that should be recommended in Studio - note I said recomended. I have a 980ti and 6GB; I decided to move to Blender for my rendering (IMO a far superior platform). I expect with more RAM I would occasionally do renders in Iray, depending on the time differences on rendering in the two platforms.

Sign In or Register to comment.