Two graphic cards for rendering + One for monitors

I'm starting my project to build my new computer, based on dual Xeon board with 2 E-5_2697_v4.
I plan to use two RTX 3060 12Gb graphics cards for the GPU render.
I want to use 3 monitors. Now these cards with output for 4 monitors have an excellent price:
- Asus GT 730 2Gb GDDR5
- Asus GT 710-4H-Sl 2Gb GDDR5
- NVIDIA Quadro M2000 4GB GDDR5
So I plan to install one of these graphics cards for video output. And the two RTX 3060 cards just for GPU rendering.

After this long exposition... My question is:
- Installing a graphics card exclusively for monitor output, relieves the workload of the RTX 3060?
I'm aware that the video output doesn't make use of the RTX 3060's CUDA cores, but obviously something must be stressing these cards.

Or is using the video output on the RTX 3060 going to cause almost zero workload on these cards
Therefore, the installation of a graphics card for specific use for video output is unnecessary?

Comments

  • oddboboddbob Posts: 396

    Can't answer the question directly but I have a strong suspicion that life would be much easier with a consumer platform from the last few generations, a mid range CPU and a used 3090.

  • PerttiAPerttiA Posts: 10,024

    If the cards you have listed, are not using the same GPU drivers as the RTX 3060, I would not mess up the system with them.

  • PlizzePlizze Posts: 11

    oddbob said:

    Can't answer the question directly but I have a strong suspicion that life would be much easier with a consumer platform from the last few generations, a mid range CPU and a used 3090.

    These are the RTX 3060 vs 3090 data
    - RTX 3090
    Price (new): €1300
    Price (used): ≈ €850-900
    CUDA cores: 10752

    - RTX 3060
    Price (new): €390
    Price (used): ≈ €200
    CUDA cores: 3584

    Passmark Score:
    (I have not found comparative performance in Cinebech)
    source: https://technical.city/es/video/GeForce-RTX-3090-vs-GeForce-RTX-3060
    RTX 3090: 26963
    RTX 3060: 17202
    RTX 3090 has a +56.7% higher performance

    Performance - Price (based on second hand items)
    RTX 3090: 26963/900= 29.99
    RTX 3060: 17202/300= 86.01

    Difference:
    86.01/29.99= 2.86

    For the price of one RTX 3090 card you can buy almost 3 RTX 3060 cards. If you decide to buy 2 RTX cards you have an estimated score: 17202x2= 34404
    Improvement: 2 RTX 3060 vs 1 RTX 3090:
    17202x2= 34404 vs 26963 = 78.37% improvement. and for many less money

    But the question was:
    Does a graphics card used for GPU rendering, suffer performance loss from using 3-4 of its video outputs?

  • PerttiAPerttiA Posts: 10,024

    There is no other benchmark for Iray rendering on DS than the thread here
    https://www.daz3d.com/forums/discussion/341041/daz-studio-iray-rendering-hardware-benchmarking#latest

    The other benchmarks you find on the net are about gaming performance, and that's comparing apples to oranges.

  • TimberWolfTimberWolf Posts: 288
    edited April 2023

    To answer this directly, yes, it is worth having a GPU dedicated to providing the output for Windows. Windows 10/11 will happily eat its way through ~2GB of VRAM depending on what you have open so, if you don't opt for a dedicated monitor card, one of your 3060 GPUs would drop out if the scene to be rendered used more than about 10GB. You'd have to decide if this limitation makes investing in a 3rd GPU worthwhile or not.

    In terms of rendering performance, the actual impact on rendering *time* would be neglible without it. If you are going to run 3 GPUs do be absolutely certain your cooling is up to the job as this machine will get hot...

    However, as someone else has pointed out, you really want to be using one unified driver across all of your GPUs which leaves the M2000 as the only realistic option. That, too, will be left behind in short order.

    I'd probably be considering a 10xx series as a minimum. Our production machine uses a similar setup to your proposed one (albeit with two Quadro GPUs for rendering) and we use a 1660 to drive the monitors.

    Post edited by TimberWolf on
  • outrider42outrider42 Posts: 3,679

    Passmark is not like Iray at all, so none of that data has any relevance whatsoever to Daz Studio.

    We have a benchmark thread here, that data is more relevant as a general performance guide. Since every scene can be different, actual performance might vary somewhat. Still, the bench is helpful for that generalized performance.

    I actually own both a 3090 and a 3060, so I can make this comparison directly on my machine so they share the same exact hardware.

    With the bench scene we have, my 3090 took 107 seconds to render. My 3060 took 272 seconds to render. The 3090 is not 56% faster like it is in Passmark. It is VASTLY faster at Iray, like 153% faster. You cannot even match the 3090 with TWO 3060s together. You need three 3060s to go faster, and just think about how much hardware, space, and power you need to run 3 GPUs just to beat one 3090. That doesn't sound so ideal to me at all. Do not ever make the mistake of using Passmark for a performance guide for rendering. This is not a video game.

    But there is more to these cards that just raw performance. VRAM is a big difference. I am sure you know that if you run out of VRAM, your GPU will not render at all. The 3090 has twice the VRAM capacity of the 3060, and this fact alone is worth the price difference between them in many Daz users opnions. You might be able to buy multiple 3060s, but you will not be able to render any scenes that exceed their 12gb of capacity, ever. You do not need to make such sacrifices with a 3090. When a scene exceeds 12gb, my 3060 will not render, leaving my 3090 to render alone. It is honestly not a huge loss when that happens, lol.

    Additionally, my 3090 has a good cooler design and is actually much quieter than my 3060. This can be an issue with 3060s, as they are considered "budget" GPUs, they will often have cheap coolers on them. That may not be ideal for rendering long periods of time. It also may not be ideal for your ears. There is a lot to consider.

    Anyway, back to the primary question, it depends on the GPU and its resources. Rendering is a big task, but there can still be enough resources to drive some displays. I don't have multiple displays, but I do remote a lot. Remote is similar to using extra displays, and I also frequently use HDMI dummy plugs (this helps some remote access work better). I can remote just fine with either GPU even when rendering. I can be watching youtube or even using Blender while rendering. It might not be perfect, there might be a small delay in Blender or a video does not play at full resolution as fast. But still very functional.

    But it is not free. Running extra displays will require more VRAM. I don't know exactly how much, but anybody who wants as much VRAM as possible for Daz will want to be aware of this. So once again, we have a situation where the 3090's extra VRAM has benefits. With the 24gb of VRAM, you have the extra resources for multiple displays. The higher the resolution of each display, the more VRAM you need. So VRAM is the biggest loss you will have when rendering with multiple displays.

    Also, I would not use such an old GPU for display, especially multiple displays, mainly because of their limited VRAM. 2gb to display multiple monitors will not cut it. Those cards have low prices because they are junk. Seriously. The GT 710 or 730 are not worth the sand used to make the silicon in them, not to mention they are no longer supported by Nvidia. So you cannot use them with a new GPU, because the new GPU will need modern drivers to run Daz Studio. You will want something more recent if you wish to use a separate GPU for displays. The Quadro card uses different drivers than a gaming card like the 3090. So this is also not ideal.

  • TimberWolfTimberWolf Posts: 288

    The OPs hardware choice is a little odd to my eyes as well (dual Xeons?) but I prefer to trust they know why they are configuring their hardware the way they are.

    I have no idea what other applications this machine is meant to run and I have a suspicion that it is not just Studio. Dual Xeons suggests other, graphic-intensive CPU based work. I run a company that uses Studio professionally for part of our work and you could never convince me to bolt a 3090 into any of our machines. They're great cards for some people but for *our* particular workload they have issues.

    Also, certainly in my corner of Europe, the 3090 is at price parity with the 4090. Odd, but there you go.

    Where I do agree with you is that the OPs choice of cheap monitor-driving GPUs is not ideal. The M2000 could be made to work but its questionable usability into the future means it would have to be about as cheap as a takeaway for it be worthwhile.

     

  • PerttiAPerttiA Posts: 10,024

    And as far as using the rendering GPU to drive the monitors... I'm using 3 monitors with my rendering GPU (RTX 3060 12GB) and my benchmark scores are the same as the other RTX 3060 results in the benchmark thread, W7 takes just 200MB's of VRAM from it, so that's not really an issue either, although W10 takes more (around 1GB)

  • PlizzePlizze Posts: 11
    edited April 2023

    First, thank you all for your kindness in answering me heart
    Think that everything is related to the availability of mon€y.
    Here in Spain an average salary is around €1300. You can throw your hands up, but a coffee and croissant are €2, a dozen eggs €1.5... The standard of living is very good for our salary. How much does a coffee and croissant cost in New York? More salary does not imply a better standard of living compared to another country.
    The problem is the purchase of some technology. An RTX for you can be 1/2 monthly salary. For me it 2x.
    I was thinking about the possibility of a Ryzen 9 7900X, but the price of the board, DDR5 memory was excessive. And this was going to be a boring computer, like my current i7
    My computer is my main hobby, but not a work tool, so I must find the balance PSP (Price/Satisfaction/Performance).

    Because the old Xeon-based servers are being upgraded, Chinese assemblers are removing the chitset from these boards and installing it on new build boards like the Huananzhi X99-F8D Plus (dual). A mother board with 10xSATA, 6xPCIe, 2xNVMe,...
    The Xeon E5-2697 v4 processor is 18 cores and 36 threads, so with two processors I have 36 cores and 72 threads.(I know, a Ryzen is better)
    This board uses DDR4 ECC memory, since nobody wants now this type of ECC memory (not valid for standard PCs), in Wallapop (the most used second-hand platform in Spain) I bought 8 modules Samsung 32Gb (256Gb) for €150.
    In addition, I have bought 11 hard drivers, IBM 4Tb SAS at €25 each. I will use 8 disks to create a RAID-10 and the remaining three in reserve for possible failures HD. Also, this seller has sold me one Adaptec SAS 78165 controller for €45.

    My computer is not going to be inside a standard PC case. I want to enjoy building it.
    It will be mounted on three floors or height levels https://aliexpress.com/item/1005005092594578.html
    Ground floor, 2 power supplies, hard drives rack, Power-On and reset sensor via App cp, random audio welcome message module, auxiliary fan speed control by temperature,... On the front panel: voltmeters, thermometers, I'm even thinking about the possibility used vintage analog gauges https://aliexpress.com/item/1005003489268646.html
    Floor 2, the motherboard with processors and fans.
    Floor 3, level for graphics cards. For now only my Asus Strix 980 Ti
    And so that my friends' children don't say that this is a computer of old men..., Then screens and more lights than a Christmas tree.wink These activated/deactivated through mobile App (a not like light in computers) cp. hehe sorry.

    I want to enjoy building my computer, a different computer. But I want efficiency, that's why I have asked you for advice on what I should do with the graphics cards.

    Post edited by Plizze on
  • PerttiAPerttiA Posts: 10,024

    Just go with 2 RTX 3060 12GB's with one driving your monitor(s)

  • alexhcowleyalexhcowley Posts: 2,386

    I have been using two GPUs (one for the screen and one for iray) for several years.  I currenty use a 12GB 3080ti for rendering and modest 4GB card for the screen.  

    Is there some advantage to using two 3060s rather than a single big card?

    I would also agree that using an AMD GPU for the card and Nvidia GPUs for rendering is probably a bad idea. 

    Cheers,

    Alex.

     

  • jmtbankjmtbank Posts: 175

    PerttiA said:

    Just go with 2 RTX 3060 12GB's with one driving your monitor(s)

     Seriously? 24GB is only just enough for me these days. 2nd hand price on 3060s is about 2/3 of the 2nd hand price of a 3090. How can it not be worth paying the extra? 

  • PerttiAPerttiA Posts: 10,024
    edited April 2023

    jmtbank said:

    PerttiA said:

    Just go with 2 RTX 3060 12GB's with one driving your monitor(s)

     Seriously? 24GB is only just enough for me these days. 2nd hand price on 3060s is about 2/3 of the 2nd hand price of a 3090. How can it not be worth paying the extra? 

    Did you read what the OP wrote? 

    Post edited by PerttiA on
  • jmtbankjmtbank Posts: 175
    edited April 2023

    "I plan to use 2 3060s".

    Not 'I already own one'.

    Admitedly by the time I'd gotten to the base of the thread, I had indeed completely forgotten about the OP post though.

    Post edited by jmtbank on
  • PlizzePlizze Posts: 11

    Consulting the NVidia forum and ChatGPT, it is advisable that all graphics cards have the same driver.
    Thus, if we decide to install 3060-3090 cards for render, the correct is that the card for the monitors use the same driver.
    The cheap compatible models with RTX3060-3090 are:
    GeForce RTX 20 Series
    GeForce RTX 2080 Ti, GeForce RTX 2080 SUPER, GeForce RTX 2080, GeForce RTX 2070 SUPER, GeForce RTX 2070, GeForce RTX 2060 SUPER, GeForce RTX 2060

    GeForce 16 Series
    GeForce GTX 1660 SUPER, GeForce GTX 1650 SUPER, GeForce GTX 1660 Ti, GeForce GTX 1660, GeForce GTX 1650, GeForce GTX 1630

    GeForce 10 Series
    GeForce GTX 1080 Ti, GeForce GTX 1080, GeForce GTX 1070 Ti, GeForce GTX 1070, GeForce GTX 1060, GeForce GTX 1050 Ti, GeForce GTX 1050

    This is where it gets interesting. Online it is possible to find GTX1050-1060 at a very low price.
    There are countless models, we just have to look for the one that has 3-4 monitor outputs.
    Cheap and compatible solution.

    Captura2.PNG
    1128 x 962 - 399K
    Captura.PNG
    1107 x 959 - 760K
  • PadonePadone Posts: 3,700
    edited April 2023

    To reply your question. I use a 1060 for rendering and have the monitor connected to the ryzen apu. There's zero vram used on the 1060 until I render. That said, the more powerful your card is the less relevant is the windows overload, so this makes more sense with mid level gpus as is my case.

    https://www.daz3d.com/forums/discussion/comment/8125751/#Comment_8125751

    Post edited by Padone on
Sign In or Register to comment.