FYI: some rendertime numbers on various xeon CPUs

user 2327671user 2327671 Posts: 0
edited December 1969 in Daz Studio Discussion

FYI: some rendertime numbers on various CPUs

Abstract:
-----------
It seems like DAZ Studio 3D makes sufficient use of many cores to justify buying dual CPU Xeon E5-2680 version 2. (2x 10 cores + hyperthreading)

Test method:
----------------
I have rendered 3 different scenes, A, B and C using daz studio 3d 4.5 under windows 7 professional edition on 3 different machines. 1 of the scenes I also rendered again on the E5-2680 v2 but using msconfig.exe to limit Windows to only use 10 cores (+ hyperthreading).


Numbers:
-------------


CPU  Cores Scene Minutes
E5620 8  A  60
E5620 8  B  106
E5620 8  C  106
CPU  Cores Scene Minutes
E5645 12  A  41
E5645 12  B  73
E5645 12  C  74
CPU  Cores Scene Minutes  Notes
2680 10  A  34   (reduced using msconfig.exe)
2680 20  A  20
2680 20  B  33
2680 20  C  34
CPU  Cores Scene Minutes

Scene B and C is apparently about the same complexity, while scene A is little over half as complex (based solely on rendertime). I can not disclose the scene, so dont bother asking.

Notice that reducing the E5-2680 v2 from 20 to 10 cores, aka 1 single CPU vs. dual CPU makes the rendertime jump from 20 minutes for 20 cores to 34 minutes for 10 cores. This is significant enough for me to justify buying dual CPU machines. But your results might be different.

I used msconfig.exe to actually reduce from 20 cores to 10 cores, but notice that msconfig and windows counts hyperthreading as a real core, so the reduction was from 40 to 20 inside msconfig.exe. I have not tested disabling hyperthreading.

Hardware:
-------------
2 of the old machines only differed in the Xeon CPU, the 3. machine is a newer generation.

old machine are http://www.supermicro.nl/products/system/2U/6026/SYS-6026TT-HIBQRF.cfm?INF= with either dual CPU 4 core Xeon E5620 2.4GHz or dual CPU 6 core Xeon E5645 2.4GHz.

New machine is http://www.supermicro.com/products/system/2U/6027/SYS-6027TR-HTRF.cfm with dual CPU 10 core Xeon E5-2680 v2 2.8GHz.

Comments

  • cwichuracwichura Posts: 1,042
    edited December 1969

    3D rendering in general is well suited to parallel work loads. I can personally attest to LuxRender crushing a dual deca-core Xeon system: http://fav.me/d79h74h

  • SickleYieldSickleYield Posts: 7,644
    edited December 1969

    People have shown off these server builds a few times on the forum, and while impressed, I am still confused. Even if I were willing to pillage my New Car fund ($2800 is a bit much for just the barebones right now), I wouldn't know how to set that thing up or use it properly. :D

    How is this different from setting up a regular build with one CPU and a standard single-socket motherboard?
    What are its case/heat removal needs?
    Does it need a special version of the OS? Can it even work with Windows?
    What do I have to do to get this from "ordered from Newegg" to "ready to render" that's different from what I would do if I just bought a Core i7 setup?

    I'm sure those are silly questions to you, but I'm asking quite honestly and with definite interest in the answers. Google has not been helpful.

  • cwichuracwichura Posts: 1,042
    edited August 2014

    I'll be the first to admit that Dual Xeon builds are not for most people. They are very expensive, and very few workloads require or even benefit from the available compute power in such a system. That said, 3D rendering does scale very well with increased cores, and even with hyperthreaded cores. It's an ideal workload for these types of systems.

    If you are using rendering software that supports distributed rendering, from a hardware cost perspective, you'd probably be much better off buying a farm of older, used server blades than one big honkin current machine. (Depreciation is a cold-hearted expletive for the original owner, but great for you as the buyer.) However, a server farm, especially with 2-3 generation old CPUs, will draw a LOT more power than the single honker, so over the long term, your operational costs will even out the cost in hardware. That's my main view on your first question: single CPU systems are quite capable, but when you start stacking them, their lose big time in power efficiency. Also, when it comes to 3Delight in Studio, a dual-core system will have more CPU available to Studio whereas 3Delight standalone licensing costs make it very prohibitive to run a distributed rendering farm. (I render all my stuff using LuxRender, which does support distributed rendering, and I make use of that even with the honker. At the resolutions I am rendering, it still usually takes my combined compute resources (I use a couple old servers at work as slave render nodes) 100+ hours to sufficiently converge one scene.)

    For dual CPUs, you're probably better off with a server case (e.g., rack mount) than desktop. My honker replaced a 10 year old (I kid you not!) desktop machine, so I built it in a large desktop case (Corsair 900d) which is so massive it won't fit under my computer table and has to sit to the side of it. I have Corsair h80i liquid coolers on each of the two Xeons in the system. (Part of the problem with smaller cases is having enough external vent space for whatever cooling you go with as most cases are designed around cooling a single CPU. I actually had to return the first case I purchased, as it just didn't have the clearance for two water coolers.) My system was built from parts, as premade dual-Xeon systems have a pretty hefty price penalty over ordering the parts yourself (at least when I looked into it).

    I am running Windows 7 Pro on mine. This allows up to 4 CPUs and 192GB of RAM in the system (I put 64GB of RAM in mine -- 32GB would have been sufficient, but I've gotten close to 25GB before, so figured 64GB and be done will hold me for a long time especially considering RAM is a lot cheaper these days than it used to be). Lower versions of Windows have various artificial restrictions on number of CPUs and total memory in the system, so check the details on Microsoft's website. But it's really just a Microsoft licensing thing. It's just 64-bit Windows that you know and love/hate. If you are making a dedicated render farm, then I'd put Linux on the nodes instead, since there is no licensing cost and Linux will happily use however many CPUs the box has. And most rendering engines are technically a wee bit faster on Linux than on Windows (at least LuxRender is slightly faster, though not enough to shed tears over).

    As to getting to 'ready to render', it really boils down to assembling all the parts and then installing the OS yourself. (Of course, installing the OS yourself is prudent anyway, since pre-made systems tend to have a lot of pack-in garbageware...) If you are comfortable building a single-CPU system, then you'd be just as comfortable building a multi-CPU system. Much of it is the same, just a larger physical case, a multi-CPU motherboard (I used the Asus P9PE-D8 WS - there aren't many options for dual CPU MBs since it's a niche market) and the extra CPU and cooling for the CPU. And you have to use the more expensive Xeon processors, as 'Core' processors do not support multi-CPU configurations. For 3D rendering, however, a Xeon is actually desirable, as the server-targeted CPUs have more cache memory and other optimizations that make them perform better for compute-intense workloads.

    And I can play WoW with everything on Ultra and have LuxRender killing all 40 logical cores in the background and not even notice it. (I set LuxRender to low priority when I start it, though).

    Clear as mud? :)

    Post edited by cwichura on
  • SickleYieldSickleYield Posts: 7,644
    edited December 1969

    Oh yes, I always build my own. Have to, it's way cheaper. :D And as you say, installing your own OS cuts right off the shovelware (though I had to fight this laptop to uninstall Windows 8 and install Windows 7).

    And I very much appreciate the information. It sounds like it's not going to be worth it, though. I can't afford to spend 100 hours (!) on any one render. It's actually rare that I let one go for more than 4, which is why I still render in DAZ Studio with 3Delight. It's very fleet when I need to get ten promos done in a maximum of two days and get decent (though stylized rather than realistic) results. Making the jump to 3Delight standalone with a system like this would be nice, but the costs are too high (and water cooling? Forget it). It sounds like I personally would be better off, when I have an extra thousand, to put a couple of good graphics cards in a second machine and work on the first one while I render in Octane, Cycles, or another unbiased with a known workflow from DS that's ostensibly less fiddly than Luxus/Reality.

  • user 2327671user 2327671 Posts: 0
    edited December 1969

    People have shown off these server builds a few times on the forum, and while impressed, I am still confused. Even if I were willing to pillage my New Car fund ($2800 is a bit much for just the barebones right now), I wouldn't know how to set that thing up or use it properly. :D
    We use it professionally and has a renderfarm in a rack in the office. It makes a lot of noise so you want to hide it away.


    How is this different from setting up a regular build with one CPU and a standard single-socket motherboard?
    What are its case/heat removal needs?
    Does it need a special version of the OS? Can it even work with Windows?
    What do I have to do to get this from "ordered from Newegg" to "ready to render" that's different from what I would do if I just bought a Core i7 setup?

    I'm sure those are silly questions to you, but I'm asking quite honestly and with definite interest in the answers. Google has not been helpful.


    It is not really different other than we have 12 machines ready to render, and that those machines are put into a rack in a room where you have sufficiently efficient AC cooling. Our cooling is 5.2kWatt cooling.

    It works fine with standard windows 7 pro. Remember that DAZ needs a OpenGL graphics card just to start, so we find a old halflength halfheight GFX card that just barely gives us enough OpenGL to start DAZ. It doesnt matter for the rendertime, because the rendertime uses CPU only.

    The reason we buy these supermicro machines is because there are 4 separate computers inside a 2U chassis, and supermicro are the only vendor that we found that wants to put a little GFX card into the machines without voiding the warranty.

  • user 2327671user 2327671 Posts: 0
    edited December 1969

    cwichura said:
    If you are using rendering software that supports distributed rendering, from a hardware cost perspective, you'd probably be much better off buying a farm of older, used server blades than one big honkin current machine.
    I dont think that is useable for DAZ because I dont think that blades support putting in a graphics card that can deliver OpenGL in a version that DAZ requires.

    cwichura said:
    (Depreciation is a cold-hearted expletive for the original owner, but great for you as the buyer.) However, a server farm, especially with 2-3 generation old CPUs, will draw a LOT more power than the single honker, so over the long term, your operational costs will even out the cost in hardware.

    Yes, operational costs like electricity and cooling can be a big impact factor in the cost.

    That's my main view on your first question: single CPU systems are quite capable, but when you start stacking them, their lose big time in power efficiency. Also, when it comes to 3Delight in Studio, a dual-core system will have more CPU available to Studio whereas 3Delight standalone licensing costs make it very prohibitive to run a distributed rendering farm. (I render all my stuff using LuxRender, which does support distributed rendering, and I make use of that even with the honker. At the resolutions I am rendering, it still usually takes my combined compute resources (I use a couple old servers at work as slave render nodes) 100+ hours to sufficiently converge one scene.)
    Hmm, maybe I should look into LuxRender, is it this one? http://www.luxrender.net/en_GB/index

    How do you convert you DAZ scenes into something luxrender can render? Or to any other render that can be distributed?

  • cwichuracwichura Posts: 1,042
    edited December 1969

    I dont think that is useable for DAZ because I dont think that blades support putting in a graphics card that can deliver OpenGL in a version that DAZ requires.
    While there is considerable licensing costs involved, if rendering with 3Delight, a farm would be better suited to running the stand-alone version of 3Delight, which does not require any OpenGL at all.

    How do you convert you DAZ scenes into something luxrender can render? Or to any other render that can be distributed?


    I use the Reality plugin for Studio. Others favor the Luxus plugin. To each their own. Regardless of which plugin you chose, there will be a learning curve around how to best adapt Studio material settings to LuxRender material settings. Both plugins take an initial stab at this for you, but for best results, you really need to tweak/optimize the material settings yourself. The same is true of any other export plugin (e.g., the one for Octane, sending to Blender to use Cycles, etc).
  • StratDragonStratDragon Posts: 3,249
    edited August 2014

    How do you convert you DAZ scenes into something luxrender can render? Or to any other render that can be distributed?


    you can approximate most of the settings in Reality before you bring up the interface by adjusting them in Daz Studio (DS) so less tweaking needed later but tweaking is nearly unavoidable as cwichura pointed out. Generally if you leave the settings where they are you may find different artists lean towards subtle changes in their preset surfaces for 3Delight which can become more "diversified" in Reality. Reality 3 (R3) is supposed to integrate directly into DS when it comes out. Luxus already does this at present. R3 will be DS4 only as Luxus is as well.

    I want to add that I have a Dual Xeon (2.26 GHz) and an i7 (2.4 GHz.), the i7 seems to be much better at my overall Studio and Blender experience in performance up until CPU rendering (3Delight / LuxRender/ Cycles) at which point the Dual Xeon simply clobbers it.

    Post edited by StratDragon on
  • Kendall SearsKendall Sears Posts: 2,995
    edited December 1969

    cwichura said:

    ...

    I am running Windows 7 Pro on mine. This allows up to 4 CPUs and 192GB of RAM in the system (I put 64GB of RAM in mine -- 32GB would have been sufficient, but I've gotten close to 25GB before, so figured 64GB and be done will hold me for a long time especially considering RAM is a lot cheaper these days than it used to be). Lower versions of Windows have various artificial restrictions on number of CPUs and total memory in the system, so check the details on Microsoft's website. But it's really just a Microsoft licensing thing. It's just 64-bit Windows that you know and love/hate. If you are making a dedicated render farm, then I'd put Linux on the nodes instead, since there is no licensing cost and Linux will happily use however many CPUs the box has. And most rendering engines are technically a wee bit faster on Linux than on Windows (at least LuxRender is slightly faster, though not enough to shed tears over).

    The part in bold is incorrect. In order to support more than 2 physical CPUs, you must use Windows *SERVER* software. The Professional and Ultimate versions only support *2* physical CPUs.


    http://answers.microsoft.com/en-us/windows/forum/windows_7-hardware/whats-the-maximum-number-of-cpu-windows-7-can/7059bb84-b76f-4a6a-befa-8deb1f8fdeea

    I have several 4x cpu systems and have to run Windows server or Linux to support more than 2 CPUs

    Kendall

  • MattymanxMattymanx Posts: 6,949
    edited December 1969

    I would like to know what your advanced render settings were for all 3 scenes.

  • pcicconepciccone Posts: 661
    edited December 1969

    As cwichura said, rendering is an optimal use of parallel execution. Keep in mind also that clock speeds have been barely inching forward in the past years, compared to the doubling that we were used to see just a few years ago. Instead, multiple cores are becoming commonplace.
    There is one big advantage in investing in a multi-core machine: longevity.

    I bought my MacPro in 2009, it's a dual-cpu machine with two 4-core Xeons which generate 16-threads of concurrent rendering (8 cores x 2). Five years later this machine is still very fast and I barely feel the need for something bigger. Also consider that I develop software for a living and so my needs are generally quite higher than the average user.

    So, a bit of investment in a bigger CPU today goes a long way.

    Cheers.

Sign In or Register to comment.