Advice on PC Specs for iRay

juzduitjuzduit Posts: 33

Hi guys,

So i am looking to purchase a CPU to use for MD and DAZ3D . 95% of my work would be Image Rendering (seen below) , and it would be for work projects so i expect a continuous use for atleast 12 hours a day of rendering . 
After much extensive research i have come up with the following setup:

 

Graphics Card : 1 x Gigabyte GTX 980 Ti Xtreme 6Gb DDR5                    OR       Asus GTX TITAN X 12GB DDR5 (GTXTITANZ-12GD5) NVidia PCI Exp. 
Processor : intel i7 5930K (Box) (3.5Ghz, C15MB) Intel LGA 2011v3          OR      Intel Xeon E5 2630 v3 2.4 G LGA 2011 (2.4G/15MB) 8/16
MB : ASRock X99X Killer Intel Socket 2011v3                                         OR      Asus Z9PA-D8 (LGA 2011, SUPPORT XEON Ẹ5-2600) Intel LGA 2011
RAM :Corsair CMD32GX4M2B3200C15 (16Gb x 2) DDR4
Cooling Fan : Corsair H105 
PSU : Corsair HX1000i 1000W
Casing : Corsair Carbide 600C 

Are these specs good for minimal rendering times (<10secs). Would they produce maximum results in iRay with my images ? Or are some items even too overpowering for the work i am doing

Help is greatly appreciated :)   Thanks ! 

 

 

Post edited by Chohole on
«1

Comments

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    if you are doing mainly architectural renders 6GB GPU's are fine.. the 980ti and Titan x have a similar speed. I would invest in 2x 980ti (renderspeed) ...

    Clearly  i would pick the x99 (2011-V3) plattform and not LGA 2011... DDR4 RAM is way faster and the prices came down over the last months..

    I would go with  a x99 board with minimum 4x PCI 3.0 slots... if rendering is your daily work you cant have enough slots for GPU cards!

    Gigabyte x99 boards save on M2 speed and offer 4 PCI 3.0 slots instead... if your budget is not limited go for a ASUS X99 with 4 - 7 PCI 3.0 slots)

    Post edited by AndyGrimm on
  • fastbike1fastbike1 Posts: 4,078

    If you want 12 hours a day, water cool the CPU AND the GPU. IMO, motherboard, ram, and gpu mfr (e.g. corsair, gigabyte, etc) won't matter that much. As long as you get an I7, 32GB ram and the GTX980TI or Titan X, you'll be happy with performance.

  • AndyGrimmAndyGrimm Posts: 910

    2011-v3 and 2011 CPU's dont have integrated Grapic... If you are doing opengl hungry CAD work or Adobe stuff while rendering in the background you will appreciate additional cards!

  • juzduitjuzduit Posts: 33
    fastbike1 said:

    If you want 12 hours a day, water cool the CPU AND the GPU. IMO, motherboard, ram, and gpu mfr (e.g. corsair, gigabyte, etc) won't matter that much. As long as you get an I7, 32GB ram and the GTX980TI or Titan X, you'll be happy with performance.

     

    AndyGrimm said:

    2011-v3 and 2011 CPU's dont have integrated Grapic... If you are doing opengl hungry CAD work or Adobe stuff while rendering in the background you will appreciate additional cards!

     

    Thanks for the reply! There would not be any animation done at all (yet) however i do want the PC to be future proof just incase i do animation in the future. 

     

    However is a Xeon processor necessarily better than a i7 5930 in this case?

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    it really depends - i am a ex professional architectural CAD and drafting guy - with a parallel career in advertising and photograpy. If i learned ONE THING..
    there is no future proof system! instead you must do a yield calculation over a time period...(2 - 4 years).

    I understood that you offer commercial render images ..and then render time is pure money. So my postings and answers have always a commercial background - because i paid WAY to much in the past for hardware and licenses.

    The xeon which i recommend in another thread is a BETTER alternative to the i7 5820 (which is limited in PCI lanes)  for a workstation with HIGH render speed (aka 2 or 4 GPU cards - or more!)... If you can afford the 5930 (or better XEON)  and a good board with (trust me you want 4 PCI 3.0 slots or more)...  then you have one of the best and as good as possible " future proofed " system which is possible at the moment.

    If you render mainly architectural visualisation then you want GPU and not CPU power (every i7 or XEON > 2.5Ghz) will do fine (with 40 pci lanes!)) . And you dont need more then 6gb Vram cards (but as many as you can get!)... if i understood you wrong  then give me some hint what you are actually doing or planning to offer  and i can give you more concrete advicesmiley.

    Cooling is always a problem - but watercooling systems are not needed anymore (but nice to have) - if you pick "smart" a case and ventilation (AND the right GPU cards - they differ a lot IN COOLING and overclocking!)! 

     

    Post edited by AndyGrimm on
  • juzduitjuzduit Posts: 33
    AndyGrimm said:

    it really depends - i am a ex professional architectural CAD and drafting guy - with a parallel career in advertising and photograpy. If i learned ONE THING..
    there is no future proof system! instead you must do a yield calculation over a time period...(2 - 4 years).

    I understood that you offer commercial render images ..and then render time is pure money. So my postings and answers have always a commercial background - because i paid WAY to much in the past for hardware and licenses.

    The xeon which i recommend in another thread is a BETTER alternative to the i7 5820 (which is limited in PCI lanes)  for a workstation with HIGH render speed (aka 2 or 4 GPU cards - or more!)... If you can afford the 5930 (or better XEON)  and a good board with (trust me you want 4 PCI 3.0 slots or more)...  then you have one of the best and as good as possible " future proofed " system which is possible at the moment.

    If you render mainly architectural visualisation then you want GPU and not CPU power (every i7 or XEON > 2.5Ghz) will do fine (with 40 pci lanes!)) . And you dont need more then 6gb Vram cards (but as many as you can get!)... if i understood you wrong  then give me some hint what you are actually doing or planning to offer  and i can give you more concrete advicesmiley.

    Cooling is always a problem - but watercooling systems are not needed anymore (but nice to have) - if you pick "smart" a case and ventilation (AND the right GPU cards - they differ a lot IN COOLING and overclocking!)! 

     

    AndyGrimm thanks for your reply. 

    I would be importing garments i created and styled in MD, then posing them on a DAZ3D Character, then placing them in front of an architectural background , thus creating a very realistic 3D Photoshoot which mimicks a fashion catalog. 
     

    I agree there is no future proof system , perhaps i should change it to "the best system currently for my needs" :)

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    Then i would pick a gigabyte or MSI x99 board with 4 PCI slots (the cheaper ones are fine x99 sli as example) add the 5930 with 6 cores and you have the best upscale ability which is possible at the moment. You pay about 400 USD more then for the skylake plattform but this is well invested money if you use your sytem professional.

    You could start with one 980ti and check if 6gb is enough for your work. one or two daz models fully dressed should always fit in the memory even with a large architectural background... to free up the whole 6GB for rendering you could add a cheaper 740/50 or 960 4gb dedicated to your monitors (or helping rendering on smaller scenes)...  and if you have scenes where you need more then 6gb you can always add a titan x later...

    Look for a case which is made for 4 SLI (airflow) and pick cards with optimized ventilation and allready overclocked  and you wont have problems with cooling.

    Post edited by AndyGrimm on
  • juzduitjuzduit Posts: 33

    i have these GPU's as an option. 

    Asus GTX 980 Ti Matrix Platinum 6GB DDR5 NVidia PCI Exp

    Gigabyte GTX 980 Ti Xtreme 6Gb DDR5 (GV-N98TXTREME-6GD) NVidia PCI Exp

    Asus GTX 980 Ti Poseidon 6GB 384Bit DDR5 NVidia PCI Exp

    MSI GTX 980 Ti Lightning 6Gb 384Bit DDR5 NVidia PCI Exp. 

    I have heard rave reviews about MSI , but there is very little differencei in pricing between these ones. Are picking any of these ones alright ? Or is there a reason i should pick one instead of the other? 

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    one thing you should look for is the dimension (thickness)....   if you plan to stick more then 2 cards in your system later.. every mm more space between the cards helps with airflow... 
    Because i dont own 980ti's yet - i can also just point on specs...
    But i am sure there are one or more other users which have 2 or more 980 or better allready in their system... so any advice and experience is appreciated.

    Just a quick look on the specs -> Asus GTX 980 Ti Matrix Platinum 6GB DDR5 NVidia PCI Exp... up to 2mm more space then similar others (thinner) plus cooper...   that one looks very good to me...   

    Edit: need to check the mainboards - most should fit 2.5 slot cards (distance PCI slots) ... but also here.. check the dimensions!

    Post edited by AndyGrimm on
  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    Mainboards: While many advertise 3 or 4 way SLI... some DONT have the needed space !...

    Here examples: Image one ASUS Rampage x99 and AS rock X99 (layered)....  note that the ASrock has way lesser space for GPU's and overheating will be more a problem....and 4 980tis will NOT fit.

     

    Asus-Rampage-V-Extreme - AS ROck.jpg
    1108 x 1200 - 501K
    Post edited by AndyGrimm on
  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    While trying to get dimensions and overlaying boards in Photoshop i noted that they CHEAT with the images.. they distort the ortho photos to make them look more spacy (longer)!

    The Asus Rampage is a extended ATX (3cm more space for GPUS!).... the others are ATX... 

    The best one after the more expensive Asus seems still to be Gigabyte x99 sli - which copies Asus design ( but on a ATX) - They get the most space out of the dimensions)...

    MSI and ASrock wont fit 4 cards.

    msi.jpg
    1280 x 1024 - 471K
    Asus-Rampage-V-Extreme-Large-BSN-.jpg
    1960 x 1200 - 908K
    gigabyte.png
    767 x 977 - 781K
    asrock.jpg
    1088 x 1280 - 581K
    Post edited by AndyGrimm on
  • nicsttnicstt Posts: 11,715

    Main advice: wait for Pascal. It might not alter what you get, but it might, or it might make it cheaper.

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    @nicstt.. right..,, i think about to build a system with a asus or gigabyte board ... and just buy a second hand titan 6gb to have something to start with...   my goal is to have a system where i CAN plug in 4 or more cards later (even extender boxes maybe in future)... 

    Post edited by AndyGrimm on
  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    Conclusion looking on dimensions of GPU cards: Clearly a extended ATX board is needed for 4 cards or there will be close to zero airsflow between cards (on the bottom card.)

    Here is another option which is cheaper then Asus and build on a extended ATX board..
    EVGA X99 Classified (151-HE-E999-KR) LGA 2011-v3 Intel X99 SATA 6Gb/s USB 3.0 Extended ATX Intel Motherboard

    that one runs for 319USD... 

    Gigabyte GA-X99-Gaming 5P   .. also extended. and the slots on the right place and there are others.... But it must be a E-ATX board  smiley

    Post edited by AndyGrimm on
  • nicsttnicstt Posts: 11,715

    I nearly went dual Xeon about three years ago, I didn't ; but it is still a configuration I contemplate whenever I consider an upgrade; dual xeon and room for four titans would be the ideal MB; the have a longer shelf life, and cost less to run, as a rule.

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    yes - i also checked out such a dream workstation - way out of my budget unfortunately smiley. if they would not charge 2500 usd for a extern PCI  4 slot box (without cards).. that would be on my wishlist too ... but maybe they are affordable in 2 - 3 years.. that's another reason why i want 40 PCI lanes.

    Actually i miss a modular system..  mini PC with extenders... no cooling problems...  and just add cards with cases when and where you need it. I think we will see such systems soon.

    Post edited by AndyGrimm on
  • nonesuch00nonesuch00 Posts: 18,142
    edited April 2016

    I think of doing something like this but the GPU service life is even less the CPU service life nowadays. I'm thinking of this as a basis machine:

    http://smile.amazon.com/MSI-Nightblade-MI2-001BUS-Skylake-Barebone/dp/B017K5ZVME?ie=UTF8&psc=1&redirect=true&ref_=gno_cart_title_3&smid=ATVPDKIKX0DER

    but dang, it's $300 sans monitor, CPU, GPU, SSD, RAM, and everthing else but the MB when something like this costs $850

    http://smile.amazon.com/gp/product/B015PYZI8E/ref=gno_cart_title_0?ie=UTF8&psc=1&smid=ATVPDKIKX0DER

    However I can easily upgrade the video card later on the MSI Barebones although I've learned CPU upgrades usually mean MB upgrades.

    Makes me think I should just spend $600 every 5 years for the most capable machine that $600 will buy regardless of form factor & be done with it. :-(

    Post edited by nonesuch00 on
  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    i studied a little bit professional GPU solutions - they all come in a 19" rack and are stacked with up to 16 GPU cards (8 GPU's on Xeons with 40 lanes.)... simple ventilation...(including racks with Titan X not just the colder Tesla.)

    To me it is logical that a motherboard must be flat (horizontal) so that warm air directly increases to the empty top of the chassis where the ventilation airstream is. (this solves the problem that a card in slot one overheats like they do in common towers usally)....

    Does somebody here have experience with quad GPU's or more in a rack or any other horizontal chassis ? i am looking around but all the overclocking gurus seems to be falled in love with watercooling which looks finaly more like a racing car then a GPU workstation smiley.

    (And i really really dont want LED's on my motherboard or ventilators devil)

     

    Post edited by AndyGrimm on
  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    I am also asking myself how good the cheaper PCI x1 (Gen 2 500Mbs) splitters are doing...  i have a old 19" 4U rack which i could use to build a GPU extender box.. but x8/16 upstream technology (backplanes) is simple to expensive...

    Is 5GB upstream splitted in 3 or 4 cards enough to work effectively with Iray ?

    Image one - > this is what i want
    Image two -> this is what i can afford smiley

    backplane.jpg
    957 x 322 - 198K
    splitter.jpg
    530 x 320 - 35K
    Post edited by AndyGrimm on
  • leo04leo04 Posts: 336

    This is my PC, I built this about 3 years ago, I think.

    I get very nice renders tho some are so complex they can take several hours.

    Operating System
        Windows 10 Home 64-bit
    CPU
        Intel Core i7 3770K @ 3.50GHz    64 °C
        Ivy Bridge 22nm Technology
    RAM
        32.0GB Dual-Channel DDR3 @ 798MHz (9-9-9-24)
    Motherboard
        Intel Corporation DZ77SL-50K (CPU 1)    44 °C
    Graphics
        HP W2072a (1600x900@60Hz)
        HP W2072a (1600x900@60Hz)
        HP W2072a (1600x900@60Hz)
        2047MB NVIDIA GeForce GTX 760 (MSI)    33 °C
    Storage
        232GB Samsung SSD 840 Series (SSD)    32 °C
    Optical Drives
        HL-DT-ST DVDRAM GH22NS40
    Audio
        Realtek High Definition Audio

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    @leo04
     

    Yes this is still a nice fast PC, fully loaded with 32GB ram... but you could exactly use a PCI expansion box (if they would  be more affordable)....  and just add one or two faster cards and you are set again for the next years.

    The technology is there - but one Brand has the Patent on PCI switch over IPASS /8/16 (PCI over cable)...  and instead targeting the massmarket they build racks with redundant powersupplies...  But i do some more research.. there must be a cheap copy cat somehwere smiley

    Post edited by AndyGrimm on
  • juzduitjuzduit Posts: 33

    thanks for the help guys

    After much extensive research i have come up with the following setup:

    PCPartPicker part list / Price breakdown by merchant

    CPU: Intel Core i7-5930K 3.5GHz 6-Core Processor ($554.99 @ SuperBiiz) 
    CPU Cooler: Noctua NH-D15 82.5 CFM CPU Cooler ($88.49 @ Amazon) 
    Motherboard: Asus RAMPAGE V EXTREME EATX LGA2011-3 Motherboard ($505.98 @ Newegg) 
    Memory: Corsair Vengeance LPX 32GB (4 x 8GB) DDR4-2666 Memory ($134.99 @ Newegg) 
    Storage: Sandisk Solid State Drive 256GB 2.5" Solid State Drive ($144.95 @ Amazon) 
    Video Card: Asus GeForce GTX 980 Ti 6GB Video Card ($709.99 @ Amazon) 
    Case: Corsair 750D ATX Full Tower Case ($145.44 @ Mac Mall) 
    Power Supply: Corsair 1000W 80+ Platinum Certified Fully-Modular ATX Power Supply ($188.79 @ B&H) 
    Optical Drive: LG WH16NS40 Blu-Ray/DVD/CD Writer ($54.88 @ OutletPC) 
    Case Fan: Corsair Air Series AF140 Red 66.4 CFM 140mm Fan ($15.09 @ Amazon) 
    Total: $2543.59

     

    2 questions though : 


    * Do i need extra cooling fans since the PC would be working intensively? If so , what should i add?
    * Is 1000W Powersupply enough?

  • hphoenixhphoenix Posts: 1,335
    edited April 2016

    Might want to shop around a bit......

    CPU:  i7-5930k 3.5GHz 6-core  is $499.99   ($50 cheaper)

    Motherboard:  ASUS X99 Sabertooth LGA2011-3  is $309.99 (the RAMPAGE V Extreme is out-of-stock, but priced at $399.99)  The X99 is about $200 cheaper, if they had the Rampage, it'd be about $100 cheaper)

    SSD 250GB Crucial MX200 (available as either SATA6G or M.2) for $89.99 ($55 cheaper)

     

    Those are at microcenter.com.  If you don't need four PCI-E slots, but can get by with 3 PCI-E slots and DDR4-2400 RAM, you can get the whole thing about $300 cheaper.

    That's savings that are halfway to another GTX980Ti GPU.....

     

    The 1000W power is fine for a single 980Ti GPU and the rest.  Probably okay for two, but if you are going to go to 3 or more GPUs, you'll want at least a 1200W, probably a 1500W.

    I'd recommend several additional case fans, to make sure your airflow keeps cool air entering the case and the warmer air being exhausted.  4 case fans, properly set up, can keep the machine running cool.  Just make sure to take air in from the front, and push it out the back.

    Post edited by hphoenix on
  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    @juzduit

    looks perfect to me...

    As a rule of thumb..

    Count for the CPU/RAM and SSD 250 Watt...(overclocked add 50 Watt)
    Then add the Wattage for the GPU's... if you plan to use 4 cards with 200 Watt peak...maybe overclocked (add 50 Watt each card).. = 1000 Watt. just for the GPU's.

    While you can add cards and additional ventilation step by step - you must known ahead how much power you will need.

    The case is a good one.. (also on my list - but i prefer the corsair carbidge 540 - 2 chambers/ one for mainboard/cpu/gpus - the other for powersupply).. i saw some 4 GPU builds without watercooling using this case).
     

    Post edited by AndyGrimm on
  • hphoenixhphoenix Posts: 1,335

    As AndyGrimm said, calculate your required power.  But also realize that you'll want it about 1.25 times what you calculate (80 Plus certified) so the max power consumption is 80% of your rated supply power.

    So, if you have 300W for the CPU/MB/RAM/SSD/HDD/Optical/etc., and then 300W for each GTX 980Ti (TDP is 250W rating, but actual power consumed has been measured as high as 420W peak draw!), with 1 GPU you'd need 600W * 1.25 = 750W.  For 2 GPUs, you'd need 900W * 1.25 = 1125W, and for 3 GPUs you'd need 1200W * 1.25 = 1500W to keep your power at efficient levels and no drop-outs on voltages.

    (80 PLUS certification indicates that the Power Supply has been tested to keep it's efficiency at 80% at full load.  That means 20% is wasted as dissipated heat.  80 PLUS Bronze is 82% at full load, Silver is 85%, Gold is 87%, Platinum is 89%, and Titanium is 90%.  Less wasted power as heat means less heat in the case, and less power consumed from the mains:  80 Plus certified drawing 100% of 900W is consuming 1125W from the outlet.  And as you go up with Bronze or better, less-than-100%-of-rated-load gives even better efficiency.  80 PLUS Gold at 50% rated power draw is 90% efficient.  It's small, but it adds up if your PC is running 24/7 and doing multi-hour renders.)

     

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    All what hphoenix posted is absolut correct... Only one thing rarely or never would all 4 cards peak with 420watt at the same time.. that's why most builds with 4 cards x80ti or titans do fine with a 1500 or 1600 watt powersupply.

    (Edt: this test setup article says 1200 watt is enough for x99 plus 4 titans x -> http://us.hardware.info/reviews/6033/nvidia-geforce-gtx-titan-x-sli--3-way-sli--4-way-sli-review-insane-performance

    Post edited by AndyGrimm on
  • mjc1016mjc1016 Posts: 15,001
    AndyGrimm said:

    Only one thing rarely or never would all 4 cards peak with 420watt at the same time.. t

    While that may be true...don't discount that it CAN happen, but don't be too hung up over it.  Most 'good' (certified bronze or better) should be able to handle the brief/occasional spikes.  It's only a problem when it isn't infrequent/never.

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    i am not a electric specialist - but the worsest what could happen if a GPU peaks and could not get the needed power is a VRAM error.. that's how i understand it... while it is in theory possible that 3 or 4 cards peak at the same time... such peaks are only milliseconds.. and in reality happens never together because the peak (spike) is a reaction an what the bus (PCI) can deliver.. and by nature of bandwith and PCI lane switch chip (tact).. this happens never at the same time for all cards...

    But right - we have the ones which say 1200 watt is enough ( i posted a link with different 4 GPU tests 980ti and titans) driven with 1200 watt (and they never measured more then 800 watt)... and we have the theory... and so i think it is save to say that 1500 watt is fine and more then enough..., 

    I also think that Nvida has allready enough reserve in their specs ( this changes only with extrem overclocking).

     

    Post edited by AndyGrimm on
  • FrankTheTankFrankTheTank Posts: 1,131

     

    I have 2 x EVGA 980ti and I peak at 550-600watts, I use a Gold level EVGA 1000 watt power supply and wouldn't feel comfortable using any less.

    So 4 cards I would want a 1500 watt power supply for sure.

    You don't want to run a power supply anywhere near peak for 15 hour renders. (My last render was 22 hours, that was for 30 seconds of animation.)

    Also, scrap that air cooler.

    I had a big huge Noctua air cooler like that and it could not handle the load under long renders, even though the CPU was not even being hit since I use Iray. All it did was circulate hot air around inside the case. A water cooler really helps to keep the rest of the case cool. I know this because I did long renders for a week with the Noctua before I sent it back. So despite all the rave reviews for gaming setups, it does not do well in a render setup. I would avoid that ugly beast, not to mention its a pain in the ass to install and get around in your case. I wound up going with a Corsair H100i water cooler. Everything stays nice and cool now. My GPUs used to hit 75c doing long renders with the Noctua. Now with the Corsair, the GPUs never go above 62c. Doesn't sound logical, I know, as the CPU cooler should not effect that, but it does. Its keeping the ambient temp around it cooler and thus its helping keep your GPUs cool as well.

     

     

     

  • AndyGrimmAndyGrimm Posts: 910
    edited April 2016

    @Dominic Tesla... thank you.. i think way better then theory and tests is real expirience...   while i posted the 1200 watt test just to show that there are other opinions... i think it is save to say that 1500 watt is fine for 4 cards....  1250 watt  for 3 cards... 1000 watt for 2 cards..... and those which like to save on powersupply can use one card more with the same specs (but on own risk)...

    Post edited by AndyGrimm on
Sign In or Register to comment.