GPU Power Supply Requirements

ebergerlyebergerly Posts: 3,255
edited February 2021 in The Commons

For a long time I've had a power meter connected to my main computer. It measures total power drawn by my computer from the wall outlet. CPU, GPU's, harddrives, fans, power supply losses, etc. The whole thing. 

And recently I dusted off my RTX-2070 Super that's been sitting in a closet for a couple of years. And out of curiousity, based on a post I saw on some forum, I decided to go to one of the power supply manufacturers' websites where they provide a calculator to tell you the MINIMUM power rating of the supply you need. Enter the GPU's you have, CPU's, harddrives, etc., and it tells you what to buy. Cool, right?

So I've been monitoring my computer during renders with both my RTX-2070 Super and GTX-1080ti running flat out. And after many renders, the most total power used by my entire computer (with both GPU's) is around 450 watts. Which seems reasonable, since the 1080ti has a 250 watt rating, and the 2070 Super has a 215 watt rating. And 250+215 = 465 watts.

Although that assumes zero CPU usage, and zero power supply losses (inefficiency), and zero anything else. So maybe when the GPU is running "flat out" it's not always running at top rated power. And in order to run at absolute max rated power requires that the software is optimized to utilize the hardware at 100%. Which it rarely is. 

So anyway, I filled my computer data into the online calculator, and the result was (drumroll...) 1300 watts. MINIMUM. 

Now that's just too funny. And people actually use those things and believe them. It's physically impossible for me to get anywhere near that. I'm sure they'll say, "oh, we're allowing for future expansion in case you buy 3 more GPU's". And "oh, we're assuming everything is running at maximum, simultaneously" (which it never does, and even if it did it wouldn't get anywhere near that value). And "oh, we're assuming everyone will overclock their CPU's and GPU's so they draw twice their rated power", which is ludicrous. And "oh, you need to run at 50% rated power so you can save $5 per year in electricity bills". 

Anyway, I guess consider this as a PSA. Just beware, and don't believe everything you see on the internet.  

Post edited by ebergerly on

Comments

  • namffuaknamffuak Posts: 4,191

    I don't have a power meter as such - but my system is the only thing on a 1350 VA UPS and with my 1080TI and 980TI both running at 97% GPU load I get a reading of 542 Watts used from the powerchute monitor, and both cards peak at 190 W each per GPU-Z. The CPU is not involved in the render; CoreTemp is indicating 45 W for it.

    So - yeah, the power requirements are seriously overstated.

  • LeatherGryphonLeatherGryphon Posts: 11,672
    edited February 2021

    I've also noticed by reading the front panel on my UPS battery that my big machine with an 850w PS runs an i7-10700 CPU, 32GB RAM, 11TB Storage, GTX-1660 graphics card, and a monitor but only gets up to 300w while rendering IRay in DAZ Studio.  So, I have lots of wiggle room in my power supply.  Better too much than not enough.

    Post edited by LeatherGryphon on
  • Richard HaseltineRichard Haseltine Posts: 102,764
    edited February 2021

    Iray can max out some aspects of the GPU, but not all, so it doesn't draw the maximum power that the the hardware might.

    Post edited by Richard Haseltine on
  • ebergerlyebergerly Posts: 3,255
    edited February 2021

    Richard Haseltine said:

    iRay can max out some aspects of the GPU, but not all, so it doesn't draw the maximum power that the the hardware might.

    Absolutely true, and a point I've repeated over and over in many ways. Hardware usage depends on software. Is the software written to take maximum advantage of the available hardware? Is the software efficient and written well? I know we love to focus on hardware and talking about how our hardware is a beast, but all of that is irrelevant if you software doesn't take advantage of it. RTX is a perfect example. It has separate hardware components designed to solve specific ray-tracing and rendering related problems. But if your scene doesn't have those problems or requirements, and/or the software isn't written to take advantage of all that, or the drivers aren't yet ready, then your hardware can only do what it's told. And as someone who has written software for years, I can assure you IT'S VERY DIFFICULT to write perfect code that is super efficient and takes maximum advantage of hardware. Sometimes (often) impossible. 

    So for those who hear a manufacturer's rating for a GPU and assume, "oh, my GPU will always require all that power", think again. Chances are you'll be using much less. 

    Which is, again, why I always strongly recommend that those interested in hardware spend $30 for a meter to plug into your wall outlet (like a Kill-A-Watt). Measures total power draw, voltage at the outlet, and a lot more. 

    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255
    edited February 2021

    BTW, I checked the actual, individual max power draws (using HWMonitor) for each component during a render (with a total power consumption from the wall of around 450 watts), and here's the results:

    1. RTX-2070 Super: 145 watts (215 watts rated)
    2. GTX-1080ti: 185 watts (250 watts rated)
    3. CPU: 35 watts 

    So if you add those up, the total is 365 watts. Which means that other stuff in the computer is taking something like 85 watts. Power supply, motherboard, fans, whatever. And the GPU's are only running at around 70% of their rated power, and the CPU below 50%. And that's with the "utilization" graphs showing the GPU's running at "maximum". 

    So, don't assume your hardware will use anywhere near its rated power.  

    Post edited by ebergerly on
  • ZyloxZylox Posts: 787

    This is fine if DAZ Studio is either the only or most resource intensive program you use. Personally, I also play computer games, which can use a lot more resources. At a minnimum, having an insufficient power supply can cause your computer to freeze and shut down occassionally. At worst, it can cause damage to your computer components. Is saving $100 on a power supply really worth the chance of destroying your $1k+ graphics cards? Just something to consider.

  • ebergerlyebergerly Posts: 3,255

    Zylox said:

    This is fine if DAZ Studio is either the only or most resource intensive program you use. Personally, I also play computer games, which can use a lot more resources. At a minnimum, having an insufficient power supply can cause your computer to freeze and shut down occassionally. At worst, it can cause damage to your computer components. Is saving $100 on a power supply really worth the chance of destroying your $1k+ graphics cards? Just something to consider.

    I just encourage folks to not rely on vague generalizations, but rather justify stuff based on facts.

    Keep in mind that ATX power supplies are required to have internal protection so that it doesn't damage any of the computer's equipment. And I think most users would be very hard pressed to come up with a computer that requires much more than a 600 watt power supply. Or less. As I showed, getting much over 450 watts is a challenge.

    I'm sure we can all come up with internet stories of someone whose GPU was fried, and they blame the power supply or whatever else they can think of, when in fact they overclocked it or mis-installed it or any one of dozens of user errors. 

    So yeah, spend $100 if you want, but alternatively you can use that $100 for something that's really useful.  

Sign In or Register to comment.