Two graphic cards?

13

Comments

  • ebergerly said:

    I vote swordkensia gets the DAZ Forums award for most useful post in September 2017. And if there isn't an award we should make one smiley

    And the winner is...........

    Bless you Sir. blush

    I think I will quit whilst I'm ahead. !!! 

    S.K.

     

  • JamesJABJamesJAB Posts: 1,760

    Lets try this again. cheeky

    I am fortunate to have a three machine, mini render studio.

    Machine 1. runs a Titan Z + Titan X.

    Machine 2. runs a pair of 1070's.

    Machine 3. runs a 1080ti + Titan Black.

    In the image sample attached, I used machine 2 and 3 for the comparison test. As you can see the scene is an interior shot using a lot of reflective surfaces, Glass etc, and five mesh lights as light sources (some of which are off shot).

    On both machines the 1070 and 1080ti were set as the primary cards, also running the display. the secondary cards were disabled. Both machines run the same OS (Win7 64bit), have the same amount of ram (16gb) and run comparative motherboard and processor hardware.., actually machine two has better MB and processor than Machine 3.

    The scene was set to render for 2000 samples. and Optix Prime Acceleration was also enabled for both machines.The Architectural Sampler was enabled and rendering Quality was set to 5.

    Machine 2: 1070 Scene Load Time: 4 minutes, 30 seconds.

    Machine 3: 1080ti Scene Load Time: 3 minutes, 10 seconds.

    Machine 2: 1070 Render time to 2000 samples: 1 hour, 28 minutes, 19 seconds.

    Machine 3: 1080ti Render time to 2000 samples: 45 minutes, 41 seconds.

    This is an example of a processor and calculation intensive scene, where better core processing and core count clearly demonstrates a considerable saving in time.. Sure if you are rendering out scenes which typically take 20-30 minutes to render, then shaving off 6-10  minutes or so of time may not make much difference.., however if you are doing more intensive scenes then the figures clearly speak for themselves.

    Cheers,

    S.K.

    Is there any chance that you could run the GTX 1070 system render again, but using both cards?
    Ths topic is after all based around the 2 lesser cards vs 1 faster card.
    Would be nice to see how well the render time scales with two cards on a more complex scene.

  • ebergerlyebergerly Posts: 3,255
    edited September 2017

       

    JamesJAB said:

    Is there any chance that you could run the GTX 1070 system render again, but using both cards?
    Ths topic is after all based around the 2 lesser cards vs 1 faster card.
    Would be nice to see how well the render time scales with two cards on a more complex scene.

    That would be interesting. To the extent you believe the results for the benchmark scene from Sickleyield, it appears that the results should be fairly close. A single 1080ti rendered at 2 minutes, a single 1070 at 3 minutes, and dual 1070's at 1.75 minutes. Given the margin of error I'm guessing it's a draw between the 1080ti and the dual 1070's. It will be interesting to see if that holds with a larger scene.

    Benchmark.jpg
    382 x 431 - 40K
    Post edited by ebergerly on
  • JamesJAB said:

    Lets try this again. cheeky

    I am fortunate to have a three machine, mini render studio.

    Machine 1. runs a Titan Z + Titan X.

    Machine 2. runs a pair of 1070's.

    Machine 3. runs a 1080ti + Titan Black.

    In the image sample attached, I used machine 2 and 3 for the comparison test. As you can see the scene is an interior shot using a lot of reflective surfaces, Glass etc, and five mesh lights as light sources (some of which are off shot).

    On both machines the 1070 and 1080ti were set as the primary cards, also running the display. the secondary cards were disabled. Both machines run the same OS (Win7 64bit), have the same amount of ram (16gb) and run comparative motherboard and processor hardware.., actually machine two has better MB and processor than Machine 3.

    The scene was set to render for 2000 samples. and Optix Prime Acceleration was also enabled for both machines.The Architectural Sampler was enabled and rendering Quality was set to 5.

    Machine 2: 1070 Scene Load Time: 4 minutes, 30 seconds.

    Machine 3: 1080ti Scene Load Time: 3 minutes, 10 seconds.

    Machine 2: 1070 Render time to 2000 samples: 1 hour, 28 minutes, 19 seconds.

    Machine 3: 1080ti Render time to 2000 samples: 45 minutes, 41 seconds.

    This is an example of a processor and calculation intensive scene, where better core processing and core count clearly demonstrates a considerable saving in time.. Sure if you are rendering out scenes which typically take 20-30 minutes to render, then shaving off 6-10  minutes or so of time may not make much difference.., however if you are doing more intensive scenes then the figures clearly speak for themselves.

    Cheers,

    S.K.

    Is there any chance that you could run the GTX 1070 system render again, but using both cards?
    Ths topic is after all based around the 2 lesser cards vs 1 faster card.
    Would be nice to see how well the render time scales with two cards on a more complex scene.

    Yes indeed, 

    I have just started the render now. 

    The reason I posted the single card head to head, was just to demonstrate the actual performance difference between these two cards.

    I should have the two 1070's card result for you in about 50 minutes.!!!! ;)

    Cheers,

    S.K.

  • JamesJABJamesJAB Posts: 1,760
    edited September 2017
    drzap said:

    At the current state of the market, it is unwise to buy two 1070's over one 1080ti.  the one 1080ti is cheaper, doesn't have to deal with load balancing between two cards, uses only two slots in your computer and is less power sucking.  I see no advantage of a 1070 in your case.  If you want the most optimum and efficient performance, buy the hotrod.  This will also free up two slots for more options in the future.

    "I wish we could re-direct our arguments away from the generic, somewhat irrelevant "more, better, faster" and towards the HOW MUCH more, better and faster, and whether that matters to a real person. "

    The GTX1080i is a significant performance improvement from its classmates in every measurable category.  This only matters to someone who is looking for more convenience and faster render times in gpu applications.  I think the original poster cares about the performance improvement because she asked for advice about choosing between two 1070's and a single 1080ti.  The advice given is in reference to her inquiry. Those who are posting their advice need not consider the view of someone who admittedly sees Daz rendering as a casual diversion that he does between watering the plants and whatever else needs to be done throughout the day. People who are looking at high-end solutions are not casual or typical users.  Apparently, you think everyone who wants to buy a 1080ti should reconsider their choice and consider putting up with the longer render times and the inconvenience because 30% isn't enough to justify the extra expense (their money).

    By almost every expert opinion, the 1080ti is the king of consumer cards (well theoretically, the TitanXP is a consumer card) and they wholeheartedly recommend it for those who are shopping in its price range.  The 1080 is a dud and I don't know why it exists.  The 1070 is an excellent budget card if the price ever comes down and if you can deal with the memory limitation.  The 1070ti rumor sounds very exciting.  I am speccing a render station for when I start full-time rendering and I will fill the box with 3 1080ti's and a special single slot 1070, but that, of course, is far from typical.

     

    The argument about the GTX 1080 being a dud is not relevent to it's primary purpose.  Keep in mind that first and foremost these are video gaming cards.  Also keep in mind that the GTX 1080 came out almost a year before the 1080 ti.
    The original pecking order was as follows:
    $199 - GTX 1060 3GB
    $249 - GTX 1060 6GB
    $449 - GTX 1070 8GB
    $699 - GTX 1080 8GB GDDR5X
    $1200 - Nvidia Titan X 12GB GDDR5X

    Earlier this year the 1080 ti and Titan XP came out and shifted the pecking order (this list is based on correct pricing, not mining inflated)
    $109 - GTX 1050 2GB
    $139 - GTX 1050 ti 4GB
    $199 - GTX 1060 3GB
    $249 - GTX 1060 6GB
    $379 - GTX 1070 8GB
    $449 - GTX 1070 ti 8GB -Rumored- -Specs leaked accidently by card manufacturers
    $499 - GTX 1080 8GB GDDR5X
    $699 - GTX 1080 ti 11 GB GDDR5X (Replaced the Titan X minus 1GB of RAM +asociated internal hardware, but saem number of cuda cores)
    $1200 - Nvidia Titan XP 12GB (full unlocked Pascal chip)

    For Gamers the GTX 1080 still fits nicely between the 1070 and 1080 ti in both price and game FPS, though that may change when the 1070 ti comes out.
    For Iray rendering both GTX 1070 and 1080 both having 8GB of VRAM removes most of the incentive for the more expensive card.  The only version of the GTX 1080 I would consider getting is the one inside of the Nvidia Quadro P5000 because it is paired up with 16GB GDDR5X VRAM, though it is a $2000+ card right now.

    Post edited by JamesJAB on
  • drzapdrzap Posts: 795

     

    JamesJAB said:
    drzap said:

     

    For Gamers the GTX 1080 still fits nicely between the 1070 and 1080 ti ....

     

    Yeah, there's gaming.  The mining craze has messed up the normal price structure and I think the 1070ti is a reaction to that, although the move by Intel isn't unprecedented.  I recall the GTX780ti (or was it the 750ti?) was a wicked card. 

  • swordkensiaswordkensia Posts: 348
    edited September 2017

    So the results are in.!!!

    Machine 2: running 2x1070's. exact same render settings as prior test.

    Scene Load Time: 3 minutes, 19 seconds (virtually the same load time as for a single card).

    Scene Render Time: 44 minutes, 30 seconds.

    Card scaling was actually better than I expected, but it definately helps to have two cards of the EXACT SAME MAKE AND MODEL, when it comes to scaling.

    Cheers,

    S.k.

    Post edited by swordkensia on
  • The merits of two cards vs one is an interesting topic, it doesn't need spicing up with vinegar. Plese keep the discussion civil and drop the digs.

  • ebergerlyebergerly Posts: 3,255
    edited September 2017

    So given the render results, allow me to attempt a simple cost/benefit analysis to bring together all the issues that have been discussed so we have a clear picture of which, if any option is preferable. Feel free to correct, add, or modify what I have:

    Cost:

    Retail prices: 1080ti: $750, 2x 1070: $900

    Cost of electricity: negligible difference for most users

    Additional power supply requirements: 50 watts for 2 x 1070 (1080ti: 250 watts vs. 2  x 150 watts (300watts))

    PCI space requirements: additional PCI slot for 2 x 1070

    Benefits:

    VRAM: 11GB vs 8GB

    Render speed: negligible difference

    Redundancy in case of failure with 2 x 1070

    So from my perspective (and I may be missing something) the real cost vs benefit comes down this:

    With the 1080ti you're paying $150 less than the cost of the 2 x 1070, you MIGHT be saving a larger power supply, you're saving an additional PCI slot, and you're getting 11GB VRAM compared to the 1070's 8GB, but you're gettting the same render performance as 2 x 1070.

    So what's the benefit of 2 x 1070? Not much. Seems like you're paying $150 for redundancy protection, if that really matters to you. And maybe you're paying more if you have to change your power supply to accomodate the two cards. 

    So bottom line, in the choice between 1 x 1080ti and 2 x 1070, it doesn't seem like there's any reason to pay the extra $150 for the 2 x 1070.   

    Post edited by ebergerly on
  • Thats what the numbers say.

    Its all pretty consistant, which is pleasently surprising.

    Purchasing choices and the why's and where for's of such things, are very much a personal thing.. I bought my pair of 1070's back in march before the price hike and JUST before the 1080 ti was released., I picked up the pair for £680.00, which is comparable to a typical 1080 ti price today.

    MY reason to purchase a pair then, was at the time the price of the 1080ti was not confirmed, nor was its relative render performance, so I took the gamble and got the 1070's. (a purchase which i don't regret at all). Consider this I was getting virtually the same performance as a Titan X/980ti, for HALF the cost..,It was a no brainer.., but this market moves fast fast fast., and the eventual price of the 1080ti was a shocker, considering the titan xp, which is virtually the same card was selling for around £1100.00, it was not in the realms of imbrobabilty that NVIDIA would be asking for figures around £800.00+ for its 1080ti.

    Cheers,

    S.K.

     

     

  • drzapdrzap Posts: 795

    Consider this I was getting virtually the same performance as a Titan X/980ti, for HALF the cost..,It was a no brainer.., but this market moves fast fast fast., and the eventual price of the 1080ti was a shocker, considering the titan xp, which is virtually the same card was selling for around £1100.00, it was not in the realms of imbrobabilty that NVIDIA would be asking for figures around £800.00+ for its 1080ti.

    Cheers,

    S.K.

     

     

    And consider that the next gen Volta is just around the corner, which promises a huge leap in performance, this market is very fast moving.  By buying only one card today, you can leave a slot or two open for the gtx1180 tomorrow.   It's good to know that iRay scales so good.  You get pretty much double the performance from two graphics cards compared to one.  This is Octane and Redshift territory.

  • ebergerly said:

    So given the render results, allow me to attempt a simple cost/benefit analysis to bring together all the issues that have been discussed so we have a clear picture of which, if any option is preferable. Feel free to correct, add, or modify what I have:

    Cost:

    Retail prices: 1080ti: $750, 2x 1070: $900

    Cost of electricity: negligible difference for most users

    Additional power supply requirements: 50 watts for 2 x 1070 (1080ti: 250 watts vs. 2  x 150 watts (300watts))

    PCI space requirements: additional PCI slot for 2 x 1070

    Benefits:

    VRAM: 11GB vs 8GB

    Render speed: negligible difference

    Redundancy in case of failure with 2 x 1070

    So from my perspective (and I may be missing something) the real cost vs benefit comes down this:

    With the 1080ti you're paying $150 less than the cost of the 2 x 1070, you MIGHT be saving a larger power supply, you're saving an additional PCI slot, and you're getting 11GB VRAM compared to the 1070's 8GB, but you're gettting the same render performance as 2 x 1070.

    So what's the benefit of 2 x 1070? Not much. Seems like you're paying $150 for redundancy protection, if that really matters to you. And maybe you're paying more if you have to change your power supply to accomodate the two cards. 

    So bottom line, it doesn't seem like there's any reason to pay the extra $150 for the 2 x 1070.   

    I think a little correction on the pros of the 1080 Ti, one big one I think you missed...

    Much faster time to load scene into card.  This is huge when you're making little tweaks, like lighting changes, difficulty fitting to fix pokethrough (because HD changes in mesh aren't shown in preview), etc.

     

    The electical cost - if you're running AC, you can double it since whatever extra heat you pump out makes your AC run harder.  I pretty much run it all year, so it's fairly significant.

     

  • ebergerlyebergerly Posts: 3,255

    The electical cost - if you're running AC, you can double it since whatever extra heat you pump out makes your AC run harder.  I pretty much run it all year, so it's fairly significant.

    I have to strongly disagree on that one. We're talking the equivalent of a compact fluorescent light bulb. 50 watts. Yeah, it's easy to say "well, it will heat up the house and cause the A/C to turn on all day", but unless you have some data to support a single light bulb heating up the house then I think it's just unfounded speculation. I really doubt a single light bulb will have any signficant effect in house temps that will tip the A/C over the brink and cause it to go on when it otherwise wouldn't.

     

  • dragotxdragotx Posts: 1,138
    ebergerly said:

    The electical cost - if you're running AC, you can double it since whatever extra heat you pump out makes your AC run harder.  I pretty much run it all year, so it's fairly significant.

    I have to strongly disagree on that one. We're talking the equivalent of a compact fluorescent light bulb. 50 watts. Yeah, it's easy to say "well, it will heat up the house and cause the A/C to turn on all day", but unless you have some data to support a single light bulb heating up the house then I think it's just unfounded speculation. I really doubt a single light bulb will have any signficant effect in house temps that will tip the A/C over the brink and cause it to go on when it otherwise wouldn't.

     

    When I am running a render, even on just one machine rather than both of my render rigs, the room that my computers are in is noticeably warmer than when I'm not running anything.  Definitely enough warmer that if there was a thermostat anywhere near it would most definitely kick on the AC.  So while it might not be able to heat up the entire house noticeably, it will affect your AC depending on where the temperature sensors are.

  • dragotxdragotx Posts: 1,138

    Thats what the numbers say.

    Its all pretty consistant, which is pleasently surprising.

    Purchasing choices and the why's and where for's of such things, are very much a personal thing.. I bought my pair of 1070's back in march before the price hike and JUST before the 1080 ti was released., I picked up the pair for £680.00, which is comparable to a typical 1080 ti price today.

    MY reason to purchase a pair then, was at the time the price of the 1080ti was not confirmed, nor was its relative render performance, so I took the gamble and got the 1070's. (a purchase which i don't regret at all). Consider this I was getting virtually the same performance as a Titan X/980ti, for HALF the cost..,It was a no brainer.., but this market moves fast fast fast., and the eventual price of the 1080ti was a shocker, considering the titan xp, which is virtually the same card was selling for around £1100.00, it was not in the realms of imbrobabilty that NVIDIA would be asking for figures around £800.00+ for its 1080ti.

    Cheers,

    S.K.

     

     

    One thing I'm curious about:  If I have a 1080ti and a 1070 running in the same machine, do I still get to use all of the vram on the 1080ti, or does the 1070 limit what I can use? (I know it won't use both cards for available VRAM for loading the scene, I'll be putting the 1080ti as primary, but I thought I'd read somewhere that the scene has to be able to fit on either card or it will still fail to CPU)

  • bluejauntebluejaunte Posts: 1,923
    ebergerly said:

    If you are not a gamer, it is really not wise to make this argument. 

    Apparently not. I'm kinda surprised there's so much involved in the importance of frame rates for video games. I figured it's just a bunch of guys running around with machine guns shooting people and blowing stuff up. 

    So would you agree that when you get up past 60 FPS, and let's say two cards you're comparing are both producing greater than 60 FPS. Saying card 1 is "better" because it gives you 118 FPS compared to the other that only gives 100 FPS, is card 1 really noticeably better, or is the difference irrelevant and not worth spending money on? 

    100 vs 118 may be irrelevant if that's what it really comes down to. However, resolutions for PC gaming keep going up as well. Many people today like myself have a 2560x1440 display, 4k displays will soon be common. To run these kinds of resolutions it may be the difference between 45 and 60 fps, which is pretty substantial. Also I run a 165Hz monitor which may sound overkill but the difference it makes has to be experienced to really understand how much the human eye can see. You would notice instantly just by moving a window around on the desktop that it feels rather like sliding around a sheet of paper on your table. There is no stuttering, juddering or whatever we wanna call it. Which your probably feel isn't the case on your 60Hz monitor either, but again one has to experience what it actually feels like at 165Hz to appreciate that 60Hz isn't nearly enough.

    Also, 100 vs 118 may be mostly irrelevant but that's not how games work. They vary in framerate as you get to areas that are more demanding or more enemies are on screen, fps may drop to 50fps instead of 60 in places which would be more dramatic. Is that all worth 300 bucks though? That is of course for everyone to decide individually and of course it's mostly down to the financial capabilities of people if we're honest. There's a fair bit of vanity too, as you can usually drop some visual eyecandy in the settings to make the game work fine on lower end hardware and it usually won't look completely horrible because of it. But there's the nagging thought of it not looking as good as it could. First world problems for sure.

    In general, fps myths seem to be some of the hardest to kill. In 2017 you still sometimes come across people saying that they are happy with 30 fps and that the eye cannot see past 15 fps anyway. They heard at some point in their life something about 15 fps, completely misunderstood the subject and then stick to that knowledge for the rest of their life. 15 fps is of course merely when the human eye begins to see motion instead of single images.

  • ebergerlyebergerly Posts: 3,255
    edited September 2017

     

    dragotx said:

    When I am running a render, even on just one machine rather than both of my render rigs, the room that my computers are in is noticeably warmer than when I'm not running anything.  Definitely enough warmer that if there was a thermostat anywhere near it would most definitely kick on the AC.  So while it might not be able to heat up the entire house noticeably, it will affect your AC depending on where the temperature sensors are.

    That's fine, but we're talking about a difference of 50 watts, not a total system power of say 300 watts or 500 watts or more that you're referring to. 

    It depends on a lot of stuff. The amount of time you're rendering, the size of the room, the insulation in the walls, the outside temperature, the settings of the A/C. 

    Yeah, 2 x 1070 might warm the room a couple degrees more than a 1 x 1080ti, but it that enough to keep the A/C on longer than it normally would? It's very complicated, and the answer depends a lot on each person's situation. There's no way you can make a blanket statement. 

    It's a bit like me claiming an additional benefit of only one 1080ti is that the 2 x 1070's generate a lot more harmful radiation that will cause cancer and make me die. Yeah, okay, sounds like one of those things that might have some effect, but without data it's just wild speculation. 

    Post edited by ebergerly on
  • dragotx said:

    Thats what the numbers say.

    Its all pretty consistant, which is pleasently surprising.

    Purchasing choices and the why's and where for's of such things, are very much a personal thing.. I bought my pair of 1070's back in march before the price hike and JUST before the 1080 ti was released., I picked up the pair for £680.00, which is comparable to a typical 1080 ti price today.

    MY reason to purchase a pair then, was at the time the price of the 1080ti was not confirmed, nor was its relative render performance, so I took the gamble and got the 1070's. (a purchase which i don't regret at all). Consider this I was getting virtually the same performance as a Titan X/980ti, for HALF the cost..,It was a no brainer.., but this market moves fast fast fast., and the eventual price of the 1080ti was a shocker, considering the titan xp, which is virtually the same card was selling for around £1100.00, it was not in the realms of imbrobabilty that NVIDIA would be asking for figures around £800.00+ for its 1080ti.

    Cheers,

    S.K.

     

     

    One thing I'm curious about:  If I have a 1080ti and a 1070 running in the same machine, do I still get to use all of the vram on the 1080ti, or does the 1070 limit what I can use? (I know it won't use both cards for available VRAM for loading the scene, I'll be putting the 1080ti as primary, but I thought I'd read somewhere that the scene has to be able to fit on either card or it will still fail to CPU)

    You would be limited to the card with the lowest Vram, if using both cards together to render a scene.

    However if you disable the 1070 and leave the 1080ti enabled you will be able to use all of that cards Vram, but of course you will only be able to render on that card. I sometimes have to do this for larger scenes.

    Cheers,

    S.K.

  • Gator_2236745Gator_2236745 Posts: 1,312
    edited September 2017
    ebergerly said:

     

    dragotx said:

    When I am running a render, even on just one machine rather than both of my render rigs, the room that my computers are in is noticeably warmer than when I'm not running anything.  Definitely enough warmer that if there was a thermostat anywhere near it would most definitely kick on the AC.  So while it might not be able to heat up the entire house noticeably, it will affect your AC depending on where the temperature sensors are.

    That's fine, but we're talking about a difference of 50 watts, not a total system power of say 300 watts or 500 watts or more that you're referring to. 

    It depends on a lot of stuff. The amount of time you're rendering, the size of the room, the insulation in the walls, the outside temperature, the settings of the A/C. 

    Yeah, 2 x 1070 might warm the room a couple degrees more than a 1 x 1080ti, but it that enough to keep the A/C on longer than it normally would? It's very complicated, and the answer depends a lot on each person's situation. There's no way you can make a blanket statement. 

    It's a bit like me claiming an additional benefit of only one 1080ti is that the 2 x 1070's generate a lot more harmful radiation that will cause cancer and make me die. Yeah, okay, sounds like one of those things that might have some effect, but without data it's just wild speculation. 

    Depends on your useage. If it's renders here and there, it's not enough to worry about. I think it's fair to say it will cost at least as much to get cool down the extra heat as it cost to generate. That 50 watts is now 100. If you're rendering 24/7 in a hot area like me, that's about an extra $100-150 per year depending on your electricity cost (member said 16 cents per watt in CA). For a full time renderer that say keeps the card a few years before upgrading, that's 200-300 dollars.
    Post edited by Gator_2236745 on
  • ebergerlyebergerly Posts: 3,255
    edited September 2017

    Thanks bluejaunte. I guess when I watch GPU benchmark videos for various games it seems like the environments and textures, while relatively high rez, are still clearly CG, and don't seem anywhere near worthy of 2k or 4k displays. So the idea of needing these high framerates, especially when every game seems to be nothing more than a guy/girl running and shooting in a clearly CG environment, just doesn't compute for me. 

    I suppose I should actually try a video game, but since my eyes have some difficulty even with 1920 x 1080 I'm probably not the best judge. 

    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255
    edited September 2017

    Depends on your useage. If it's renders here and there, it's not enough to worry about. I think it's fair to say it will cost at least as much to get cool down the extra heat as it cost to generate. That 50 watts is now 100. If you're rendering 24/7 in a hot area like me, that's about an extra $100-150 per year depending on your electricity cost (member said 16 cents per watt in CA). For a full time renderer that say keeps the card a few years before upgrading, that's 200-300 dollars.

    Of course anyone can some up with an extreme scenario to prove the point, but practically I think it would be very tough to prove a 100 watt lighbulb affects your A/C bill. On a really hot day, your A/C may go on at say 10am, stay on all day, and go off at say 7pm just because of the outside heat. Your 100 watt light bulb has no effect on that whatsoever. The A/C is on whether your 100 watts is rendering or not. So does it keep your 1kW airconditioner on longer that it would otherwise be on? Does it stay on all night because of a 100 watts 2 x 1070, if it's even still rendering? Probably not because it got cool outside and the house cooled off by itself.

    I think if you give it some thought you'll quickly realize, at best, it's a VERY slight difference in A/C bills, if at all.  

    Post edited by ebergerly on
  • ebergerly said:

    Depends on your useage. If it's renders here and there, it's not enough to worry about. I think it's fair to say it will cost at least as much to get cool down the extra heat as it cost to generate. That 50 watts is now 100. If you're rendering 24/7 in a hot area like me, that's about an extra $100-150 per year depending on your electricity cost (member said 16 cents per watt in CA). For a full time renderer that say keeps the card a few years before upgrading, that's 200-300 dollars.

    Of course anyone can some up with an extreme scenario to prove the point, but practically I think it would be very tough to prove a 100 watt lighbulb affects your A/C bill. On a really hot day, your A/C may go on at say 10am, stay on all day, and go off at say 7pm just because of the outside heat. Your 100 watt light bulb has no effect on that whatsoever. The A/C is on whether your 100 watts is rendering or not. So does it keep your 1kW airconditioner on longer that it would otherwise be on? Does it stay on all night because of a 100 watts 2 x 1070, if it's even still rendering? Probably not because it got cool outside and the house cooled off by itself.

    I think if you give it some thought you'll quickly realize, at best, it's a VERY slight difference in A/C bills, if at all.  

    Thermodynamics says it will be an equal cost to cool vs. To heat (in a perfect world with no efficiency loss) Like dragotx, the area where my computers are is noticeably warmer. The other rooms feel a bit cold, since the thermostat is in the same room as the computers. I have one machine dedicated to rendering that goes pretty much 24/7 all the time.
  • ebergerly said:

    Depends on your useage. If it's renders here and there, it's not enough to worry about. I think it's fair to say it will cost at least as much to get cool down the extra heat as it cost to generate. That 50 watts is now 100. If you're rendering 24/7 in a hot area like me, that's about an extra $100-150 per year depending on your electricity cost (member said 16 cents per watt in CA). For a full time renderer that say keeps the card a few years before upgrading, that's 200-300 dollars.

    Of course anyone can some up with an extreme scenario to prove the point, but practically I think it would be very tough to prove a 100 watt lighbulb affects your A/C bill. On a really hot day, your A/C may go on at say 10am, stay on all day, and go off at say 7pm just because of the outside heat. Your 100 watt light bulb has no effect on that whatsoever. The A/C is on whether your 100 watts is rendering or not. So does it keep your 1kW airconditioner on longer that it would otherwise be on? Does it stay on all night because of a 100 watts 2 x 1070, if it's even still rendering? Probably not because it got cool outside and the house cooled off by itself.

    I think if you give it some thought you'll quickly realize, at best, it's a VERY slight difference in A/C bills, if at all.  

     

    Thermodynamics says it will be an equal cost to cool vs. To heat (in a perfect world with no efficiency loss) Like dragotx, the area where my computers are is noticeably warmer. The other rooms feel a bit cold, since the thermostat is in the same room as the computers. I have one machine dedicated to rendering that goes pretty much 24/7 all the time.

    I live in Wisconsin and during the winter only turn on the heat enough to keep pipes from freezing because the room where I spend all my time is effectively heated by PCs. :\

    If the air is on all the time to start with there probably isn't any effect on a bill though, you'll just have a warmer room than you might like.

  • ebergerlyebergerly Posts: 3,255
    edited September 2017

    Scott, since it's summer, maybe you can do a simple test for us. Track how many hours your air conditioner stays on today, and then tomorrow turn on a 100 watt light bulb all day and night and see if it stays on longer. 

    Post edited by ebergerly on
  • ebergerly said:

    Scott, since it's summer, maybe you can do a simple test for us. Track how many hours your air conditioner stays on today, and then tomorrow turn on a 100 watt light bulb all day and night and see if it stays on longer. 

    The temperatures and cloud coverage would have tp be identical across every minute of the day for that to work. I don't get why basic thermodynamics is so difficult to understand. Your AC is working to pull energy out of the air in your home. For every unit of energy you pump in, it has to pump it out.
  • ebergerly is assuming a situation where the AC is running constantly and thus does not pump out the extra energy. In the situation where it isn't you are obviously correct.

  • ebergerlyebergerly Posts: 3,255
    edited September 2017

    Also the outside temperature is a huge factor in cooling of the house, just as it is with heating in the first place. As is the amount of insulation. For example, if it gets cool at night, depending on how good your insulation is, that may have a quick and significant effect on cooling, so that the A/C doesn't have to stay on. That also is basic thermodynamics, but recognizing there are other factors than A/C cooling. I think you're assuming that only the A/C cools the house? 

    And you're exactly right there are a bunch of factors, like I said. My only point is that you can't make a blanket assertion without facts to support. Just like my blanket assertion of radiation from a second 1070 causing cancer.

    We can't simplify based on general ideas and expect it to have legitmacy.  

    Post edited by ebergerly on
  • dragotxdragotx Posts: 1,138
    dragotx said:

    Thats what the numbers say.

    Its all pretty consistant, which is pleasently surprising.

    Purchasing choices and the why's and where for's of such things, are very much a personal thing.. I bought my pair of 1070's back in march before the price hike and JUST before the 1080 ti was released., I picked up the pair for £680.00, which is comparable to a typical 1080 ti price today.

    MY reason to purchase a pair then, was at the time the price of the 1080ti was not confirmed, nor was its relative render performance, so I took the gamble and got the 1070's. (a purchase which i don't regret at all). Consider this I was getting virtually the same performance as a Titan X/980ti, for HALF the cost..,It was a no brainer.., but this market moves fast fast fast., and the eventual price of the 1080ti was a shocker, considering the titan xp, which is virtually the same card was selling for around £1100.00, it was not in the realms of imbrobabilty that NVIDIA would be asking for figures around £800.00+ for its 1080ti.

    Cheers,

    S.K.

     

     

    One thing I'm curious about:  If I have a 1080ti and a 1070 running in the same machine, do I still get to use all of the vram on the 1080ti, or does the 1070 limit what I can use? (I know it won't use both cards for available VRAM for loading the scene, I'll be putting the 1080ti as primary, but I thought I'd read somewhere that the scene has to be able to fit on either card or it will still fail to CPU)

    You would be limited to the card with the lowest Vram, if using both cards together to render a scene.

    However if you disable the 1070 and leave the 1080ti enabled you will be able to use all of that cards Vram, but of course you will only be able to render on that card. I sometimes have to do this for larger scenes.

    Cheers,

    S.K.

    Thanks for the info, that definitely adds another wrinkle to the calculations for me, since for me the primary reason to get the 1080ti over the 1070 is the extra VRAM.  It sounds like I need to decide if I'd rather have the extra VRAM for the bigger scenes, or a second 1070 for faster renders. 

  • ebergerly is assuming a situation where the AC is running constantly and thus does not pump out the extra energy. In the situation where it isn't you are obviously correct.

    Not likely to run an AC unit 100% of the time in a real world scenario, but I suppose if you actually did (and the coils didn't freeze up on you) then you wouldn't notice the temperature difference 100w pumped in would make.
  • ebergerly said:

    Also the outside temperature is a huge factor in cooling of the house, just as it is with heating in the first place. As is the amount of insulation. For example, if it gets cool at night, depending on how good your insulation is, that may have a quick and significant effect on cooling, so that the A/C doesn't have to stay on. That also is basic thermodynamics, but recognizing there are other factors than A/C cooling. I think you're assuming that only the A/C cools the house? 

    And you're exactly right there are a bunch of factors, like I said. My only point is that you can't make a blanket assertion without facts to support. Just like my blanket assertion of radiation from a second 1070 causing cancer.

    We can't simplify based on general ideas and expect it to have legitmacy.  

    You're pumping extra heat into you're home, it's not a wild assertion that the AC will need to pump it out. That's how AC works. Just because it's not a unit that's easily perceivable doesn't mean it's not happening.
Sign In or Register to comment.