Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Bless you Sir.
I think I will quit whilst I'm ahead. !!!
S.K.
Is there any chance that you could run the GTX 1070 system render again, but using both cards?
Ths topic is after all based around the 2 lesser cards vs 1 faster card.
Would be nice to see how well the render time scales with two cards on a more complex scene.
That would be interesting. To the extent you believe the results for the benchmark scene from Sickleyield, it appears that the results should be fairly close. A single 1080ti rendered at 2 minutes, a single 1070 at 3 minutes, and dual 1070's at 1.75 minutes. Given the margin of error I'm guessing it's a draw between the 1080ti and the dual 1070's. It will be interesting to see if that holds with a larger scene.
Yes indeed,
I have just started the render now.
The reason I posted the single card head to head, was just to demonstrate the actual performance difference between these two cards.
I should have the two 1070's card result for you in about 50 minutes.!!!! ;)
Cheers,
S.K.
The argument about the GTX 1080 being a dud is not relevent to it's primary purpose. Keep in mind that first and foremost these are video gaming cards. Also keep in mind that the GTX 1080 came out almost a year before the 1080 ti.
The original pecking order was as follows:
$199 - GTX 1060 3GB
$249 - GTX 1060 6GB
$449 - GTX 1070 8GB
$699 - GTX 1080 8GB GDDR5X
$1200 - Nvidia Titan X 12GB GDDR5X
Earlier this year the 1080 ti and Titan XP came out and shifted the pecking order (this list is based on correct pricing, not mining inflated)
$109 - GTX 1050 2GB
$139 - GTX 1050 ti 4GB
$199 - GTX 1060 3GB
$249 - GTX 1060 6GB
$379 - GTX 1070 8GB
$449 - GTX 1070 ti 8GB -Rumored- -Specs leaked accidently by card manufacturers
$499 - GTX 1080 8GB GDDR5X
$699 - GTX 1080 ti 11 GB GDDR5X (Replaced the Titan X minus 1GB of RAM +asociated internal hardware, but saem number of cuda cores)
$1200 - Nvidia Titan XP 12GB (full unlocked Pascal chip)
For Gamers the GTX 1080 still fits nicely between the 1070 and 1080 ti in both price and game FPS, though that may change when the 1070 ti comes out.
For Iray rendering both GTX 1070 and 1080 both having 8GB of VRAM removes most of the incentive for the more expensive card. The only version of the GTX 1080 I would consider getting is the one inside of the Nvidia Quadro P5000 because it is paired up with 16GB GDDR5X VRAM, though it is a $2000+ card right now.
Yeah, there's gaming. The mining craze has messed up the normal price structure and I think the 1070ti is a reaction to that, although the move by Intel isn't unprecedented. I recall the GTX780ti (or was it the 750ti?) was a wicked card.
So the results are in.!!!
Machine 2: running 2x1070's. exact same render settings as prior test.
Scene Load Time: 3 minutes, 19 seconds (virtually the same load time as for a single card).
Scene Render Time: 44 minutes, 30 seconds.
Card scaling was actually better than I expected, but it definately helps to have two cards of the EXACT SAME MAKE AND MODEL, when it comes to scaling.
Cheers,
S.k.
The merits of two cards vs one is an interesting topic, it doesn't need spicing up with vinegar. Plese keep the discussion civil and drop the digs.
So given the render results, allow me to attempt a simple cost/benefit analysis to bring together all the issues that have been discussed so we have a clear picture of which, if any option is preferable. Feel free to correct, add, or modify what I have:
Cost:
Retail prices: 1080ti: $750, 2x 1070: $900
Cost of electricity: negligible difference for most users
Additional power supply requirements: 50 watts for 2 x 1070 (1080ti: 250 watts vs. 2 x 150 watts (300watts))
PCI space requirements: additional PCI slot for 2 x 1070
Benefits:
VRAM: 11GB vs 8GB
Render speed: negligible difference
Redundancy in case of failure with 2 x 1070
So from my perspective (and I may be missing something) the real cost vs benefit comes down this:
With the 1080ti you're paying $150 less than the cost of the 2 x 1070, you MIGHT be saving a larger power supply, you're saving an additional PCI slot, and you're getting 11GB VRAM compared to the 1070's 8GB, but you're gettting the same render performance as 2 x 1070.
So what's the benefit of 2 x 1070? Not much. Seems like you're paying $150 for redundancy protection, if that really matters to you. And maybe you're paying more if you have to change your power supply to accomodate the two cards.
So bottom line, in the choice between 1 x 1080ti and 2 x 1070, it doesn't seem like there's any reason to pay the extra $150 for the 2 x 1070.
Thats what the numbers say.
Its all pretty consistant, which is pleasently surprising.
Purchasing choices and the why's and where for's of such things, are very much a personal thing.. I bought my pair of 1070's back in march before the price hike and JUST before the 1080 ti was released., I picked up the pair for £680.00, which is comparable to a typical 1080 ti price today.
MY reason to purchase a pair then, was at the time the price of the 1080ti was not confirmed, nor was its relative render performance, so I took the gamble and got the 1070's. (a purchase which i don't regret at all). Consider this I was getting virtually the same performance as a Titan X/980ti, for HALF the cost..,It was a no brainer.., but this market moves fast fast fast., and the eventual price of the 1080ti was a shocker, considering the titan xp, which is virtually the same card was selling for around £1100.00, it was not in the realms of imbrobabilty that NVIDIA would be asking for figures around £800.00+ for its 1080ti.
Cheers,
S.K.
And consider that the next gen Volta is just around the corner, which promises a huge leap in performance, this market is very fast moving. By buying only one card today, you can leave a slot or two open for the gtx1180 tomorrow. It's good to know that iRay scales so good. You get pretty much double the performance from two graphics cards compared to one. This is Octane and Redshift territory.
I think a little correction on the pros of the 1080 Ti, one big one I think you missed...
Much faster time to load scene into card. This is huge when you're making little tweaks, like lighting changes, difficulty fitting to fix pokethrough (because HD changes in mesh aren't shown in preview), etc.
The electical cost - if you're running AC, you can double it since whatever extra heat you pump out makes your AC run harder. I pretty much run it all year, so it's fairly significant.
I have to strongly disagree on that one. We're talking the equivalent of a compact fluorescent light bulb. 50 watts. Yeah, it's easy to say "well, it will heat up the house and cause the A/C to turn on all day", but unless you have some data to support a single light bulb heating up the house then I think it's just unfounded speculation. I really doubt a single light bulb will have any signficant effect in house temps that will tip the A/C over the brink and cause it to go on when it otherwise wouldn't.
When I am running a render, even on just one machine rather than both of my render rigs, the room that my computers are in is noticeably warmer than when I'm not running anything. Definitely enough warmer that if there was a thermostat anywhere near it would most definitely kick on the AC. So while it might not be able to heat up the entire house noticeably, it will affect your AC depending on where the temperature sensors are.
One thing I'm curious about: If I have a 1080ti and a 1070 running in the same machine, do I still get to use all of the vram on the 1080ti, or does the 1070 limit what I can use? (I know it won't use both cards for available VRAM for loading the scene, I'll be putting the 1080ti as primary, but I thought I'd read somewhere that the scene has to be able to fit on either card or it will still fail to CPU)
100 vs 118 may be irrelevant if that's what it really comes down to. However, resolutions for PC gaming keep going up as well. Many people today like myself have a 2560x1440 display, 4k displays will soon be common. To run these kinds of resolutions it may be the difference between 45 and 60 fps, which is pretty substantial. Also I run a 165Hz monitor which may sound overkill but the difference it makes has to be experienced to really understand how much the human eye can see. You would notice instantly just by moving a window around on the desktop that it feels rather like sliding around a sheet of paper on your table. There is no stuttering, juddering or whatever we wanna call it. Which your probably feel isn't the case on your 60Hz monitor either, but again one has to experience what it actually feels like at 165Hz to appreciate that 60Hz isn't nearly enough.
Also, 100 vs 118 may be mostly irrelevant but that's not how games work. They vary in framerate as you get to areas that are more demanding or more enemies are on screen, fps may drop to 50fps instead of 60 in places which would be more dramatic. Is that all worth 300 bucks though? That is of course for everyone to decide individually and of course it's mostly down to the financial capabilities of people if we're honest. There's a fair bit of vanity too, as you can usually drop some visual eyecandy in the settings to make the game work fine on lower end hardware and it usually won't look completely horrible because of it. But there's the nagging thought of it not looking as good as it could. First world problems for sure.
In general, fps myths seem to be some of the hardest to kill. In 2017 you still sometimes come across people saying that they are happy with 30 fps and that the eye cannot see past 15 fps anyway. They heard at some point in their life something about 15 fps, completely misunderstood the subject and then stick to that knowledge for the rest of their life. 15 fps is of course merely when the human eye begins to see motion instead of single images.
That's fine, but we're talking about a difference of 50 watts, not a total system power of say 300 watts or 500 watts or more that you're referring to.
It depends on a lot of stuff. The amount of time you're rendering, the size of the room, the insulation in the walls, the outside temperature, the settings of the A/C.
Yeah, 2 x 1070 might warm the room a couple degrees more than a 1 x 1080ti, but it that enough to keep the A/C on longer than it normally would? It's very complicated, and the answer depends a lot on each person's situation. There's no way you can make a blanket statement.
It's a bit like me claiming an additional benefit of only one 1080ti is that the 2 x 1070's generate a lot more harmful radiation that will cause cancer and make me die. Yeah, okay, sounds like one of those things that might have some effect, but without data it's just wild speculation.
You would be limited to the card with the lowest Vram, if using both cards together to render a scene.
However if you disable the 1070 and leave the 1080ti enabled you will be able to use all of that cards Vram, but of course you will only be able to render on that card. I sometimes have to do this for larger scenes.
Cheers,
S.K.
Thanks bluejaunte. I guess when I watch GPU benchmark videos for various games it seems like the environments and textures, while relatively high rez, are still clearly CG, and don't seem anywhere near worthy of 2k or 4k displays. So the idea of needing these high framerates, especially when every game seems to be nothing more than a guy/girl running and shooting in a clearly CG environment, just doesn't compute for me.
I suppose I should actually try a video game, but since my eyes have some difficulty even with 1920 x 1080 I'm probably not the best judge.
Of course anyone can some up with an extreme scenario to prove the point, but practically I think it would be very tough to prove a 100 watt lighbulb affects your A/C bill. On a really hot day, your A/C may go on at say 10am, stay on all day, and go off at say 7pm just because of the outside heat. Your 100 watt light bulb has no effect on that whatsoever. The A/C is on whether your 100 watts is rendering or not. So does it keep your 1kW airconditioner on longer that it would otherwise be on? Does it stay on all night because of a 100 watts 2 x 1070, if it's even still rendering? Probably not because it got cool outside and the house cooled off by itself.
I think if you give it some thought you'll quickly realize, at best, it's a VERY slight difference in A/C bills, if at all.
I live in Wisconsin and during the winter only turn on the heat enough to keep pipes from freezing because the room where I spend all my time is effectively heated by PCs. :\
If the air is on all the time to start with there probably isn't any effect on a bill though, you'll just have a warmer room than you might like.
Scott, since it's summer, maybe you can do a simple test for us. Track how many hours your air conditioner stays on today, and then tomorrow turn on a 100 watt light bulb all day and night and see if it stays on longer.
ebergerly is assuming a situation where the AC is running constantly and thus does not pump out the extra energy. In the situation where it isn't you are obviously correct.
Also the outside temperature is a huge factor in cooling of the house, just as it is with heating in the first place. As is the amount of insulation. For example, if it gets cool at night, depending on how good your insulation is, that may have a quick and significant effect on cooling, so that the A/C doesn't have to stay on. That also is basic thermodynamics, but recognizing there are other factors than A/C cooling. I think you're assuming that only the A/C cools the house?
And you're exactly right there are a bunch of factors, like I said. My only point is that you can't make a blanket assertion without facts to support. Just like my blanket assertion of radiation from a second 1070 causing cancer.
We can't simplify based on general ideas and expect it to have legitmacy.
Thanks for the info, that definitely adds another wrinkle to the calculations for me, since for me the primary reason to get the 1080ti over the 1070 is the extra VRAM. It sounds like I need to decide if I'd rather have the extra VRAM for the bigger scenes, or a second 1070 for faster renders.