Two graphic cards?

Can someone explain how two graphics cards in one computer works? Is it worth it for Daz? Does it affect gaming at all? Is this the SLI thing I keep seeing when I go to look at graphics cards? Would it make it worth it to buy two cheaper cards over one 1080 Ti? (my budget lowered recently) (with Windows 10)
Post edited by FWIW on
Comments
decent video
some things left out of the video.
2 videocards with 4 gig or memory each still only = 4 gig of memory
in Rendering with Iray in Daz you want more Cuda cores and memory
For video games, 1 video card renders the even numbers of lines on the screen, while the second card renders the odd numbers. Problem is many games/applications do not utilize SLI properly or at all (typically get 20-30% more frame rate in games where SLI is properly utilized). If game developers optimized their games for SLI, then the performance gain would be great. But in my experience SLI is more of a headache where many games and apps have problems with it such as frame rate stuttering or crashes unless you disable SLI.
A single more powerful video card over 2 cheaper cards is better in my opinion. A single 1080 Ti is an absolute monster with Daz. Keep in mind if you choose to use 2 video cards, the video memory does not stack. So if you have 2x video cards each with 4GB, your PC will not have 8GB of video memory, it will only have 4GB. The 1080 Ti has 11GB. I used to have 2x GeForce 970's in SLI which have more total CUDA cores combined than the 1080Ti, but when I switched to a single 1080Ti, the performance gain was signifigant.
...one reason to have two cards. One with enough memory to drive the monitor(s) and a high VRAM one solely deicated to rendering.
You can have two cards without SLI? How does that work because that sounds like a really good idea. Especially if I listen to my sister and get use both monitors
...yes you can run two GPU cards without SLI.
If you are going to be using two cards for rendering you don''t want to use SLI.
To run two cards without SLI:
I agree with the above post. A single graphics card will always be better for Iray rendering than 2 cards that add up to the same price.
This is very important in Iray because of VRAM considerations. (the entire scene needs to fit into the VRAM of EACH card participating in the render job).
For Iray usage 1x Geforce GTX 1080 with 8GB of VRAM is better than 2x Geforce GTX 1060 6GB cards for these reasons:
You're much better off with a 1080 Ti than with two smaller cards.
1. For Iray the VRAM will be the biggest benefit, as mentioned they don't add with multiple cards, each card has it's own total and will drop out if there isn't enough VRAM.
2. With the same number of cores, the 1080 Ti will likely be faster to some degree.
3. Likely to draw less power, besides electricity which won't be a huge cost bigger power supplies to support multiple video cards more expensive.
4. For gaming, SLI does not scale nearly as well as say Iray using multiple cards. Also, some games don't support it, then you're stuck at the performance of 1 card. I don't run in to too many that don't support it, but it happens and I don't game very much.
ETA: If you do run multiple cards, then you may wind up turning SLI on and off constantly as you go back and forth between gaming and rendering which is a bit of a PITA as you have to close out of many applications.
I"ve been wrestling with the same question for a while. And what has been said is true, though there are some caveats IMO:
WinterFlame will also use it for gaming, and it also happens to be great for gaming.
And it's an awesome value, less than a year ago for that performance, you had to spend $1200 on a Titan X Pascal. Now for $700, you can get about the same performance basically just dropping from 12GB to 11GB. Last year you could buy a 1070 for a little less than they are now.
The benchmark thread isn't perfect, we specifically know it has taken out loading the scene into VRAM. It's also been pointed out it's simple scene whose performance may not be equal to larger, more complex scenes.
Scott, "great" and "awesome" are meaningless. It depends.
For gaming, are you running a game where the difference between the GTX 1080ti and say a GTX 1060 is from 90 FPS to 120 FPS? Do you care? Is it a graphics-intenstive game that taxes the system or not? Are you running 4k and really really need at 1080ti to get 60FPS, and without it you only get 40 FPS?
It depends.
And whether the benchmark is perfect is irrelevant. If you don't like the numbers, then add or subtract as much as you want, and then make your determination based on that. If you don't think the 35% improvement that NVIDIA quotes for a 1080ti over a 1080, then use your own numbers. Whether it's 35% or 40% is probably irrelevant for most people.
Ahh, yeah gaming just came to mind too.
In all my years of gaming, I have never thought to myself that I have too much vidya card power. Never. Modern games are written to take advantage of current hardware, and most of the time require turning some details down to get acceptable performance.
Maximum framerates only have meaning for benchmarks. Average and lowest mean the most to the gaming experience. Max doesn't matter when you jump into a congested area and the game turns into a slide show.
One 1080 Ti for 4K gaming? LOL I wish.
I had to turn down settings at 4K with my old Titan X's. Only my newer SLI Titan X Pascals can I run a game like Battlefront at Ultra and get mostly consistent 60 FPS - and I still saw some framerate drops on a few maps. ONE 1080 Ti? Yeah, you're not going to be using max settings, and that game came out 2 years ago.
Scott, I'm not a games guy (actually never played a video game), but I've seen enough tech videos to know that "awesome" results might be irrelevant for some. I'm looking at one right now where the lowest framerates for a game with the GTX 1080 were 44-46 FPS, and the exact same game with the GTX 1080ti were 48-58 FPS (the range is for 0.1% to 1% average).
Now, how many people would consider that "awesome" and want to pay the extra money? Seems to me that would be in the "barely noticeable" category.
Now expand that to the same % improvement in a game with minimums in the 80-90 FPS range? Is that high a framerate REALLY noticeable? Heck, movies on a big screen in the theater are in the 30 FPS range. Can you really tell the difference?
If you are confident that you will not need more than 8GB GPU RAM than the 1080 or 1070 may well make sense on price/performance. However, if you are concerned that you will need more memory (but not more than the 1080Ti's 11GB, less OS overhead) then the calculations chnage and it becomes a question of whether the avoidance of CPU rendering will happen often enough to matter. Just looking at the performance in the benchmark scene can do nothing to address that question.
Agreed. Running out of VRAM is deadly, IMO. Response of everything Iray grinds to a halt, even with a many core CPU. But personally, I have to work hard to make a scene that takes all of my 8GB.
And keep in mind that the one factor nobody discusses, which is scene management, can do wonders to cut down your VRAM usage. Maybe instead of throwing everything on your hard drive into the scene, break it into sub-scenes to give a lot better response.
I dunno where you saw that, I quickly searched and the first two I saw show the 1080 Ti performing a lot higher than a 1070.
http://hwbench.com/vgas/geforce-gtx-1080-ti-vs-geforce-gtx-1070
I looked at another because they don't say what it is, average or max but I'm guessing max. So I found another, Tom's Guide which is highly respected in the gaming & benchmarking community.
https://www.tomsguide.com/us/nvidia-gtx-1080-ti-benchmarks,review-4241.html
Notice how the average and minimum framerates are much higher in everything.
No offense, but you seemed to cherry-pick something and also chose a 1080 vs. the 1070 you have to make it seem like a 1080 Ti is not much better performing over a 1070. It is. It's either an accident, whatever benchmark that was happened to be the first you clicked on and looked at nothing else or intentionally deceiving yourself and others here that a 1080 Ti isn't that much more than a 1070 and not worth the money.
Scott, I don't care what anyone gets. I just happened to be watching Bitwits channel from March when the 1080ti came out and he happened to be comparing to the 1080. And he was asking if the NVIDIA claim of 35% improvement over the 1080 was legit. And he found it wasn't, at least in the FPS measurements he made in the games he tested, UNLESS you overclock it. Without overclocking is was something closer to 20% improvement.
All I'm saying is I wish people would put actual numbers to their recommendations rather than some hand waving and saying it's "great" and "awesome" and "better".
BTW, I've already mentioned many times the % improvement many have gotten with the iray benchmark comparing the 1080ti to the 1070. If you don't like that it's fine, but I think it's hard to argue the improvement exceeds 40% when it comes to iray.
And also BTW, Bitwit's testing ended up determining that the best value in $$ per frame was the 1080 over the 1080ti, even if you overclock the 1080ti.
Easy to see the difference with charts... Look at the second link I provided. Pretty plain that it backs up the assertion that a 1080 Ti is better.
And here's the link to the tests I was referring to:
https://www.youtube.com/watch?v=lxX1_mmmQAA
And Scott I checked your link to Tom's, and it seems that when you compare the 1080ti to the 1070, the performance increase on those particular games ranges from around 40-60%. And using the iray benchmark in this forum it seems about the same range, though closer to 40%.
Nobody would argue it's not "better". But the point is "how much better, and is it worth the money?"
At the end of the day, I'm hoping we can come to a consensus something like this. I'm winging some of this from memory, but at least we're in the right ballpark I think. For those who have to get down to the exact 1.23%, be my guest:
Hopefully that will help clarify exactly what benefit you're getting for the cost. Again these are BALLPARK numbers, and exact numbers vary depending on the game or application. And I'm being a bit generous to the 1080ti considering the iray benchmark results people have obtained are a bit lower.
You're still leaving out the extra VRAM.
Since 90% of the scenes I render won't fit in a 1070 or a 1080, that's a performance number of zero. I can say my 1080 Ti's perform a bazillion times better.

Perhaps you missed the post where I said "running out of VRAM is deadly" ?
And also that scene management can help you with those issues?
But do what your budget lets you. Not everyone just has $700 - $1000 to shill out for 1080 TI cards.
The question isn't a smaller card vs. a 1080 Ti, it's two smaller cards vs. a 1080 Ti. 1070's are about $450 a pop, so it's a question of $900 for two 1070s vs. $709 for one 1080 Ti.
Absolutely. For many people, going from a 30 minute render to a 15-20 minute render isn't worth the $300 premium. It's still a long damn render.
Isn't there a side benefit that the larger RAM on the 1080Ti will allow you to render more complex scenes with the GPU instead of having DAZ kick it over to CPU rendering?
Absolutely, and that's one of the points being made. The only question is whether a user really needs it for what he or she is doing. Maybe, but maybe not.
Yep. This is the truly huge benefit of a 1080ti that stacking cards won't get you.
For me, I'm looking at a 1080 ti not for getting a 30 minute render down to 15, but for getting an 8 to 10 hour render down to whatever I can get it to. And that's on ones that do manage to stay in the memory on my 1070. I render at what I admit is an absurd resolution for most people and that's why it takes so long, but it's the resolution I want to render at. If I use an HDRI for the environment and only one character, then I can knock them out in 10 to 15 minutes. But for what I usually render there aren't any HDRIs that really fit the image I'm going for, so I have to stick with full 3d environments. Also, for me, I find myself sticking anywhere from 4 to 10 characters in the scene. So for those renders having that much more Vram is important all on it's own, not even considering the performance boost over the 1070. Between the extra memory and IF I can really get a 40% boost by adding a 1080ti to my system, then $300 is a lot easier to justify
...some of the scenes I create could come close to challenging even a 1080Ti. My biggest scene is about 8.7 GB without the added load of the Daz programme when open. My compositing skills are terrible and not having a steady hand rules out digital painting of elements like shadows that may be lost in layered rendering, so I need to get as much out of the render pass as I can get. When there are a lot of elements in a scene, reducing texture size in a 2D programme can become an exercise in diminishing retruns compared to the time saved.