CPUs with integrated graphics?
RexRed
Posts: 1,374
in The Commons
CPUs with integrated graphics?
Can a CPU with "integrated graphics" help the graphics card with rendering more than one without integrated graphics?
Is there a boost over a CPU without integrated graphics?
Comments
There was a time when AMD iGPUs could work in tandem with discrete AMD GPUs using Crossfire. But this is no longer supported I've read. This would have also needed MB support to enable that. And of course there are no current desktop processors with nvidia graphics built in so SOL there.
I don't even know if Crossfire and SLI are a thing anymore like they once were but that's how they would be combined.
The MB would have to have some kind of 'hardware/software bridge' built into it.
Integrated GPUs were really designed to allow for more compact sytems as their main purpose.
From my experience with my system, I would, No, Make that absolute say No since most motherboards with Video are on the CPU. My I9 Video is on the CPU., as For AMD inception8 seems to know more about them. In the other system, the motherboard has an HDMI out but doesn't have Video because of the i7 CPU.
Ahh. Yeah I read your other post about the nvidia 3090's and it looks like you stated you have an AMD CPU (presumably with integrated graphics). Incompatible hardware to start with. I wouldn't even know why you'd need a boost with 2 of those in a presumed SLI configuration.
In general, if one is using a dedicated GPU (add on card), one is better off if there is no integrated GPU, as it keeps the system less complicated and less prone to conflicts.
I agree. Graphics processing is basically about doing vast amounts of heavy weight arithmetic very fast which also implies a very heaviliy loaded memory bus. Dedicated GPUs with their own memory and memory bus are bound to work a lot faster than a combined CPU and GPU where the two functions will be constantly fighting one another for memory access.
Cheers,
Alex.
If the integrated GPU supports Open CL (I am believe most modern ones do), then the GPU may help with dForce simulations, but it will only aid with rendering if it is an nVidia GPU and I think mostly this is not true.
No, they are too slow compared to dedicated GPUs. The exception is if your scene is so big that is won't fit in VRAM and you are rendering in Blender.
An integrated GPU gives you some trouble shooting ability if your dGPU is not putting out a display. Too many people seem to overlook this. GPUs can have all kinds of issues, and being able to get a display screen to try to fix it is very nice. Honestly, this is the best benefit of iGPU, and if you do not have another GPU sitting around, this is very helpful.
However, the iGPU cannot run Iray, so it is limited use for Daz. Now you may have an option of using the iGPU as your display, and not hooking a monitor to your dGPU. This was would allow you to maximize the dGPU's memory for Iray, because driving the display will use a little VRAM. So if you are trying get every drop of memory of your GPU for Iray, this may be an option.
But none of these are as important as getting a good dGPU in the first place. There is no substitute here for GPU power when it comes to Iray. The rest of the PC you have might be all super beast parts, but if your GPU sucks you are going to hate rendering Iray. You will hate it a lot.
When it comes to Iray the GPU is king. Period. You can have a relatively weak CPU you picked out of a trash can, but if you have a 3090 you will be fine with Iray. That is just how it is. There are some other factors, like VRAM and RAM, which cannot be overlooked, but it comes down to GPU.
How does the CPU render Iray when the GPU 's memory is insufficient?
The CPU is not Nvidia?
Just wondering.
If the CPU can render Iray why can't the integrated graphics assist somehow in that process?
When I turn on my CPU to assist in rendering it spikes intermittently on each iteration all the way up to nearly 100%.
The ram hits 35% and my two 3090s are not in SLI mode yet either.
Because the integrated graphics device doesn't have CUDA cores.
I still have my old Alienware M17 laptop which uses both integrated and dedicated GPU capabilities. It depends on the OEM how they setup their configurations. In case of Alienware, they routed all external display ports via their integrated Intel graphics processor even though the system had a dedicated GTX 980M inside. Initially, I was OK with that, but it soon got annoying once I started using a high freq, low latency external monitor as the nvidia GPU would complain that the monitor is not connected directly to it. A lot of people use integrated graphics for driving the display unless they require very high resolutions or have multi monitor setups.
I would say, having integrated graphics helps as 1) a backup option in case of any dedicated graphics card malfunctions 2) Off-loads the dedicated GPU from any display load. At the least the dedicated GPU never becomes the single point of failure. The only trade-off I see is that you can't upgrade the integrated graphics as conviniently as a dedicated GPU, which should be fine as long as the display requirements doesn't change.
you can use it to run a monitor
My second monitor uses my onboard AMD Vega graphics, I would run both on it but many programs require my Nvidia card connected to the display to use it
DAZ, Octane and Blender don't so you can run all your monitors off the motherboard for them to free up your GPU for rendering
I believe in laptops switching takes place. There's a hand off. This is built in to the capabilities of the MB architecture by design. GPU, the work horse for all heavy graphics needs. iGPU for everything else that doesn't need that kind of horsepower.
This would make sense related to power consumption.
So far nobody has provided any proof that using a lesser GPU to feed the monitors, would offer any benefits when using Nvidia "consumer" GPU's for Iray rendering.
I'm nor suer it does, per se, though a relatively recent Windows update may have allowed it to have soem impact 9by reducing the resources used for unconnected devices). However, soem applications do run their own code on the GPU (e.g. OpenGL features) and having those on a separate card should help - though of course the ideal would be to close other applications while rendering, which would make the point moot.
Good day, it can help, but only by a tiny percentage and it adds a lot of wear and tear plus heat to your system. Not worth it.
To be more specific, when I started with Daz Studio, I did not have a graphics card, just integrated graphics. It would take all night to render a scence. Later, I got a moderate graphics card (300 cores) and the same scene would be done in an hour (adding the integrated part might shave a few minutes off of the time). Later I got a more powerful card (3000 cores) (now antiquated), but I could render the scene in 10 minutes.
Say what? That is literally what GPUs are designed for, running a display. Running a display requires some resources, the most important being VRAM. While Windows reserves a small amount of VRAM on all GPUs in a machaine, the fact remains that having a display connected to a GPU uses MORE resources than not having a display connected to a GPU. I am not sure how that statement can be disputed. Especially with modern software, where so many of the GUI elements are running on the GPU when available. Those things are not free.
I used to have two 1080tis running Daz Iray. I could have a scene where one of the 1080tis would drop out of rendering, leaving the one GPU that was NOT connected to a display to render the scene by itself. Before that, I had two cards that had 4GB of VRAM, and i very frequently was able to get one card to drop and have the other one work, and again it was ALWAYS the card not connected to a display that worked.
And just to verify, I have always used MSI Afterburner or other software to monitor how much VRAM each card uses. When that first card drops from Iray, well, it exceeded its VRAM in Afterburner. What a surprise. How much more evidence do you really need here?
It is not a huge difference. But it exists, and yes it can matter. If every byte of data is important, having a lesser GPU or iGPU take care of the display frees up as much resources as possible for the main GPU to render. If it is not important (and a 3090 with 24GB tends to help here), then it is not a big deal.
You are also one of the few people I have ever seen recommend against iGPU. Things have changed. If iGPU was a problem, it would be something people would talk about. Especially gamers. Gamers tend to look for literally any kind of advantage. Most of the Ryzen series lacked iGPU, but I didnt see this ever be mentioned as an advantage over Intel because of some problem with iGPU. There would tech tubers jumping on this hot juicy news, certainly the ones who favor AMD, and there are plenty of them. Whatever problems there might be are HIGHLY overshadowed by the benefits of having iGPU. After I built my 5800X system, while I love my system, I miss having my iGPU.
I have a friend who had his GPU die on him. He had no backup GPU, and no iGPU. So guess what? He had no computer while he waited a painfully long 4+ weeks for his RMA to get back. If only he had a iGPU he would have at least been able to turn on his PC, browse stuff, and maybe even play some very old games. But he couldn't even do that. For an entire month. Not everybody has GPUs just sitting around.
So, I have 3 monitors at 1920x1200 connected to my 3060 and that takes 200MB's of VRAM (on W7), how much VRAM does Windows take on a Nvidia 'consumer' GPU that has no monitors connected to and which drivers is the GPU using, WDDM or DCH?
Having W10 take 1GB of VRAM, (800MB's more than on W7 with 3 monitors), is a proof that the amount of VRAM taken, is not about the number of monitors connected to the card.
Evidence, as in tests and results, numbers that others can replicate and confirm - Just like any other study, so far none is given. See https://www.daz3d.com/forums/discussion/comment/7751846/#Comment_7751846 for an example.
Me recommending against iGPU is based on experience with troubleshooting systems with 'strange' problems. You are assuming that those gamers etc. are knowledgeable enough to figure out and understand, which problems are caused by which piece of hardware, drivers and/or software - When testing and troubleshooting, it is the first requirement that one is able to track down the culprit and not just blaim it on "computers are always crashing" - Mine sure aren't.
"Whatever problems there might be"... Sounding like a teen installing any and all trial versions for whatever games/software he ever finds, and curses because "computers never work..."
Knowing the tricks to get a classic VW Beetle to win a race against a brand new turbo Porsche, does not mean those same tricks can be used to make the same Beetle to tow a 20 000 lb trailer day in/day out reliably for years.
If one is so afraid of having a GPU fail, one can buy a spare GPU for pocket change - No need to install a third axle to ones car, just in case one gets a flat tyre at some point in time.
I general, the less complicated a system is kept, the less potential for problems/conflicts there are. A GPU is also such an important part of the system, operating at such a low level at the hierarchy, that having hardware and/or drivers for two GPU's, especially ones with different architecture in the same system, is asking for problems.
"Tests made using RTX 2070 Super (8GB), i7-5820K, 64GB's of RAM on W7 Ultimate and DS 4.15"
You tested with *ONE* GPU in a computer, and made the conclusion that using a GPU for display would have no benefit.
Do you not see the problem with this testing method?
You did not even test the thing you said has no impact. Your conclusion is nothing more than a faulted assumption.
I HAVE tested with multiple GPUs. I have used multiple GPUs for years now. I have screenshots of my GPU stats while rendering. I have posted them in these forums at different times. And my data with my system shows that the display GPU is always using more VRAM than the GPU not connected to a display. And not only do I have these numbers, but I also have the experience of seeing a GPU drop out of rendering because it exceeded VRAM while the other continued rendering...and both GPUs had the same VRAM capacity. And I am far from the only person who can say this, since I am far from the only person who has used multiple GPUs. Pretty much anybody who uses multiple GPUs can tell you this. It doesn't have to be me.
If you want to make this bold claim, you need to actually test it properly instead of making assumptions. Put another GPU in your system, especially a 8GB one to match your 2070. Then work on making a scene that can hit as close to that 8GB as possible. Eventually you WILL reach a point to where the display GPU refuses to render, while the secondary GPU does. I 100% guarantee this.
As for your iGPU claims, you did not back them up with any numbers. Since you do this for a living, how many instances do you see in a given week or month where you can be 100% sure the iGPU caused a problem? I am not saying that iGPUs never have an issue, but I am saying that these issues are uncommon to the point that it is silly to universally state that all iGPUs are bad.
I also have my very own iGPU experience, having used one for years without these weird issues you keep talking about. The only time I had issues was when I had a lightning strike. And Daz Studio likes to crash sometimes. But that is Daz. My gaming has always been more stable. My newest PC has no iGPU, but I am by no means terrified of the buying a CPU that has one. I bought a Steam Deck recently, which uses a little iGPU. It has been pretty rad. This iGPU set up is what makes it even possible to be a mobile chip. iGPU allows for very low power devices to exist. There are GPUs in practically everything now except toasters.
You you also might want to have a chat with AMD since all of their new 7000 series CPUs have a iGPU included. I guess they didn't get the memo. AMD is also very clear on what these iGPUs are, they only included 2 compute units on the iGPU, so they are not at all intended for gaming. They exist solely for the purpose of having a display adapter if needed. That sounds pretty practical to me, and come on, AMD could have saved millions by not including a iGPU in these chips. Building a CPU without iGPU is far cheaper. But they chose to do so, clearly there must be a reason why, and I don't think they did it just to annoy you.
Many iGPUs are FREE with the CPU. Like the Ryzen 7000 series parts since they don't even have a model without it. The Intel chips only charge a paltry $20 for the iGPU, sometimes less than that. You will not find many dGPUs for $20. You will also not find many dGPUs that use less power than the iGPU does. So you spend more money and use more power just so you can avoid this evil and scary iGPU. I have to say this is just getting comical at this point.
Also, the techtubers I talk about do not exist in a vaccuum. They talk to each other, and they talk to AMD and Intel. They will talk about any problems they have with these companies, they do not just go "Oh whelp, it doesn't work!" Support is a part of the review process. To say that reviewers are always clueless about the equipment they review is just false. Some of the people who do reviews are people who actually worked in the industry. Some of these people have side jobs, at least a few have experience in PC repair.
There's more, too. Having the iGPU can give a real world performance boost to various creative software. You wanted proof that iGPU can betterm you haven't looked very hard. The difference between the K and KF SKU can be 30% or more in these particular applications. Don't believe me?
Maybe you should have a chat with this guy. If by chance you believe this man is not telling the truth, then I suggest looking up Puget benchmarks.
This is a list of 12900KF Adobe Photoshop scores
https://www.pugetsystems.com/benchmarks/?age=30&benchmark=PugetBench+for+Photoshop&application=&specs=12900KF
This is a list using the 12900K
https://www.pugetsystems.com/benchmarks/?age=30&benchmark=PugetBench+for+Photoshop&application=&specs=12900K#results-table
If you look at the scores you will find a consistant trend, the 12900K scores 30% higher across the board, and it does so regardless of the dGPU used in the system. Adobe Photoshop is very much bound to CPU usage, but the software actually does leverage the iGPU found in Intel chips to boost performance. This is an actual benefit of iGPU. There is no other difference between the 12900K and the 12900KF beyond the iGPU. So in theory these scores should be the same. Now maybe this is not for Daz Studio, but I would wager that a good many Daz users probably use some photo editing software given what they do, and that many of them use Photoshop. So this is relevant to their interests.
I am not sure what else there is to say here. Having the iGPU can help performance with certain apps even with a good dGPU. Having iGPU gives you the ability to trouble shoot in ways you cannot without one. You can use iGPU for extra monitors without needing to spend money and energy on a 2nd GPU. AMD must think they have value to include them across their latest product line. Intel certainly believes in iGPU as they have been standard for generations, and are always included on their flagship SKUs. Intel believes that iGPU is important. Obviously Intel must believe this, because their chips are still monlithic in design. By designing a chip with iGPU, Intel is actively reducing the space they could be using for CPU, yet they keep designing everything with iGPU in mind. The F-SKU chips are simply chips that have the iGPU disabled.
This is like saying we need to ban cars, because cars can be in crashes. But like iGPU, the benefits of a car far outweigh the negatives.
Keep the discussion civil, please.
Outrider42, The test results I linked to were an example of how one could show ones findings. You have been the one saying that using a lesser GPU to drive the monitors, enables the GPU used for rendering, use more of it's resources for rendering. I'm still waiting for the numbers, because I want to see if in fact there is a benefit in having such a setup.
iGPU's were invented and brought to market, so that one could sell cheaper computers for masses that do just simple things with their computers. When looking at systems meant for 'serious' business, there are no iGPU's there.
As I have said many times. The less complicated a system is kept, the less potential for problems there are - Ever heard of drivers that just don't want to work together in the same computer
well at least I can use DAZ studio with the latest Nvidia driver and run Dforce simulations with my iGPU
I learned the hard way that APUs are a bad way to go. They're bottlenecked like crazy, and because they don't have any dedicated RAM, they pinch from your system RAM.