Dual GPU AMD/Nvidia

I brought up this discusion in the newb forum, but now that I've narrowed down the issue, I think this forum is more appropriate to ask for a solution.

I currently have two GPU's in my comp, one is AMD R9 290, other is RTX 2070 super. The AMD card is needed because I have an old monitor (don't ask, lol.. I've already looked into it, adapters will NOT work with this monitor). I , may even continue to keep the AMD card in there for now (even after monitor upgrade) as I've read something in the Daz log that states I can speed up my renders if I turn off the video output on the Nvidia card. 

That's the basic information to explain why I have the current issue. 

The issue itself is that with both cards in (I've tried the cards in either slots and same result), Daz refuses to load the Nvidia card as the main card. It does recognize the Nvidia card and I'm able to select it in the device options, but it won't actually use the card in rendering unless I disable the AMD card before loading the program (I can re-enable it after the program is open). I've included a copy of the relevant portion of log file that ran on start. One half of it is with both cards enabled, other half is with the AMD card disabled. 

I have half solved the problem by just disabling the AMD card before loading Daz, but I'm hoping someone here might be able to tell me how I can get Daz to just use the Nvidia card without doing this. 

docx
docx
Daz log.docx
44K
«1

Comments

  • RayDAntRayDAnt Posts: 1,147
    edited November 2019

    GPUs can serve two primary functions in Daz Studio:

    1. Drive the Daz Studio application interface itself (via AMD/Nvidia compatible OpenGL code) 
    2. Serve as the compute engine for Iray (via Nvidia proprietary Cuda code)

    Function #1 is always performed by whichever GPU is being used to physically drive the display. And to the best of my knowledge there is no performance advantage to be gained from using an Nvidia card over an AMD one for it (since Daz Studio's OpenGL performance needs are so minimal that virtually any modern GPU - including embedded CPU ones from Intel/AMD - are more than up to the task.)

    Function #2 is determined by which devices you have checked in the "Photoreal Devices" list under Render Settings > Advanced > Engine: NVIDIA IrayHardware.

     

    It does recognize the Nvidia card and I'm able to select it in the device options, but it won't actually use the card in rendering unless I disable the AMD card before loading the program (I can re-enable it after the program is open). I've included a copy of the relevant portion of log file that ran on start. 

    The log file excerpts you've posted only relate to whichever GPU is being used by the Daz Studio application interface - not whether or not your 2070 SUPER is being used for rendering with Iray. In order to know what's going on with that, you need to look at the log file starting from a line like:

    2019-10-19 10:30:12.124 Rendering image

    to the end.

    Based on what you've provided so far, there is no reason to think that your 2070 SUPER isn't being used for Iray rendering even with the AMD GPU active. Just that it isn't being used for the application interface when the AMD card is display primary (which is exactly how it should work.)

    Post edited by RayDAnt on
  • You cannot turn off the video outs on a 2070.

    If you can select the card but it doesn't render, I am not bothering with your log file since it isn't .txt, the most like culprit is the Nvidia control panel.

  • PadonePadone Posts: 3,790
    edited November 2019

    Just to let you know that I currently use a vega gpu for the viewport and a 1060 for rendering and no issues here. I just plugged the monitor to the mobo and selected the 1060 in the iray panel.

    Post edited by Padone on
  • RayDAnt said:

    Based on what you've provided so far, there is no reason to think that your 2070 SUPER isn't being used for Iray rendering even with the AMD GPU active. Just that it isn't being used for the application interface when the AMD card is display primary (which is exactly how it should work.)

    I guess there's a lot of newbs around, so I shoulda specified that I'm not an idiot, lol. The reason to think it isn't being used is because I said it isn't being used. The more relevant question would be as to why I think it isn't being used. That would be simple. When loaded with both GPU's active, Daz runs Iray preview at an impossibly slow speed, and crashes entirely before the first iteration of a render. And due to the aforementioned adapter issue (or rather lack of ability to use an adapter on this particular monitor), I cannot plug the monitor into the Nvidia card. I mean, I could technically use an adapter but this monitor will treat any adapter (even dual link, or DP, or HDMI) as a single link adapter and give me a pixelated headache inducing display. So atm, turning the AMD card on and off beats swapping adapter cables everytime I run Daz. 

    But I suppose the prominant question in terms of fixing the issue at this point would be, is there a "hack" for Daz to make it load the Nvidia card as the primary/only card? Even something external from Daz, like a text-based trigger/condition in the shortcut, or a GPU management application, or even a plugin for Dazstudio itself?

     

    You cannot turn off the video outs on a 2070.

    If you can select the card but it doesn't render, I am not bothering with your log file since it isn't .txt, the most like culprit is the Nvidia control panel.

    That was one of the first things I checked. I even tried making a specific profile for Daz. But tbh, I think the first reply sounds most likely to be the issue. And I'm no expert on the video outs thing, but I think you can, unless that is a function reserved for the Quadro (and similar) workstation cards, which I suppose wouldn't surprise me. 

     

    Padone said:

    Just to let you know that I currently use a vega gpu for the viewport and a 1060 for rendering and no issues here. I just plugged the monitor to the mobo and selected the 1060 in the iray panel.

    Sadly not possible, that was actually my first shot at fixing the initial monitor issue before I decided to toss the AMD card back in. Thing is my onboard intel card is crap and the DVI slot only has a max resolution of 1080p, in the mobo specs anyways, the amount it would actually allow was the same as the adapters/single link cable, which was even lower than 1080p. (My monitor is 2.5k)

  • RayDAntRayDAnt Posts: 1,147
    edited November 2019

    When loaded with both GPU's active, Daz runs Iray preview at an impossibly slow speed, and crashes entirely before the first iteration of a render.

    @lazarus102 this isn't indicative of your Nvidia GPU not being used for Iray. In the absence of a functioning Nvidia Cuda compatible GPU, Iray runs using the CPU as a compute unit. Which will cause Iray liveview to run extremely slowly, but not to not run at all (which is what you are describing with it crashing PRIOR to a first-iteration appearing on screen.) That indicates at least one other thing is amiss (eg. insufficient RAM/VRAM in your system.) The first step to diagnosing a problem like this is going to the Daz Studio log file and studying everything reported by Iray after Iray liveview or a render has been initiated - ie. after the line that looks like this:

    2019-10-19 10:30:12.124 Rendering image...

     If you want help diagnosing your problem, you need to post that portion of your log file (either as a text file attachment or as text pasted directly into your post.) Without that, there isn't really anything anyone here can tell you definitevely about how to fix your problem since it is impossible for any of us to know what your problem even is. Especially if the true culprit is actually multiple problems rolled into one (which it sounds to me like this very well may be.)

    Post edited by RayDAnt on
  • PadonePadone Posts: 3,790
    edited November 2019
    When loaded with both GPU's active, Daz runs Iray preview at an impossibly slow speed, and crashes entirely before the first iteration of a render .. is there a "hack" for Daz to make it load the Nvidia card as the primary/only card?

    The nvidia rmb menu should allow you to choose the gpu to run your application with. I always run daz studio with the vega, but it uses the 1060 for iray preview as selected in the iray panel. I agree you need to post the log. Specifically where it reports the nvidia device activation at startup, then while rendering before the crash.

    Also if there's a crash this is probably a driver issue anyway.

    Post edited by Padone on
  • Only Quadros and Titans can be put into TCC mode.

    TCC mode is the only way that Nvidia cards can disable theior video outs.

    If Daz is crashing before the first iteration and not simply dropping to CPU it certainly sounds like a driver issue. I'd get DDU get rid of both drivers and install the latest drivers directly from AMD and Nvidia's websites (I'd get the zips before doing the uninstall).

     

  • Only Quadros and Titans can be put into TCC mode.

    TCC mode is the only way that Nvidia cards can disable theior video outs.

    If Daz is crashing before the first iteration and not simply dropping to CPU it certainly sounds like a driver issue. I'd get DDU get rid of both drivers and install the latest drivers directly from AMD and Nvidia's websites (I'd get the zips before doing the uninstall).

     

    Before the first iteration of a render, I didn't say that it crashed on Iray preview. It also ran the living crap outta CPU before attempting render, then crashing. Granted it wasn't a simple scene. 

     

    Padone said:

    The nvidia rmb menu should allow you to choose the gpu to run your application with. I always run daz studio with the vega, but it uses the 1060 for iray preview as selected in the iray panel. I agree you need to post the log. Specifically where it reports the nvidia device activation at startup, then while rendering before the crash.

    Also if there's a crash this is probably a driver issue anyway.

    I tried your suggestion, only rmb menu that I get for Nvidia is for the control panel when right clicking desktop. 

     

    RayDAnt said:

    When loaded with both GPU's active, Daz runs Iray preview at an impossibly slow speed, and crashes entirely before the first iteration of a render.

    @lazarus102 this isn't indicative of your Nvidia GPU not being used for Iray. In the absence of a functioning Nvidia Cuda compatible GPU, Iray runs using the CPU as a compute unit. Which will cause Iray liveview to run extremely slowly, but not to not run at all (which is what you are describing with it crashing PRIOR to a first-iteration appearing on screen.) That indicates at least one other thing is amiss (eg. insufficient RAM/VRAM in your system.)

     

    Well, again, I didn't say the Iray preview was crashing, I said the renders were crashing. I did try an Iray preview using just a G8M, it ran extremely slow, but ran. And logic.. If my Nvidia GPU was being used while both cards are on (during startup), and crashing during render, then why does it render just fine when I disable the AMD card before starting the program, but re-enable it after opening the program (and before rendering starts)? I'm tryin to tell ya, the key to the issue was in your first reply, with the Nvidia card loaded in a different configuration while the AMD card is on, it must be asigning parts of the program to the AMD card that are needed for rendering my scenes (at least that would make sense based on what's happening). 

    Also, by using your initial information, I figured out a secondary workaround that is more acceptable than turning the card itself on and off everytime I start daz. If I switch my TV (connected to the Nvidia card cuz it's HDMI) to the primary display, then run Daz, it runs the Nvidia card as the primary and all works fine. Just like shown in the piture, if the AMD card is on (and connected to the "primary display") the AMD card shows up in that information box. 

    Btw, I'm not trying to be rude or anything. I am thankful for the help, but I've already beat a dead horse with the stuff that you're trying to explain to me, and linked log files on two other forums. The thing is, I don't need to search for another potential issue, I need a solve for the issue that I have. I don't generally come onto forums asking for help about something before I've already googled the crap outta it. Cuz  A: I don't like being told to "google it". B: I don't like to be "that guy" when it comes to missing simple solutions. Nor do I like to waste people's time by coming onto forums asking the kinds of questions that can be solved with answers like "Are you SURE it's plugged in??". Cuz nothing irritates me more than googling for the answer to an intermediate task and being flooded with results of people asking stupid questions that make me wonder how they remember how to breathe (much less operate a computer).

    Problem.png
    456 x 546 - 16K
  • RayDAntRayDAnt Posts: 1,147
    edited November 2019

    Well, again, I didn't say the Iray preview was crashing, I said the renders were crashing.

    You don't understand. Iray liveview is a render. Programmatically speaking, turning it on is exactly the same thing as initiating an Iray "render" via the Render button/menu option. Meaning that if your GPU is failing to function for the one, it is going to fail to function for the other in exactly the same manner (since it is exactly the same task being performed.) This makes your description of your problems (liveview falling back to CPU/"renders" failing to initialize at all in the same sitting) contradictory. Hence why we need to know what your log file says about the rendering process itself before being able to recommend what you can do to get it to work properly. Since it is impossible to recommend a specific course of action to resolve your problems without first having at least some basic idea of what the underlying problem/problems are.

     

    If I switch my TV (connected to the Nvidia card cuz it's HDMI) to the primary display, then run Daz, it runs the Nvidia card as the primary and all works fine. Just like shown in the piture, if the AMD card is on (and connected to the "primary display") the AMD card shows up in that information box. 

    That information box has absolutely nothing whatsoever to do with whether or not Iray is successfully making use of a specific Nvidia GPU for rendering. The only place where that information is to be found is in the Daz Studio log file following an attempt to initiate either Iray liveview or a formal Iray render.

    Post edited by RayDAnt on
  • PadonePadone Posts: 3,790

    I tried your suggestion, only rmb menu that I get for Nvidia is for the control panel when right clicking desktop. 

    from the control panel, menu > desktop > add "run with gpu" to context menu

  • nonesuch00nonesuch00 Posts: 18,302

    I brought up this discusion in the newb forum, but now that I've narrowed down the issue, I think this forum is more appropriate to ask for a solution.

    I currently have two GPU's in my comp, one is AMD R9 290, other is RTX 2070 super. The AMD card is needed because I have an old monitor (don't ask, lol.. I've already looked into it, adapters will NOT work with this monitor). I , may even continue to keep the AMD card in there for now (even after monitor upgrade) as I've read something in the Daz log that states I can speed up my renders if I turn off the video output on the Nvidia card. 

    That's the basic information to explain why I have the current issue. 

    The issue itself is that with both cards in (I've tried the cards in either slots and same result), Daz refuses to load the Nvidia card as the main card. It does recognize the Nvidia card and I'm able to select it in the device options, but it won't actually use the card in rendering unless I disable the AMD card before loading the program (I can re-enable it after the program is open). I've included a copy of the relevant portion of log file that ran on start. One half of it is with both cards enabled, other half is with the AMD card disabled. 

    I have half solved the problem by just disabling the AMD card before loading Daz, but I'm hoping someone here might be able to tell me how I can get Daz to just use the Nvidia card without doing this. 

    You're not having the 2 default viewports in DAZ Studio use iRay Preview mode are you?

    If you are that may be your problem. iRay preview mode is primarily meant for nVidia GPU users and having iRay preview mode on when your AMD card is the monitor output is likely not a situation nVidia/DAZ 3D anticipated and coded for. Turn it off. You can use iRay Preview when you have no nVidia cards it's true but it's extremely slow.

  • PadonePadone Posts: 3,790
    edited November 2019

    .. having iRay preview mode on when your AMD card is the monitor output is likely not a situation nVidia/DAZ 3D anticipated and coded for ..

    I do it all the time and everything works fine here. I use the vega for the viewport and the 1060 for iray rendering and preview. And I run daz studio with the vega as default gpu.

    Post edited by Padone on
  • p0rtp0rt Posts: 217
    you need a registry key to enable opencl on nvidia cards if you have a amd card installed, nvidia will disable opencl ny default, but you can find out how to enable it using google
  • lazarus102lazarus102 Posts: 47
    edited November 2019
    RayDAnt said:

     

    You don't understand. Iray liveview is a render. Programmatically speaking, turning it on is exactly the same thing as initiating an Iray "render" via the Render button/menu option. Meaning that if your GPU is failing to function for the one, it is going to fail to function for the other in exactly the same manner (since it is exactly the same task being performed.) This makes your description of your problems (liveview falling back to CPU/"renders" failing to initialize at all in the same sitting) contradictory. 

     

    While you may be correct in that the Iray preview does the same process as the render option, I believe it does so under differing options. For example, when I render, I generally do so in 3999x3999 resolution, whereas, it is unlikely that the Iray preview would try to render at that high of a res. Trying to load that much more inforamtion is more likely to breach the limits of the CPU/memory. I only have a 2.5k monitor, if I had a 4k monitor, it may be more likely to try to run Iray preview at a res closer to that. As well, in full honesty, I didn't try to render the single male model, that was just a quick test that I did to see if Iray preview would crash if it was given a scene that it was capable of rendering in a realistic timeframe. These days (since getting the RTX card) I've been running darker scenes that I was incapable of running in a realistic timeframe when I just had my AMD card. But again, this is stuff I've already been through. Primary concern is getting Daz to function the same way that is does when the Nvidia card is the only card it loads. It really has no reason to load the AMD card when it won't even work with Iray.  Oh, PS: I also just remembered something. Preview isn't exactly the same as render option. For example, the Bristol hair will not load in Iray preview, but shows up perfectly in render.

     

    Padone said:

    from the control panel, menu > desktop > add "run with gpu" to context menu

    https://www.addictivetips.com/windows-tips/force-app-to-use-dedicated-gpu-windows/ one of the top few responses on that page states that this option is only available with Nvidia power saving GPU's. Admitedly I didn't even know exactly where to look when you first told me, but ya, the actual option isn't even available under "desktop". Shame though as I imagine that option would be handy. 

    p0rt said:
    you need a registry key to enable opencl on nvidia cards if you have a amd card installed, nvidia will disable opencl ny default, but you can find out how to enable it using google

    This might just be the most sensible answer yet (although the previous answer would be better if it was available for my current card).

    Post edited by lazarus102 on
  • PadonePadone Posts: 3,790
    edited November 2019

    @lazarus102 The "run with gpu" option is available for me and I have a desktop with a 1060 with 431.86 drivers, so it's not only for laptops. I don't know if drivers may disable this option for rtx cards, but if so it sounds odd to me since this feature is always useful to have.

    As for opencl, it is not used by iray so it doesn't matter with your specific issue. Below the link with a more extended explanation though. I already posted this information but some moderator deleted it.

    https://community.amd.com/message/2909519

    Post edited by Padone on
  • RayDAntRayDAnt Posts: 1,147
    edited November 2019
    RayDAnt said:

     

    You don't understand. Iray liveview is a render. Programmatically speaking, turning it on is exactly the same thing as initiating an Iray "render" via the Render button/menu option. Meaning that if your GPU is failing to function for the one, it is going to fail to function for the other in exactly the same manner (since it is exactly the same task being performed.) This makes your description of your problems (liveview falling back to CPU/"renders" failing to initialize at all in the same sitting) contradictory. 

     

    While you may be correct in that the Iray preview does the same process as the render option, I believe it does so under differing options.

    Belief has nothing to do with anything here. It's the way Iray works internally. Iray "liveview" is a completely arbitrary distinction Daz's developers implemented on the Daz Studio user interface side of the Daz Studio + embedded Nvidia Iray renderer plugin software package to enable various end-user workflows. Rendering is rendering - is rendering so far as Iray knows or knows to care about.

    it is unlikiely that the Iray preview would try to render at that high of a res.

    Iray used as a "liveview" always renders at the native screen resolution of the viewport in which it is enabled solely because Daz's developers decided to implement it that way in that context. There is no technical limit to render dimensions in "liveview" beyond those already present in the case of "regular" rendering because they are exactly the same task from Iray's end of things.

     

    Primary concern is getting Daz to function the same way that is does when the Nvidia card is the only card it loads.  It really has no reason to load the AMD card when it won't even work with Iray. 

    To reiterate - Daz Studio and Iray have independent pipelines for implementing graphics hardware. Whether or not Daz Studio runs using OpenGL on your AMD or Nvidia card has no bearing on whether Iray is making full use of your Nvidia card for rendering. In order to assess that, you need to be looking at the END of the Daz Studio log file (for log lines with "IRAY" in them) since that is the only place where Iray operational/troubleshooting information is found

     

    Oh, PS: I also just remembered something. Preview isn't exactly the same as render option. For example, the Bristol hair will not load in Iray preview, but shows up perfectly in render.

    That's because Daz Studio defaults "Preview PR Hairs" under Line Tesselation to "Off" for strand based hairsets in Iray "liveview" renders to avoid overloading lower tier systems in pre-rendering workflows. It has nothing to do with Iray being inherently different between "liveview" and Render use.

     

    Admitedly I didn't even know exactly where to look when you first told me, but ya, the actual option isn't even available under "desktop".

    You need to have both your GPUs installed and plugged into at least one display/HDMI/DVI cable prior to booting up your computer for this functionality to be active.

     

    Post edited by RayDAnt on
  • PadonePadone Posts: 3,790
    RayDAnt said:

    Admitedly I didn't even know exactly where to look when you first told me, but ya, the actual option isn't even available under "desktop".

    You need to have both your GPUs installed and plugged into at least one display/HDMI/DVI cable prior to booting up your computer for this functionality to be active.

    In my desktop I never plugged the 1060 to a monitor and it is available anyway. I can choose between the vega and the 1060 for any app.

  • RayDAntRayDAnt Posts: 1,147
    edited November 2019
    Padone said:
    RayDAnt said:

    Admitedly I didn't even know exactly where to look when you first told me, but ya, the actual option isn't even available under "desktop".

    You need to have both your GPUs installed and plugged into at least one display/HDMI/DVI cable prior to booting up your computer for this functionality to be active.

    In my desktop I never plugged the 1060 to a monitor and it is available anyway. I can choose between the vega and the 1060 for any app.

    That probably has to do with the fact that your Vega is imbedded in your CPU. Meaning that there is only one discrete GPU in your system. In systems with multiple discrete GPUs where only some of them are physically plugged into a display, Windows Vista and later assume that those cards are not being used and attempt to shut them down/ignore them as much as possible for power saving purposes. The classic workaround is to have a dummy plug (or even just a disconnected display/hdmi/dvi cable) inserted in one of the GPU's outputs to keep WIndows from deactivating it.

    Although that makes me think of something else I totally forgot. @lazarus102 have you checked your BIOS to make sure that multi-GPU is enabled and that the correct card is selected for primary display? Many motherboards require custom BIOS configuration for things to work properly when you have multiple discrete GPUs in a system.

    Post edited by RayDAnt on
  • RayDAnt said:
    Padone said:
    RayDAnt said:

    Admitedly I didn't even know exactly where to look when you first told me, but ya, the actual option isn't even available under "desktop".

    You need to have both your GPUs installed and plugged into at least one display/HDMI/DVI cable prior to booting up your computer for this functionality to be active.

    In my desktop I never plugged the 1060 to a monitor and it is available anyway. I can choose between the vega and the 1060 for any app.

    That probably has to do with the fact that your Vega is imbedded in your CPU. Meaning that there is only one discrete GPU in your system. In systems with multiple discrete GPUs where only some of them are physically plugged into a display, Windows Vista and later assume that those cards are not being used and attempt to shut them down/ignore them as much as possible for power saving purposes. The classic workaround is to have a dummy plug (or even just a disconnected display/hdmi/dvi cable) inserted in one of the GPU's outputs to keep WIndows from deactivating it.

    Although that makes me think of something else I totally forgot. @lazarus102 have you checked your BIOS to make sure that multi-GPU is enabled and that the correct card is selected for primary display? Many motherboards require custom BIOS configuration for things to work properly when you have multiple discrete GPUs in a system.

    My 2070 has never once been connected to a display, I use the 1080ti to drive my display. Both render in DS.

    I've never once seen a motherboard need a custum UEFI to have 2 GPU's plugged in.

    You seem to be referencing an issue with having two cards in the system both plugged into monitors and getting Windows to switch from one to the other depending on application. People do this on gaming laptops to have things like Chrome use the iGPU rather than the discrete one to save batterly life. But even that has been fixed.

  • RayDAntRayDAnt Posts: 1,147

    I've never once seen a motherboard need a custum UEFI to have 2 GPU's plugged in.

    Not talking about custom UEFIs - just the fairly standardized BIOS settings most motherboards have for managing having multiple GPUs in a system at a time.

     

     My 2070 has never once been connected to a display, I use the 1080ti to drive my display. Both render in DS

    That's because they are both recent generation Nvidia cards sharing the same driver code. Hence why they don't both need to be plugged in order to keep Windows from ignoring either of them.

    In @lazarus102's case, the issue is with attempting to run a headless Nvidia card along with an AMD card as primary display output. Which is just the sort of use case where issues with headless GPU operation and dummy outputs as possible fix come into play.

     

    You seem to be referencing an issue with having two cards in the system both plugged into monitors

    I'm talking about the opposite case - where there are fewer GPUs connected to external displays in the system than there are GPUs in the system in total.

  • PadonePadone Posts: 3,790
    RayDAnt said:

    Many motherboards require custom BIOS configuration for things to work properly when you have multiple discrete GPUs in a system.

    Yes I can disable or set the vega as primary display in the bios. I did set it as primary and it is wired to the monitor. While the 1060 is unplugged and only does rendering.

  • RayDAnt said:

    I've never once seen a motherboard need a custum UEFI to have 2 GPU's plugged in.

    Not talking about custom UEFIs - just the fairly standardized BIOS settings most motherboards have for managing having multiple GPUs in a system at a time.

     

     My 2070 has never once been connected to a display, I use the 1080ti to drive my display. Both render in DS

    That's because they are both recent generation Nvidia cards sharing the same driver code. Hence why they don't both need to be plugged in order to keep Windows from ignoring either of them.

    In @lazarus102's case, the issue is with attempting to run a headless Nvidia card along with an AMD card as primary display output. Which is just the sort of use case where issues with headless GPU operation and dummy outputs as possible fix come into play.

     

    You seem to be referencing an issue with having two cards in the system both plugged into monitors

    I'm talking about the opposite case - where there are fewer GPUs connected to external displays in the system than there are GPUs in the system in total.

    There hasn't been a mainstream motherboard with BIOS in years. People still call it BIOS but its actually UEFI. There are motherboards that let you set the primary display butthat is pretty irrelevant. My motherboard has no such setting and even when there is such it is only relevant when multiple GPU's are plugged into different cards. Windows can detect which GPU's have monitors plugged in and you can set which one is primary in the display settings dialog.

    No. I've done a lot orf looking into this and this claim is just contrary to what is out there. If you have a source that actually supports this claim present it. I think you're giving out just wrong info.

  • RayDAntRayDAnt Posts: 1,147

    I've done a lot orf looking into this and this claim is just contrary to what is out there.

    What are your sources for this conclusion?

  • RayDAnt said:

    I've done a lot orf looking into this and this claim is just contrary to what is out there.

    What are your sources for this conclusion?

    That it comes up in no searches at all. Beyond that I build systems and have never had this come up and the people who buy from me get their tech support from me.

  • RayDAntRayDAnt Posts: 1,147
    edited November 2019
    RayDAnt said:

    I've done a lot orf looking into this and this claim is just contrary to what is out there.

    What are your sources for this conclusion?

    That it comes up in no searches at all.

    So in other words no sources.

     

    Beyond that I build systems and have never had this come up and the people who buy from me get their tech support from me.

    How many systems with multiple discrete GPUs from both AMD and Nvidia have you worked on? Because that's what's being troubleshooted here - not multi-GPU setups in general.

    Post edited by RayDAnt on
  • RayDAntRayDAnt Posts: 1,147
    edited November 2019
    Padone said:

    @lazarus102 The "run with gpu" option is available for me and I have a desktop with a 1060 with 431.86 drivers, so it's not only for laptops. I don't know if drivers may disable this option for rtx cards, but if so it sounds odd to me since this feature is always useful to have.

    As for opencl, it is not used by iray so it doesn't matter with your specific issue. Below the link with a more extended explanation though. I already posted this information but some moderator deleted it.

    https://community.amd.com/message/2909519

    @Padone is your signature correct in that you are still running WIndows 10 1809? Because neither my main rendering rig (Titan RTX/Intel iGPU combo) nor my Surface Book 2 (GTX 1050/Intel iGPU combo) both of which currently run W10 1903 offer those menu options (although per-application GPU assignment still seems to be possible through the Windows settings app.)

    Post edited by RayDAnt on
  • RayDAnt said:
    RayDAnt said:

    I've done a lot orf looking into this and this claim is just contrary to what is out there.

    What are your sources for this conclusion?

    That it comes up in no searches at all.

    So in other words no sources.

     

    Beyond that I build systems and have never had this come up and the people who buy from me get their tech support from me.

    How many systems with multiple discrete GPUs from both AMD and Nvidia have you worked on? Because that's what's being troubleshooted here - not multi-GPU setups in general.

    How precisely would there be a source that says this thing no one has ever heard of or had to do isn't a thing?

    I'd have to go through my records by 4 that I recall. People who bought budget Radeon cards for gaming and cheap quadros to do AI coding. Lot's of people I know, I work in a datacenter, have serious workstations at work but do not want to drop $10k on such a system at home. 

  • RayDAntRayDAnt Posts: 1,147
    edited November 2019

    I'd have to go through my records by 4 that I recall. People who bought budget Radeon cards for gaming and cheap quadros to do AI coding.

    Is it safe to assume that these Quadros were all being set to TCC mode? Because if so, that would explain why you would have never come across any of the brand intercompatability issues being discussed in this thread (since GPUs in TCC mode don't engage with the WDDM subsystem in the first place.) It's also worth reiterating that @lazarus102 is working with a 6+ year old Radeon R9 290 specifically. Meaning that whatever worked/didn't work hardware-compatibility-wise between AMD and Nvidia GPUs circa 2013 is primarily what's relevant here.

    Post edited by RayDAnt on
  • PadonePadone Posts: 3,790
    edited November 2019
    RayDAnt said:

    .. you are still running WIndows 10 1809? Because neither my main rendering rig .. both of which currently run W10 1903 offer those menu options ..

    Yes I have 1809, I can set the default gpu for each app using both the nvidia control panel or the nvidia rmb menu. It seems odd to me that 1903 doesn't work the same but it may be that.

    nvidia-rmb.jpg
    644 x 283 - 52K
    Post edited by Padone on
  • RayDAntRayDAnt Posts: 1,147
    edited November 2019
    Padone said:
    RayDAnt said:

    .. you are still running WIndows 10 1809? Because neither my main rendering rig .. both of which currently run W10 1903 offer those menu options ..

    Yes I have 1809, I can set the default gpu for each app using both the nvidia control panel or the nvidia rmb menu. It seems odd to me that 1903 doesn't work the same but it may be that.

    Yeah, it's totally gone for me in both places (Nvidia control panel only allows you to adjust Nvidia GPU settings for specific apps - not which GPU device an app runs on.) However if I go to Start menu >  Settings > System > Display Graphics settings (very last clickable thing in the parameters area)  I am then able to browse for specific apps and specify (via an additional Options button) which physical GPU they run on:

    • System default
    • Power saving ("Intel UHD Graphics 630" in this system)
    • High performance ("Titan RTX" in this system)

    So basically the same functionality. But accessed in a much more... indiscrete way.

    Would love to know what happens if you have three or more operational graphics devices in your system.(eg. two dGPUs plus one iGPU) since there doesn't really seem to be any accommodation for that here.

    Post edited by RayDAnt on
Sign In or Register to comment.