GPU upgrade and nVidia drivers

I was hoping that someone here could give me some advice on the best way to go about a graphics card update.
I am currently running a GTX 980ti with the nVidia 388.43 drivers. I haven't updated to newer drivers because everything is running fine with these (currently running Daz Studio 4.9.4.122). Tomorrow I am getting a 1080ti delivered. Initially I will be swapping out the 980ti for the new card and then in a few weeks I plan to upgrade the power supply and reinstall the 980ti and a secondary card for Iray rendering. The question is, what would be the best way to go about the upgrade tomorrow:
A) Removing the nvidia drivers, swapping for the new card and reinstalling 388.43.
B) Removing the nvidia drivers, swapping for the new card and installing the latest nVidia drivers.
C) Swap the cards and leave the drivers alone.
D) Something else?
Anyone who has had experience of this, I would greatly appreciate any advice.
Comments
Option C shouldn't give you any problems. The newest Nvidia drivers aren't always an improvement on ones that have been stable. If there are problems, you can always install a newer driver (not any different than updating a driver with an installed GPU.
I´m not sure about keeping the same drivers, since these cards are from different families and architectures... I would change the drivers to the same version but for the 10 series.
Won't matter. The 388 drivers came out after the 10XX series.
Thank you for the advice. I want to try and end up with the 388.43 drivers as they have been stable for me for a couple of months, both in DS and in games with VR.
I'd try option C first; see if all functionality of the card is available, then compare your render times with those of other 1080ti owners. If all seems ok, don't fix what isn't broken.
I quite literally just updgaded and installed a new Nvidia card today.
1) Install the new card.
2) Boot up your computer and run the CD-Rom.
3) Use the advanced menu and tell it to do a clean install.
The software will uninstall your old driver if it isn't compatible, and install the drivers you need.
That's it...
Thank you everyone for your input.
New card arrived this morning. Installation went fairly smoothly (was a short delay whilst I searched for 8-pin power cables, then realised that there was a single cable that clips onto the 6-pin that was already there). Second problem is that I use a 42" TV that has a VGA input as my monitor, via a DVI-I adaptor...and the new card uses DVI-D which I don't have an adaptor for. I dug out a Displayport to VGA adaptor so that I could at least test the new card. Upon startup unfortunately Windows would not associate the new card with the old drivers. I uninstalled them, did a reboot and then reinstalled 388.43. Next problem was the Displayport to VGA. For some reason Windows does not want to recognise the resolution of the TV and so won't run properly at 1920x1080. After some more digging through the box of unused parts, I found a Displayport to HDMI adaptor which gives the full HD resolution. I'm undecided whether to keep this connection or get a new DVI-D to VGA - the screen looks a little too crisp for my preference.
Anyway, the new 1080ti card is working fine after uninstalling the drivers and then reinstalling them again. I'm going to give it a few weeks to satisfy myself that everything is running properly, then look at upgrading the power supply so that I can add back the 980Ti for iray rendering.
Once again, thank you for the advice.
VGA doesn't relay "screen size"... That is an HDMI/DVI thing. (Windows would attempt to identify the connected device, as some form of input, but if you use an adaptor, it is seeing only the adaptor's value.)
You would have to manually setup the computer resolution to your desired output, then also adjust the TV-input to match that expected input type. (Older TV's tend to fit the screen incorrectly, unless they have a "Game-Input" mode, or a specific "Computer-Input" mode.)
You are better-off with an adaptor for HDMI or one of the DVI variants, if your TV has something other than SVGA/VGA connections. DVI will transfer sound as well as video, HDMI and SVGA/VGA will only do video output. (Unless you have some special HDMI setup that also allows sound.)
DVI will also have better color-output than HDMI or SVGA/VGA, and larger potential resolutions and faster refresh-rates.
Each card will pull steady max's of about 250-watts with peaks hitting 300-watts, but that is not actually normal for rendering. Expect that your CPU and drives will consume another 180 to 250-watts, depending on the collection of other hardware. (CPU type, Cooler demands, Drive-types and quantity.)
I have run a two-card system with a PSU as small as 550-watts, but it was more stable with a 650-watt setup, and loved the 750-watt setup more. (Ran a little cooler with the spare amps to handle demand.) [NOTE: A 550-watt PSU will normally handle modest spikes of 680-watts. However, it is only "rated", for a constant load of 550-watts. Beyond that, you are pushing manufacturing quality of the PSU.]
At the moment, I use 1400 and 1600-watt PSU's in my systems, with four-five cards and watercooled CPU's that are as hungry as the GPU's. Total overkill, as measured "Watts at the wall", normally only hit 750-watts, with four cards rendering. Though, playing games, it hits close to 1200-watts at the wall. Which is about 1000-watts being delivered to the actual devices. That other 200-watts is just wasted power along the way, consumed by the power regulators and ejected as PSU heat.
Thanks for the info on power usage. I'm running a 750W PSU currently which might be borderline on working with both cards, but as I'm also running a water cooling system, an SSD and 3 HDDs I don't think it will be enough, so I'm looking towards 1000W or 1200W.