Adding to Cart…
![](/static/images/logo/daz-logo-main.png)
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
If you want to get a safe ground you can attach a simple ground wire with a metal pipe clamp to metal pipe going into the ground or buy a grounding rod and hammer it in and run a wire to your outlet so you can have ground. I would strongly suggest getting a licensed electrician to do the installiation if you don't have someone with know how.
Crappy house wiring is a fixable problem.
It seems the electrical standards should also be fixable, but harder for a house-holder to work around - and maybe impossible?
Two very different issues.
The rumors I am seeing are saying 450 Watts for the 4090. The 4090 is actually looking a bit cut down, too, with over 16000 cores instead of 18000. The 600 may be for the 4090ti which may release at a later date. This makes some sense, as it gives Nvidia some leverage to see what AMD releases. If AMD manages to take the performance crown, then the 4090ti will unleash its 600 Watts sooner. It all depends on AMD. But the 4090ti will be the totally insane product. The regular 4090 will probably be the somewhat more tolerable 450W.
That is what some current rumors say.
I believe the 4000 series will start a new shift in tiers, which kind of happened with the 3090. The x90 is the new top end card. We will still have a x80ti, but the x80ti is no longer the top SKU on the list, nor is it going to be close. The 3090 and 3080 are not far apart in performance, and the 3080ti is practically a 3090 with half the VRAM. That could change with the 4000 series, as the 4090 will probably have a bigger gap over the 4080. The 4090ti in turn will also have a larger gap, given the core counts. The chip for the 4090 actually has over 18000 cores, so the 4090 is cutting down 2000 of them, if this is true. I hope it is not, because IMO that leaves the door wide open for AMD.
Nvidia is being very shifty with their leakers. They are playing games, and different people have different information because that is what they are being told. It is very possible that these cards change multiple times before launch. Keep in mind that Nvidia managed to surprise everyone with the CUDA counts of Ampere, even the board partners. The AIBs all had printouts with the wrong CUDA counts, remember that? So it is important to keep this in mind, no matter how good the leakers may be, the information they have can be accurate...yet still wrong in the end.
I fully expect a couple of curve balls to be thrown when Nvidia finally announces the cards.
Earlier feature specs claimed up to 2400Watt power spikes at 100% GPU usage for 4090 .. but the final product they might have lowered that however I doubt a PSU less than 1300Watt would be even remotely enough to guarantee stability with either 4090 or 4090Ti .. probably having two PSUs would be required.. unless 2500Watt+ PSUs get released on the market with PCI-E 5.0 specs.
For now I have a RTX 2060 and a GTX 1060 for game development. If the games I am developing using DAZ assets for which I bought the interactive license option will sell then I might get a 4050 or a 4060 but nothing more.. 4070 and up are too power hungry and where I live the power bill is really high (you quickly end up paying thousands of dollars/euros monthly if you have something like 1500Watt+ devices in use 15-18 hours a day or 24/7) and if my first games don't make money I can't afford that.
Don't forget the additional costs from the AC as it has to transfer that 1.5KW's out of the living space![wink wink](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/wink_smile.png)
Guys, this stuff is not even confirmed! We have absolutely no solid information saying how much energy a 4070 actually uses, much less a 4090. Remember, these are all rumors and speculation until we have real products on shelves. People can make educated guesses to what the power draw may be, but they are just guesses. Power draw is the very last thing that gets locked in during development. They can adjust the clocks and voltages to alter how much power the card uses right up until the last minute. There have even been BIOS updates on launch day for buyers that change the clock speed behavior. While core counts and memory configs get locked in well in advance, the clock speeds are tuned over the card's development. A lot of factors go into what the final clocks end up being.
I seriously doubt Nvidia would release a GPU that requires a 1000+ Watt power supply. I doubly doubt that anyone would be using over 1000+ Watts period for the whole system with a 4090. Nvidia would be a laughing stock if they did so. There are still Fermi memes out there from the 400 series. A crazy 4090ti variant like the Kingpin (built for overclocking pros) might do something like that, but not the Founder's versions. I feel very safe saying the 4090 Founder's will be a 450 Watt card. There may be others that go higher, but folks, you can control your GPUs and tune them to not use so much power. You are not locked in to use the card at full power.
If anyone wants to calculate just how much the power cost of their GPU might be, here is a handy page to do so. https://www.rapidtables.com/calc/electric/electricity-calculator.html
It also helps to keep in mind a few things:
1- Rendering in Iray NEVER uses as much power as a demanding video game does. It is not drastic, but it is indeed less. So if a 4090 uses 450 Watts, it is highly likely it uses less than 400 Watts when rendering Iray, if even that much.
2- GPUs do not consume a massive amount of power when idle. They are...you know...idling. You have to be actively rendering all the time to hit the peak numbers.
3- You can tune any GPU to use less power. Undervolting can save a ton of energy without sacrificing much performance. Igor's Lab has shown it is possible to undervolt 3090tis to use just 300 Watts and only drop about 10% performance. Not only did they knock 150 Watts off, they still beat the AMD 6900XT which was using 359 Watts, as well as 3080s which were using well over 300 Watts. That is pretty impressive. https://www.techspot.com/news/94153-rtx-3090-ti-set-300w-rtx-3080-ti.html
Meaning you can get a 4070 and if the power draw is that scary, it can be undervolted to still give better performance than a 4060 can and possibly use about the same or less energy.
450 watts sounds about right. Nvidia wants its customers to spend more of their money on their cards rather than on bigger/exotic power supply units.
Given how power hungry these cards are becoming & the amount of heat they produce, I would NOT go with anything other than a Kingpin or something with a similar cooling solution. I currently use a 3090 series Kingpin and the VRM 1-5 temps haven't gone past 59 C, even with an ambient room temp of 32.2 C( or 90 F) while playing Bard's Tale IV or World of Tanks on ultra settings. Needless to say, I installed a window AC unit the next day.
I wonder what their next "Quadro" cards are going to be like. When I render with the A6000, the temps get up to 80 C since its a blower-type cooler as is typical with quadro cards. I can't imagine them going much higher on the wattage for these workstation cards without making them much bigger to allow for bigger fans. Not a big fan of having even bigger cards.![cheeky cheeky](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/tongue_smile.png)
My 3090 Founder's doesn't go past 64C on air, even with a 3060 sitting above blowing its hot air on it. And the 3060 has a cheap basic cooler, it hits around 80 easily even though it uses like half the power. The Ampere Founders cards really do have great coolers unlike previous generations.
The Quadro series has a different set of challenges. They have to be 2 slots so they can do multiple GPUs. They also need to stay under a certain power level. All the A series cards use much less power than their gaming counterparts. I am going to assume next gen will be no different. The next one will stay around 300 Watts, maybe slightly more. It certainly will not be close to 400.
...indeed, I remember all the talk about the GTX 980Ti going to 8 GB of VRAM. In the end, it had 6.
The workstation cards have always pulled less power than their gaming counterparts. The A5000 has a TDP of 230 which is less than even my Titan-X even though it has twice the VRAM and 5000 additional cores (not counting RT and Tensor cores). It is interesting that they use the blower style for cooling.
Which 3060 do you have?
My 3090FE had a great cooler and lousy thermal pads. 102c on the vram with the stock config is poor. It's like a 150mph bike supplied with 130mph tyres.
OK so the first of the ATX 3.0 PSUs are starting to appear and that standard will be used on the next gen cards:
http://www.archyde.com/atx-3-and-pcie-5-0-power-supplies/
So the more computer savvy of you may be able to answer this - does the new standard mean we'll possibly need a new PSU to get max performance out of the RTX 4 Series?
The 3090ti uses the new 16 pin plug and comes with an adaptor for three eight pin connectors from a current PSU. Three eight pins plus the power from the slot will supply 525 watts (3 x 150 + 75). If the new cards need more power than this then you may need a new PSU, unless four x eight pin to 16 pin adaptors become a thing. 16 pin straight from the PSU is 600w plus 75 from the slot. It's possible that some cards will come with 8 pin connectors and ignore the new 16 pin plug as was the case with 30 series cards and the 12 pin plug.
Everything I am saying in this post is opinion.
Like all things, competition is a huge factor. The competition versus the 980ti was not strong enough, so why bother with extra VRAM that was not going to make it sell better? Maxwell was when Nvidia really started taking over the market. AMD had some good cards, but nobody was buying them. AMD as a whole was on a big downswing that nearly killed the company.
The 3090 is a different story. This was going to be a Titan card, but the power budget killed its ability to be a Titan. Servers don't like 350 Watt cards, so there was no point in making it a Titan. So they nixed the Titan feature set and advertised it as a creator's card. The 3090 wound up using 350 Watts because Nvidia knew AMD had something good coming, and AMD has been riding high thanks to Ryzen. They were proven right, the 6900XT came real close to the 3090 and in some cases it can even win. The 3090 also wound up creating a new product stack, the x90 name plate. But it is just a name, they can name their top GPU whatever they want. Still, creating a new naming tier above the old x80ti means they can have a product priced higher. Notice that the 3080ti was in fact the same general price the 2080ti was (the 3080ti has no official MSRP, but most AIBs pin it to about $1200). The 3090 was priced at $1500, above the 2080ti, because well, it was a x90 now, just look at the 24GB, LOL.
That is what is going on with the 4000 series. It is all about competition, and Nvidia is cranking the clocks way high in order to win, and that has lead to a 4090 that might use 450 Watts. They want the performance crown. Jenson Huang is extremely competitive, and also pretty smart most of the time. They know how important the crown is. The crown is vital to keeping the mindshare of gamers and other users. If AMD takes the performance crown, it creates a trickle down effect all the way through the product stack. You often see people buying a 3050 or 1650 instead of the AMD competition simply because it is Nvidia. Even when AMD has better products priced cheaper. If you sort GPUs by popularity at almost any store website, you will find Nvidia dominating. But AMD actually beats Nvidia at several tiers in performance. Sure, Nvidia is the only card for Iray, but guys, Iray is pretty niche. There is no doubt that in general, the CUDA platform is a huge advantage for Nvidia across content creation software. But in video games, AMD has very compelling products to compete with, and that is still a prime target for these companies.
My 3060 is a EVGA Black model, which I believe is the most basic one. It was the only one I could sign up for back when they launched. I am fine with it, though. The temps are still in spec, it is not like it is actually overheating. The funny part is that all the noise comes from that card, even though it is the smaller, weaker card using less energy. The fans spin much harder to cool this little card. One of my 1080tis was EVGA Black, the temps on the 3060 Black are very similar to the temps of the 1080ti Black, even though the 1080ti used more power.
My main system is my laptop and when it starts rendering it sounds like an aircraft taking off and I can tell the render is done when the noise ends. The 3060 desktop threw me at first because it made no noise in comparison, I had to look at HWinfo to see if the fans were spinning. Of course my 3060 isn't nestled next to a monster 3090 like yours is!
My mates are like 'those are good gaming systems you have there' and my response is 'I don't game, I use them to make pictures of fairies' lol.
...I'm the same. Last PC computer games I played were Civ3 and Duke Nukem 3D.
Rather spend my time making pretty pictures and illustratiions for stories .
Perhaps a fault? I haven't seen much over low 90s celcius in 'normal' non crypto use on the 3090 FE memory temps. Though I haven't used it during the middle of a summer yet. Are you sure you weren't mining in the background when you saw those mem temps?
Edit: On the main thread topic, I can't ever see myself using a >350W card. I won't even use the 3090 that high outside of mid winter.
Edit2: I take back my post from a couple of weeks ago. I've stopped mining, removed all the additional heatsinks I put on the backplate and returned the case orientation to standard. Now its noticably much more noisy when Daz rendering. I didnt really credit the heath robbinson heatsink setup before.
No mining going on, Mine is an early production card and it isn't uncommon to see people reporting those temps and worse while gaming or rendering. There's an awful lot of info out there about swapping the thermal pads and the benefits it can bring, maybe they changed them for something better later on or there was some other silent revision.
I tried swapping the case fans, taking the front out of the case and undervolting but the vram temps still hovered around 100c with the core in the high 60s. I could live with the temps but when the VRAM goes above 102 the fans ramp up and the thing sounds like a hoover. I tried to RMA mine and the store said that the behaviour was within expected range. I wasn't the only one to get brushed off. I know of someone that bought multiple for video work and returned half of them for unacceptable fan noise or coil whine. Mine was also the worst card I've had for coil whine.
I gave up fighting it and stuck a water block on, which got the temps down and mostly cured the whine.
Good post and I agree. 800W would be too much. But home circuits can handle it.
15A circuit = 15A x 120V = 1,800 Watts
20A circuit = 20A x 120v = 2,400 Watts
Now you want to only draw about 80% max on either circuit. Even thought you technically could go higher. It is just a safe number to use. So 15A could support 1,440 Watts and 20A 1,920 Watts.
Some people go with 220V to overcome those limits and bigger wires and breakers.
I have also mined cryptocurrencies with GPU cards and have run seven 1070TI and six RX580 cards for a total of 13 GPUs on one 20A grounded circuit. The rig would draw around 1400 watts continuous power on a 20A circuit. I had a few of these rigs and over 40 GPUs in the farm.
One rig was connected to a 20A quadruplex receptacle via two 865 watt UPS devices. Well one of the outlets melted. I noticed the rig going offline and shut it off overnight. Next morning I went to investigate and discovered the melted outlet. It was in a metal box mounted on a CMU wall with conduit so fortunately a fire did not break out. When I took the box apart I discovered that the neutral copper pigtail wire insulation connecting to the conductor wire in the metal box had melted away. There was an oily residue in the box. I cut the metled wire off and was able to reuse the rest of the conductor wires and clean off the oil off the metal box. This time I really tightened all the connections much more. A connection that is not tight could lead to heat buildup and a voltage drop.
I have another rig that due to the location in the house had to be connected to two 15A circuits of which one has a ground and the other is ungrounded. That way the load was split over two circuits. Two power supplies ran off one circuit and another two off the other. The four power supplies were plugged into UPS backups. This worked fine for over 3 years. However earlier this year my wife noticed a bad burning smell in the home. I was not home at the time and of course all the rigs were mining Ethereum. I returned home an noticed the bad smell. Checked out the GPU temps and all were fine and all the GPUs were mining. I started to track down the smell and the ungrounded 15A outlet was starting to melt. I had been using one of those two prong to three prong adapters to power half of the rig off that outlet. It worked fine for over 3 years but now it was failing. I ended up replacing the outlet with a duplex 15A ungrounded GFCI and it's been fine since then. It has three prongs so I don't need to use a little adaptor on that outlet anymore. Getting a ground to that oulet would take some effort. I guess I could try to run a separate ground copper wire to the outlet box but it would have to run to the ground bar in the electrical panel.
Never going to draw a lot of continuous power through a little two prong to three prong adapter again. Fortunately that outlet box was also metal and did not set anything on fire. However we have some stuffed dog toys near those outlets and if it had been in the wrong spot it could have combusted. A little bit reckless I know.
A single 800-watt GPU is doable depending on the PSU and the other hardware installed, but two of them wouldn't work without an upgraded circuit to whatever outlet the system will be getting its power from. I'm not even sure what kind of a PSU you would need for something like that. I've seen 2000-watt ATX PSU for mining rigs, but even one of those things wouldn't be enough to handle two 800-watt GPUs if they are prone to having the same power spikes as the other Nvidia GPUs. I think you would have to undervolt them, but that would defeat the purpose unless all you plan to do with them is mine crypto, which seems to have lost its viability in regards to profit.
The thing is, while yes it can work, it might if you only have your PC on that particular breaker. If you add other devices it all adds up. What if somebody has multiple monitors running, audio other devices? I imagine that your mining rigs were isolated from everything else.
But your experience with the non grounded outlet brings up another issue, old housing.
So while some places can handle it, I can be quite sure a number will have issues because of bad wiring or old out dated wiring. If you have an old enough house...you could be in trouble. Plus that does not consider transient power spikes that can happen. GPUs can fluctuate rapidly with their power draw, with quick but very high spikes. This is becoming such a concern that some hardware tech channels are attempting to investigate it given the rumors around the 4000 series. These spikes can be very fast, so can be hard to catch. They can also trip the safety measures of a power supply, meaning the PC just shuts off suddenly without any warning or crash report.
Here is a video by Gamers Nexus that covers the issue. It is long, but very in depth. They explain a lot of things, like why this issue is becoming more noticeable now as opposed to years past, and why a 800 Watt GPU could pose a big problem even if everything was built to spec.
An EVGA 3090ti is 450 watts. Add another and you have 900 watts. 900 watts does not seem out of the question if the card does what two 3090ti cards do.
Most people in the developed world live in apartments and those usually are held to much more stringent building codes than houses. In any case, people should own fire insurance (today more commonly called homeowner's insurance because of some auxilary coverage included) and if they can afford it, flood insurance. Apartment renters should buy renter's insurance. All of that adds up. To help protect against electric circuit fault if you to have fire insurance some insurance companies have entered into an agreement to Ting and you would get supplied a device that you plug into your dwelling's circuits that montors for electric circuit faults free. You may be able to purchase the Ting service if you don't carry fire insurance.
If you have a dwelling that uses circuit breaker box then you are in good shape usually, only lacking grounding in some of the older homes with circuit boxes. You can recitfy that by buying GFI outlets to replace the ungrounded old outlets or pay to have your circuits and wiring upgraded. Not a big deal if you get the right electrical business involved.
I've read that one should buy an electric surge extension cord with at lease a rating of at least 2000 joules, with 4000 joules prefered, so keep that in mind when powering any device with sensitive electronic circuits. Trouble is those are bulky and unsighly decor in you home. You can buy surge protectors that are not extension cords but plug directly into wall outlets of up to 1650 joules or 1500 joules which is getting close to the desired 2000 joule minimal ratings advised. The average surge protector extension cord is typically rated 650 to 900 joules. It does happy that electrical storms cause surges that destroy TVs, telephones, and other electrical devices. Have seen it happen 1st hand in multiple places in the USA over the decades. Not in Europe though, I have trouble even remembering an electrical storm in Europe although there most of been a couple in all the time I was there, we did get caught in a storm once in Ticino that 2 or 3 big trees fell on paths we just exited from going to the hut but I don't remember thnder or lightning even for that.
Depends on the country, imo.
UK is keen, of course, old wiring, which is common enough may cause issues, but our circuites will take 32Amps; 13 from one socket, although 3KW for a PC would be a shocking electric bill! (No pun intended!)
Electrical faults are an extremely common cause of fires, with tens of thousands of fires attributed to this over just a 4 year period.
The following comes from https://www.nfpa.org/News-and-Research/Data-research-and-tools/Electrical/Electrical
Report highlights
Fire departments responded to an estimated average of 46,700 home fires involving electrical failure or malfunction each year in 2015–2019.
Fire departments responded to an estimated average of 32,160 home fires involving electrical distribution and lighting equipment each year in 2015–2019.
These fires caused an estimated average of 430 civilian deaths and 1,070 civilian injuries each year in 2015-2019, as well as an estimated $1.3 billion in direct property damage a year.
Home fires involving electrical distribution and lighting equipment most often originated in a bedroom.
Wiring and related equipment was involved in just over two-thirds of home fires caused by electrical distribution and lighting equipment.
Approximately one-quarter of these fires occurred between midnight and 8 a.m., but they accounted for just over half of the deaths.
So no we cannot just assume buildings are all built to code. It goes deeper than that, with many fires caused by people simply plugging up too much stuff to an outlet, or using poor grounds. The US electrical standards are really quite lax, and there are a large number of people who are totally ignorant of their own safety. It is far to common that people use skinny extension cords for power hungry devices instead of the proper sized cords. Using the improper cable bypasses any safety measures that a breaker or outlet may provide.
I am not saying that GPUs will start fires. But building codes don't mean a whole lot in the scheme of things. Old buildings can be totally overlooked by new standards. When I got my house, I was quite disappointed by how quickly the inspector went through the house. I can't help but think it is no wonder stuff slips through the cracks.
Computer power supplies have come a long way over the years, but there are still times when they fail to do their job. There are some famous cases where GamersNexus found power supplies that failed in spectacular fashion, with an explosive pop. I shudder to think what could happen with this particular PSU was paired up with a power hungry GPU like the 4090.
Even GPUs themselves have come under scrutiny, with multiple failures being reported after the launch of the 3090, and this is "just" a 350 Watt card. It stands to reason that power hungry GPUs add to the complexity of the board design. Cooling something that is producing 600 Watts of heat is not an easy task, and probably is not that feasible in a consumer product. 450 Watts really has to be the ceiling here, going much past that is just insanity, and would make the product look like a joke. I am not talking about the EVGA Kingpin cards, these are specially built for professional overclockers who are always trying to push the absolute limits, and know what they are doing. Kingpin cards will always be an exception here. I am talking about real consumer products built in mass.
...that's why I wish I could afford the pro grade cards as they has a lower power demand than their gaming counterparts. For example the RTX A5000 tops out at 120w lower than a 3090. true it doesn't have the same core counts, but still does the job quite admirably. 2 of them can be NVLInked for memory pooling as well, which also doubles the core counts and at peak, a pair of A5000s would consuming only 10w more than the 3090 Kingpin or a 4090. .
You don't have to run cards at full power. I only ran the 3090 at 350W when I wanted it to warm my house in winter :D
The pro cards probably have a better binned chip, but not to the extent they are worth paying much extra for imo. They either have more memory, or they don't.
I never bother with overclocking anything in any computer. Too expensive for me and as you all point out, a higher risk of danger.
I was going to mention that too. It's only in recent years that our 3 pin socket kettles were de-rated from 3kW to 2.4kW to meet the lowest common standard across Europe.
Frustrating that the rest of the continent couldn't be upgraded to our standards.
...one other advantage of the RTX A series is size as they are standard dual slot and thus are smaller in overall dimensions (basically the same size as my old Titan-X which fits in my current case with a just enough wiggle room).
Also I don't have to under power the card as it already peaks at 20w less than my Maxwell Titan-X (230w) which my PSU handles just fine.
Well another rumor making the rounds now is that Nvidia may have delayed the launch completely, and we may not see the RTX 40xx till next year.. One line of reasoning behind this is that there is an overabundance of new RTX 30xx cards still available, and one thought of why is due crypto going down hill..