Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
You are totally fine! Like I said above, your biggest concerns are physical size and power draw. Your PC is quite solid. If you go with a 3090 that has 24GB, you might need additional RAM if you plan on using most of that VRAM. So you may consider going to 64GB if that is an option.
If 64GB is not an option, do not fear. You can try it out and see how it goes. But if you start running out of RAM then you might want to consider an upgrade. The 6700 should be able to handle 64GB, so the only question is if your motherboard can. If it doesn't, then you could just buy a new motherboard rather than an entire system, which of course would be expensive.
I'm no expert but I upgraded from this:
Intel i5 4670K, Gigabyte Z97X-Gaming 7, EVGA GTX 1070 Gaming, 2x 8GB Patriot Viper 3 DDR-3 1866 memory, Samsung 860 EVO 1TB SSD, Hitachi HDT721010SLA360 1TB HDD, Western Digital Blue WD20EZRZ 2TB HDD, Corsair HWX 1000 80 Plus PSU, HP DVD1720 optical drive, CoolerMaster CM 690 II Case, Samsung SyncMaster P2370 Monitor @ 1080p, Windows 10 Professional 64
To this:
AMD Ryzen 7 3700X, Gigabyte X570 Aorus PRO WIFI , EVGA GTX 1070 GAMING, 32 GB G.Skill Trident Z Neo DDR-4 3200 memory, Samsung 860 EVO 1TB SSD, 2 Western Digital Blue WD20EZRZ 2TB HDDs in Raid 1, EVGA Supernova 1200 P2 PSU, CoolerMaster H500P Mesh White case, Samsung SyncMaster P2370 Monitor @ 1080p, Windows 10 Professional 64.
I specifically upgraded to get PCIE 4.0 to get the best performance out of the current and future GPUsand am very happy with it. My old system build was 5 years old except for the GTX 1070 which was upgraded from a GTX 960 4GB card. I also replaced my power supply because that Corsair PSU was about 12 years old when I upgraded it last year and moved the EVGA PSU to this system. I do miss having drive bays and an optical drive in the system but I have my DVD in a USB 3.0 enclosure in case I need it but have not needed it so far. Will probably need it when I wipe and reinstall the SSD later this year.. I also really like the Coolermaster H500P case. It is like the grand canyon inside this thing and the white paint reflects all the light from the installed Coolermaster RGB fans. I haven't had a system with a bunch of bling since my old Antec Lanboy case with the 5 .25 vertifcal mounted drive bay next to the case window that I mounted my WD Raptor X clear lid drive with a blue bar lightand a bunch of blue led fans to show it off. It was a real hit at Lan parties watching the read/write head move back and forth when I would defrag it.
Just waiting for the RTX 3000 series reviews to see if I will try to wait for a 16GB RTX 3070 or 20 GB RTX 3080, go for broke and max out my credit card for a RTX 3090, or see what the used market is and maybe pick up a pair of 2070 Supers cheap and an NV Link to get my 16GB of memory. I am wanting to do some comics that are mainly Sci-fi and the ship assets I have seem to go to CPU quite regularly and it is really frustrating at times.
I don't see much return-for-investment of the money needed to upgrade this setup unless you need extra 30fps gaming 300fps in 1080p for that new 360hz GAMING monitor. Or you are a developer or professional content creator with specific needs.
I am not a gamer not a content creator. I am just fed up with slow renders and ridiculously slow animation renders. I don't really think that a 3090 is realistic for my budget other than I had a hint from a family member that I might get some help towards a PC upgrade. Just in case, I measured my PCIe slot space and I have 32 cm to the Drive bay. That's just over 12.5 inches. NVidia say the FE 3090 is 31.3 cm (12.3") so it is a squeeze. OTOH, a 3080 or 3070 would fit easily but I will have to wait to see if the extra VRAM is going to be coming with the Ti or "Super" versions.
I have a 750W PSU so that should be ok.
...should fit in my P-193 as a Titan-X does with room to spare. The triple width isn't an issue either as with 24 GB and 10,000 cores I pretty much only need one (it would have to go in the #2 PCIE slot). As that system is on W7Pro, no issue with W10 WDDM robbing a singificant portion of VRAM. Already have a 750W PSU (which is recommended) and an 850W on standby just in case.
It should fit, but note that the Titan X is 26.7 cm long, while the FE 3090 is 31.3 cm. The power cable on the FE fits into sort of the middle of the card. The 3rd party cards appear to have them towards the end like most other cards do. Some AIB cards might be even longer than the FE. If you are using the 2nd slot on the motherboard, you might want to make sure it has room under it for air since it takes 3 slots. You should have plenty of power unless yours is overclocked to the moon, which I don't think would suit Daz that well anyway.
If it's 'more VRAM at a cheap price', there are some nice deals on Titan RTX's.. I'm keeping an eye on those Quadro 8000's too :)
If you have a recent PC that you are happy with and that supports PCIe 3, then I see no reason why you wont continue to be happy with it for some time.
See how it goes, before spending cash you might not need to.
If you have a recent PC that you are happy with and that supports PCIe 3, then I see no reason why you wont continue to be happy with it for some time.
See how it goes, before spending cash you might not need to.
Just compare the speeds you get with others, and take particular note of those with PCI 4 systems and their render times over yours.
Do some render tests using the scenes available.
Your two main concerns:
1) Will it fit?
2) Have you sufficient power available.
Not all 750W PSUs are created equal; not all branded 750W PSUs are created equal.
An overtaxed PSU can blow, and it is possible (although not likely) that a PSU that fails can take components with it.
I think the questions of will it fit and is 750W enough are moot at the moment because I'm still thinking my most likely purchase will be a 3070 sometime next year, assuming the 16GB version is released. Nevertheless, if I were to luck into having the cash for a 3090, I can't see why 750W wouldn't be enough ... even NVidia say that this requirement is assuming an i9 10900K CPU. The GPU itself is rated at only 350W.
Can't see?
Seriously, whilst I agree that it should be, gambling hundreds isn't something I would do. If you don't know you are trusting to luck.
NVidia are going to have a lot of complaints if their cards suddenly start blowing 750W PSUs when the published specs recommends a 750W PSU. The one I have in the picture has some 5-star reviews so it isn't some cheapo sub-standard knock-off.
...don't see the Titan RTX coming down in price to challenge the 3090 though. RTX Titans are still priced around 2,500$ (or more, Walmart is asking over 3,100$ for one).
Yeah the RTX 8000 dropped from the lofty price of nearly 10,000$ at introduction to around 6,000$ depending on the outlet (Amazon has several at 5230$). Quadro 6000s are still in the 4,000$ - 4,500$ price range. so that Quadro 8000 may end up being the better deal in the end. 48 GB would really be super. Though 10,000 CUDA cores, mmm juicy.
...did some measurements. and my case has enough room I have is 14.5" of space from the back of the case to the drive cage and 6.5" from the interior of the left side panel to the MB. A tight fit but not unmanageable
...yeah I have a 750W PSU and an 850W in reserve. Still running a 6 core Westmere XeonX5660 that only peaks at 95W and 24 GB of DDR3 1333 memory so that shouldn't pose an issue.
My case has room for 13.78 or 350mm (35cm) according to PCPartPicker.com and I'm not buying a new one to run a 3090 as I expect in 2 generations of GPUs a new architecture will completely and totally blow the the 3090 out of the water and be all I need for many years too so I don't want to break the bank now on a 3090 anyway.
Hmmm... two 2080ti's to one 3090... If the test numbers go the way it looks like they should, this might be an update worth doing.
Well yeah, two generations from any generation will pretty be blowing the old one away. The 1080ti is now going to be two generations old. Looking at Daz Iray, the 2080ti already destroyed the 1080ti with its new ray tracing cores. Even the 2060 is faster at Iray than the 1080ti is, it only lacked VRAM. The 3080 and 3090 will make the 1080ti feel quite old indeed.
You can go further back as well. Just look at the 980ti vs the 2080ti, that would be a joke. The 1080ti vs the 780ti. We can go on and on.
The next generation is supposed to start using multiple GPUs on a chip, very similar to how AMD Ryzen uses chiplets right now. If this pans out, the implications will be literally game changing, as over used and dramatic as that may sound.
That is not just because of performance, though that will certainly be a factor. The real benefit to chiplet GPUs will be yields. Currently they pack as many transistors as possible on a chip, and to get more they need to scale up to larger chips. But with MCM that changes. They will be able to use smaller chips packed into a single package, which is exactly what AMD does with Ryzen. Because the chips can be smaller, that will make yields better, and chips can be potentially cheaper. The upsides are huge. That would mean it should be easier to manufacture in mass quantities.
The downsides may be how fast the different chips can communicate with each other. That is the one and only thing that holds AMD Ryzen back...they have just a tiny amount of latency between the chiplets that allow Intel the one and only performance advantage it still has left. GPUs are even more sensitive to this, which is why this has not been done. (And no, past dual chips don't count here.)
This also raises other possible questions. With MCM, GPUs can scale up super high, so the only thing stopping them from going too wild will be power draw. So you guys think Ampere draws a lot of power? I think the next generation might actually draw even more if they go crazy with performance again. What would be stopping them from making a 450 Watt 4090? You might think that is crazy, but some of you thought 350 Watts would be crazy, yet here we are, and people still want it. People would still want a 450 Watt card as long as it had the performance to match. Keep in mind that some dual chip GPUs in the past had insane power draws. The 4870x2 could draw over 300 Watts AT IDLE! You read that correctly. That was for a the whole system, but holy smoke...like literally holy smoke. My system with two 1080tis will run around between 70 and 85 Watts at idle. The same system would crank up to 750 total Watts under load. The 3090 isn't going to be doing that unless you pair with some power hungry server CPUs. The point being, they have built some wild GPUs in the past.
I do wonder, where do you personally draw the line on power? For me I think the 3090's 350 Watts is about as high as I would want to go. My two 1080tis can make my room a couple degrees warmer, so it does give me pause to consider going too much higher. A 3090 would use less power than two 1080tis, so there is that, though I am thinking of keeping a 1080ti if there is room, and it if it performs well with the 3090 when VRAM allows. That would be a very interesting pairing.
...if it gets too ridiculous, the average user will be left out as you would need a 220W/30amp line to run your computer. If you rent, you would be out of luck because the only 220W/30amp circuit most apartments have is for the stove.
I think it costs about $600 to get an electric car charging station installed. But it will have to be adapted to that weird 12-pin connector.
...maybe Newegg will sell adapters.
The "average user" has no need for that kind of GPU setup.
They don't have to blow a lot, only 'mine'; it isn't my responsibility to check if other's PSUs are fit for purpose. It is, however, my responsibility to ensure my own is, it isn't something to abrogate to some third party.
I'm done discussing this, please yourself.
As someone who has a 980ti, I'm really looking forward to the upgrade; I have 1200W PSU in great condition, and enough space for 2 3090s. I don't have plans for a second, but I may upgrade the other card with will soon be the 980ti as that will take over from the 970, which drives my monitors and does struggle a little at times.
That thing is only on the Nvidia cards. All the AIB's refused to put it on their versions. No one seems to know Nvidia was trying to do, except annoy people.
On the power issues, 320W for the 3080 is high but not outrageous. If Nvidia is being honest, and there'd be leaks from the AIB's if they weren't, there is no reason to expect this to not be like previous such TDP's. It will hit this under things like Furmark and briefly in games but otherwise it won't happen.
Mathieu, the creator of E-Cycles, told me that a 2080ti can draw 400W when its stated maximum in 250W.
I'll be adding a huge margin when choosing my PSU.
Did he say under what circumstances and for how long?
It was in the context of some flakiness I was seeing in the Linux driver (that has subsequently been corrected), so not really. But he did say something pretty damning:
And true enough, now that I have my work computer at home on the same circuit, I can't reliably render with all 4 GPUs anymore without occasionally tripping the 20A circuit breaker.
250W is the TDP not maximum power draw. Rule of thumb buying PSU is usally adding up all the TDP and multiple by 1.3/1.4 for the premium PSU and 1.5 for the cheaper ones. It's pretty much how Nvidia came up with the 750W PSU recommendation for their GPU rated at 350W TDP.
Even knowing that, reading 400W still shocked me.
My current desktop as seen in my signature is drawing 6W - 12W basically at idle window Windows 10 bld 2004, DS Pro Public Beta 4.12.2.6 (scene open but idle), and MS Edge with about 30 tabs open while I type this so that's respectable. I'm quite happy with the speed, quietness, and even a reduced electric bill from when I was using my 2012 HP 8470P Elitebook. Things have improved a lot compute wise with regards to power.
For a multicore GPU in 2 generations 32GB+ VRAM that had the speed I'd be willing to rebuild my PC or build a new PC for the GPU to do 500W and the rest of the system, mostl;y the CPU draw another 500W, understanding though it'd only draw close to such such power compiling programs or rendering scenes or doing extreme specialized math and AI fast: and those really about the only personal scenarios; and it'd finish those quickly enough to keep me electric bill affordable. I'm not talking a scenario where I am using it to crunch cryptocurrencies 24/7/365 that would leave me struggling to pay the increased electric bill every month before my PC was drawing 1000W 24/7/365.