Curious as to what GPUs everyone is using.
The challenge with forums is trying to not ask a question that's likely already been asked (I thought searching on "GPU" would return too many hits). Not to mention the technology changes so rapidly now I figured I wouldn't be creating a huge faux pas.
So.... I'm curious as to what everyone is using for GPUs. I have an NVIDIA GeForce GTX 1660 Ti with 6Gb of GPU memory, and have quickly realized it might be a wee bit to small. Any renderings of decent size seem to swallow up the GPU memory really fast. My only solution is to close and re-open DAZ to get it to release the memory. I'm also a Data Scientist, so I use this same machine to train neural networks, which can also hammer the GPU's memory. So, considering upgrading but want to do some research before I shell out a few hundred bucks.
Thanks!
Dave
Comments
I think it's safe to say there is no commonly agreed upon best choice for GPU's, for a multitude of reasons. And I also think it's safe to say that now is not a good time to be looking to buy a GPU, since availability and prices are a big issue. As in there is no availability, and prices are insane.
Also, it was found years ago that the new RTX GPU's have performance that is very much related to scene "complexity", and their performance can vary drastically depending on that. So there really is no simple benefit to cost ratio for GPU's now. It depends on what you're willing to spend, the content of the scenes you render, how much VRAM you need, how fast you want your renders to be, and on and on. Some will recommend you get the fastest, newest, coolest GPU out there, or wait a few months for the coming fastest, newest, coolest GPU. But if you don't need that, then it may be a bit of a waste. And at the end of the day, with all the hype, you personally might only see a 30% improvement in render times, which may not be worth it for you.
Others (like me) will suggest you first consider other ways to get faster renders and more lightweight scenes that require less hardware before shelling out big bucks.
So unfortunately there's no good answer, IMO. It depends.
I'm still on a GTX 1070 with 8GB GPU ram. I completely understand your pain with scene sizes. I think what Ebergerly said is probably wise, with the caveat that more onboard ram will allow larger scene sizes from my experience/understanding. I personally saw a huge increase when I upgraded from the 970 (4GB), so my plan is to sink money into one of the higher end cards - if/when the bitcoin miners stop buying them all up and the prices stabilize!!!
In the meantime, when I do larger scenes I often have to break them up into multiple renders and then put them together in Photoshop. I usually do this just by setting up the scene, and then hiding various elements so I get a background, then a forground - sometimes more layers. Anyway, that's my 2 cents. I'm certain more educated people will weigh in.
I've been in the hobby for almost 2 decades and I always feel like I'm 2 generations behind in available tech and my work arounds are always useful. I can do bigger and more realistic images now, but the process is still the same!
2080 super + 1070
I'm using a 3090 and I couldn't be happier.
980 TI and 1080 TI. I plan to add a 30xx when I can get it for not more than $100 over MSRP. That will demote the 980 TI to running the monitors and I can pull the GT 740 currently doing that.
The 1600 series of GPUs are crap-tier level cards that should have not even made it to market, seriously the GTX 1080, even the GTX 1070 is a far superior card to the entirety of the rotten 1600 series, and it was a previous generation!
I hate the 1600 series due to the crappy performance vs the gtx 1070/1080 and measly VRAM at only 4-6Gb, as too many people have been led astray from getting competent GPUs!
On topic, I was pushing the limits of what I could do with my stalwart 4-year-old 8Gb Vram GTX 1080, up until the point that I lucked out on a render-ten-scenes-at-once 24Gb RTX 3090! <=(Seriously, you'll have a better chance at getting one with this site/app!) Now all I need is to upgrade my old 850W PSU to at least a 1000W as I already pushed the limits of my PC that caused a shut-down...
Seems like encoding 2 videos at once while gaming was a bit much for my Rig!
I got mine @ 100% MSRP via Hotstock if you live in the u.s. ar UK, I'd go that route!
This is incorrect if you live in the UK/u.s., you can get a 3090 via hotstock, that's how I got mine on January the 5th, and the availability is strong as the 3090 has been stocked twice-a-month as one just dropped on the 9th and the 25 via best buy which is 100% MSRP...
I'm in NZ and we have 3090 stock on the shelves although at somewhat inflated prices compared to MSRP. However, that is not unusual because whatever is quoted as MSRP in US$ never translates to the same price, by exchange rate, in NZ$. I found that was also the case when I lived in the UK where the US$ price would usually show as same number of UK pounds sterling.
I had saved for a 3070 but was disappointed with the fact that it came out with only 8GB of VRAM which was no improvement on my existing GTX 1070. I was resigned to having to stick with that 1070 until my family worked out a way to make up the differnece between a 3070 and a 3090 (involving my savings, selling my 1070 and some generous help). So the 3090 is now blowing hot air through my PC vents and I'm a happy old man (for a while, anyway).
Congrats, I hope it serves you well over the years!
I was originally going to just go for the 3080 as the 3090's price was $300 over the $1200 I had saved up for previously targeted 2080ti, but when they came out with the official specs, my mind was made up, screw the 3080, my rig needed a 3090 and that sweet-sweet 24Gb Vram!
I use an Intel Mac mini, which works just fine. I also have a new M1 Mac mini, but since Daz doesn't currently work on the Big Sur version of MacOS for either architecture, I set up screen sharing from the Intel version and work on my M1 screen. Screen sharing works suprisingly well. I just have to make sure I don't accidentally allow the Big Sur OS update on the Intel mini until Daz support appears.
Twin 2070Supers, NVLinked. Render speed isn't my bottleneck as I don't do animations, so it's fast enough, but that NVLink is pretty sweet to get round those issues running out of VRAM, which I used to run into a lot.
I almost got one via Hotstock at Best Buy but the bots were quicker, aided by a bug in the Best Buy site that kept my cart page on their site from being updated with the video card I had on hold in my cart.
I am using a 3080 and over the moon with it.
RTX 3090. Got lucky and managed to grab one before they disappeared and so far I've been really satisfied with it.
4 x 2080ti
The speed is good, but am limited by the 11GB VRAM, and never did get NVLink to work.
Will research better and upgrade to probably 2 x 3090 at some point, but I'm not going to pay those crazy prices. I bought a K80 (Blender Cycles) just to see if my scenes would fit into 24G without using the magical simplify button, and it turns out that most of my scenes just barely exceeds the 11G. Thank god for hair particle systems.
Post duplicated, see below.
I can't say that I understand how VRAM is allocated in DAZ Studio / IRay but for all the time I was using my previous 1070, 8GB seemed to limit me to 3 characters at most in a scene. I could almost guarantee that an additional G8 figure would cause a drop to CPU rendering. Now that I have a 3090 with 24GB of VRAM, something odd is happening. Those same scenes with 3 characters are now showing (in GPU-Z) as occupying more than 8GB - usually around the 9.5GB mark while four characters takes me above 12 or 13 GB. So it seems to me that IRay is somehow adaptive and will take more VRAM if it is available.
That really sucks that they got to Best Buy, I hope every one of those scalpers gets a rampant worm and/or their PCs stolen, I really detest those people!
I have a gtx 1060 in a laptop, of the issues I have with my current setup, the gpu ranks low on the list - definitely behind too much glare from the sun, in terms of things that make rendering more difficult.
:-) Currently using 1 x 3090 and 2 x Titan RTXs - ordered 2 3090s but only the first one has been delivered.. When the second 3090 arrives, I will use one Titan for display and the two 3090s for rendering. Pretty satsfied with rendertimes - not so much with the power bill...
GTX 1660 6GB is sufficient for my needs.
Thank you everyone for the thoughtful and insightful replies. I think I will aggregate the guidance, hold off on replacing the GPU for now (and yeah, I agree, the 1660 series is pretty sucky). Instead, I'll put more effort into more effective rendering. I'm sure I have WAY too much "stuff" in my scenes. I will delve in to some tutorials on the subject.
Thank you again for taking the time to reply!
Dave
I don't see that as particularly surprising; this is a common idiom all over computing. The driver is probably smart and does indeed adapt to the memory that it has.
I can talk about Linux specifically in that it will ALWAYS use all of your physical memory if it can, in order to improve your user experience by doing this like caching I/O and reading ahead anticipating future I/O requests because it is extremely cheap to give the memory back to a process that actually needs it. It's easy to believe that your driver is doing something similar.
Yep, I remember that about Linux. I mentioned this VRAM thing because I imagine a lot of us determine what hardware we need by what GPU-Z is showing for the GPU we are already using. I'd include myself in that before I upgraded. So someone might look at what a 1070 can fit into VRAM and decide that the 12GB in a 3060 probably wouldn't be enough whereas it might. My new 3090 went over the 12GB mark with 4 characters but maybe a 3060 12GB might have been ok with the VRAM allocation tuned to that limit.
Nevertheless, I am so very happy not to have to worry about 8GB or even 12GB but I think that, if I had not had help acquiring the 3090 I would have settled for a 3060 with 12GB.
I got a titan X a couple years for the 12g ... used on ebay $400.
---
and it handles most everything I've thrown at it.
but I have gone to rendering in layers
background .. foreground and couple in between
turn off the draw dome to get the same lighting and stack them up in PS
---
putting together a 10k x 5k right now...5 separate renders in it.
Alan, thanks for the suggestions. Yes, I really need to learn more economic ways of rendering. I will make use of your examples--thank you!
3090 .. and the speed just makes my eyes wobble ..
2x 2080ti and 3090. I bought a 3090 at the store for $2,500. My friends said I'm crazy and it's expensive for this GPU. But now this GPU is already $4000+ In the same store. I was wondering, what if my GPU crashes? I won't be able to buy the same card at the old price anymore. The world has gone mad.
I have an 8GB RTX2070 Super and just a couple of hours ago rendered a scene with seven G8 characters with clothing and hair in an interior scene. Overall VRAM usage did get over 7GB:s but still not over the limit.
i have a 2080ti. It's fine.