Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
My 1660ti 6GB taps out with larger scenes but for previews its indispensable. If you can go with a 20xx they appear to be excellent. My concern with the 3000's is with C19 that release date is going to push back, and the cost of parts due to scarcity might change that price point dramatically. My 16 has actually gone up in price from last august.
Thank you very much @DustRider for those insights! I took a look and Prostar does seem like a good option although it appears to be limited to 16GB RAM. I will check out Xotic PC! Thank you kind sir!
At some point, the potential time you save rendering on a faster card seems like it's counterbalanced by the time spent waiting to buy the perfect card
The articles I read never claimed the Navi generation you are refering too would have HW Ray tracing and outperform the 2080TI (it said similar or maybe even better) but that the Navi generation due out 2nd half of 2020 "might"do both of those things and that the "Big Navi" coming out would have real time raytracing which is actually confirmed.
So I went and read more of their claims since this came up. Remember it's the Big Navi that is being used in both the next gen XBox & Playstation Consoles you I think it's a given the performance & capabilities will be a major improvement over the expensive Radeon 5700XT GPUs. They claimed the performance of the Big Navi will be about on par with the nVidia 2080TIs but lacking with regard to the nVidia 30s series coming up and it appears from via AMD and game console tech reveals that they are right. I'm not sure why they call that information 'leaks' when it was the tech reveals by the manufacturers of that hardware that revealed those things.
That's not evidence, but speculation at best. The article states them to be leaked specs. Leaks can be accurate, and can not. Then we have. " up to" appear in Cude core count.
... And then again, we have, at the start " The last few days have been pretty crazy in the GPU world, with purported specs on NVIDIA's upcoming Ampere-based GeForce RTX 3080 Ti blowing us away". You'll notice the word "purported", or at least should have because that is the most important word in all of that sentence.
I like reading the rumours, but that is just what they are.
Some will turn out to be accurate, and as has happened previously, some will not.
@everyone
Please don't make factual claims when someone is looking for help with a purchase.
Tell that to my GPU...
But yeah, it does make sense as I'm trying for high realism and my characters are pretty heavy as far as mats/shaders go, plus, I use at least 4-8k HDRIs, as I've said I'm going for high realism, but yeah, I could go for a heavily optimized scene but that would compromise the look that I'm trying to achieve!
Rather than get into a big discussion over it, let's just keep it as a subjective thing between artists and leave it at that.
But no matter, once I get my 3k GPU, the issue will take care of itself!
Big Navi only became a thing once AMD only released mid range, in comparison to Nvidia, cards. They aren't even, strictly speaking, Navi cards. Navi is RDNA 1. The new cards will be RDNA 2.
Anyone who doubts that such claims were definitely made about Navi before 7/7/2019 can easily enough use Google.
The sites that spent almost all of their time pushing these rumors are wrong more often than not. They make money not by being reliable but by the screaming headlines getting gullible people to click through. Just examine the absurd, and contradictory, "leaks" about Ampere. There was a a made up set of specs circulating, that got posted here, claiming the cards were just a generational improvement in performance but also consumed 10 to 25% more power while being on a smaller process node. Anyone, and I do mean anyone, who knows computer hardware would have known that was garbage. But there it was.
? I only optimize when scenes exceed my 11Gb card, which is pretty rare as I know what it can handle after better than 3 years with it.
If by high realism you mean excessive subD, what else could you possibly mean, then you're hamstringing yourself for no gain. I've checked. A subd3 and subd4 figure is indistinguishable to the eye and next to that using image comparison software on an extreme closeup. But heavy mats and shaders? If you're using 4k maps for lots of stuff you are again pointlessly hamstringing yourself. Again unless you are doing closeups so tighht you should see the character's pores you won't produce a difference.
But please test it for yourself. There are plenty of good image comparison programs out there.
I never use a 4k HDRI unless i just need light. If it appears in the render at all I use 8k, as anything else looks awful.
Yes, I do use 2-3 subD levels as the low poly artifacts are pretty ugly, I also use the HDRI pics as a background as well, hence the 4-8k resolution, not to mention my 4k multi-texture maps; I do multiple renders that have close-ups and far shots (About 2-3 feet for the far shots) as I post in a "photo-shoot styled" series of pics, however, as for my action/fantasy compositions/animations I can get away with lighter text/SubD requirements.
Thanks though, for the suggestions!
Again, this will be moot once I can get an alleged 12gb ti as I won't know for sure about the true specs until August/September assuming that the announcements are around that time, who knows at this point? I hate the wait!
I'm not going to google that Everything I read was November 2019 and later as I was researching to build a desktop for myself. Anyway, I've learned long ago that the news media implies a lot to escape culpability for their innuendo and I treat them that way. I had never read that tweaktown guy before though but he was very concise and clear.
Being concise and clear is not a substitute for being accurate, though. Right now, all these people are doing is selling clicks.
Indeed it can be very clear rubbish; we just don't know, which is what those stating as gospel what is going to happen, don't get.
Some of it will turn out to be correct, some less so. Some not at all.
When he states a rumour he's points it out and when he doesn't he says it's "confirmed". And his articles are short too. I like him as a tech journalist. Since it's clearly tech fluff journalism I don't care so much had it supposed to been 'real news' then I wouldn't read it as I haven't been reading or watching any 'real news' anymore by anyone.
I do wonder how all these tech fluff sites stay in business though? I know 'real news' is loss laden business subsidized by billionaires but I don't think these tech fluff sites are. ?
¿
So, I've gotten the 2070 Super with 8GB of VRAM. So, first takeaways...
Firstly, my rig has 3 video cards, a GTX 1650, and my new addition. A recent image that I created recently (that's in my gallery
) was used for agressive testing along side a 1650. The resolution is 3000 x 1525.
The 2070 Super is definitely faster than my 1650 but they do seem to also render quite nicely together as well. There was another graphic on the Internet that was also used to do benchmark testing with various cards, so I used it in testing mine. The 2070 rendered it in 7 minutes, the 1650 in 18 minutes, but together it took 4 1/2 minutes.
So I will be getting another 2070 Super to replace the 1650 later this year and I will be good for producing renders for my graphic novel.
The 2070 super has a NVLink connector. If you have two you should consider getting the bridge to let you pool VRAM for bigger images.
Is that sold separately? I will definitely get it.
Yes, it's also not cheap, $100 US, about. I would do your best to get the same brand of 2070 Super if you intend to try it. SLI only works on exactly matched cards and there is no information on if NVLink does as well.
I just spent another arm and leg on another 2070 Super and an EVGA NVlink bridge. So I will post some results on how NVlink works in Daz Studio...if it's supported at all.
NVLink is supported. People have gotten 2080ti's working and Daz says they tested it during the beta.
2080TIs are expensive...oh lord! I would love to see the benchmarks.
Yes 2080Ti's are expensive but will never match the ooh boy that's expensive cost of these https://www.centrecom.com.au/leadtek-quadro-rtx5000-pcie-16gb-gddr6-work-station-graphics-card .. And they are not the most expensive of the Quadro line the RTX 8000 is the cost of a fairly decent car.. :)
Yeah, I think I'll pass on that for now...lol