Adding to Cart…
![](/static/images/logo/daz-logo-main.png)
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
No clue. Just educated guesses and watching the official Daz Studio changelog to go by (which currently states btw that it's already been added to the private build channel.)
Daz released its first (non-beta) Titan/20XX series RTX supporting version of Daz Studio (4.11) last week.![smiley smiley](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/regular_smile.png)
All other things being the same Titan RTXs are approximately 10-15% faster than 2080Ti's in all things (2080Tis are actually just Titan RTXs with only partially functional GPU dies.)
It's already fully supported... on Linux.
Apparently something about the way Windows and MacOS handle graphics devices prevents software like Iray from taking full advantage of current Nvidia hardware for things like VRAM pooling and out-of-core memory functionality. Meaning that it's basically up to Microsoft... so hopefully sooner rather than later.
"Daz released its first (non-beta) Titan/20XX series RTX supporting version of Daz Studio (4.11) last week. "
My current version is 4.11.0.335. How to get that "Titan supporting version"??? is it this: https://www.daz3d.com/forums/discussion/332751/daz-studio-4-11-pro-general-release/p1 ?
"Titan supporting version" as in it recognizes and renders successfully on Titan RTX/20XX cards but without RTCore acceleration (which is still in development.) Previous production versions (Daz Studio 4.10 or earlier) wouldn't render at all with any RTX card.
If there will be a RT core support for Iray some day, how much this will improve the rendering? Will it be much faster or will it just have a little effect? What to expect of such an update? And I already purchased a nvlink for the Titans but I guess it was in vain. Or do I benefit of it in some way at the moment?
Anywhere from around 50-400% faster, based on the preliminary results people have posted so far. Raytracing acceleration is INCREDIBLY scene dependent. Hence the range is so big. And there's no easy way to determine in advance what scene content will get you closer to 400% vs 50%.
And yeah. Sorry to say that for the time being NVLink bridges are only marginally useful in improving rendering. That is, unless you are running Iray Server on a Linux box. In which case it will do everything expected and more.
Holy crap, 50-400%??? I hope they will release 4.12 beta with that RT core update very very soon! But since 4.11 has just been released, the 4.12 beta version will highly likely still need awhile! I don't think it will be released this year anymore
.
Check the guide i posted in the discussion/benchmark thread if you want to test Iray RTX. I had found a way to export a scene from Daz Studio and to import it for benchmarking in Iray Server. This way the new Iray RTX 2019.1.1 version is used for rendering. The iray bridge protocol in Daz Studio is outdated. The scene can be exported to Iray Server but then Iray Server only uses the Iray version which is supported by the source application (Daz Studio). https://www.daz3d.com/forums/discussion/321401/general-gpu-testing-discussion-from-benchmark-thread#latest
No one knows. Daz, historically, has almost never provided dates for release of products; they don't tend to 'advertise' what is upcoming either, with the build logs being just about the only one.
Wasn't Optix Prime incapable of using RT Cores?
Has anything changed or is this simply people confusing Iray with the chopped up version Daz has?
RT Cores replace OptiX Prime in its role as a raytracing accelerator on RTX cards. They both perform exactly the same function (raytracing), but OptiX Prime accomplishes it abstractly in software, whereas RT Cores do it natively in hardware - making the RT Cores exponentially faster. To the point that attempting to have both of them active at the same time would actually end up hurting oerall rendering performance verses just letting the faster of the two handle it alone. Hence why OptiX Prime has little to no role on the RTX platform.
How many 'people' have posted 400%, and how reliable are they?
Do the scenes they have used reflect the scenes Daz users tend to produce?
Iirc 2-3 entities have put out numbers like that since RTX launched. And they were all R&D development labs of some sort.
No, because none of them have been based on DS specific content or shaders. As I've been pointing out for several months now, the only way to get a handle on how all these new performance optimizing technologies shake out for DS use is by testing it with DS content. And that still isn't directly possible.
If we can assume the octane renders use of RTX cores would be similar to the results with iray. Then you can expect eanywhere from 20% to yes up to 300% and more. It really depends on the type of scene, for example a complex outdoor scene with leaves and trees ; lots of geometery would benifit the most as that's where the RTX cores would really shine. Versus a simple scene without much geomentry. So my guess would be if i have a vicky in a forest i'd see big gains, if i had vicky by herself, i'd probabbly would not see much imporvment in render speeds.
Yes, If you have money to burn, buying a new video card technology is ok. Especially if it makes you happy! Otherwise, it's sort of like buying a new car once the leave the lot you find out that it is not worth what you paid for it. I would suggest being patient and seeing what the reviewers online say about the card before spending so much money. Also if you are patient you can get the card new at better prices once the new generation comes out. I purchase two Titan X's for around $350 dollars each because they are the non Pascals and are older tech, but guess what they work fine! I have 12gb of video ram and with Win7 pro I don't lose vram to the OS like Win10. The latest and greatest is not always the best. Oh, case in point >> I'm still using Daz Studio 4.10, for the same reason. ~ just my 2 cents
Here are some iray RTX brenchmarks... looks more like 20% to 30% more speed in most cases:
https://www.migenius.com/articles/rtx-performance-explained
To be clear, the speedup is the "3.00x", "1.06x", etc part of the captions for each example in that article (ie. 300%, 6%, etc faster.) Not the percentage part (which is the estimated proportion of the rendering workload that is just raytracing.)
After reading the article gerster linked it appears that they are reporting about 6-15% performance increase rendering indoor environments (with exception to the museum scene). And those environments do resemble typical settings often used by the average daz user. Its not completely apples-to-apples, as there are no actors rendered in those benchmarks, which can add complexity by an order of tens of million polys after subd, but it seems that the performance increase afforded by RTX accelaration may not exactly live up to a lot of people’s expectations except in a few very specific scenarios. Nonetheless, I'm wondering if some of the more complex hair products may particularly benefit from RTX acceleration.
New DS Beta announcement includes RTX support (as well as some improved animation tools).
3.00x = 200% faster than 100%. 100% is 1x which is 0% faster than 100%
2.00x = 100% faster than 100%
1.00x = 0% faster than 100%
It's a play on words, it doesn't say it is 3x faster, it says it renders at 3x, which is 200% or 2x "faster" than 1x. (Faster than itself, with the wrong or incomplete drivers that didn't support RTX. It could have just been running 200% slower on the other drivers. That is hardly a fair comparison. There is no actual comparison to any other cards, which should show similar gains. That specific card isn't even available for a Daz render, through Daz. You couldn't honestly compare a professional grade $6,000 card's results, to a consumer grade $1,000 to $3,000 card.)
EG, It renders in 1/3 the time, potentially, at the most and 0% faster at the least. (3 cards would also render in 1/3 the time of 1 card, all the time. Not just in certain scenes.)
I still want a Titan-RTX just for the 24GB of VRAM. Apparently I need it now. 12GB just isn't cutting it in my Titan-V cards or my Titan-Xp cards anymore, since I can't disable OptiX, or enable it, or whatever it is now.
For roughly the same money, 2 2080tis will almost double the performance of a Titan. But one's modeling practices will always expand to render one's system slow again, just with more impressive results. The most convincing reason for wanting a Titan is, of course, the 24GB. I've hit the 11G limit with grass, leaves, and hair. In Blender these kinds of problems are relatively easy to solve with a little thought and layers, Daz must have something similar, so I don't know if a little bit of convenience would be worth literally doubling one's outlay for GPUs.
It looks like the newest beta of Daz Studio 4.12.1.55 will support nvlink and thus vram pooling![smiley smiley](https://www.daz3d.com/forums/plugins/ckeditor/js/ckeditor/plugins/smiley/images/regular_smile.png)
That would be.... big!
I did get an older titan 12g to use... first big render I did hit 11888mb... won't have flown on an 11 gig card.
https://www.daz3d.com/resource-saver-shaders-collection-for-iray
Try this.
And are hugely expensive paperweights if the scene doesn't fit on the card(s). Plus, they don't have 11GB of RAM after windows has finished stealing it.
OK, you caught me :)
I render in Blender, which has MUCH better tools for making things fit. Between Render Layers and the Scatter plugin that only instantiates objects that are visible to the camera, I haven't been able to exhaust 11 gigs since I took the time to learn how Blender users more proficient than I do it.
Architecture, like a huge cathedral, and backgrounds belong in their own layer. The characters can get by with much simplified surroundings, just to catch shadows, etc...
Even so, believe it or not, most of my memory usage is because of the dForce Long Curly hair that two of my main characters use, that I haven't converted to particle systems yet.
Also, Cycles often seems to go over the 11 gig limit, and even driving 3 4K displays, the render succeeds. I think the memory usage meter is way off.
In my experience, 11 gigs is really enough for just about anything. If it's not, it's the fault of the environment you're working in.
Indeed, I'd love Cycles in Studio.
I occasionally export to blender, but can't say I find any of the transfer options suit me. Maybe I'll set aside a weekend.
I agree.I've also tried the export options and have not been satisfied with the renders but I am no Cycles expert and have no clue as to how to work with the Blender node system (just looking at tutorials about it puts me off). There is a discussion going on about an enhancement to the Diffeomorphic exporter with regard to IRay materials :
https://www.daz3d.com/forums/discussion/367571/what-if-blender-could-seamlessly-render-iray-materials
One of the main drawbacks to using the Reality plugin (before I got an IRay capable GPU) was the hours I would spend tweaking materials for Luxrender. I think I would have to do that with Cycles in order to be happy with the results but that forum link (above) suggests that there might be something being developed to help with that conversion process.
I actually prefer Cycles; I was using it before I first tried Studio, but don't use it much now. But using Daz stuff in Blender would be great. I bring Blendered suff to Studio, but I would prefer the alternative.
Better still would be Cycles in Studio... I would pay cash for that.
A oneclick export to blender, I would probably give cycles a more fair go. I can't be assed to waste another two years of my life relearning another material and render system yet again though lol. Did it too many damn times already. First poser, than DS and 3DL, then Iray. And that's not counting all my failed attempts to get daz people into other programs like houdini lol. I could get them into the program, but gave up trying to set up a good looking skin shader that worked with the maps that comes for iray.