Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
You know, when I hear about news like "Might be next year".... I wish we had some actual date. Even something as vauge as "not sooner than a month, but not longer than 3"...
Anyway that aside - lately I've noticed the huge price drop on GTX 980 Ti cards and my question here - if someone could tell me - would it be worth it to buy it so I can use it for rendering while the 1080 sits pretty inside the case? (Let's assume I have the funds to buy it so money is not an issue)
Will the 980 Ti be useful after the 10's series gets supported? Or will it be more like a "there are no scenarios in which you would use the 980 ti after 1080 is supported"?
You can use it as well together but you need to install correct driver for each card 1 at a time , so you put one card in the system install the correct driver, then put next card and install the correct driver , after series 10 is supported in iray you can use 980ti also as display driver and 1080 for rendering a long side or use them at the same time for rendering but nobody knows yet how it affect the speed in rendering in iray as 980ti can bottleneck 1080 and reduce the speed since 2 different drivers and architectures are used for the same task .
just installed daz and g3 essentials did a quick render test looks like I can render in iray via cpu and optix options selected and daz can see both my gtx1080 cards and auto selected them if they actually some how got something out of them well doubt it since you all said they haven't been iray updated yet anyway iray rendered base g3f in underware in 1 minite 59 seconds guess again via cpu though wild guess actual scene of any type usually do probably end up longer to render on cpu than on my laptop using it's 850m gtx card probably take several days to download install all my content on my new one
Thank you very much for the information.
I guess I'll pass on the GTX 980 ti then. Not willing to risk errors with cross architecture problems. Who knows how long it would take them to sort that out if it happened.
From what I understand, the scene size is limited by the amound of Ram on the video cards, and that that amount will be limited to the lesser amount of the two cards if you use both in one machine. In other words, since the GTX 980ti has only 6 gigs of video ram, and the GTX 1080 has 8 gigs, you can't the extra 2 gigs on the GTX1080 and your scene size will be limited to 6 gigs. Think of this as similar to a canvas size for a painting, not quite like that, but similar in the way it limits you. Perhaps more like a combination of how big the canvas is, but also how many things you can paint onto that canvas - like figures and clothing and other objects. You could have a huge canvas, with only one figure on it, like standing in an empty field, or 20 figures all crammed into one small room for a given amount of video ram. So, it really depends upon what you want to do with Daz3d. If you are just making single frame renders of one character, then buy the GTX 980 ti now and get on with it, since you need the Cudas to render quickly in Iray. But if you intend to make larger scenes, and perhaps larger animations with lots of stuff happening, then you're probably better off investing in another GTX 1080 and joining them together with an SLI bridge, and being patient.
As far as I recall, there is no need for SLI when using Iray.
Also, the problem is not which card is better - the problem was "Will these 2 cards function correctly with one another or will there be some problems?". And MEC4D answered my question really well there - there might be driver issues or problems with the 980ti bottlenecking the 1080. And I would not want to risk that, so the obvious choice is to NOT buy the 980 ti.
I got rid of my other cards and replace all with the same TItan X , I did have issues with 700 series and Titan X 900 series as sometimes the drivers was mismatch somehow by the system , if would be better if you wait for the support in iray and buy yourself later another 1080 in place to go backwards .. it will be always better if you cards are from the same series , 4-6 weeks is worthy waiting
plus if you ever want to play games using SLI this mix will be not really what you want , so 2 x 1080 would be just perfect
I'll wait patiently. I'll prepare stuff and then go on a render spree.
Of course, since Pascal scales well, and the 1080, 1070, and even the 1060 operate at very similar clocks (or can be OC'd to operate at the same clocks pretty easily....stock 1080 with OC'd 1060 would be simple) mixing within the 1000 GTX line is pretty doable too. You can't SLI for gaming that way (without issues) but for Iray it's just more cuda cores.
(Yes, you can OC a 1060 to the same core and mem clock as a 1080, with a little work. OC'ing a 1070 to match a stock 1080 is childs play. So at that point, it's just the number of CUDA cores per card.)
As to what someone earlier said about the memory limits.....if one card has 6GB, and the other has 8GB, and the scene requires 7GB, the scene will STILL render on the 8GB card, but the 6GB card won't be used. If the scene exceeds ALL enabled GPU cards memory, then it dumps to CPU.
Yeah, I'm thinking of getting one Nvidia 1060's to run my monitors and 2 Nvidia 1070's for renders as soon as Daz updates to new cuda. I would like to see the Octane feature of showing how much a scene will take up to let me know if it will fit on my cards. Right now I have to test the waters and hope it fits.
SimTenero has a product for that.
You can also download GPUz; that will at least tell you how much of your GPU RAM is being used, and as the scene is loading you can also watch as the memory use goes up, so you can see how much is being used; this gives one an idea of what fits, and wont fit.
EDIT
http://www.daz3d.com/iray-memory-assistant
Does the plugin for DS also count how much is used for Image rendering and Open GL ( display ) I don't think so , as the info I think is fetched from the DS LOG files , that does not show up all data and you need to count also how much vram will be used for render the image , usually a scene that have already 2.5GB load to my 3 cards and the 4th display card that also is used in rendering show usage of 5GB while rendering but not in the DS Log files .
I prefer Precision X 16 as it show me all data of the cards and system at once in one window in real time plus temp of both CPUs and GPUs
The Iray Memory Assistant was a great idea and something that would really have been helpful. I'd gladly repurchase it, if is ever gets updated to provide reliable predictions of whether a scene would fit. It did provide some information on scene elements in a handy grid, so it is not useless. YMMV, as they say.
It would need to predict how much VRAM will get to use based on rendering resolution of the image too , as the scene can fit but the moment you hit the render it would switch to CPU if there is not enough VRAM for rendering , usually it take 1GB for full HD render resolution and just because my 1 card can fit 600 genesis figures without textures ( just the unique geometry ) in the scene not mean you can actually render that and work with this as it start to be slightly choppy after 50 unique figures .
That why when I have open my GPU monitor it show me everything in real time including rendering usage .. for free while adding stuff to the scene ;)
I have Precision X 16, and I use it to control my fan speed, but beyond that, I am boggled by the interface. Is there a good tutorial anywhere on how to set it up and use it?
The memory Assistant does not claim to be exact - it gives a guide that should help users to make soemwhat informed decisions in many situations.
I don't disagree with your statement, Richard. It might be a useful tool for some purposes, but I still don't think this tool will do what the Silver Dolphin is looking for.
Silver Dolphin stated "I would like to see the Octane feature of showing how much a scene will take up to let me know if it will fit on my cards. Right now I have to test the waters and hope it fits."
I gave this tool a good workout with many scenes, before concluding that it didn't satisfy that need for me. I offered my opinion and experience with it: Iray Memory Assistant is not accurate enough to tell whether a scene will fit before loading it. It might help someone identify some memory hogs, IF the memory hogs are items that the tool reports on. If your issue is Surface SubD settings, for example, the tool provides no help with finding that issue (which is especially unfortunate, now that you have clarified elsewhere, that surface SubD affects the entire model).
Cath,
I couldn't get GPU-Z to install on my render box since it apparrently requires an internet connection. Where do I find Precision X?
Cheers,
Alex.
Precision X can be found on EVGA's website. Googling should give you a good result.
Alternatively, you can try MSI Afterburner, which is based off of Rivatuner. Both are quite capable OC/monitoring programs.
you find the arrow on both sides of the name < PRECISION X 16 > click it until you see the monitor graph then click on it - new window open with your GPUs and system info , you can right click between the graphs on the open window and record, pause or keep in the foreground
Hi Alex
http://www.evga.com/precision/
they have 2 one for the older series to 900 P16 and the new for Pascal OS cards
Thanks, Cath.
Alex.
I've downloaded MSI Afterburner and it's working a treat. My 970 has spent most of the day at 64 degrees centigrade, in a room temperature of 24 degrees. The CPU cores seem to range between 40 and 45 degrees. I'll be interested to see what readings I get if we have another heat wave.
Cheers,
Alex.
Hi Alex,
I used afterburner in the past , but since Evga precision X 16 control my GPU radiator fans it is easy to just follow one app at a time for me , but glad to hear the other working for you .
If I'm reading this right, it looks like the Iray devs have completed thier update to pascal and released the relevant SDK.
http://blog.irayrender.com/post/149501723666/hello-since-june-you-are-working-on-iray-for-the
https://developer.nvidia.com/iray-sdk
They announced it on Aug 26th, any ideas on how long it would take for DAZ to update from this?
It always irks me a bit going to that website, it just looks like a random, low-key, freeware dev-blog. Not a huge wing of nividia developement.
Never-the-less - I can't see Daz pushing out a hotfix, it will more likely be swallowed up by the next release in this cycle. So we could be looking at 1/2 months post-release for third party intergration. I imagine 3ds max will have the same time frame.
I agree with you on this blog it had sometimes info what was not accurate at all
It should be here in early fall and not 1-2 extra months , Nvidia announced it but that does not mean DAZ got ot as they did not as in our case more things need to be adjusted than just new engine plugin to support new features and after that testing .
Do you think that Daz3d 4.9 will be able to use any aspect of the Cuda's in these cards by Christmas? Or do you think it is more likely that it will be late spring 2017, given that the dates keep moving further and further and further...away?
I'm going to stick to my original guess of mid to late October - and then copper my bets by saying it may just be a public beta at that time. :-)