Adding to Cart…
![](/static/images/logo/daz-logo-main.png)
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Maybe we're saying it wrong, but it seems a natural way to distinguish 'RAM on the video card vs. RAM chips.'
Well technically, it's GDDR or video memory, vs system memory.
The log file that stores that info is located in C:\Users_folder name on your computer_AppData_Roaming_Daz3D-Installmgr
on mine it's C:\Users\Robert\AppData\Roaming\Daz3D\Installmgr
If you don't have a backup of that then It's pretty much start all over
I had the same problem a while back but fortunately had my users folder backed up
And thanks for reminding me that I need to update that folder
Hmm, thanks. I moved my appdata folder to the d drive and updated the paths, all other items are there, but I don't see a Daz3D-Installmgr... just InstallManagerFileRegister. I really hate to have to download everything all over just because DAZ hasn't made their app able to read an install folder to recreate it's install db like most other programs of this type. :/
If anyone has a file description for the index file so I could write my own program to rebuild the file from the folder contents it would help, ty.
My video card has DDR3 memory, not GDDR.
Ends up a bit of fiddling and the items downloaded did show up. I had install path set but the download path was still going to an old path that had been moved.
Directory screen grab
Looks like you were figuring it out while I was gathering info to post
Yeah, we gonna pass on the purchase of VCAs. LOL! I realize that most of the products like this aren't totally geared toward hobbyists and enthusiasts, but the idea of throwing more money at an item to get better, faster renders seems counter productive to those of us in this group.
People who do this for a living may disagree. But, I can't argue with the results. My test render using IRAY looks great.
The upgrade costs when ALL of them K6000 cards in all of them VCA's just don't have enough Random-Access-Memory on each card, to function on the scene that must be done for a contract, Ouch. Upgrading a supercomputer is never cheap, lol. It's still cool to look at them tho, pretty cabinets in beautiful racks.
They (Nvidia), don't need to sell millions of these to Daz users, Just a few hundred large VCA clusters would do it, hypothetically. Truncated thought for this forum, to much technobabble, lol.
Perhaps a contender for the “Top500”, yes.
What video card/software is this? Onboard memory uses system memory whereas software will often show GDDR memory as 'DDR' in it's reporting (GDDR is a type of DDR.)
Looks like you were figuring it out while I was gathering info to post
Yes, thanks for your help. Always appreciate it when someone takes time out to help with something like this. :)
That last post did help also. I was still sorting out the bits. Ends up I had moved much of my content of the %appdata\Daz3D% to "d:\!appdata\Daz3D\" but not the install manager folder. The install manager did rebuild itself it looks like when I sorted out all of my path issues, so kudos to DAZ. I had unfairly commented on them not doing this earlier it seems. Apologies to the hard workers at DAZ. :)
On a side note, I might have to do a tutorial on all of this later if for no other reason so I can reference it myself after long period away.
...yeah, but who has 2.500$ for a Quadro K5200 (8 GB) or 5,000$ for a Quadro K6000 (12 GB)? Heck, even 1,000$ for a Titan Black (6 GB) is more than many of us can afford. not me, lol. Give me a few less then that thousand CUDA cores, and more VRAM. It would be more productive for me, and probably consume far less Watts.
Emphasis mine. WATTs are the reason I currently have an i5 and a weak card. I wish I knew an electrician who could tell me what's safe for me in my aluminum wired apt. They did some work on the outlets a few years ago so that the aluminum wiring connects with copper at the outlet, but last winter there were too many bangs when the aluminum wiring which shrinks when cooling separated from the copper. It's scary. Once it turned on the dishwasher along with the bang and I was right in front of it at the time.
My microwave oven is only 500 WATTS and it just turned 29!! It's a great little bugger (GE Spacesaver) and I'm afraid to replace it. So I'm going to think long and hard about my next PC.
I do know what you refer to, and even copper shrinks in cold temperatures. The biggest issue with aluminum, is the heat it produces when you pull more watts threw it and the lose connection. If things don't go completely wrong, you still end up with a bigger electric bill that you never got to enjoy using most of.
My difficulty is that my two 20 amp circuits just are not enough, and I need to string up more if I get Whatt-Hog graphics cards. Not to say, a few more batteries, a bigger inverter, and computer power-supply. Something has to give here, or break (I hope not), lol.
I can live with hour long full quality renders, just make the preview and spot-renders less painful. no more of this move a light a tad, start a spot-render, and walk away for a while please. On that note, been digging a tad more.
It looks like the GT 730 (4GB Video Memory) card that I'm looking at for a display card has two variants. a lobotomised version with only 96 CUDA cores, and a slightly better one with 384 CUDA cores. Compared to my zero-useful CUDA cores card for the Daz Studio interface, lol.
The only benchmarks I could find, were on the 1GB and 2GB variants of the GT 730, and even they look really good compared to my seven year old card. Looking purely at the OpenGL capabilities for the Daz Studio interface. I'm not worried about video or gaming, just the desktop. I don't play games any more, and even a Radion9250 AGP card has no issue with 1080 video playback.
I'm curious how the Gt740 dose with OpeGL, and if anyone has noticed View-port lag/stutter in Studio?
yea my card is really bad. :red:
At one time, it use to be an incredibly good card that only used 43 watts, not so much any more. a 23 watt bottom of the barrel card looks incredible. What, it dose not consume over a hundred watts, I may have an interest in GPUs again, lol.
The implication is, it is actually on the video card and not faked by sucking system-memory bandwidth, and it is for the video card's use.
That's what you would expect to see, as GDDR4 and GDDR5 are modified versions of DDR3 technology. :-)
So I am assuming once again that those of us not using NVIDIA cards and on 32 bit systems are out of luck on being able to use the IRAY engine? I would like to know before I download it.
Here are some test renders of one of my custom characters. The first one is with the IRAY renderer. The second is with Reality using the LuxRenderer. The 3rd is using the 3Delight renderer and the last one is a basic OpenGL render. All done in Daz Studio 4.8 Pro and all done out of the box with no additional configurations. Initial thoughts...
Of all four, obviously I love the IRAY one. It pops like nobody's business. :) However, it took 30 minutes to render and it slowed my PC down to a crawl. I did this one on the Photorealistic setting. I'm doing another one using Interactive. Still pops, but it's brighter than this one and it's not a resource hog either. I don't think I'm gonna buy an expensive card for this (I'll probably just wait for network rendering with the IRAY server whenever it comes out and if it's not too pricey), but I don't think I will be using Reality much longer. :)
There's really nothing else to say about the other renders. I know that there's more into rendering than just out of the box settings obviously and a lot of Daz's promos were probably done using the standard 3Delight renderer, so I know it can produce fantastic results.
However, the proof is in the pudding with this IRAY business. I love it. :)
The new 3delight, is still there, and it is also updated. far better then the version in Studio 4.7 I will say. far far far better. AND there is new Photometric lights that work in both Iray and 3delight.
Don't be put off if Iray (Nvidia creation, not daz's) dose not run as fast on your system as you would like, other stuff has been greatly improved, and there is not much at all for Iray surface shaders as it is still in Beta.
Iray is 64bit only.
Here's the render using the Interactive setting for IRAY. It's a little brighter than the Photorealistic one but still pops.
The new 3delight, is still there, and it is also updated. far better then the version in Studio 4.7 I will say. far far far better. AND there is new Photometric lights that work in both Iray and 3delight.
Don't be put off if Iray (Nvidia creation, not daz's) dose not run as fast on your system as you would like, other stuff has been greatly improved, and there is not much at all for Iray surface shaders as it is still in Beta.
...heck, in pure CPU mode, it runs a lot faster than :Reality/Lux
Okay, that one I'm gonna have to disagree with slightly...but only because Reality/Lux doesn't slow down my machine nowhere near as bad as IRAY does. However, it's looks to be a better renderer just out the box.
...I'm getting completed renders in 2 - 3 hours with Iray (CPU). With Lux even after 10 - 11 hours I still have a lot of noise that when compared to to Iray, would put the process at about 35% - 40% complete.
I also find being able to apply Iray shaders in the Daz surfaces tab to be a lot more "elegant" than doing so in a separate UI.
Looking at computer components to build myself something nice I really can't afford to do 3d any more :(
I had the same problem without doing anything. I opened the beta (the first one--haven't installed the 2nd one yet) and let it just sit there a while after doing a grab with the snip tool. Came back and DS Beta was unresponsive.
I suspect there's something to that, ie, when the rendering 'seems' to stop, the graphics chip is busy but nothing else is, just like my leaving Studio unattended? Weird though. Do either of you have a GPU monitor to check if the chip is actually doing something during that time?
I have integrated video and all my renders use the CPU. No GPU to monitor. Sadly. :)
The new 3delight, is still there, and it is also updated. far better then the version in Studio 4.7 I will say. far far far better. AND there is new Photometric lights that work in both Iray and 3delight.
Don't be put off if Iray (Nvidia creation, not daz's) dose not run as fast on your system as you would like, other stuff has been greatly improved, and there is not much at all for Iray surface shaders as it is still in Beta.
Just confirming, Iray is 64-bit and won't run on a 32-bit system. However, there are significant improvement to 3Delight which should make it worth your while to download and use, Security 16.
Comparing these images you can see the problems if you switch between different renderes:
The lightning is complete different.
To get nice scenes you have to use the right lightning system, e.g. photometric light for Iray.
But I will say the light is key, whether we're talking Iray or Lux or 3D delight. The mistake I see a lot in these renders is flat, flat lighting. People seem to be using the headlamp, so you get zero sense that this is a 3D object. Put the key or main light at a 45 degree angle to the left or right of the figure. Put another light opposite that, and make it considerably dimmer, to fill in the sharp shadows created by the first light. Put a third light in the air to rim the head and shoulders of the figure. This makes him stand out from the background. This is called three point lighting, and goes back to live theater. Google 3 point lighting for more.
Find the headlight attached to your camera and turn it OFF. Then you will get renders that show M6's muscles, the curves of V6's face and fine body, or the drape of fabric. Otherwise you get something that looks like you're posting it to Instagram. Yeah, it may look like a photograph, but not a great one. We all spend a buttload of money on our art. Make it the most it can be by taking a few extra minutes to light it well. That is far more important than the video card. (Says the lunatic who just dropped 1500 on two NVIDIA cards and a new power supply to run them.)
Okay, that one I'm gonna have to disagree with slightly...but only because Reality/Lux doesn't slow down my machine nowhere near as bad as IRAY does. However, it's looks to be a better renderer just out the box.
I don't currently have LuxRender installed, but as I recall it didn't run the CPU flat-out the way 3delight and Iray in CPU mode do, accounting for the snappier system performance while rendering but also contributing to longer render times. Of course you can lower the priority of 3Delight, or tell it to use fewer cores entirely, via Task Manager in Windows (I'm not sure how that would be done on a Mac).
You can find a Iray light system here:
http://www.daz3d.com/forums/discussion/53797/