Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
A number of you heard me go on about my currently 16 year old computer. I can use Daz, only draw back is that I can use dforce and iray is a major pain, even 3delight has been giving me grief lately, however that's mainly due to the fact I recently discovered how much dust is in this thing...lol The fans hate me right now...lol
Thanks to someone who heard my rants about my computer I now have a newer one, lots more ram and a 650 nvidia card and you know what, considering how much of an upgrade that is from the old one, I'm happy with that. Of course I do want a better gpu at some point, maybe even more ram at some point, doesn't change the fact iray is a way lesser headache and I can do dforce now so ya...I'm happy.
If I can spend 6 years on such an old computer, can't many of you spend a year or 2 on your current one till the prices start to lower?
,..between my Titan-X and system RAM it's about twice the difference. A scene that is 6 GB in system RAM comes to a around 3GB in VRAM (running MSI Afterburner to monitor the render process).
No, M1 is based on the ARM architecture. So CPU and GPU don't "fit together in the old way". This is all new, so it may come about that one day M1 (or most likely M1+n) can serve as an unbiased rendering platform, but it's not available now.
This is all future stuff, which is why I brought it up. This may just be the thing we need to get us out of the legacy rendering hole we're in. And because Apple is the one who decided to get a divorce from Intel and AMD (remember, the M1 replaces not just Intel's CPUs, but also the AMD graphic cards that currently get installed into Mac computers), then Apple may just be the company with the vision for that. Or maybe the M1 process will one day beget even more new ideas. TSMC makes the M1 chip and there are other ARM chip makers out there, so there's always a chance for innovation.
I didn't claim that it would happen "soon". New ways of thinking always take time to implement.
Pricing is not part of anything I said either. Pricing will adapt based on the rise and fall of supply versus demand.
I didn't say anything about stock and physical stores either. All I said was that it's possible that ARM architecture, and more specifically, Apple's implementation of it in the M1, may change everything for the better. But not right now. What's happening right now would just be the impetus for change. If the Intel, AMD, and Nvidia triumvirate were working well, there'd be no reason to consider doing anything differently, would there?
Just like when land-drilled oil became more scarce, we started looking for oil under water. And then we started looking at alternate sources of energy. The difficulty of finding energy became the impetus for change.
Prices didn't go down, partly because demand stayed high. After all, it takes fuel to rebuild, well, anything.
Yep, and I believe that's because we had so many roadblocks to rebuilding after Katrina. Unreasonable blockers just ended up raising the cost of recovery, which ended up slowing it down while also keeping prices high.
And you should be pessimistic. You're RIGHT to be pessimistic. The evidence is plain that recovery and growth were artificially hampered, and this certainly IS unconscionable. But if you're looking for somebody's feet at which to lay the blame, it probably should be spread around more than you might be inclined to.
Glad I pulled the trigger bought a whole machine at $3600 just for RTX3090 at the end of 2020 because the card itself is impossible to find, now my RTX2060 worth about $2000
thinking about selling it
...most likely with the years I have remaining I'll still be dealing with the standard CPU + GPU.
Not holding my breath with ARM technology yet, as while Apple's been an innovator they've also had a few failures as well (LISA, NEXT, the "trash can" Pro, and the iMac Pro). They also lost me with their marketing and pricing philosophy particularly when they went back to "closed architecture" after the Mac II, Power PC, and original "Cheesgrater", (which meant that upgrading was at their prices and sometimes involved buying a completely new, more powerful version of the same machine). Issues with software compatibility and their iOS is another (as an example the fact Daz still doesn't have a version of their software ready for the "Big Sur" OS after over five months).
PCs with separate CPUS and GPUs may seem "clunky" in comparison to the ARM concept but they get the job done and at a more affordable cost to the end user (well save for GPUs right now thanks to the latest "Crypto Rush" and chip shortage).
Again, thankful for my old Titan-X.
Got my hands on an RTX 3090(or rather will as soon as it arrives in about a week) from a legit source: $4000
.
One step closer to having all the parts for my build:
AMD Ryzen Threadripper 3960X Processor
Floe DX RGB 360 TT Premium Edition cpu cooler
ROG Zenith II Extreme AMD TRX40 E-ATX motherboard
G.Skill Royal Z DDR4-3200MHz CL14-14-14-34 1.35V 128GB (8x16GB)
EVGA GeForce RTX 3090 KiNGPiN HYBRID 24Gb GDDR6X
NVidia RTX A6000 48Gb GDDR6
Seagate Firecuda 520 2Tb M.2 NVMe PCIe Gen4 SSD
WD_BLACK 6Tb Performance Desktop Hard Drive
Corsair or EVGA 1600-watt Titanium-rated PSU
Riing Quad 12/14 RGB Radiator Fan TT Premium Edition(all radiator/case fans)
The Case I haven't decided on yet, but so far I've picked out either:
Lian Li's pc-o11d-rog (model O11DXL-X ) or Thermaltake's View 71 Tempered Glass ARGB Edition. I think I'm leaning more towards the Lian Li because its almost half the net weight vs the View 71.
I think the next difficult part to get is going to be the 1600-watt PSU since those are being sucked up by a lot of Butcoin miners. The APC UPS might cost another arm & leg depending on what else I need to connect to it.
MSRP on the 3090 is $1,499.
I simply won't pay these ridiculous prices.
I'll just do without a new Nvidia card and my 2060 will have to be good enough.
Seems everyone is catering to the miners these days and by the looks of it that won't be changing any time soon.
I honesty believe nvidia will abandoned gaming and all other endeavors to focus their entire company on crypto-mining in the near future.
...is that the price for just the 3090? At that price I'd have gone with a second A6000 (only 600$ more) and an NVLink bridge, along with double the system RAM as it appears cost is no concern for this build.
Otherwise I don't understand the purpose of having both an A6000 and 3090 on the same rig as the A6000 has twice the VRAM of the 3090.
as far as I am concerned this has destroyed what was once a community I enjoyed being part of.
in the old days back in 2010 my lousy Dell laptop was able to run Poser and Carrara (not DS3 terribly well)
my lack of hardware did not exclude me from using DAZ 3D assets
[ ]
(I admit I still will likely buy older products on sale sometimes but also don't wish to support a company not with my interests at heart with my custom and money)
certainly nothing G8.1 or using features found later than DS4.11 is of much use to me with my hardware
Yeah, people forget if this continues long, DAZ will lose its community simply because there are no cards for hobbyists and even professionals to render IRAY with.
So it's absolutely wonderful to see how they thought NFT would be "great" for the community. Enough said.
https://en.wikipedia.org/wiki/Tulip_mania
+1
We are already there.
Getting a new GPU just isn't going to happen, unless you are willing to pay thousands of dollars for a card that retails for a few hundred.
Yeah, very glad I finally managed to build my current rig at the start of 2020, before everything hit. I'd intended to build it at the start of 2019, but my husband got laid off and that put the whole thing off for a year. If I'd waited two months later, I'd have had to put it off again, and quite likely I'd be having to still put it off now, or having to suck it up and get on the waiting list for a prebuilt rig, or basically gutting my old system for the graphics card (that old system went into a custom built virtual pinball rig, we're stuck on that because I can't really upgrade the computer's graphics right now to take full advantage of the new screen I'm dropping into it in a couple of weeks).
Apparently the specific chip shortage that's hitting graphics cards AND cars is some display controller IC.
I have no doubt that nVidia (and other makers of said cards) may be looking at the money they could make just making mining cards, or at least repurposing production that would have normally been used for graphics cards that they can't make because ICs those need that mining cards don't just aren't available in the quantities they need for demand.
I also expect we'll start seeing the MSRP on graphics cards rising as it's clear demand isn't abating.
Well, of course each of us has to decide the best course for our own personal situation.
I never said that ARM would create a utopia. Only that the current pinch may be the impetus for this new architecture to maybe make some big changes in the industry going forward. You're right that we can't keep going on like this with gaming cards disguised as $4,000 "everyday solutions". I would say that the current way is simply not supportable.
The more people who DON'T upgrade, the fewer people to make use of new software and technologies down the road. And THAT is not sustainable in any way.
"Clunky" isn't even on my radar; not sure where you got that, or if you were responding to me on that point. If they fit in the case and they're compatible with all the other stuff in a build, then they're not clunky. My laptop has an RTX 2080 in it. It fits into the laptop form factor and it's compatible, so by definition, it's not clunky. But it does get scaldingly hot under load, yes it does. So maybe this isn't a laptop after all, but a "lapcooker"? But I digress...
And you say "save for GPUs right now", but I submit to you that maybe this time it's a sea-change that we're going through, and not just a tide that comes in and goes out every 12.5 hours. Meaning that this is not necessarily a "right now" kind of situation, and that maybe, just maybe $4,000 GPU cards IS WHAT THEY COST.
The marketing people at Nvidia surely know that eventually the Crypto craze will experience major attrition. It must, because that is the way of things. People will figure out that they can't make money at it (or that only a very few big players actually have any chance of making money at it), and then they will start losing interest. People will stop buying and prices will come back to earth. And THEN, crypto miners will start getting out of it, and they'll put their used gear on the used market, which will absolutely crash the market for new GPUs for a period of time.
I suggest that Nvidia's and AMD's marketing soothsayers already know that this is a risk. They know that R&D needs constant funding injections. They know that if the market crashes, then a lot of things become more risky and business becomes more dangerous; more prone to failure.
Businesses don't like uncertainty. Markets hate uncertainty. And companies need to know what supplies, ping, power, pipe, and feedstocks are going to cost, and what they can charge for products. It would be in Nvidia's and AMD's best interests to be able to meet demand with current or somewhat growing supply. When the demand chart goes parabolic and starts to look like the left side of a Christmas tree or the Eiffel Tower, companies would do well to resist the temptation to lay out cash for new factories and so forth, because parabolic left-side charts often acquire a matching right-side. And then, look out below!
This is a perfect example of somebody doing exactly what they need to do. In this case, "pay up" is his decision. It's not an invalid decision!
And here is another person who has decided what he needs to do. Different from KK and Magog above.
People are already making their choices, in their own ways in their own time. Me? I've chosen to go outside and get my garden ready for spring planting. Then I'll come back in a couple hours and maybe grill a wild boar sausage for lunch.
It's not that I can't afford a $4,000 3090 card. It's what I want. But the price is a barrier for me because I plan to have a good, fun, and pleasant retirement one day, and maybe art will be a part of that (I sure hope it is). It's not that I have decided against a $4,000 card, oh no. But on balance, I'm definitely leaning AWAY from such a purchase, so it's more accurate, I think, to say that I have decided not to buy a $4,000 card today. Well, it's still early here in Florida, so maybe it's more accurate to day that I've decided definitely not to buy a $4,000 card before lunchtime. And probablly not this afternoon either.
As the saying goes, we live in interesting times.
Very content with what I have and if it means using older content G3 and G8 (eventually G8 will be out out to pasture) so be it. I'm just lowly hobbyiest that just wants to have fun and relax. So I'll stick with the PC I built 2 years ago. I read and article in PC mag back in Feb where Nivdia was dusting off outdated GPU because of the world wide chip shortages. They planned on re-releasing GTX 1050Ti and RTX 2060 cards. Whether this happened I am not sure since I am not in the market for a graphics card but because of the seriouness of the shortage these cards coulb doubled in price now. This chip shortage is forcasted to continue into later 2022.
But there is a bright side it's becoming golf weather!!
The M1 discussion is, as acknowledged, highly speculative and not really relevant to the this thread - please drop it before it turns into platform warring.
Why would Daz have to lose it's community just because other companies can't keep up with the demand for a product. It's not Daz' fault. Also don't most of you already have working computers turning out iRay renders, with lots of cores and ram and other stuff.
Having the newest, most expensive, rarest, biggest...etc etc etc anything doesn't mean anyone's work will improve or that they'll be left behind. Everyone seems to be forgetting a very important aspect of what anyone does, it's called skill.
Skill means nothing when your hardware dies and you can't get anything new 'cause there's nothing to buy. Spending $4k+ for a $1500 card or $1500 for a $300 card is beyond the pale. It's stupid, and what is directly to blame? Did you say crypto and NFTs? And what is Daz now peddling? Did you say NFTs? Well then. My observation, we have a righteous rage brewing in this community, and it's only so long before people find other ways to occupy their time, other places to spend their money. It may not cme to that, but some of us want some speedy new hardware.
Ok, so what happens when your current hardware goes belly up? Because it does have a finite lifespan. I'm running a 1080TI from August 2017. I also game with it. How many years does it have left? Last I checked, even 1080TI's are selling for double MSRP (or more).
Agreed!
If this global price madness continues then in future I will use 3delight only rendering. I will buy or look for used PC. I will use outdated PC components and probably outdated software version, but I cannot give up rendering...
Sure. It's okay to limp along with substandard hardware, even though we all know the latest products are more resource intensive than ever. Maybe some of us are tired of limping along.
I'm already doing that. My system is old and still soldiering on. I was looking forward to being able to do IRay, to running 3D Coat without giving my system apoplexy, to doing cloth simulation and other resource intensive operations, to being able to efficiently render some animations, etc. Now, all that is caught in a holding pattern. Not happy, no.
Anyway, I vented a lot more in this thread than I usually do, but I will leave my posts as I am reasonably certain I am not the only one feeling this way right now.
Sure, I'll let it drop. To platform war for or against ARM in its current state would be akin to going to battle with a bag of Beanie Babies, soup ladles, ice cream scoops, the book "Catcher in the Rye", and catnip. There'd be no way to win such a battle, although the ensuing pandemonium might be funny to watch. But then anytime catnip is involved, pandemonium and hilarity usually follow forthwithedly; that other stuff is unneeded.
Sure its stupid, but it is what it is. No amount of kicking & screaming is going to change it. I don't like it or butcoin mining anymore than you do. The shortages are going to continue probably well into mid-2022 and I've already been kicking the can down the road long enough when it comes to replacing my laptop, so if anything its probably even more stupid to defiantly sit around waiting for a miracle surplus to appear out of thin air when the prices are going to climb even higher. I probably could have even went with a lesser card for half the price, but my system specs & the environment its going to be in require a decent cooling solution for the card when rendering, which meant going with the Kingpin RTX 3090. Being that I could not get two of them, the only available alternative was a high end Quadro RTX A6000 which runs at lower wattages and doesn't throw off as much heat inside the case. I'll be able to game and have a render going at the same time with no issues OR I'll be able to do it quicker using both cards.
So if I can swing it now, that's what I'm going to do. Stupid? Yeah, because I waited so long up to this point so I have to pay more. It would be even more stupid to wait longer; especially since I do need a new system and don't want to buy pre-builts or cookie-cutter systems that have limited options.
I should also point out that Dell and Digital Storm still have RTX cards(all the way up to/including 3090s) available for their systems, so unless you're like me and are very picky or don't need a new desktop, you can get them without being scalped. The only catch is, you need to purchase the whole system, so it might work for some while others not so much.
Its not going to happen. I could be wrong, but I don't think NVidia is dumb enough to make that kind of move. They might ramp up production a little, but beyond that, they're not going to invest in new facilities & such just for the sake of selling more graphics cards to a market share that is based on speculation and uncertantly with a whole lot of risk. And they're certainly not going to abandon gaming for crapto-currency. Gaming is more reliable economically in the long term while crapto-currency is not.
The whole crapto-currency madness is no different than any other popular fad both past & present. Eventually, it will become a s***streak that everyone will want washed out. They're learning that in Turkey now when the top guy of the one crypto-currency company over there decided to leave with over $2 billion of everyone's money leaving them high and dry.
I will probably never use 48 Gb VRAM, lol. The only time I would ever come close is if I use both the 3090 & A6000. I think the most I would ever use is 16-18Gb(for now). If & when I'm actually able to and need to use the full potential of the card, I'll either have a separate smaller workstation for it, or I'll just swap out the motherboard if necessary. This is just to hold me over until I have more options available to me while having something that I can potentially build upon, because right now the market in general just sucks.
Absolutely not. I've been on this laptop with a GTX 765M since 2013 and at the time I thought it was a good replacement for a desktop. It held true until I started working in Daz. Practically every scene I render gets kicked off to the cpu, so any kind of rendering with Iray is painfully slow. I'm also unable to use DForce.
...I considered doing so a couple years ago and to that end purchased several 3DL utilities including River Soft's/SickleYield's RSSY IRay to 3DL Converter, Parris's IBL Master, and Wowie's AweShader system. Was working on this scene as an experiment when a drive crash took everything. Render time about 14 minutes. The Iray version, multiple hours (CPU mode).
Attachment does have a bit of post processing in Exposure 3 to give it a 1960s photo look.