But will it run DAZ Studio?

in The Commons
If your new RTX 3090 card won't cut it any more, maybe Nvidia's latest offering will give you the power you need. Four 80GB GPUs ought to give you decent performance on even the most complex scenes. No pricing available yet, but is expected to be in the realm of "if you have to ask, you haven't got it".
Comments
Bah. No RTX. I'll give you a hundred bucks.
I hope the scalper's bot accidentally buys ten of them.
I like how I got obsessively attached to a hobby where I get to make mental calculations like, "Can my computer handle this function? How much of my yearly salary is the computer that could handle this function? Even if I could afford that, if I plug it in am I going to cause a three-block brownout in my apartment complex?"
CHALLENGE ACCEPTED. Just as soon as my wealthy patron arrives to personally fund me and put me up in a penthouse downtown.
I can see something like that with improved RT & Tensors and more RAM being available to home consumers in 10 years, give or take. Why? Life and environment problems need solved (by advanced statistical AI methods) in the environment they occur in & nowhere else can suitably replicate them.
Why...is Crysis 4 incoming?
Data processing is a huge money maker, there would be companies that would order shipping containers full of those and build environmently controlled bunkers underwater to house them
I remember seeing a video of a farm full of server drawers of racks of Titans and gasping
LOL.
https://www.epicgames.com/store/en-US/product/crysis-remastered/home
It has a troll graphics setting that will bring any GPU on the market to its knees. My 2070 was doing, IIRC, 12 fps at 4k.
Why would someone do this? You can get that really cheap on AWS EC2
. And if you do no longer need it, then delete the virtual machine. I was even considering using the EC2 for DAZ renders. You can go a long way for approx $1 per render hour.
for Bitcoin miners $200K is nothing, well if that behemot does run cryptominning of course ("not from the Jedi")
Remember those days when we got a nice Ati Radeon and asuming were richies...hahaha
they are not rendering stuff
crunching numbers
and I know utterly nothing about the purposes just watched the video tours
Even for number crunching you can get started at $0.38 per hour on AWS... you have to do some very special things in your basement to need such a machine
The company I know of marketing containers of rendering servers does it to the movie industry. They claim they can produce renders of the effects layered/matted with the live action shots within 12 hours of the live shots being made on site. They say the director etc. then can decide on retakes etc. based on a "near" final pass of the scene. This isn't my field but I somehow got on the mailing list and get the marketing material.
I imagine some movie shooting on location somewhere with spotty internet might consider this but who knows. I have no idea if they've ever leased one.
I am confused, wouldn't that AWS EC2 need hardware somewhere? or those the big smiley mean joking?
I am a bit s l o w
Would the connection need to be spotty for the volume of data from a film shoot to be problematic? I'd suspect, regardless, that insurers might be twitchy about lettingthe data out onto the net - I'm sure it could be made safe, but insureers tend to be hyper-conservative over risk-reduction and might well stituplate on-site processing if it were an option.
Encryption is still a thing. Insurers could be an issue I'm sure. Although there's no way you'd get every sfx house involved out in the middle of nowhere so the studio would still be transmitting assetts around.
But a container full of high end HW, plus support equipment and personnel, delivered wherever wouldn't be cheap. Some quick back of the envelope calculations says the computer gear would cost right around $1 million plus probably another 25% for exotic cooling since some damn fool would want to put this in the Sahara. Even for a big budget Hollywood movie that starts to get noticeable in the budget.
It does require hardware. But its Jeff Bezos' hardware
. They use nvidia titan cards. You can rent 1 to 16 GPU's starting from $.082 for the 1 GPU. The 16 GPU is $8.38 per hour.
that could have been the one I even saw in the video with the racks upon racks of Titans then
Was it at Jeff's house?
320 GB of graphics processing power from NVIDIA... Youtube video link below...
The latest nVidia 64-core workstation computer with 512 GB system RAM, 7.68 TB NVME SSD, and up to 4 Nvidia A100 Ampere cards with either 40 GB or 80 GB each (320 GB max)... Note that each A100 GPU card is about $20,000 each. The complete workstation, which is designed for processing AI, will probably cost around $100,000 (my guesstimate).
It would be a dream-come-true for iRay rendering, and there are both PCIe and NVlink variants of the A100 card (https://www.nvidia.com/en-us/data-center/a100/).
Introducing NVIDIA DGX Station A100
https://www.youtube.com/watch?v=TKtN04z7Q5Q&feature=emb_logo
The A100 is not a rendering card. It has no RT cores.
Dell will sell you the card for $15.5k.
https://www.dell.com/en-us/work/shop/nvidia-ampere-a100-pcie-250w-40gb-passive-double-wide-full-height-gpu-customer-install/apd/490-bgfv/graphic-video-cards?gacd=9646510-1025-5761040-266794296-0&dgc=st&ds_rl=1282786&gclid=Cj0KCQiAhs79BRD0ARIsAC6XpaVDVYBwKhTAuF8XXII6f-8nhV9_nRJr2-XfQSPvV8I4uxm5rPIQB_waAmuNEALw_wcB&gclsrc=aw.ds
The server version of the DGX A100 was going for $75k (ish). The specs are essentially identical to the workstation so there is no reason to think the price will be all that different.
@kenshaw011267 The Dell link you provided is for the 40 GB variant of the A100, the 80 GB variant is a lot more expensive. Even 40 GB systems can range widely in price...
4-A100 pricing starting at $65,000
8-A100 pricing starting at $140,000
Source: https://lambdalabs.com/deep-learning/servers/hyperplane-a100
The original DGX A100 launched with a starting sticker price of $199,000 back in May.
Source: https://www.engadget.com/nvidia-dgx-station-a100-80gb-tensor-core-gpu-announcement-140027589.html
Of course you can render on the NVIDIA A100, it's a GPU after all. Even Daz Studio 4.14 supports the GA100 variant and Daz 4.14 supports both Octane Render engine (with raytracing) and Streaming Multiprocessor (SM) architecture for devices of compute capability 8.0 (i.e., A100 GPUs):
https://www.daz3d.com/forums/discussion/comment/6200861/#Comment_6200861
As I previously mentioned, the A100 is designed primarily for Artificial Intelligence (AI) deep learning and scientific data analysis, but as mentioned in the nVidia whitepaper, it can also do graphics rendering...
NVIDIA A100 Tensor Core GPU Architecture (whitepaper PDF):
https://www.nvidia.com/content/dam/en-zz/Solutions/Data-Center/nvidia-ampere-architecture-whitepaper.pdf
OTOY launches next generation NVIDIA A100 GPU nodes on Google Cloud for RNDR:
With Google Cloud's NVIDIA A100 instances on OTOY's RNDR Enterprise Tier, artists can leverage OctaneRender's industry-leading, unbiased, spectrally correct, GPU-accelerated rendering for advanced visual effects, ultra-high resolution rendering, and immersive location-based entertainment formats.
Urbach added, "OctaneRender GPU-accelerated rendering democratized visual effects enabling anyone with an NVIDIA GPU to create high-end visual effects on par with a Hollywood studio. Google Cloud's NVIDIA A100 instances are a major step in further democratizing advanced visual effects, giving any OctaneRender users on-demand access to state of the art NVIDIA GPUs previously only available in the biggest Hollywood studios."
Source: https://www.prnewswire.com/news-releases/otoy-launches-next-generation-nvidia-a100-gpu-nodes-on-google-cloud-for-rndr-301119804.html
NVIDIA Ampere A100 HPC Tensor Core GPU Becomes The Fastest GPU Ever Recorded in Octa Bench (Without Utilizing RTX):
The feat was shared by the CEO of OTOY, Jules Urbach. OTOY are the developers behind Octa Bench which is a benchmark tool that lets users evaluate GPU performance using the Octane Renderer. OctaneRenderer itself is a GPU render engine that supports NVIDIA's RTX raytracing hardware acceleration to deliver crisply rendered scenes.
Source: https://wccftech.com/nvidias-ampere-a100-becomes-the-fastest-gpu-ever-recorded-43-faster-than-turing/
All true, but even still, the new RTX 3090 has recently outpaced the A100 with RTX on in the latest OctaneBenchmarks and at a tiny fraction of the price, it would make no sense for anyone outside of a VFX studio, that would utilize all 80GB (potentially 160GB and up) of Vram, to even consider buying one just for rendering. Even the DGX Station itself wasn't created to be a workstation for a single user, but for a small workgroup as a GPU server. For 99.9% of artists, dual 3090's would outperform anything else they could possibly afford, even a pair of A100's.
Yes, going forward the RTX 30-series is currently the best bet for most of us, but if you are doing a feature-length animated movie, or even an animated web series, it's nice to have the ability to tap into that rendering power via the cloud computing option. Of course, if money was no object, then having one of those A100 workstations sitting on your desk would probaby make it so you never see your PC drop from GPU to CPU.
!
You go for it then. I pointed you to Dell's enterprise site. If you have the cash I'm sure they'll get you a 80gb card. That a pair of 3090's in NVLink will crush it for less than a tenth the price may not matter to you but for everyone else...
For rendering on the professional desktop the A6000 is what the real powerhouse should be. Based on the released specs it is the full GA102 GPU, the 3090 and 3080 is the GA102 with various elements fused off, presumbly running at a lower clock, its TDP is 300W, and using 48Gb of the cooler GDDR6 VRAM. Depending on just how low the clock is set it may set all sorts of GPU records when it gets out in the world. 2 in NVLink are enough for most of the stuff Hollywood does (and most of the fluid dynamics sims and other sorts of things people want these for).
..I was about to ask that.
...true, but you only can link a maximum of 2 3090s for a total of 48 GB.
On the other hand 640 GB of raw rendering power. Crikey, I could render an "Alpha Channel" (the artist) type scene in one pass in the time it takes to make a cup of tea.
[looks for that Megabucks lotto ticket that will be drawn tomorrow]
I bet you could get one of these $100,000 systems, add dForce hair and still make it drop to the CPU.
You know guys, we wouldn't need these massively overpriced GPUs to render stuff in Daz, if the Daz sellers would just properly optimize their models. Just look at modern videogames, they look better than most people's Daz renders by a long shot, and they don't require even a fraction of the processing power to run that Daz renders do. The reason being is that videogame models are optimized for efficient speedy rendering, relying mostly on shaders to create a beautiful photorealistic look, rather than an excessive amount of polygons. Most of the more complex items in the Daz market are so bloated with polygons that not even a high end GPU can render them in any reasonable amount of time. The less polys you have to apply shaders to, the less time it takes to render.
I've gotten so tired of dealing with Daz's bloated models, that if I want to use a particular complex scenery piece in a render, I'll render out a 360 photosphere HDRI from that scenery alone, then use that instead of the actual model for all future renders, because rendering a character standing in a photosphere environment takes less than 5 minutes, and you can't really tell the difference if your photosphere is high res enough. I typically render then at 16k for the best results.
I've also gotten into porting videogame models into Daz, which is actually possible to do with minimal effort (even rigged characters and such) because I'm tired of Daz content either not working as advertised, or just bogging down my computer with poorly optimized content.
My 2060 Super GPU can't even handle some of the scenery models by themselves, in the viewport, which says a lot about the terrible optimization of these models. A 2060 Super should be able to render high quality scenes without issues. There is no need for 3d content to be so bloated that it can't even be displayed on high end equipment without slowing the computer to a crawl.
THAT is the problem, gentlemen. Our GPUs are not the issue, and you shouldn't need these fancy 100k dollar GPUs to run Daz scenes. If the Daz content sellers would simply optimize their releases, Daz would be useable by a lot more people, and almost any user could make absolutely beautiful renders without having to dish out half a million dollars on a fancy corporate rendering rig.
That's not remotely accurate. videogame engines do not do many of the things iRay does that chews up VRAM, actual reflections for one. When raytracing became a thing they had to add actual specific hardware to the cards and still performance tanked and they still don't really do reflections.
Poly counts in DS models are lower than they are in most games I'll put money on it. DS model geometry is actually IMO too low. Genesis model geometry has gone down each generation, which was to some extent great as original Genesis had way too much geometry but G8 is way too smooth and counts on the texture for too much.
Yes, I can tell the difference between a character standing in a rendered environment and a character in an HDRI. A character in an environment casts shadows on things etc. To make it so there is no difference would require isolating the character from the rest of the scene. I have no idea why you'd want to do that but if it makes you happy good for you but since I'm telling stories I couldn't but then again I don't have these problems peopel keep reporting that I cannot seem to reproduce like an 8 Gb card not being able to render scenes.
LOL
A lot of this is subjective, and the stuff that isn't... well... I don't think it's right.