Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
It depends on what you actually mean by static renders. Even if the entire industry moved to real-time engines, static renders are still doable. Real-time engines are equally as capable as offline render engines. A static render is its own form of art, and has been around way longer than animated or interactive art. It's not going anywhere.

If by "static renders" you were referring to offline rendering, then that's a slightly different story. Offline rendering has been a thing for a long time for the very reason that game engines couldn't produce realistic lighting or believable renders until recently. Now that game engines have advanced far enough to be called real-time tools for more than just games, we're entering the era where offline rendering will most likely begin to phase out of the production pipeline. They won't be completely obselete anytime soon, but that is where we're headed. I've been hearing word around the 3d community that real-time is the future.
It's not as scary as it might sound. Adapting is the key. Like that one meme says
yeah you gonna still have "photos", what gonna change is which you will get "new options" to render your images, instead of rendering only inside daz you gonna be able to go render inside unreal if you feel like better than in daz, you will get to choose the plataform you will be working, let's say if daz, if unreal, if unity, if blender, maya and all the available 3d plataforms which rendering options, you not gonna be limited only to offline rendering engines, any plataform will be able to "handle" monstruous scene with trillions of polygons to render, you just gonna choose if you want it "in real time" or not.
Yeah, but i very rarely see those who do static renders in first place, especially in "pro" circles, outside of DAZ/Poser etc.
Or perhaps it's not on surface much?
If you want to see still renders done by artists outside the Daz/Poser community, I highly recommend looking through ArtStation, if you haven't already. Places like BlenderArtists, ZBrushCentral, and Polycount are also some of my personal favorites.
I have certainly seen single image renders from Unreal
especially using the Octane plugin
Tbf if you want to dip your toes into highpoly modelling there are some hobbiest (by which I mean free) options out there. Namely sculptris and blender. Or if you don't even want to commit to downloading anything there's SculptGL which you can download, but also go to the page and sculpt high resolution meshes in your web browser of choice.
See them all the time on Artstation. It's still the most feasible thing to accomplish for solo people. Movies or games are so humongously much more work that it's usually not doable without a team. It also requires such a wide variety of skills that very few people have. Real time engines becoming more common isn't going to change that much. It just cuts the render time dramatically but that is honestly not the big issue. Production is.
rendering fast really helps solo people.
Ian Spriggs used to make his portraits in 3 months but upgrading his computer and using "lookdev" reduced it down to 1 1/2 months
you can use rasterized previews instead of raytraced but they just aren't the same.
It is the grail.
The preview of UE5 is impressive. I'm going to install UE4 and see what it looks like. Also interesting is that AMD figures to be in the raytracing business by the end of this year.
Oh yeah. I know AMD is in on it! NVidia's been throwing them a bone. Subtly, of course. The partnership between AMD and NVidia makes them practically unstoppable and I love it!
Thanks, i got Sculptris from official site but haven't tried yet.
Though, initially i was more about...
What are fields for lowpoly, though? Mobile games due to technical limits?
meh you only need Dreams on the PS4 and a whole lot of talent
https://www.youtube.com/user/MartinNH
There is no bone. AMD and Nvidia will be offering totally different ray tracing solutions that compete against each other. Ray tracing is built into Microsoft DirectX12 anyway, and Vulcan now supports it, too. All AMD or Nvidia are doing is adding a hardware component on top of that for additional performance. Nvidia has dedicated ray tracing cores to perform this task. To explain it simple, in the PS5, the AMD GPU does NOT have dedicated ray tracing cores, rather the existing cores are capable of doing accelerated ray tracing tasks. It is expected that this is how AMD GPUs will handle ray tracing in the future. It is still hardware based, but not using a dedicated core. This has positives and negatives which we will probably see when they finally launch later this year. Rumors say they will ray trace faster than Turing GPUs, but probably not as well as Nvidia Ampere GPUs. However, on the flip side, because AMD does not have dedicated cores, they can use more of their chip for standard cores, and it is possible that they may actually have a faster GPU than the 3080ti when it comes to normal gaming and compute tasks. But that is all rumor, nothing is known for sure. Either way, AMD is going to be back in the GPU market in a big way. (Its about time.)
I think its a brilliant solution while ray tracing is still in its early stages. They stand a much better chance of competing against Nvidia this way because most games will still not be ray traced. Nvidia's solution divides the processor into 3 sections. They have ray tracing cores and a section of Tensor cores. To be frank, most video games will only use 1 of those 3 cores, so only one third of the chip is even getting used. Those 3 cores ultimately compete for space to some degree on the chip. AMD does not have that issue. So even if they don't ray trace as fast, they can more easily build a chip that competes at standard rasterization. To match that, Nvidia would need to produce larger chips, which cost more to make. The result is competition, which should help keep prices in check for us all.
This is quite the word salad. Let me work through all of it, because a lot of this is wrong.
"AMD and NVidia will be offering totally different ray tracing solutions that compete against each other."
NVidia and AMD have worked together in the past, especially after the Ryzen line of CPUs came out. There was a sudden interest. The truth is the ray tracing solutions are identical to one another, which I'll explain later.
"Ray tracing is built into Microsoft DX12 anyway, and Vulcan now supports it, too."
Let me explain a few things here: Microsoft's DirectX is a foundation of 3d graphics platforms, but you're forgetting one and that is OpenCL, which is an AMD platform. This isn't DirectX vs Vulcan, because one of them is a foundational 3d graphics program and the other one is a program used to take the foundation and rasterize it. What you have to look for is CUDA vs Vulcan.
"To explain it simple, in the PS5, the AMD GPU does NOT have dedicated ray tracing cores, rather the existing cores are capable of doing accelerated ray tracing tasks."
This is also incorrect. First, the PS5 doesn't have a dedicated GPU, rather it has an APU which is half CPU, half GPU. The only reason why the existing cores are capable of excelerated ray tracing tasks is because there's also another hardware specifically for that task. And do you know what those are called? Those are called ray tracing cores. The only reason AMD doesn't call them that is because NVidia coined the term first. In other words, for legal reasons.
"It is expected that this is how AMD GPUs will handle ray tracing in the future. It is still hardware based, but not using a dedicated core."
The fact that you and others are expecting this means that it's all speculation. Are you serious? In case you haven't noticed, all rasterization and ray tracing are hardware based. That's why we have a CPU and GPU in the first place. It's why you can see an image on your monitor in the first place. So, yes, it will be hardware-based!
"This has positives and negatives which we will probably see when they finally launch later this year. Rumors say they will ray trace faster than Turing GPUs, but probably not as well as Nvidia Ampere GPUs."
You're talking about positives and negatives which we will probably see. News flash, we've been seeing the positives and negatives since the RTX series came out. In fact, we've been seeing the positives and negatives since the first GPU came out and that was in the 90's. It's called power draw. It's called heat generation. It's called screen tearing. It's called latency. "Up there, but not quite." - AMD's moto since 2008.
"However, on the flip side, because AMD does not have dedicated cores, they can use more of their chip for standard cores, and it is possible that they may actually have a faster GPU than the 3080ti when it comes to normal gaming and compute tasks."
The last time AMD and NVidia tried to do real-time ray tracing without dedicated cores, the scene that they used could barely hold at 20 FPS. The standard cores have to be used for rasterizing. Otherwise, you get the performance dip I just spoke about. Much like running a server, a piece of hardware and no amount of coding can dictate how many cores will be used for a set task, i.e. ray tracing. A piece of hardware will automatically dedicate all of its power to whatever task demands the most. In this case, that task is real-time ray tracing. If AMD's new GPU is going to dedicate all of its cores, then it cannot rasterize. Do you know what that's called? That's called offline rendering. And we've had that for at least a decade.
AMD GPU users, have fun getting one frame every five minutes. (that was a joke.)
"Either way, AMD is going to be back in the GPU market in a big way."
AMD's been back in the GPU market since the RX480. It's just that they've been disappointing everyone since the RX480.
"I think its a brilliant solution while ray tracing is still in its early stages. They stand a much better chance of competing against Nvidia this way because most games will still not be ray traced."
No, it isn't. No, they won't.
"Nvidia's solution divides the processor into 3 sections."
No, it doesn't. If you've got dedicated cores, then you're not dividing anything. To divide something, you have to split the same thing several ways. FYI, the Tensor cores handle anti-aliasing and anosotropic filtering.
"They have ray tracing cores and a section of Tensor cores."
I already answered that one.
"To be frank, most video games will only use 1 of those 3 cores, so only one third of the chip is even getting used."
Several games are already using 2/3 of it. With a few using all 3.
"Those 3 cores ultimately compete for space to some degree on the chip. AMD does not have that issue."
Because AMD hasn't done ray tracing before.
"So even if they don't ray trace as fast, they can more easily build a chip that competes at standard rasterization."
Search for the date when AMD was founded. That's how long they've been competing in rasterization. Trying to compete in standard rasterization is what a GPU is for! They've been doing this for years.
"To match that, Nvidia would need to produce larger chips, which cost more to make. The result is competition, which should help keep prices in check for us all."
In order to get a performance increase of any kind, you need to make a larger chip regardless. Any time streaming processors are added, it's a larger chip. Any time memory is added, it's a larger chip. Any time you put more fans on it, it's a larger chip. So it doesn't matter if it's AMD or NVidia, you're gonna need a larger chip.
Done. Sorry to everyone for the length of this post, but there was so much to unpack here.
Well just watched this Unreal 5 video and it does look impressive, also it says at the end of the video that an early pre release version of UE5 is now available..
Can't wait to start learning UnReal, and then after that, Houdini. I tried to get the bridge working yesterday and start working on tutorials but I couldn't get the script to port Daz assets to UnReal to show up in the script tab. I tried DIM and Daz Central. I need to get this figured out but the tutorials I watched looked good.
That looks amazing, especially being rendered real time on only a Xbox Series X. It's a yuge leap forward in the engine technology IMO.
Yeah it certainly does look good, the problem at the moment is the landscape is just a desert/arid biome.. Would love to see what they do with forests and so on, and of course cities, towns and villages..
The 'auto-adjustment' of the animations looks most interesting to me.
just had a look at a starter scene yesterday with nearly nothing in it
about 40 fps with 5.0 where i have got 120 fps with 4.26
the ui is totally new: all black and i hope one can customize it in the future
now up to playing with nanites and lumen and all the new stuff
Demos are nice, but please post some video made with Daz 3D assets and this new engine version.
Kevin, I'm really sorry to hear that. Glad you made it though, and it's great you can do art.
Thank you, Damsel - Slowly better but the right side is still lagging. I try to use my right hand more with basic things. Back to work thankfully but not like pre-stroke.