Iray - Titan X, Titan V, 1080Ti vs 2080/1180 RTX? (Turing Discussion)

2»

Comments

  • branonig said:
    NewVision said:

    So it worked for you? I have an RTX 2080 as well and i haven't been able to get it to render anything in iray at all (using DAZ 4.10 on windows 10). Did you have to mess with any settings or anything? 

    Yes, no problem at all. Be aware that the driver and software may be an issue here. You should use driver 4.x (currently 4.11?) and at least CUDA 9. I did not do anything special and it worked. But as mentioned, on DAZ 4.10 it was not very impressive and I didn not find any settings which could cause that.

  • DAZ_RawbDAZ_Rawb Posts: 817
    NewVision said:
    branonig said:
    NewVision said:

    So it worked for you? I have an RTX 2080 as well and i haven't been able to get it to render anything in iray at all (using DAZ 4.10 on windows 10). Did you have to mess with any settings or anything? 

    Yes, no problem at all. Be aware that the driver and software may be an issue here. You should use driver 4.x (currently 4.11?) and at least CUDA 9. I did not do anything special and it worked. But as mentioned, on DAZ 4.10 it was not very impressive and I didn not find any settings which could cause that.

    4.10 doesn't have a new enough Iray to support the 2080 card, so the unimpressive speed on the renders was probably because that card was tossed out at render time and your CPU + other GPU were used instead.

     

    So for now, be sure to use the 4.11 public beta if you have a 20-series card.

  • DAZ_Rawb:   "So for now, be sure to use the 4.11 public beta if you have a 20-series card."

    How do we know 4.11 public beta supports the RTX2080?  (or do we?)  I want to purchase an RTX2080, but not until I'm sure DS supports it.  Is there a planned release date for it?  I can't seem to find the info about it on the Daz3D site.

  • VEGAVEGA Posts: 86

    So my rtx 2080ti will be faster than yours rtx 2080 + 1070 ? or not.

  • I have similiar questions.  I have built a new PC this year with the Ryzen 2700x CPU w two GTX 1070ti cards, 1 for monitors, and 1 for renders.  I can get a iray view image up in less than 10 seconds w/ a fairly complex scene, however, I am thinking of animation renders, either in DAZ or UE4 or Unity with SDK for Real Time Cinema in the two later to make use of the 2080 RTX Ray Tracing Abilities.   I can do interactive renders with high quality in little time vs standard photoreal that still takes time.  So I want to move from DAZ to a game engine to animate the characters and capture cinematics.  Looking to Game Engines to pick up on RT capabilites first.  If people don't make games to take advantage of the new RT ability of the cards, why get them?  I think if you are going to develop games and implement RT charachteristics, then yeah, you need to get the new RT cards, if not, then wait 6-12 months when prices drop and second gen comes out and by then you will have a selection of Games to take advantage of the now new RT cards.  

    Big question of Buy Now - Are you going to develop useing RT SDK's?  If yes, then yest buy it now.  

    If you are just a gamer or working on Renders, probably wait, until the new Tech is supported, and maybe not first run, let the bugs get worked out in 6-12 months.  

     

     

  • prixat said:

    I just watched the NVidia twitch live stream where they announced the launch of the GeForce RTX 2070 ($499) RTX 2080 ($699) and RTX 2080Ti ($999) September 20th (Pre-orders open now) and according to their infographics the 2070RTX is lightyears ahead of the current gen 1080Ti and Titan series. I know real life benchmarks will tell a slightly different tale but I can't imagine that they would blatantly lie so boldly. They showed that it was between 4-6x faster than a 1080 Ti.

    Note that it's claim is 4-6 TIMES not Percent faster... :OYou have to read those slides very carefully. They were talking about a specific function, in this case, 4-6x faster at Ray Tracing.That's a good point.  I often wonder the same when considering Quadro Cards.  I have sat through Nvidia webinars on the Quadro's and asked how they perform compared to GTX and RTX cards, but they wouldn't answer.

    That's a good point.  I sat through an Nvidia webinar on the new Quadro cards, and I asked about their performance vs GTX and RTX cards, but they wouldn't answer.  They had Hollywood studio guys in there bragging about the Quadro's showing off the movies they had made, but I think it comes down to the VRAM on some of the high end Quadro's that even puts them in the ball park of the GTX cards.  

    On Cinebench, my System is shown as outperforming a P-4000 Quadro at nearly twice the render speed overall, but they score mine 1 point below it overall because the single core ran 2% faster. and they threw out my 16 core score which again blew the other unit away.  So I have pretty much given up on building an intel and Quadro PC workstation for renders and plan on building a 1080ti or Titan off the GTX cards, probably around Black Friday of this year.  Hoping that the 1950 Threadripper comes on sale at that point.  If not will go with AMD 2700x again.    

  • wolf-2473824wolf-2473824 Posts: 14
    edited October 2018

    guys.. you have it all wrong.  IRAY is no longer supported by NVIDIA.  They publically announced a year ago that support for IRAY was being handed over to the applications that used it.  https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/documents/Iray-plugin-FAQ-FINAL.pdf

      I did email DAZ about this and they officially stated that they would not take that burden.   So 2080 cards, 2070 cards, Titan V, etc, is not going to magically work on IRAY, NVIDIA is not going to make it work on IRAY,  and its 100% up to DAZ if they want to do the work themselves. 

    So all you pointing to NVIDIA, you are pointing the wrong direction.  It's up to DAZ.  So start emailing them.... thats who you are paying anyways. 

    Post edited by wolf-2473824 on
  • guys.. you have it all wrong.  IRAY is no longer supported by NVIDIA.  They publically announced a year ago that support for IRAY was being handed over to the applications that used it.  https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/documents/Iray-plugin-FAQ-FINAL.pdf

      I did email DAZ about this and they officially stated that they would not take that burden.   So 2080 cards, 2070 cards, Titan V, etc, is not going to magically work on IRAY, NVIDIA is not going to make it work on IRAY,  and its 100% up to DAZ if they want to do the work themselves. 

    So all you pointing to NVIDIA, you are pointing the wrong direction.  It's up to DAZ.  So start emailing them.... thats who you are paying anyways. 

    You need to read that document more carefully, it's talking about Iray plugin products, not Iray itself - specifically Iray for 3ds Max, Iray for Maya, Iray for Rhino, and Iray Server. None of that affects Daz users.

    Look at their answer to the last question (my emphasis):

    NVIDIA will focus on bringing GPU accelerated ray tracing technology to every rendering product out there. Therefore, it further invests into core rendering technology, like:

    • NVIDIA OptiX and real-time ray tracing platforms for next-generation GPU architectures. > MDL, the open Material Definition Language.
    • NVIDIA IndeX, our platform technology for visualization and computing of multi-valued volumetric and embedded geometry data.
    • NVIDIA Iray for physically based rendering platforms focusing on the CAD, product and architectural design markets.

     

     

  •   I did email DAZ about this and they officially stated that they would not take that burden.   So 2080 cards, 2070 cards, Titan V, etc, is not going to magically work on IRAY,

    People are already using Daz Studio on their new 2080/2080Ti's... so..???

  • JD_MortalJD_Mortal Posts: 760
    Reality1 said:
    JD_Mortal said:
    Air-conditioners BTU power consumption in watts: https://www.generatorjoe.net/html/wattageguide.html
     

    You could always move, and heat with them (in the winter).wink

    In the winter, I need the windows open in my apartment. I should sell the excess heat! (Oddly, it doesn't keep my coffee warm.)

    As for some RTX discussion... I think I read that even without the dedicated RTX hardware-splits, the RTX process will assist in non-RTX cards too. It seemed like RTX was actually just a code-hack, not an actual hardware component. The 2080s are just a sub-level gamer-grade version of the Titan-V, at the core. (Awesome budget rendering card, compared to the overpriced Titan-V. Minus the double precision for the most accurate rendering. Game-quality V.S. Professional. No-one will notice, unless they look real, real, real hard, at distant objects.)

    https://www.pugetsystems.com/labs/hpc/NVIDIA-RTX-2080-Ti-vs-2080-vs-1080-Ti-vs-Titan-V-TensorFlow-Performance-with-CUDA-10-0-1247/

     

  • ebergerlyebergerly Posts: 3,255
    edited October 2018

    I'm certainly not knowledgeable about the differences between Optix and Iray, but my sense is that Iray is more of a full rendering application, and all you need to do is buy it (as a plugin), and build your interface around it, and make sure your particular scene files and stuff is converted into the Iray format, and it does all the rest. Which means it does all the rendering calculations and interfaces via CUDA with the GPU and so on. 

    But it seems like Optix is a lower level solution, where you need to build your own renderer around it, and it has the libraries to do the ray tracing, etc, but it's not a fully configured rendering solution. In fact it's probably a big library of functionality that can be used in many scientific, rendering, etc., fields. 

    And what wouldn't surprise me is if NVIDIA decides to distance itself from Iray since it is for a smaller, specific market, compared to Optix, which can be applied to far more solutions other than 3D rendering. And I'm guessing that's why they didn't include Iray in the Turing architecture development, but focused on Optix instead. And that's why there was never any discussion of Iray benchmarks. And perhaps why the RTX cards are so pricey...maybe they're not directed at the consumer doing renders or playing video games, but rather at a bigger market of production studios and scientific and business users who have the resources to develop applications from a lower level.

    And what also wouldn't surprise me is if the existing Iray developers are scrambling to figure out how they can move away from Iray, and maybe switch to a new, complete rendering solution. I'm guessing that to develop a complete renderer using Optix would be quite a challenge.

    Though I could be totally wrong on all of that. But bottom line, if I was trying to guess the future of Studio-type rendering, I'd prepare for at least a year or two of sticking with whatever I have now (GPU's and Studio software), and not expect any major developments to come along. But if my predictions on the RTX-20xx releases are any indication, I'm not being skeptical enough laugh   

    Post edited by ebergerly on
  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited October 2018

    DAZ_Rawb:   "So for now, be sure to use the 4.11 public beta if you have a 20-series card."

    How do we know 4.11 public beta supports the RTX2080?  (or do we?)  I want to purchase an RTX2080, but not until I'm sure DS supports it.  Is there a planned release date for it?  I can't seem to find the info about it on the Daz3D site.

    I'd still wait but DAZ has had good luck so far with the latest DAZ Studio beta as DAZ _Rawb said, raytracing functionality still a question mark - see more https://direct.daz3d.com/forums/discussion/comment/3969916/#Comment_3969916

    Post edited by Kevin Sanderson on
  • NewVisionNewVision Posts: 15
    edited October 2018
    VEGA said:

    So my rtx 2080ti will be faster than yours rtx 2080 + 1070 ? or not.

    I guess, it will be approximately as fast as the 2080 + 1070 currently. That picture will change if iray will support the raytracing cores of the rtx, which, advertising says, performs 10 billion rays per second. Currently, I don't see a big advantage of the rtx cards over the gtx cards in performance. I would appreciate DAZ or Nvidia give a statement about if they will support the raytracoing cores and what the performance influence is. 

    Post edited by NewVision on
  • VEGA said:

    So my rtx 2080ti will be faster than yours rtx 2080 + 1070 ? or not.

    I guess, it will be approximately as fast as the 2080 + 1070 currently. That picture will change if iray will support the raytracing cores of the rtx, which, advertising says, performs 10 billion rays per second. Currently, I don't see a big advantage of the rtx cards over the gtx cards in performance. I would appreciate DAZ or Intel give a statement about if they will support the raytracoing cores and what the performance influence is. 

    DAZ_Rawb said:
    NewVision said:
    branonig said:
    NewVision said:

    So it worked for you? I have an RTX 2080 as well and i haven't been able to get it to render anything in iray at all (using DAZ 4.10 on windows 10). Did you have to mess with any settings or anything? 

    Yes, no problem at all. Be aware that the driver and software may be an issue here. You should use driver 4.x (currently 4.11?) and at least CUDA 9. I did not do anything special and it worked. But as mentioned, on DAZ 4.10 it was not very impressive and I didn not find any settings which could cause that.

    4.10 doesn't have a new enough Iray to support the 2080 card, so the unimpressive speed on the renders was probably because that card was tossed out at render time and your CPU + other GPU were used instead.

     

    So for now, be sure to use the 4.11 public beta if you have a 20-series card.

    Thabnks for your answer! Hmm, there is a big difference in performance, DAZ 4.11 is much faster in rendering. But also on the 1070 which has no raytracing cores. 
    So will DAZ Studio support the raytracing cores of the 20x0 series/ new Quadro series?

  • 10 billion rays per second from the RT cores do sound like a lot!

    How many is an ordinary 1080TI able to as a comparision of performance?

  • prixatprixat Posts: 1,590
    edited October 2018
    j1039564 said:

    10 billion rays per second from the RT cores do sound like a lot!

    How many is an ordinary 1080TI able to as a comparision of performance?

    A fast Intel CPU (with Embree) ~100Mrays. 

    A 1080ti (with OptiX) ~1200Mrays.

    M6000 Quadro ~1500Mrays

    Post edited by prixat on
  • paulbungardpaulbungard Posts: 52
    edited October 2018

    To get Daz fully compatible with Turing GPU's there are quite a few SDK's from Nvidia that need to be updated also Iray uses Optix but if there making a new version of Iray compatible with Turing that won't be shown tell march. Current version of Iray in Daz Beta does not officially support Turing. It just added compatibility for Volta      

    Post edited by paulbungard on
  • olesehenolesehen Posts: 15
    edited October 2018

    To get Daz fully compatible with Turing GPU's there are quite a few SDK's from Nvidia that need to be updated also Iray uses Optix but if there making a new version of Iray compatible with Turing that won't be shown tell march. Current version of Iray in Daz Beta does not officially support Turing. It just added compatibility for Volta      

    Thats intresting , so perhaps we will se a massive performance increase around that time .. Right now from switching from the GTX 1080 to a RTX 2080 ti  I easily cut my render times in half if not more ( not done extensive tests )  , But I am really impressed with the RTX card  , and to get another increase from a future iray update would be icing on the cake :-) 

    Post edited by olesehen on
  • Question about NVlink.

    does anyone know how this works with Iray. Nvlink should have "shared" memory, does this work with Iray?

    IE, you have two 2080 ti's with 11gb each, does that give you 22gb of vram to work with, or still only 11?

  • prixatprixat Posts: 1,590

    Question about NVlink.

    does anyone know how this works with Iray. Nvlink should have "shared" memory, does this work with Iray?

    IE, you have two 2080 ti's with 11gb each, does that give you 22gb of vram to work with, or still only 11?

    Texture sharing across NVLink was introduced to Iray back in July. The big unknown is... does that apply to the cut down version of NVLink on Geforce cards, or is it only available on the full NVLink Quadro cards?

  • prixat said:

    Question about NVlink.

    does anyone know how this works with Iray. Nvlink should have "shared" memory, does this work with Iray?

    IE, you have two 2080 ti's with 11gb each, does that give you 22gb of vram to work with, or still only 11?

    Texture sharing across NVLink was introduced to Iray back in July. The big unknown is... does that apply to the cut down version of NVLink on Geforce cards, or is it only available on the full NVLink Quadro cards?

    The NVLink verision on the consumer cards (RTX 2080/2080Ti) are only half speed and do not support memory sharing - they are basically just SLI bridges with more bandwidth to handle the faster cards. The full speed ones on the RTX Quadro cards do support memory sharing.

Sign In or Register to comment.