RTX 3080 20GB Cancelled: Are 10GB Enough?

LenioTGLenioTG Posts: 2,118
edited October 2020 in Daz Studio Discussion

Hi everyone!

I'm torn.

I was eagerly waiting to get a RTX 3080 20GB, or a RTX 3070 16GB, but, apparently, they got cancelled: https://www.pcmag.com/news/report-nvidia-cancels-rtx-3080-20gb-and-rtx-3070-16gb-graphics-cards

I know the RTX 3090 with 24GB exists, but €1500 is way too much for me to spend: I'm currently using a €320 RTX 2060, and I'd gladly upgrade to a €700 RTX 3080 10GB (when it'll be possible), but I can't go higher than that.

The question is: do you think 10GB will be enough in, let's say, 3 years?
PAs are using heavier and heavier textures.
3 years ago I actually managed to do something with 3GB of VRAM, nowadays I have to be careful with 6GB.
And yes, you can optimize everything you want, but ultimately that takes time.

  • I don't want a NVlink with RTX 2000 cards, because, as we've seen in the Iray benchmark, the RTX 3000 crush them.
  • Still, the RTX 3080 doesn't support NVlink, so, if a RTX 3080 Super with more VRAM doesn't come out next year, I'll be stuck with 10GB for a long time.
  • My PC is ready to mount two GPUs (X570 motherboard with 2x PCIe x16 slots, 1200W PSU), so I can still get the 3080 now, and hope for a 3080S with more VRAM next year. (Indeed, if I got a 16/20GB GPU next to the 10GB one, at least the heavier renders won't fall back to the CPU).

What do you think I should do?
And how are you handling this VRAM shortage?

Post edited by LenioTG on

Comments

  • nicsttnicstt Posts: 11,715

    It was never announced as available, so we can't consider it cancelled; internal changes, within an organisation are exactly that, internal.

  • LenioTGLenioTG Posts: 2,118
    nicstt said:

    It was never announced as available, so we can't consider it cancelled; internal changes, within an organisation are exactly that, internal.

    Still, many of us were waiting for it.

    That doesn't change the fact that we now have to understand how to behave with the current opportunities.

  • The VRAM shortage has not been confirmed and no source that really knows has come forward. I would just sit tight and not spend if you can't afford it. Adjust your budget to save for a 3090 which will still be cheaper than a Titan. Or really optimize your scenes and composite.

     

  • RobinsonRobinson Posts: 751
    LenioTG said:
     

    What do you think I should do?
    And how are you handling this VRAM shortage?

    I would wait until 2021 because as that somewhat misleading piece states, "it now seems more likely they will form part of a more substantial graphics card refresh in 2021".

  • MerakiMeraki Posts: 38

    You can't really have better than that for the price, the new amd cards will ship with 16gb of ddr6x, but except that only the quadro, the titan or the rtx 2080ti have more vram, but it's really expensive... The rtx 3080 is your best shot.

  • LenioTGLenioTG Posts: 2,118
    edited October 2020

    The VRAM shortage has not been confirmed and no source that really knows has come forward. I would just sit tight and not spend if you can't afford it. Adjust your budget to save for a 3090 which will still be cheaper than a Titan. Or really optimize your scenes and composite.

    It's not that: if I had the money to buy a 3090, I would not do that, because it would take too long for the return of investment. So long that RTX 4000s would be out.
    In the early days I had a 2GB GPU, or a 3GB one, so I know how to optimize, but that requires time.

     

    What do you think I should do?
    And how are you handling this VRAM shortage?

     

    Robinson said:

    I would wait until 2021 because as that somewhat misleading piece states, "it now seems more likely they will form part of a more substantial graphics card refresh in 2021".

    I think nowadays they're going to release new GPU every year, because of AMD's schedule: so now the RTX 3000, in about 12 months the SUPER variants.
    But I don't think they're going to have 20GB of VRAM: if they did that, it would be to counter AMD launch.
    Passed that, maybe they'd go for 12GB.

    The fact is that you don't need more than 10GB of VRAM to play games, and that's what most people buy GPUs for.
    For us professionals, they always want to milk us restricting features to Titans etc. :(

    Since a RTX 3080 would give me +4GB of VRAM and x3 times the performance I currently have, I can't wait a whole year!

    But yes, I could never get a RTX 3000 before 2021: right now, you can get a RTX 3080 in Italy for +€300 the MSRP (€900-1000), delivered in January.
    I prefer to wait until the prices will become stable.
    They say the RTX 3080 ASUS TUF should have an MSRP of €750.

    Meraki said:

    You can't really have better than that for the price, the new amd cards will ship with 16gb of ddr6x, but except that only the quadro, the titan or the rtx 2080ti have more vram, but it's really expensive... The rtx 3080 is your best shot.

    Be careful: AMD GPUs are useless for Iray!
    My hope is that they'll have enough stock to get the whole gamers market, leaving RTX 3000s for us.

    They used to say the RTX 3080 20GB would have cost €900/1000, and that would have been a nice price for it.

    It's not that the Titan or the 2080 Ti are more expensive: they're also much weaker in Iray. It's time to get a RTX 3000 for us creators.

    Post edited by LenioTG on
  • marblemarble Posts: 7,500

    I'm in precisely the same boat as the OP except perhaps that my hopes were pinned on a 16GB 3070 because the 20GB 3080 would be a huge challenge for my bank account while a 3090 is out of the question.

    So I've been trying to glean from other discussions around this forum what my alternatives are. A couple of them take me out of DAZ Studio for rendering. Blender has Eevee and Cycles which can use AMD cards. The Diffeomorphic bridge seems better than the DAZ option and allows for scene transfer to Blender. I just have not been able to reproduce IRay quality in Blender yet but that is because I am not skilled at manipulating the Blender node system yet. However, anyone who claims that Diffeo material conversion can reproduce IRay quality must have altogether different eyesight to mine because I just can't see it. Nevertheless, in Blender there's also out-of-core and Render Simplify for Cycles which help with scenes which would cause IRay to give up and drop to CPU.

    I just don't see the point in replacing my 8GB 1070 with another 8GB (or even 10GB) card when I am already optimising as much as I know how. As it is I can't have more than 3 characters in a scene or use many of the sets that I have bought (and often returned) because they exceed my VRAM even after running through Scene Optimizer. So if anyone has better suggestions for further optimisation, I would be happy to be educated. I've often seen comments such as "I have never had a problem with dropping to CPU - you guys need to learn how to optimise". Well, for those who claim that - speak up: tell us how.

  • Lenio, time is money. If you could swing getting a 3090 FE, it's $1500 and only available online at Best Buy for now in the US but out of stock. It looks like you're going to have to wait no matter what. I would work on optimizing and compositing. That's the only way to make do with less. But if you could swing the $1500, your time is valuable and stopping the render from going to CPU is well worth it if you render enough or animate.

  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited October 2020

    marble, have you ever tried rendering the set separately from the characters or  bringing them in with the set as a background image? It's similar to what Val at Dreamlight does with his movie maker sets. They render fast without much optimizing. The results can look much better than Eevee.

    Post edited by Kevin Sanderson on
  • nonesuch00nonesuch00 Posts: 18,284

    On the news aggregator from Google I no sooner read of the 3070 16GB & 3080 20GB cancel than of another article claiming now there was going to be a 3070 Super build from cutdown 3080 & 3090 GPUs before the new year. 

    For me all those rumours and problems means one thing really: if you need to pinch pennies and you need an nVidia 30X0 series GPU then you shouldn't be buying any 30X0 now except the GeForce RTX 3090 24GB.

  • SevrinSevrin Posts: 6,309

    It depends on what kind of scene you want to render. 

    Anyway, things are way too uncertain right now to give anyone advice other than to wait until they settle down.

  • LenioTGLenioTG Posts: 2,118
    edited October 2020
    marble said:

    I'm in precisely the same boat as the OP except perhaps that my hopes were pinned on a 16GB 3070 because the 20GB 3080 would be a huge challenge for my bank account while a 3090 is out of the question.

    So I've been trying to glean from other discussions around this forum what my alternatives are. A couple of them take me out of DAZ Studio for rendering. Blender has Eevee and Cycles which can use AMD cards. The Diffeomorphic bridge seems better than the DAZ option and allows for scene transfer to Blender. I just have not been able to reproduce IRay quality in Blender yet but that is because I am not skilled at manipulating the Blender node system yet. However, anyone who claims that Diffeo material conversion can reproduce IRay quality must have altogether different eyesight to mine because I just can't see it. Nevertheless, in Blender there's also out-of-core and Render Simplify for Cycles which help with scenes which would cause IRay to give up and drop to CPU.

    I just don't see the point in replacing my 8GB 1070 with another 8GB (or even 10GB) card when I am already optimising as much as I know how. As it is I can't have more than 3 characters in a scene or use many of the sets that I have bought (and often returned) because they exceed my VRAM even after running through Scene Optimizer. So if anyone has better suggestions for further optimisation, I would be happy to be educated. I've often seen comments such as "I have never had a problem with dropping to CPU - you guys need to learn how to optimise". Well, for those who claim that - speak up: tell us how.

    I'm actually one of those guys, I mean that I rarely run out of VRAM with 6GB.
    But that's because I've made my first 600 renders with less than 2.5GB of available VRAM, so it's natural for me to optimize: I render with Render Queue, and I hide the characters/heavy objects that are not being shown, etc.

    Yet, that makes my workflow a bit slower.
    And it's happening more and more to see standard scenes requiring more VRAM.

    The fact is that, if I go for a RTX 3080, chance is I won't upgrade it for let's say 3 years: I don't think that 10GB tomorrow will have the same utility they have today, with how heavy the new products are becoming.

    Thanks for telling me about this Simplify!
    For my workflow, I prefer to not go out of Daz if that's not necessary. I can't even consider AMD GPUs, with the requirement to go to Blender every time. I couldn't even have a proper Iray viewport that way.
    By the way, just out of curiosity (I don't plan on using it much, and it's fast anyway), does Filament use AMD GPUs too?

    Lenio, time is money. If you could swing getting a 3090 FE, it's $1500 and only available online at Best Buy for now in the US but out of stock. It looks like you're going to have to wait no matter what. I would work on optimizing and compositing. That's the only way to make do with less. But if you could swing the $1500, your time is valuable and stopping the render from going to CPU is well worth it if you render enough or animate.

    Thanks Kevin, I could already buy a 3090 at that price, but I won't, because that's not a wise choice for my online business!
    Especially with these harsh pandemic times: you may never know when you'll need money just to buy groceries.
    It wouldn't give me much more than a 3080, for less than half the price.
    If tomorrow a thousand people will subscribe to my Patreon, I'd gladly get 4x 3090...but I don't think that's going to happen! xD

    €700/800 for a RTX 3080 is definitely worth it: in the Iray benchmark it's 12iter/s versus 3.8 that I'm having now. And it's almost double the VRAM (I rarely need more VRAM nowadays anyway).
    But I can't justify in my head to spend €1500 for a 3090, in my particular situation.
    I'd rather get a 2nd 3080 for that extra money, going to 24iter/s, instead of just 14 with a single 3090.
    No doubt it's a great card, but it's not for me.

    As I was saying above, I have a XX60 GPU right now, so going up to a XX80 is already a lot!

    On the news aggregator from Google I no sooner read of the 3070 16GB & 3080 20GB cancel than of another article claiming now there was going to be a 3070 Super build from cutdown 3080 & 3090 GPUs before the new year. 

    For me all those rumours and problems means one thing really: if you need to pinch pennies and you need an nVidia 30X0 series GPU then you shouldn't be buying any 30X0 now except the GeForce RTX 3090 24GB.

    I couldn't buy a RTX 3000 now even if I wanted to!
    In Italy, there's basically no stock.

    A 3070 Super would look tasty!
    I've found this article: https://www.techradar.com/news/is-a-new-nvidia-rtx-30-series-card-on-the-way-to-take-down-amds-big-navi
    But they're not saying how much VRAM it'll pack.

    Sevrin said:

    It depends on what kind of scene you want to render. 

    Anyway, things are way too uncertain right now to give anyone advice other than to wait until they settle down.

    I make comics!
    Let's say three fully clothed/haired G8 characters and a modern environment on screen.
    I'd like to be able to do that in three years as well.

    I know how to optimize very well (I've made two whole comics with a GTX 1050 back in 2018), but I wouldn't mind to not have to do that anymore.

    Post edited by LenioTG on
  • takezo_3001takezo_3001 Posts: 1,997

    The VRAM shortage has not been confirmed and no source that really knows has come forward. I would just sit tight and not spend if you can't afford it. Adjust your budget to save for a 3090 which will still be cheaper than a Titan. Or really optimize your scenes and composite.

    I did this very thing, I was originally saving up for a 2080ti, up until the point that I learned about the 24gb 3090, but unless I can snag an FE one next month, I'll most likely be waiting for stock to equalize in 2021, hopefully... The good news is that I still plan to save until that time so I can get me a nice little nest egg!

  • marblemarble Posts: 7,500

    marble, have you ever tried rendering the set separately from the characters or  bringing them in with the set as a background image? It's similar to what Val at Dreamlight does with his movie maker sets. They render fast without much optimizing. The results can look much better than Eevee.

    I tried a long time ago (probably back when I was using Reality/Luxrender which was very slow on CPU only). I found that I couldn't get the knack of positioning the characters in the background image. I recently bought those Movie Maker sets and had exactly the same problem. The promo images he uses are probably the few instances where the character looks correctly positioned.

  • marblemarble Posts: 7,500
    LenioTG said:
    marble said:

     

    I'm actually one of those guys, I mean that I rarely run out of VRAM with 6GB.
    But that's because I've made my first 600 renders with less than 2.5GB of available VRAM, so it's natural for me to optimize: I render with Render Queue, and I hide the characters/heavy objects that are not being shown, etc.

     

     I just don't get that. As far as I can see, hiding does little to nothing to cut down the VRAM being used. I believe that the textures are still loaded whether hidden or not. I use Scene Optimiser and reduce the textures by half. Some textures can stand another reduction step but I generally find that seams begin to become obvious and artefacts begin to appear. I guess if you are Ok with those drawbacks, then further texture size reduction is possible. Most of my scenes have 2 or 3 dressed characters, a room or exterior environment and a few props such as furniture, etc. I tend to replace as many textures as possible with IRay shaders although I don't know for sure whether this makes much difference to the VRAM but it often helps with render times.

    I prefer to render a 5:4 aspect because I find that scene composition is easier for me and I set the resolution to 1600x1280 which is not too demanding. I then enlarge in post with a AI denoiser if necessary.

    I would like to have scenes with 4 to 6 characters but I think I would need 16GB of VRAM for that which is why I was hoping fot the 16GB 3070.

  • LenioTGLenioTG Posts: 2,118
    marble said:
    LenioTG said:
    marble said:

     

    I'm actually one of those guys, I mean that I rarely run out of VRAM with 6GB.
    But that's because I've made my first 600 renders with less than 2.5GB of available VRAM, so it's natural for me to optimize: I render with Render Queue, and I hide the characters/heavy objects that are not being shown, etc.

     

     I just don't get that. As far as I can see, hiding does little to nothing to cut down the VRAM being used. I believe that the textures are still loaded whether hidden or not. I use Scene Optimiser and reduce the textures by half. Some textures can stand another reduction step but I generally find that seams begin to become obvious and artefacts begin to appear. I guess if you are Ok with those drawbacks, then further texture size reduction is possible. Most of my scenes have 2 or 3 dressed characters, a room or exterior environment and a few props such as furniture, etc. I tend to replace as many textures as possible with IRay shaders although I don't know for sure whether this makes much difference to the VRAM but it often helps with render times.

    I prefer to render a 5:4 aspect because I find that scene composition is easier for me and I set the resolution to 1600x1280 which is not too demanding. I then enlarge in post with a AI denoiser if necessary.

    I would like to have scenes with 4 to 6 characters but I think I would need 16GB of VRAM for that which is why I was hoping fot the 16GB 3070.

    The textures remain loaded if you've worked on the scene etc., but if you're using Render Queue, it loads the whole scene every time, just what you see.

    You can notice that because, when you unhide something in a loaded scene, it takes much more time than the usual unhide on something you've been working on in that Daz session.

    I render in 1920x1440, 1500 iterations, then Intel denoiser.

  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited October 2020
    marble said:

    marble, have you ever tried rendering the set separately from the characters or  bringing them in with the set as a background image? It's similar to what Val at Dreamlight does with his movie maker sets. They render fast without much optimizing. The results can look much better than Eevee.

    I tried a long time ago (probably back when I was using Reality/Luxrender which was very slow on CPU only). I found that I couldn't get the knack of positioning the characters in the background image. I recently bought those Movie Maker sets and had exactly the same problem. The promo images he uses are probably the few instances where the character looks correctly positioned.

    Did you buy his tutorial? Looks like everything is placed well. It's on sale cheap now. https://www.daz3d.com/movie-maker-iray-maestro--video-tutorial

    Val says in one of the promo videos for example a car is loaded into the scene at the 0, 0, 0 position if that helps. Maybe you have to figure on loading your characters that way, too.

    Post edited by Kevin Sanderson on
Sign In or Register to comment.