Wait for the RTX 3080 20GB / 3090 or go for a Titan RTX?

24

Comments

  • nicsttnicstt Posts: 11,715
    Rauko001 said:
    nicstt said:
    PerttiA said:
    nicstt said:
    PerttiA said:
    karia007 said:
    Robinson said:

     

    It's a guess.

    It wouldn't be a good decission imo though. That sort of re-active response to AMDs products would irritate those who had just bought it. I hope they don't do that.

    It is a guess .. but, if Big Navi comes out and gets close to the 3080 (or god-forbid, beats it) and it does come with the rumoured 16gb .. then what else can NVIDIA do? They can't just sit there for 6 months and let AMD have the performance crown and and all the while having more VRAM. 

    AMD needs a win here; as long as they can at least compete with the 3080, then it will be big.

    Why don't you want AMD to beat Nvidia; you know that if that happens, it is likely to have a beneficial affect on card prices?

  • outrider42outrider42 Posts: 3,679

    Kinda holding out for a month or two here... but getting "the latest" isn;t quite the same when it comes to laptops.

    I'm partially back on the road again (my team is splattered across half the planet, quite literally), so a mobile solution is my best bet, and lugging a desktop around ain't gonna cut it, so...  

    My 2018-vintage Acer Aspire 7 w/ GTX 1060 (6GB) GPU recently blew out its trackpad (but a little USB mouse fixes that for now - yay?) so I'm sniffing around, and find that I can get low/mid-end 2k-series RTX GPUs for a decent (well, relatively decent) price - around the $2k-$2.5k range. I know they're gonna come out with something more badassed around Black Friday, so I'm kinda stretching this little bugger for just awhile longer.

    The point is kinda simple: If you can hold out a little longer, do it. You'll thank yourself later.

    Given that the desktop 3080 uses around 320 Watts, any laptop versions of these cards would need to be drastically cut back to fit into a laptop's power budget. At that point they may not offer much more than the current Turing laptops do. So it would not make much sense other than the desire to market a new product. They did not release a 2080ti laptop at all, almost certainly because of power limits. With that in mind, I doubt the 3080 will even release for laptops since it uses even more power. Unless, again, they do it for marketing purposes.

    The most laptops might get is a 3060 or 3070 type of card, and again it would be cut back for power. The current 2080 laptops might remain the fastest available.

    But that is only my speculation. Maybe Nvidia has a ray traced rabbit in their hat somewhere.

  • nicsttnicstt Posts: 11,715

    Kinda holding out for a month or two here... but getting "the latest" isn;t quite the same when it comes to laptops.

    I'm partially back on the road again (my team is splattered across half the planet, quite literally), so a mobile solution is my best bet, and lugging a desktop around ain't gonna cut it, so...  

    My 2018-vintage Acer Aspire 7 w/ GTX 1060 (6GB) GPU recently blew out its trackpad (but a little USB mouse fixes that for now - yay?) so I'm sniffing around, and find that I can get low/mid-end 2k-series RTX GPUs for a decent (well, relatively decent) price - around the $2k-$2.5k range. I know they're gonna come out with something more badassed around Black Friday, so I'm kinda stretching this little bugger for just awhile longer.

    The point is kinda simple: If you can hold out a little longer, do it. You'll thank yourself later.

    Given that the desktop 3080 uses around 320 Watts, any laptop versions of these cards would need to be drastically cut back to fit into a laptop's power budget. At that point they may not offer much more than the current Turing laptops do. So it would not make much sense other than the desire to market a new product. They did not release a 2080ti laptop at all, almost certainly because of power limits. With that in mind, I doubt the 3080 will even release for laptops since it uses even more power. Unless, again, they do it for marketing purposes.

    The most laptops might get is a 3060 or 3070 type of card, and again it would be cut back for power. The current 2080 laptops might remain the fastest available.

    But that is only my speculation. Maybe Nvidia has a ray traced rabbit in their hat somewhere.

    Big Navi is supposed to have a very good power/wat level. This will make in interesting.

  • fred9803fred9803 Posts: 1,564

    Given that the desktop 3080 uses around 320 Watts, any laptop versions of these cards would need to be drastically cut back to fit into a laptop's power budget.

    Good points and also the heat issue. With more watts comes more heat and with laptops I have no idea how they would deal with that problem apart from reducing the wattage, which would kind of defeat the 3080 as it needs that power level to work as intended.

  •  

    Given that the desktop 3080 uses around 320 Watts, any laptop versions of these cards would need to be drastically cut back to fit into a laptop's power budget. At that point they may not offer much more than the current Turing laptops do. So it would not make much sense other than the desire to market a new product. They did not release a 2080ti laptop at all, almost certainly because of power limits. With that in mind, I doubt the 3080 will even release for laptops since it uses even more power. Unless, again, they do it for marketing purposes.

    The most laptops might get is a 3060 or 3070 type of card, and again it would be cut back for power. The current 2080 laptops might remain the fastest available.

    But that is only my speculation. Maybe Nvidia has a ray traced rabbit in their hat somewhere.

     

    No, you have an excellent point there.

    I don't really rely on battery much, and IMHO most gaming/desginer-style laptops usually toss the idea of a practical battery life (because they know full well that nobody's gonna rely on one for very long anyway). I'm just happy that it (and the power brick for it) fits into my carry-on valise, which allows me to either set it up standalone in the hotel room, or (if they support it) Miracast it to the TV in the room. It's an all-in-one thing for me - media center, movie machine, vidphone conference thingy with the missus and kids, occasional gaming machine, something to organize and backup the tourista pix with, and oh yeah - something I can mess around with DS on. 

    BUT... the idea behind waiting a couple of months wasn't to magically get a 3k-series GPU in mobile format, but to get lower prices on a 2070 rigged one, or maybe get one with more RAM in it. New models come out, the n -1 models get discounted heavily, etc... 

     

  • GordigGordig Posts: 10,189

    Kinda holding out for a month or two here... but getting "the latest" isn;t quite the same when it comes to laptops.

    I'm partially back on the road again (my team is splattered across half the planet, quite literally), so a mobile solution is my best bet, and lugging a desktop around ain't gonna cut it, so...  

    My 2018-vintage Acer Aspire 7 w/ GTX 1060 (6GB) GPU recently blew out its trackpad (but a little USB mouse fixes that for now - yay?) so I'm sniffing around, and find that I can get low/mid-end 2k-series RTX GPUs for a decent (well, relatively decent) price - around the $2k-$2.5k range. I know they're gonna come out with something more badassed around Black Friday, so I'm kinda stretching this little bugger for just awhile longer.

    The point is kinda simple: If you can hold out a little longer, do it. You'll thank yourself later.

    Given that the desktop 3080 uses around 320 Watts, any laptop versions of these cards would need to be drastically cut back to fit into a laptop's power budget. At that point they may not offer much more than the current Turing laptops do. So it would not make much sense other than the desire to market a new product. They did not release a 2080ti laptop at all, almost certainly because of power limits. With that in mind, I doubt the 3080 will even release for laptops since it uses even more power. Unless, again, they do it for marketing purposes.

    The most laptops might get is a 3060 or 3070 type of card, and again it would be cut back for power. The current 2080 laptops might remain the fastest available.

    But that is only my speculation. Maybe Nvidia has a ray traced rabbit in their hat somewhere.

    There may not have been 2080ti laptops, but there were laptops with Quadro RTX 4000 and 5000.

  • Gordig said:

    Kinda holding out for a month or two here... but getting "the latest" isn;t quite the same when it comes to laptops.

    I'm partially back on the road again (my team is splattered across half the planet, quite literally), so a mobile solution is my best bet, and lugging a desktop around ain't gonna cut it, so...  

    My 2018-vintage Acer Aspire 7 w/ GTX 1060 (6GB) GPU recently blew out its trackpad (but a little USB mouse fixes that for now - yay?) so I'm sniffing around, and find that I can get low/mid-end 2k-series RTX GPUs for a decent (well, relatively decent) price - around the $2k-$2.5k range. I know they're gonna come out with something more badassed around Black Friday, so I'm kinda stretching this little bugger for just awhile longer.

    The point is kinda simple: If you can hold out a little longer, do it. You'll thank yourself later.

    Given that the desktop 3080 uses around 320 Watts, any laptop versions of these cards would need to be drastically cut back to fit into a laptop's power budget. At that point they may not offer much more than the current Turing laptops do. So it would not make much sense other than the desire to market a new product. They did not release a 2080ti laptop at all, almost certainly because of power limits. With that in mind, I doubt the 3080 will even release for laptops since it uses even more power. Unless, again, they do it for marketing purposes.

    The most laptops might get is a 3060 or 3070 type of card, and again it would be cut back for power. The current 2080 laptops might remain the fastest available.

    But that is only my speculation. Maybe Nvidia has a ray traced rabbit in their hat somewhere.

    There may not have been 2080ti laptops, but there were laptops with Quadro RTX 4000 and 5000.

    Quadro RTX 4000 and 5000 have the same chip, roughly, as the 2080 (2070 super and the 2080 super also) the TU104 not the TU102 that the 2080ti, the RTX Titan, the Quadro 6000 and 8000 had.

  • lilweeplilweep Posts: 2,561

    imagine not being able to wait like 2 days

  • outrider42outrider42 Posts: 3,679

    I was curious, so I looked up the 2080 laptop. The desktop 2080 is rated at 215 Watts. They managed to bring that down to 150 Watts for the laptop. I had forgot that the 2080 Super was added to laptops, but it nearly the same. The max power on any of the laptops is 200 Watts. 

    The "Max-Q" variants of Turing have a TDP of just 80 to 90 Watts, which is pretty incredible that they scale them down that much. But to so the chip is clocked to under 1000 mhz for 80 Watts, it pokes just above 1000 Mhz for 90 Watts.

    Another issue is the physical size of the chip. The 2080 is 545mm square. The 3080 is 628.4mm. Though the 2080ti is 754mm, that thing was a beast. So the 3080 is smaller than a 2080ti, in spite of the extra TDP it has. So maybe there is hope for it showing up in a laptop after all, downclocked to like 1300 or something.

    The laptops versions are often late to the party. This time, I think that laptops might be even later to the party because of how much they need to par things down. It will be interesting to see what AMD does in the laptop space. Given their advantages with TSMC, I think it is fair to expect them to be even more competitive in the laptop space than the desktop. Laptops makers are much more strict on what they will allow (usually). So unlike desktop where power can be cranked up at will, a laptop generally has to fit under a certain power envelope. Watt for Watt that should favor AMD this time around. Actually, if AMD cannot beat Nvidia in this segment, then something is seriously wrong with their GPU division. Like for real, some heads need to roll if they can't beat Nvidia given how wide open the door is from them right now. AMD has a golden opportunity here.

  • Low power Vega is pretty good at this point and the Ryzen mobile APU's are quantifiably equal to or superior to current gen Intel chips in graphics performance, not that that's saying much. Actual discrete mobile Radeon GPU's aren't very common and I've seen next to no benchmarks, TBH I don't think I've seen any Navi ones (a quick check says the cards exist but I couldn't find any reviews or benchmarks).

    The RDNA 2 APU's are in both consoles and that should be a big tell. If they're good then people will want them in laptops and as the base for cheap gaming desktops. We'll see in the next month to 6 weeks.

  • nonesuch00nonesuch00 Posts: 18,316

    Oh, those 30X0 GPUs will be showing up on laptops. nVidia are not going to use 2018 20X0 GPUs for another 2 years until the 40X0 GPUs come out.

  • nicsttnicstt Posts: 11,715

    Interesting.

  • DripDrip Posts: 1,206

    Oh, those 30X0 GPUs will be showing up on laptops. nVidia are not going to use 2018 20X0 GPUs for another 2 years until the 40X0 GPUs come out.

    Oh, they will probably show up. But, whether there's a point to putting such a card in a laptop is still a different matter. Even if it's just for bragging rights, there will be a demand, so chances are, that some niche manufacturer will make something ridiculously useless and expensive for that niche market. But laptop users who really need the power of these cards, are probably better off looking into portable eGPU solutions, server based renders or cloud streaming.

  • marble said:
    kyoto kid said:

    ..the 3090, with more than double the cores of the RTX Titan is priced 1,000$ less.

    But the 3080 is quite close in performance and, if it does get 20GB, it will be hard to justify a 3090.

    3080 20G version was leaked during Gigabyte's internal roadmap, release date is still unknow but likely will be 2021, a 20G-24G GPU under $2000 almost never happened before, so I will get a 3090 if it's available on launch day, won't kill me to wait another month or 2.   But doing work on a 6G GPU is just too time consuming and always out of capcity.  

  • Noah LGPNoah LGP Posts: 2,617
    edited September 2020

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    Post edited by Noah LGP on
  • windli3356windli3356 Posts: 239
    edited September 2020
    Noah LGP said:

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    As good in gaming performance? yes, but 8G ram is kind irreavlent if you do rendering work.  RTX titan was $1200 more expensive than 2080TI and got identical or wrose performance in gaming, but still sold a ton due to massive Vram capacity  

    24G vRam for $1500 is actually a good deal consider there never was any GPU with such vram capacity sold under $2k.  

    Post edited by windli3356 on
  • kyoto kidkyoto kid Posts: 41,249
    Noah LGP said:

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    ..."apples and oranges" as the RTX Titan is closer to a pro grade GPU intended for rendering and deep learning research while the 3070 is a consumer grade GPU primarily targeted towards gaming where frame rate is much more important.  Also the RTX Titan is an older generation than the 3070, sort of like comparing a Pascal 1080 Ti to a Maxwell Titan-X. 

    There is also the matter of the RTX Titan having three times the VRAM which is far more important for rendering of complex scenes even though it has about 1,200 fewer CUDA cores.

    Of course the 3090 will surpass both and is only 1,000$ more than the 3070.  

  • Noah LGPNoah LGP Posts: 2,617
    edited September 2020
    Noah LGP said:

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    As good in gaming performance? yes, but 8G ram is kind irreavlent if you do rendering work.  RTX titan was $1200 more expensive than 2080TI and got identical or wrose performance in gaming, but still sold a ton due to massive Vram capacity  

    24G vRam for $1500 is actually a good deal consider there never was any GPU with such vram capacity sold under $2k.  

     

    With 1 Titan RTX (24 GB) you can have 3 RTX 3070 (8 GB)

    I guess 3 GPU (3 x 8 = 24 GB ) is still better than 1 Titan.

     

    I see no reason to purchase a Titan RTX now, it's outdated (almost 2 years old, launch on December 18, 2018).

    You can already begin to save $63 every months in order to replace it by the RTX 4090 in 2022.

    Post edited by Noah LGP on
  • nicsttnicstt Posts: 11,715
    Noah LGP said:
    Noah LGP said:

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    As good in gaming performance? yes, but 8G ram is kind irreavlent if you do rendering work.  RTX titan was $1200 more expensive than 2080TI and got identical or wrose performance in gaming, but still sold a ton due to massive Vram capacity  

    24G vRam for $1500 is actually a good deal consider there never was any GPU with such vram capacity sold under $2k.  

     

    With 1 Titan RTX (24 GB) you can have 3 RTX 3070 (8 GB)

    I guess 3 GPU (3 x 8 = 24 GB ) is still better than 1 Titan.

     

    I see no reason to purchase a Titan RTX now, it's outdated (almost 2 years old, launch on December 18, 2018).

    You can already begin to save $63 every months in order to replace it by the RTX 4090 in 2022.

    3 GB: 3 * 8 actually equals 8. RAM can't be shared.

  • Noah LGPNoah LGP Posts: 2,617
    nicstt said:

    3 GB: 3 * 8 actually equals 8. RAM can't be shared.

    Oh ? Okay !

  • windli3356windli3356 Posts: 239
    edited September 2020
    Noah LGP said:
    Noah LGP said:

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    As good in gaming performance? yes, but 8G ram is kind irreavlent if you do rendering work.  RTX titan was $1200 more expensive than 2080TI and got identical or wrose performance in gaming, but still sold a ton due to massive Vram capacity  

    24G vRam for $1500 is actually a good deal consider there never was any GPU with such vram capacity sold under $2k.  

     

    With 1 Titan RTX (24 GB) you can have 3 RTX 3070 (8 GB)

    I guess 3 GPU (3 x 8 = 24 GB ) is still better than 1 Titan.

     

    I see no reason to purchase a Titan RTX now, it's outdated (almost 2 years old, launch on December 18, 2018).

    You can already begin to save $63 every months in order to replace it by the RTX 4090 in 2022.

    But triple-SLI aren't supported in most games nor it's 8G ram can be stacked in DAZ or other 3D tools, power draw wise it won't be economically friendly in the long run either. I never suggested RTX titan was a great deal, only used it to reference price gap between 2080Ti and titan back then compare to RTX3090 vs 3080 now, RTX3090 at $1499 certainly is a great deal for my hybirde rendering/gaming rig which come with 24G vram if you willing to overlook the gaming performance part.  

    Post edited by windli3356 on
  • Noah LGPNoah LGP Posts: 2,617
    edited September 2020

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Post edited by Noah LGP on
  • Kinda holding out for a month or two here... but getting "the latest" isn;t quite the same when it comes to laptops.

    I'm partially back on the road again (my team is splattered across half the planet, quite literally), so a mobile solution is my best bet, and lugging a desktop around ain't gonna cut it, so...  

    My 2018-vintage Acer Aspire 7 w/ GTX 1060 (6GB) GPU recently blew out its trackpad (but a little USB mouse fixes that for now - yay?) so I'm sniffing around, and find that I can get low/mid-end 2k-series RTX GPUs for a decent (well, relatively decent) price - around the $2k-$2.5k range. I know they're gonna come out with something more badassed around Black Friday, so I'm kinda stretching this little bugger for just awhile longer.

    The point is kinda simple: If you can hold out a little longer, do it. You'll thank yourself later.

    My system cost around 1600 for a Ryzen 7 3700X 32 gigs of ram and an 8 gig RTX 2060 Super though I think there was a bit of a sale at the time. New EGG has a  system builder that allows you to compare parts with prices and all that.

  • nicsttnicstt Posts: 11,715
    edited September 2020
    Noah LGP said:

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Too much power; folks are not being careful enough I suspect.

    I keep saying it here. Make sure your bloody PSU can handle, don't presume your 850 never mind 750 can handle it.

    Personally, I think it's disgusting the amount of power it takes.

    I wonder how much better than the 2000 series it would be if it used the same amount of power.

    Post edited by nicstt on
  • Noah LGPNoah LGP Posts: 2,617
    nicstt said:
    Noah LGP said:

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Too much power; folks are not being careful enough I suspect.

    I keep saying it here. Make sure your bloody PSU can handle, don't presume your 850 never mind 750 can handle it.

    Personally, I think it's disgusting the amount of power it takes.

    I wonder how much better than the 2000 series it would be if it used the same amount of power.

     

    But there is no issue with Nvidia Founders Edition

  • nicsttnicstt Posts: 11,715
    edited September 2020
    Noah LGP said:
    nicstt said:
    Noah LGP said:

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Too much power; folks are not being careful enough I suspect.

    I keep saying it here. Make sure your bloody PSU can handle, don't presume your 850 never mind 750 can handle it.

    Personally, I think it's disgusting the amount of power it takes.

    I wonder how much better than the 2000 series it would be if it used the same amount of power.

     

    But there is no issue with Nvidia Founders Edition

    None so far - or none reported so far, and it appears to be built over the minimum spec that Nvidia released to vendors; I do think as long as folks have a good 750, they are unlikely to have issues, but what if they use a Threadripper? Have a second card, even an old one?

    Folks have a tandency to believe as gospel what is stated. They don't consider their own circumstances; I've seen posts of folks insisting they will be ok because they had the PSU spec stated by Nvidia.

    I have a 1200W PSU; it has two 900 series cards and a Threadripper; I'll be making sure it can take a 3090 (presuming I go for one) with the the 980ti as well. I have no reason to believe it wont be able to comfortably, but the inconvenience of fixing a problem I could easily have avoided with just a few minutes care is both sensible and potentially save me some cash.

    Post edited by nicstt on
  • DripDrip Posts: 1,206
    nicstt said:
    Noah LGP said:
    nicstt said:
    Noah LGP said:

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Too much power; folks are not being careful enough I suspect.

    I keep saying it here. Make sure your bloody PSU can handle, don't presume your 850 never mind 750 can handle it.

    Personally, I think it's disgusting the amount of power it takes.

    I wonder how much better than the 2000 series it would be if it used the same amount of power.

     

    But there is no issue with Nvidia Founders Edition

    None so far - or none reported so far, and it appears to be built over the minimum spec that Nvidia released to vendors; I do think as long as folks have a good 750, they are unlikely to have issues, but what if they use a Threadripper? Have a second card, even an old one?

    Folks have a tandency to believe as gospel what is stated. They don't consider their own circumstances; I've seen posts of folks insisting they will be ok because they had the PSU spec stated by Nvidia.

    I have a 1200W PSU; it has two 900 series cards and a Threadripper; I'll be making sure it can take a 3090 (presuming I go for one) with the the 980ti as well. I have no reason to believe it wont be able to comfortably, but the inconvenience of fixing a problem I could easily have avoided with just a few minutes care is both sensible and potentially save me some cash.

    And don't forget about other hardware: multiple HDDs (never hurts to bring your HDD from your old rig to your new one, but keep in mind that it does take electricity), webcam, mouse & keyboard, sometimes loudspeakers, and plenty people charge their mobile through the USB port of their computer. Most of that is small change in Wattage, but it does add up. Then there's the pretty lights that are so popular these days, extra fans, maybe a DVD or Blueray device. Most people will be a hundred Watts closer to their limit without noticing.

  • nicstt said:
    Noah LGP said:
    nicstt said:
    Noah LGP said:

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Too much power; folks are not being careful enough I suspect.

    I keep saying it here. Make sure your bloody PSU can handle, don't presume your 850 never mind 750 can handle it.

    Personally, I think it's disgusting the amount of power it takes.

    I wonder how much better than the 2000 series it would be if it used the same amount of power.

     

    But there is no issue with Nvidia Founders Edition

    None so far - or none reported so far, and it appears to be built over the minimum spec that Nvidia released to vendors; I do think as long as folks have a good 750, they are unlikely to have issues, but what if they use a Threadripper? Have a second card, even an old one?

    Folks have a tandency to believe as gospel what is stated. They don't consider their own circumstances; I've seen posts of folks insisting they will be ok because they had the PSU spec stated by Nvidia.

    I have a 1200W PSU; it has two 900 series cards and a Threadripper; I'll be making sure it can take a 3090 (presuming I go for one) with the the 980ti as well. I have no reason to believe it wont be able to comfortably, but the inconvenience of fixing a problem I could easily have avoided with just a few minutes care is both sensible and potentially save me some cash.

    I think we're going to be seeing lots of posts about weird issues with the 3090 FE's once they get into the world. Just simple boost under full game type load can exceed 375W, the full rated power delivery for the 12 pin and the PCIE slot. The cables and the slot can handle that for a few seconds assuming everything is in good shape but if everything isn't stuff will overheat. Traces and power delivery on the mobo overheating that basically no one gives a second thought to could result in all sorts of issues.

    if I was buying a 3090 I'd wait until the AIB supply got better and get one with 3x8 pins which is adequate power delivery for these things. Anyone trying to run these off 2 x 8 pins (what the 12 pin really is) is asking for trouble. The very first I'd do is undervolt the thing but who knows what kind of hoops Nvidia has put up between the cards BIOS and users this time.

  • nicsttnicstt Posts: 11,715
    nicstt said:
    Noah LGP said:
    nicstt said:
    Noah LGP said:

    There are some issues

    Nvidia’s RTX 3080 is reportedly crashing to desktop for some people
    https://www.pcgamesn.com/nvidia/rtx-3080-issues

    Too much power; folks are not being careful enough I suspect.

    I keep saying it here. Make sure your bloody PSU can handle, don't presume your 850 never mind 750 can handle it.

    Personally, I think it's disgusting the amount of power it takes.

    I wonder how much better than the 2000 series it would be if it used the same amount of power.

     

    But there is no issue with Nvidia Founders Edition

    None so far - or none reported so far, and it appears to be built over the minimum spec that Nvidia released to vendors; I do think as long as folks have a good 750, they are unlikely to have issues, but what if they use a Threadripper? Have a second card, even an old one?

    Folks have a tandency to believe as gospel what is stated. They don't consider their own circumstances; I've seen posts of folks insisting they will be ok because they had the PSU spec stated by Nvidia.

    I have a 1200W PSU; it has two 900 series cards and a Threadripper; I'll be making sure it can take a 3090 (presuming I go for one) with the the 980ti as well. I have no reason to believe it wont be able to comfortably, but the inconvenience of fixing a problem I could easily have avoided with just a few minutes care is both sensible and potentially save me some cash.

    I think we're going to be seeing lots of posts about weird issues with the 3090 FE's once they get into the world. Just simple boost under full game type load can exceed 375W, the full rated power delivery for the 12 pin and the PCIE slot. The cables and the slot can handle that for a few seconds assuming everything is in good shape but if everything isn't stuff will overheat. Traces and power delivery on the mobo overheating that basically no one gives a second thought to could result in all sorts of issues.

    if I was buying a 3090 I'd wait until the AIB supply got better and get one with 3x8 pins which is adequate power delivery for these things. Anyone trying to run these off 2 x 8 pins (what the 12 pin really is) is asking for trouble. The very first I'd do is undervolt the thing but who knows what kind of hoops Nvidia has put up between the cards BIOS and users this time.

    Agree that the undervolting makes sense. I certainly wouldn't want it boosting.

  • Noah LGPNoah LGP Posts: 2,617
    edited December 2020

    windli3356 said:

    Noah LGP said:

    Titan RTX ($2,500) seems to be as good as a RTX 3070 ($500)

    https://www.gpucheck.com/compare/nvidia-geforce-rtx-3070-vs-nvidia-titan-rtx/

    As good in gaming performance? yes, but 8G ram is kind irreavlent if you do rendering work.  RTX titan was $1200 more expensive than 2080TI and got identical or wrose performance in gaming, but still sold a ton due to massive Vram capacity  

    24G vRam for $1500 is actually a good deal consider there never was any GPU with such vram capacity sold under $2k.  

    I purchased 2 RTX3070 , the scenes don't use more than 4 GB of VRAM for each, 24 GB would be useless in my case

    Post edited by Noah LGP on
Sign In or Register to comment.