Nvidia Ampere (2080 Ti, etc. replacements) and other rumors...

1111214161745

Comments

  • Scan.co.uk already listing 30xx partner cards. Very little info. The Gigabyte 3090 Gaming looks to be dual slot. https://www.scan.co.uk/products/gigabyte-nvidia-geforce-rtx-3090-gaming-oc-24gb-gddr6x-ray-tracing-graphics-card
  • nicsttnicstt Posts: 11,715

    I still think the TI is something they can hold in reserve if Big Navi proves to be bigger than they are hoping/expecting. They may have little room at the moment to add extra performance to a TI model, but that doesn't mean it wont be possible as processes improve.

  • RayDAnt said:

     

    I think the 12 pin is brilliant. The AIBs need to get over themselves, they just don't want to change. Just look at the 12 pin on the 3080, it takes up so little space on the board. The wire in the cable is also supposed to be a higher gauge.

    I mean, we've had 8 pins for so long now. Maybe it is time for a new standard. Just think if the power supply came with a 12 pin. One 12 pin uses a lot less cabling, so if the 12 pin started from the power supply, it would result in a little less clutter in the build. And two of these would certainly be better than 3x8 pins, which is madness. I am not looking forward to running so many cables.

    Having said that, I will be looking at EVGA. I want that extended warranty, and the customer service is top notch if it is ever needed. These GPUs are on Samsung 8nm, which is an unproven fab. Word is that yields are bad, which I spoke about above, and that they leak power, which is why Ampere needs so much power. I would bet money that if Ampere was on TSMC that it would not need 3x8 pin. There is a reason that the A100 is built on TSMC, those are serious parts that need to be reliable and be the best. But gamers are getting the 2nd string backup here. So I do have a bad feeling that the failure rate of Ampere is going to be a little high. Thus I want to have peace of mind, and EVGA has those awesome 5 and 10 year warranties. For the 3090, the 5 year would be $30, and the 10 year would be $60. On a $1500 GPU, that's nothing. Other places offer extended warranties as well, so anybody in this thread, I highly recommend you look at buying an extended warranty if you plan on keeping your cards longer than 3 years, the standard for manufacturer warranties. I am not even sure if I will keep mine or not. I have two 1080tis and they are just 3.5 years old now, just barely past their original warranties, and here I am looking to upgrade, LOL.

    A German website released information that showed the 2080ti as having a 5% RMA rate. Now this may not mean they failed at 5%, they could have been RMA'd for any number of reasons. But that 5% was still higher than any other Turing GPU. I personally expect the 3090 to have a higher rate than that, possibly as high as 10%.

    And one more thing, for those bummed about the 3070 having 8GB, just know that Turing card prices are going to drop like stones. Why keep that 2080ti when a freakin 3070 beats it so easily? The used market is going to be flooded, and there are already reports of 2080tis being sold for as little as $500. I don't know if that is true, but I certainly find it hilarious. I have been telling people to sell their freakin cards before Ampere hit. They just lost a big opportunity. At any rate, their loss could be your gain. If Turing prices do drop like stones, you have the option of snatching them up and using Nvlink on the ones that support it. Of course, then Nvidia might release that 16GB 3070 after you buy two 2070 Supers, LOL.

    The 12 pin is to be blunt crap. The wire gauge is lower which means they are delivering less current on each. So they have more 12V lines in the bundle, an 8pin PCIE connector only has 3 12V pins the rest are grounds. Thinner wires and smaller connectors means lower MTBF. There is a reason the current standard was chosen. Nvidia seems to want their cards to die faster. That would be in their presentation yesterday where they were begging 1080ti owners to upgrade.

    And no a 12 pin cannot deliver the same power as 3x 8. It is a 2x 8 not a 3x 8. The AIB's that are producing cards that need 3 x 8 are doing factory OC's or the like.

    That whole secret untested Samsung fab is going to kill this whole thing IMO. It takes a year, usually, to work the kinks out of a new process node. If Samsung's yields aren't good then they are are going to be getting a lot of chips destined for the future 3060's etc. and not the 3070's they want to sell today. If the goal was to stomp on Microsoft and Sony before they launch then it won't work if this is a paper only launch. I'm certainly not enthused about getting those chips in Quadros. Not with the power efficiency they seem to have from the numbers we saw yesterday.

    Personally I've got a bot watching eBay for pairs of 2070 Supers. If I can get a matched pair for a decent price, under $800 I'm grabbing those. That would be better than anything likely to come out anytime soon.

  • droidy001droidy001 Posts: 282
    edited September 2020
    kenshaw, You don't think that sort of money would get you a pair of 2080ti in 2-3 months?
    Post edited by droidy001 on
  • droidy001 said:
    kenshaw, You don't think that sort of money would get you a pair of 2080ti in 2-3 months?

    I doubt it. People always claim people are going to dump cards when the new line comes out. It never happens.

    Further where's the incentive? What new game is coming along that will make someone just have to drop $1500+ on a 3090. Flight sim 2020? I really can't see it. Cyberpunk 2077? Maybe but by most accounts it isn't really that hard to run.

    There was a reason Jensen was targeting the 1080ti owners yesterday. I'd like to get the extra VRAM to do crowd scenes but I'm very happy with my render speed on my 1080ti/2070 combo and the 1080ti is a great gaming card. I'm in no hurry to upgrade.

  • droidy001 said:
    kenshaw, You don't think that sort of money would get you a pair of 2080ti in 2-3 months?

    I doubt it. People always claim people are going to dump cards when the new line comes out. It never happens.

    Further where's the incentive? What new game is coming along that will make someone just have to drop $1500+ on a 3090. Flight sim 2020? I really can't see it. Cyberpunk 2077? Maybe but by most accounts it isn't really that hard to run.

    There was a reason Jensen was targeting the 1080ti owners yesterday. I'd like to get the extra VRAM to do crowd scenes but I'm very happy with my render speed on my 1080ti/2070 combo and the 1080ti is a great gaming card. I'm in no hurry to upgrade.

    There may not be a massive rush to dump 2080 Ti cards, but if a 3070 is equal to one then surely a used 2080 ti can't sell for more. It just wouldn't make sense to buy a used card for more than you could buy a new one of equal performance.
  • Slight correction Outrider - older TITANs (pre Pascal) were also sold through board partners. 

     
  • VisuimagVisuimag Posts: 571
    edited September 2020

     

    RayDAnt said:

     

    No 3rd party ever sold Titans.

    Actually, pre Pascal, TITANs were also sold through board partners/3rd Party.

     

    i53570k said:

     

     

    The Titans have always been very big cards. The 3090 is not much larger than the RTX Titan.

     

    TITAN RTX is the same size as the 2080 and 2080 Ti.

     

     
    Post edited by Visuimag on
  • i53570ki53570k Posts: 212
    nicstt said:

    I still think the TI is something they can hold in reserve if Big Navi proves to be bigger than they are hoping/expecting. They may have little room at the moment to add extra performance to a TI model, but that doesn't mean it wont be possible as processes improve.

    Nvidia can easily remove NVlink and halve the VRAM on 3090 and call it 3080Ti to profitably complete its top half product stack to squeee Big Navi.  That is if Big Navi is not competitive against the 3080, or is comptitive performane wise but has less than 16GB VRAM as people assumed.  If Big Navi can match up against 3080 and has 16GB then Nvidia can easily counter with 20GB 3080 but likely at the cost of some profit margins.  The only scenario where Nvidia does not have a ready counter today is if Big Navi can beat 3080 across the board and comes with 16GB. I really hope AMD can do it.

  • PerttiAPerttiA Posts: 10,024
    nicstt said:
    PerttiA said:

    Surprisingly cheap, but... If there are no affordable 12-16GB models coming, I'd rather get an other 2070 Super + NVLink (=16GB VRAM).

    Speed is no longer an issue but the the bloated models, which lead to lack of VRAM.

    Bloated: in what way?

    Do you mean they are providing high resolution textures you have no need for, but others (myself included) want?

    Or geometry?

    Nvidia officially stated that geometry doesn't have much affect - and personal experience is it doesn't take up much room untill one starts to add shedloads of subD or equivalent.

    I'm installing everything manually, so I can see the size of textures and geometry at the time of installation and the trend has been going up with no benefit in return.

    I can accept large amounts of data, if it's for a reason, but something like a bracelet that has 10 times the vertices of G8 or 2GB:s worth of textures (compressed size) of which only 20% of the ones loaded with the model are actually used for something.

  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited September 2020

    Possible wet blanket time...

    https://www.tweaktown.com/news/74915/exclusive-geforce-rtx-30-series-cards-tight-supply-until-end-of-year/index.html?utm_source=dlvr.it&utm_medium=twitter&utm_campaign=tweaktown

    If true, this'll mean grab it while you can, if you are lucky enough to be able to get one.  I actually was expecting that something like this might happen, but yeah huge grain of salt and all that.  In any case, a number of us around here may be holding off until we get support for the new cards in Daz Studio.  Last time it took about 4-6 month depending on how you felt about using beta software vs official release.  I could be faster this time, but if the supply dries up quickly it'll be academic for the time being anyways.

    Post edited by tj_1ca9500b on
  • Possible wet blanket time...

    https://www.tweaktown.com/news/74915/exclusive-geforce-rtx-30-series-cards-tight-supply-until-end-of-year/index.html?utm_source=dlvr.it&utm_medium=twitter&utm_campaign=tweaktown

    If true, this'll mean grab it while you can, if you are lucky enough to be able to get one.  I actually was expecting that something like this might happen, but yeah huge grain of salt and all that.  In any case, a number of us around here may be holding off until we get support for the new cards in Daz Studio.  Last time it took about 4-6 month depending on how you felt about using beta software vs official release.  I could be faster this time, but if the supply dries up quickly it'll be academic for the time being anyways.

    Yea I also think it's great for the hype machine if 'availability is low'.. QUICK QUICK.. dont wait on reviews.. get your pre-orders in now.

  • nicsttnicstt Posts: 11,715
    edited September 2020

    Anyone know if the 3090 card may possibly work on PCIe 3 or will it be only 4?

    I know that components can work on older slot versions, but with a reduced performance; I suspect however that they are incompatible but want to be sure.

    Edit: they should work.

    Post edited by nicstt on
  • marble said:

    I am disappointed but not surprised that the 3070 is stuck with the same 8GB of VRAM that I have in my GTX 1070. VRAM is *the* limiting factor of IRay and NVidia continue to ignore this or push us towards the expensive end which is beyond the spending capacity of many hobbyists such as myself. I'm guessing that Optix and some of the other fancy features in the new cards will require even more VRAM so my scenes will soon be down to a couple of characters with texture sizes reduced beyond the point where the seams start to show. High hopes dashed for me but I can see the thrill of those with cash to burn.

    While saving up for the 3090 I had/have to live off of $250 a month for 7 months come October, after bills, so I don't have cash to burn, it's just that I want that 24gb VRAM and the monster performance boost I'll get when I can finally upgrade my 4-yr old 1080! So yeah, I feel ya as I have to stare at my PC while it takes nearly all of my VRAM that takes over 2 hours to render a 100 frame animation!

    With 24 gb, that means I can actually have enough headroom to game/encode/etc while my scenes/animation are rendering!

  • nicsttnicstt Posts: 11,715
    marble said:

    I am disappointed but not surprised that the 3070 is stuck with the same 8GB of VRAM that I have in my GTX 1070. VRAM is *the* limiting factor of IRay and NVidia continue to ignore this or push us towards the expensive end which is beyond the spending capacity of many hobbyists such as myself. I'm guessing that Optix and some of the other fancy features in the new cards will require even more VRAM so my scenes will soon be down to a couple of characters with texture sizes reduced beyond the point where the seams start to show. High hopes dashed for me but I can see the thrill of those with cash to burn.

    While saving up for the 3090 I had/have to live off of $250 a month for 7 months come October, after bills, so I don't have cash to burn, it's just that I want that 24gb VRAM and the monster performance boost I'll get when I can finally upgrade my 4-yr old 1080! So yeah, I feel ya as I have to stare at my PC while it takes nearly all of my VRAM that takes over 2 hours to render a 100 frame animation!

    With 24 gb, that means I can actually have enough headroom to game/encode/etc while my scenes/animation are rendering!

    I'm in a similar situation with regards to the RAM; performance gains for Blender, which is where I do my rendering is (according to the marketing hype) double the 2000 series; I have a 980ti and a Threadripper (which is better than my 980ti in Blender), so I'm expecting big gains.

    I'd saved for a Titan, then moved house and need a lot more cash than that, but I've saved for a Titan again, and Nvidia have reduced the price (now that's afirst).

  • nicstt said:

    Anyone know if the 3090 card may possibly work on PCIe 3 or will it be only 4?

    I know that components can work on older slot versions, but with a reduced performance; I suspect however that they are incompatible but want to be sure.

    Edit: they should work.

    PCIE is fully compatible. If these cards were coming out and would only work on B550 and X570 motherboards with Ryzen 3000 CPU's it would be the first thing mentioned in every article on the cards.

  • Ghosty12Ghosty12 Posts: 2,068
    nicstt said:

    Anyone know if the 3090 card may possibly work on PCIe 3 or will it be only 4?

    I know that components can work on older slot versions, but with a reduced performance; I suspect however that they are incompatible but want to be sure.

    Edit: they should work.

    Yeah that is the one thing for those of us out there who have PCI-E 3.0 systems, reading this article https://www.anandtech.com/show/16057/nvidia-announces-the-geforce-rtx-30-series-ampere-for-gaming-starting-with-rtx-3080-rtx-3090 leaves me with a somewhat easier position when I get around to getting one of these cards.. As I do not need to at the moment upgrade my who system as it is still serving me well..  Although at the same time I could go for an AMD system, since they are the only ones that are manufacturing PCI-E 4.0 based components right now, unlike Intel who it seems have no idea what they want to do at the moment..

  • outrider42outrider42 Posts: 3,679
    droidy001 said:
    kenshaw, You don't think that sort of money would get you a pair of 2080ti in 2-3 months?

    I doubt it. People always claim people are going to dump cards when the new line comes out. It never happens.

    Further where's the incentive? What new game is coming along that will make someone just have to drop $1500+ on a 3090. Flight sim 2020? I really can't see it. Cyberpunk 2077? Maybe but by most accounts it isn't really that hard to run.

    There was a reason Jensen was targeting the 1080ti owners yesterday. I'd like to get the extra VRAM to do crowd scenes but I'm very happy with my render speed on my 1080ti/2070 combo and the 1080ti is a great gaming card. I'm in no hurry to upgrade.

    You've...never really met a hard core gamer before, have you???

    Various forums are crawling with people looking for the 3090. These people want the best, period. They don't care what it costs. They don't care how much electricity it uses.

    Just having 60 fps is not good enough anymore. Just like you refuse to believe that even consoles will be doing 120 fps...when we've already had multiple games state out right that they will in fact do exactly that. There are new 360 fps monitors getting released as we speak. Nvidia used super high frame rates as a selling point in their presentation, they know this a big deal to a core group of gamers.

    And if that is not enough, why don't you just go look at ebay right now, and look at the completed auctions for the 2080ti. I looked just yesterday and some people were still asking $1000. Well today that has dropped. I now see $750 and $700 as a common "buy it now" price, or best offer. In just a day these cards dropped $250 in value. You can also find auctions that ended with $500. For a 2080ti people. These are not too common yet, but they have happened. These cards are coming from the people who want the best at any cost. So some of these people don't really care if they sell for that low of a price.

    Here are screencaps of several actual auctions that were sold.

    And here is a full 2080ti Nvlink setup, including the adapter. The price is higher, but I thought this was interesting.

    You can find these and more on ebay. One of the current auctions showed that the 2080ti was "trending a $1000" the last 90 days. Hmmm, I think that trend is about to take a decided down turn.

    So don't say it never happens. It is quite literally happening right now. Go look for yourself. And it has happened many times before...that is why people say it in the first place. I bought my own 1080tis this way! Both of my 1080tis were purchased for less than $500 each, thanks to sellers who were buying 2080tis. You cannot possibly be more wrong about this stuff.

  • It great marketing from nvidia. Gamers didn't really go for the Titan but call it a 3090, show it playing games 8k at 60 fps and they all want it.
  • Kevin SandersonKevin Sanderson Posts: 1,643
    edited September 2020

    Daz Changelog says DS is updated  to NVIDIA Iray RTX 2020.1.0 http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log so Daz will be ready it seems when they do another release update.

    Iray developers blog says new cards are working with that version. https://blog.irayrender.com/

     

    Post edited by Kevin Sanderson on
  • tj_1ca9500btj_1ca9500b Posts: 2,057
    edited September 2020

    Daz Changelog says DS is updated  to NVIDIA Iray RTX 2020.1.0 http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log

    Iray developers blog says new cards are working with that version. https://blog.irayrender.com/

     

    That looks like potentially good news there Kevin!  I was going to wait for the supply and pricing situation to stabilize anyways, since it's a foregone conclusion that the hardware scalpers are going to hoover up everything they can and sell it at a huge markup, but it's nice to know that, for those that are lucky enough to grab a card when they launch that it might work in Daz Studio out of the gate, and that they won't have a useless hunk of silicon laying around for a few months while they wait for DS native functionality.

    Of course, as Jack pointed out, until we see independent benchmarks we won't really know how stable or buggy the cards are, but I'm content to wait a few months to upgrade.  I can make do with my 1080 Ti for now...

    Still, potentially good news.  Thanks for sharing!

    Post edited by tj_1ca9500b on
  • droidy001 said:
    It great marketing from nvidia. Gamers didn't really go for the Titan but call it a 3090, show it playing games 8k at 60 fps and they all want it.

    Sure, but the price drop from the $2,495 Titan to the $1,499 3090 doesn't hurt either, lol. I was looking at a $1,299 2080Ti before this.

  • droidy001 said:
    kenshaw, You don't think that sort of money would get you a pair of 2080ti in 2-3 months?

    I doubt it. People always claim people are going to dump cards when the new line comes out. It never happens.

    Further where's the incentive? What new game is coming along that will make someone just have to drop $1500+ on a 3090. Flight sim 2020? I really can't see it. Cyberpunk 2077? Maybe but by most accounts it isn't really that hard to run.

    There was a reason Jensen was targeting the 1080ti owners yesterday. I'd like to get the extra VRAM to do crowd scenes but I'm very happy with my render speed on my 1080ti/2070 combo and the 1080ti is a great gaming card. I'm in no hurry to upgrade.

    You've...never really met a hard core gamer before, have you???

    Various forums are crawling with people looking for the 3090. These people want the best, period. They don't care what it costs. They don't care how much electricity it uses.

    Just having 60 fps is not good enough anymore. Just like you refuse to believe that even consoles will be doing 120 fps...when we've already had multiple games state out right that they will in fact do exactly that. There are new 360 fps monitors getting released as we speak. Nvidia used super high frame rates as a selling point in their presentation, they know this a big deal to a core group of gamers.

    And if that is not enough, why don't you just go look at ebay right now, and look at the completed auctions for the 2080ti. I looked just yesterday and some people were still asking $1000. Well today that has dropped. I now see $750 and $700 as a common "buy it now" price, or best offer. In just a day these cards dropped $250 in value. You can also find auctions that ended with $500. For a 2080ti people. These are not too common yet, but they have happened. These cards are coming from the people who want the best at any cost. So some of these people don't really care if they sell for that low of a price.

    Here are screencaps of several actual auctions that were sold.

    And here is a full 2080ti Nvlink setup, including the adapter. The price is higher, but I thought this was interesting.

    You can find these and more on ebay. One of the current auctions showed that the 2080ti was "trending a $1000" the last 90 days. Hmmm, I think that trend is about to take a decided down turn.

    So don't say it never happens. It is quite literally happening right now. Go look for yourself. And it has happened many times before...that is why people say it in the first place. I bought my own 1080tis this way! Both of my 1080tis were purchased for less than $500 each, thanks to sellers who were buying 2080tis. You cannot possibly be more wrong about this stuff.

    LOL. Did you look at those?

    Most are from sketchy AF sellers. See the ones that say parts only. Those guys are only selling disassembled cards. I assume those are Chinese sellers selling parts from the actual factories. That's a lot of labor to get a cheap card. 

    I urge you to rush out and buy those. How are you at soldering SMT's? I haven't done it in 20+ years and would have to buy a soldering station.

  • outrider42outrider42 Posts: 3,679

    A little more info on EVGA from their forum mod:

    EVGATech_LeeM
    Also, to answer about half the questions in this thread:

    • Currently no pre-order planned.  Similarly, no pricing is available at this time.
    • Like the US store, we cannot generally comment on availability at this time for the EU store.  Availability, however, will not be any sooner than the dates NVIDIA mentioned for each GPU.
    • HYBRID, HC, and KPE cards will come a bit later than the XC3 and FTW3 cards.  No ETA to give you, which is probably your next question.
    • HYBRIDs will be 240mm, except for KPE HYBRID, whch will be 360mm.  No plans for a 120mm HYBRID at this time.
    • HYBRIDs will be 2 Slot.
    • XC3 cards will all be 2.2 slots.  Length is 11.23in. - 285.37mm / Height is 4.38in. - 111.15 mm.
    • FTW3 cards will be 2.75 slots.  Length is 11.81in. - 300mm / Height is 5.38in. - 136.75mm.
    • There will be an EVGA NVLINK.  No ETA at this time.
    • Since I've seen this mentioned incorrectly, all cards are 3 DisplayPort, 1 HDMI.
    • PowerLink is expected to work with the XC3 models.  For obvious reasons, it will not work with the 3090/3080 FTW3 cards due to the number of PCIe power connectors.  Either way, we will have the compatibility list updated when the cards are available on the website.
    • Step-Up will begin when cards are available.  Products will not be listed prior to general availability.  I would expect one of the XC3 models will be listed for availability, but we will make that decision prior to general availability.
    • I'm not sure why people assume bots are buying cards over regular people.  It was a popular reason for why 20-Series cards were always out of stock, which completely ignored literal supply and demand issues (lots of people wanted them, but there weren't many initially).  Yes, there are per person quantity caps for each card.  Yes, we require captcha to login, which is also required to create a profile.  This should help to comfort some of you.

    So the XC3 is indeed 2.2 slots, while the FTW3 is a hefty 2.75 slots. Still, it looks like all of EVGA's offerings are going to be less than 3 slots. They do not confirm what the Nvlink will work on.

    In a additional post, they are stated that the XC3 for both the 3080 and 3090 are the same physical size, so both are 2.2 slots and use 2x8 pin connectors.

    Here's more:

    I/O bracket for all models is a 2-slot bracket.  This will allow for a slightly wider card to fit in cases that some people had difficulty with on the 20-Series 2.75 slot cards.

    Thickness of the 3090/3080 XC3 models with backplate (XC3 Ultra/XC3) is 1.78in. - 45.1mm.
    Thickness of the 3090/3080 XC3 models without backplate (XC3 Black) is 1.61in. - 40.9mm.
     
    Thickness of the 3090/3080 FTW3 models with backplate (i.e. all of them) is 2.19in. - 55.55mm.
    Thickness of the 3090/3080 FTW3 models without backplate (manually remove) is 2.02in. - 51.35mm.

     

  • RobinsonRobinson Posts: 751
    edited September 2020

    Oh, 3070 Ti, 16Gb?  Check out the Youtuber.

    Post edited by Robinson on
  • nicsttnicstt Posts: 11,715
    edited September 2020
    Ghosty12 said:
    nicstt said:

    Anyone know if the 3090 card may possibly work on PCIe 3 or will it be only 4?

    I know that components can work on older slot versions, but with a reduced performance; I suspect however that they are incompatible but want to be sure.

    Edit: they should work.

    Yeah that is the one thing for those of us out there who have PCI-E 3.0 systems, reading this article https://www.anandtech.com/show/16057/nvidia-announces-the-geforce-rtx-30-series-ampere-for-gaming-starting-with-rtx-3080-rtx-3090 leaves me with a somewhat easier position when I get around to getting one of these cards.. As I do not need to at the moment upgrade my who system as it is still serving me well..  Although at the same time I could go for an AMD system, since they are the only ones that are manufacturing PCI-E 4.0 based components right now, unlike Intel who it seems have no idea what they want to do at the moment..

    I have a Threadripper PCIe 3 system and will be upgrading but don't want the cost this year.

    Post edited by nicstt on
  • nicstt said:
    PerttiA said:

    Surprisingly cheap, but... If there are no affordable 12-16GB models coming, I'd rather get an other 2070 Super + NVLink (=16GB VRAM).

    Speed is no longer an issue but the the bloated models, which lead to lack of VRAM.

    Bloated: in what way?

    Do you mean they are providing high resolution textures you have no need for, but others (myself included) want?

    Or geometry?

    Nvidia officially stated that geometry doesn't have much affect - and personal experience is it doesn't take up much room untill one starts to add shedloads of subD or equivalent.

    I said the same thing in another thread, but I had to stand corrected when someone pointed out... dForce Hair. It's extemely dense geometry, and when subdivided can take up gigs by itself. I had forgotten because I convert them to Blender hair particle systems which are an orders of magnitude lighter.

    I don't think NVidia knows about second dForce Hair.

  • kyoto kidkyoto kid Posts: 41,256

    ...so it's official, just saw the Nvidia release this morning.  Still a little hazy though on whether the new NVLink Bridge will function just as an SLI bridge or have full NVLink capability.  48 GB for 3,000$ would be pretty intense. particularly at about half the price of the current Turing RTX Quadro 8000. Would need a beefy PSU though as a single 3090 is rated at 350W at peak output.

    fred9803 said:
    kyoto kid said:

    ..SLI? I thought they moved on to NVLInk. So much for pooling memory. 

    I think only the 3090s will be NVLink compatable. Personally I'd never need 48GB of VRAM for any of the stuff I do.

    ...48 GB of VRAM would give me total "peace of mind" that the process would not dump to the CPU even with some of my more "epic" scenes.  However from what I just read about the new NVLInk bridge, it seems to only mention SLI 

    Even so, imagine having almost 21,000 cores on only two cards.(of course it would require a new MB which effectively means a new system). Crikey, moving around in Nvida view mode alone would be as fast as wireframe mode, not mentioning how fast even a relatively large scene would take to render. You would almost have the total core count of 5 RTX Titans or Quadro 6000s in two cards.

    This also makes me wonder if the Titan marque may have reached the end of the line with Turing and the 3090 may be its replacement.

  • kyoto kidkyoto kid Posts: 41,256

    So I will mostly likely buy the 3070 16GB when & if it comes out or a AMD Big Navi with 16GB or more RAM if the Big Navi has close or better number of compute units and all that specialized mumbo jumbo as the 3070 has. It looks very good now to quickly use all those models I've accumulated from DAZ 3D. Days of rendering for one 4K image (in DAZ Studio or Blender) are gone!

    ...but AMD doesn't support CUDA so only rendering in Blender will benefit. 

  • nicsttnicstt Posts: 11,715
    nicstt said:
    PerttiA said:

    Surprisingly cheap, but... If there are no affordable 12-16GB models coming, I'd rather get an other 2070 Super + NVLink (=16GB VRAM).

    Speed is no longer an issue but the the bloated models, which lead to lack of VRAM.

    Bloated: in what way?

    Do you mean they are providing high resolution textures you have no need for, but others (myself included) want?

    Or geometry?

    Nvidia officially stated that geometry doesn't have much affect - and personal experience is it doesn't take up much room untill one starts to add shedloads of subD or equivalent.

    I said the same thing in another thread, but I had to stand corrected when someone pointed out... dForce Hair. It's extemely dense geometry, and when subdivided can take up gigs by itself. I had forgotten because I convert them to Blender hair particle systems which are an orders of magnitude lighter.

    I don't think NVidia knows about second dForce Hair.

    They are resource intensive - or some of them are; PhilW's I've notice are not.

    How do you convert dforce and strand based hair in Blender? (I have managed to use them, but it ends up being a ton of geometry.)

    I convert a lot of mesh hairs, and they make for some very nice styles. I even converted an Aiko 3 (yes you read it right) a couple of weeks ago.

Sign In or Register to comment.