Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
I still think the TI is something they can hold in reserve if Big Navi proves to be bigger than they are hoping/expecting. They may have little room at the moment to add extra performance to a TI model, but that doesn't mean it wont be possible as processes improve.
The 12 pin is to be blunt crap. The wire gauge is lower which means they are delivering less current on each. So they have more 12V lines in the bundle, an 8pin PCIE connector only has 3 12V pins the rest are grounds. Thinner wires and smaller connectors means lower MTBF. There is a reason the current standard was chosen. Nvidia seems to want their cards to die faster. That would be in their presentation yesterday where they were begging 1080ti owners to upgrade.
And no a 12 pin cannot deliver the same power as 3x 8. It is a 2x 8 not a 3x 8. The AIB's that are producing cards that need 3 x 8 are doing factory OC's or the like.
That whole secret untested Samsung fab is going to kill this whole thing IMO. It takes a year, usually, to work the kinks out of a new process node. If Samsung's yields aren't good then they are are going to be getting a lot of chips destined for the future 3060's etc. and not the 3070's they want to sell today. If the goal was to stomp on Microsoft and Sony before they launch then it won't work if this is a paper only launch. I'm certainly not enthused about getting those chips in Quadros. Not with the power efficiency they seem to have from the numbers we saw yesterday.
Personally I've got a bot watching eBay for pairs of 2070 Supers. If I can get a matched pair for a decent price, under $800 I'm grabbing those. That would be better than anything likely to come out anytime soon.
I doubt it. People always claim people are going to dump cards when the new line comes out. It never happens.
Further where's the incentive? What new game is coming along that will make someone just have to drop $1500+ on a 3090. Flight sim 2020? I really can't see it. Cyberpunk 2077? Maybe but by most accounts it isn't really that hard to run.
There was a reason Jensen was targeting the 1080ti owners yesterday. I'd like to get the extra VRAM to do crowd scenes but I'm very happy with my render speed on my 1080ti/2070 combo and the 1080ti is a great gaming card. I'm in no hurry to upgrade.
Slight correction Outrider - older TITANs (pre Pascal) were also sold through board partners.
Actually, pre Pascal, TITANs were also sold through board partners/3rd Party.
TITAN RTX is the same size as the 2080 and 2080 Ti.
Nvidia can easily remove NVlink and halve the VRAM on 3090 and call it 3080Ti to profitably complete its top half product stack to squeee Big Navi. That is if Big Navi is not competitive against the 3080, or is comptitive performane wise but has less than 16GB VRAM as people assumed. If Big Navi can match up against 3080 and has 16GB then Nvidia can easily counter with 20GB 3080 but likely at the cost of some profit margins. The only scenario where Nvidia does not have a ready counter today is if Big Navi can beat 3080 across the board and comes with 16GB. I really hope AMD can do it.
I'm installing everything manually, so I can see the size of textures and geometry at the time of installation and the trend has been going up with no benefit in return.
I can accept large amounts of data, if it's for a reason, but something like a bracelet that has 10 times the vertices of G8 or 2GB:s worth of textures (compressed size) of which only 20% of the ones loaded with the model are actually used for something.
Possible wet blanket time...
https://www.tweaktown.com/news/74915/exclusive-geforce-rtx-30-series-cards-tight-supply-until-end-of-year/index.html?utm_source=dlvr.it&utm_medium=twitter&utm_campaign=tweaktown
If true, this'll mean grab it while you can, if you are lucky enough to be able to get one. I actually was expecting that something like this might happen, but yeah huge grain of salt and all that. In any case, a number of us around here may be holding off until we get support for the new cards in Daz Studio. Last time it took about 4-6 month depending on how you felt about using beta software vs official release. I could be faster this time, but if the supply dries up quickly it'll be academic for the time being anyways.
Yea I also think it's great for the hype machine if 'availability is low'.. QUICK QUICK.. dont wait on reviews.. get your pre-orders in now.
Anyone know if the 3090 card may possibly work on PCIe 3 or will it be only 4?
I know that components can work on older slot versions, but with a reduced performance; I suspect however that they are incompatible but want to be sure.
Edit: they should work.
While saving up for the 3090 I had/have to live off of $250 a month for 7 months come October, after bills, so I don't have cash to burn, it's just that I want that 24gb VRAM and the monster performance boost I'll get when I can finally upgrade my 4-yr old 1080! So yeah, I feel ya as I have to stare at my PC while it takes nearly all of my VRAM that takes over 2 hours to render a 100 frame animation!
With 24 gb, that means I can actually have enough headroom to game/encode/etc while my scenes/animation are rendering!
I'm in a similar situation with regards to the RAM; performance gains for Blender, which is where I do my rendering is (according to the marketing hype) double the 2000 series; I have a 980ti and a Threadripper (which is better than my 980ti in Blender), so I'm expecting big gains.
I'd saved for a Titan, then moved house and need a lot more cash than that, but I've saved for a Titan again, and Nvidia have reduced the price (now that's afirst).
PCIE is fully compatible. If these cards were coming out and would only work on B550 and X570 motherboards with Ryzen 3000 CPU's it would be the first thing mentioned in every article on the cards.
Yeah that is the one thing for those of us out there who have PCI-E 3.0 systems, reading this article https://www.anandtech.com/show/16057/nvidia-announces-the-geforce-rtx-30-series-ampere-for-gaming-starting-with-rtx-3080-rtx-3090 leaves me with a somewhat easier position when I get around to getting one of these cards.. As I do not need to at the moment upgrade my who system as it is still serving me well.. Although at the same time I could go for an AMD system, since they are the only ones that are manufacturing PCI-E 4.0 based components right now, unlike Intel who it seems have no idea what they want to do at the moment..
You've...never really met a hard core gamer before, have you???
Various forums are crawling with people looking for the 3090. These people want the best, period. They don't care what it costs. They don't care how much electricity it uses.
Just having 60 fps is not good enough anymore. Just like you refuse to believe that even consoles will be doing 120 fps...when we've already had multiple games state out right that they will in fact do exactly that. There are new 360 fps monitors getting released as we speak. Nvidia used super high frame rates as a selling point in their presentation, they know this a big deal to a core group of gamers.
And if that is not enough, why don't you just go look at ebay right now, and look at the completed auctions for the 2080ti. I looked just yesterday and some people were still asking $1000. Well today that has dropped. I now see $750 and $700 as a common "buy it now" price, or best offer. In just a day these cards dropped $250 in value. You can also find auctions that ended with $500. For a 2080ti people. These are not too common yet, but they have happened. These cards are coming from the people who want the best at any cost. So some of these people don't really care if they sell for that low of a price.
Here are screencaps of several actual auctions that were sold.
And here is a full 2080ti Nvlink setup, including the adapter. The price is higher, but I thought this was interesting.
You can find these and more on ebay. One of the current auctions showed that the 2080ti was "trending a $1000" the last 90 days. Hmmm, I think that trend is about to take a decided down turn.
So don't say it never happens. It is quite literally happening right now. Go look for yourself. And it has happened many times before...that is why people say it in the first place. I bought my own 1080tis this way! Both of my 1080tis were purchased for less than $500 each, thanks to sellers who were buying 2080tis. You cannot possibly be more wrong about this stuff.
Daz Changelog says DS is updated to NVIDIA Iray RTX 2020.1.0 http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log so Daz will be ready it seems when they do another release update.
Iray developers blog says new cards are working with that version. https://blog.irayrender.com/
That looks like potentially good news there Kevin! I was going to wait for the supply and pricing situation to stabilize anyways, since it's a foregone conclusion that the hardware scalpers are going to hoover up everything they can and sell it at a huge markup, but it's nice to know that, for those that are lucky enough to grab a card when they launch that it might work in Daz Studio out of the gate, and that they won't have a useless hunk of silicon laying around for a few months while they wait for DS native functionality.
Of course, as Jack pointed out, until we see independent benchmarks we won't really know how stable or buggy the cards are, but I'm content to wait a few months to upgrade. I can make do with my 1080 Ti for now...
Still, potentially good news. Thanks for sharing!
Sure, but the price drop from the $2,495 Titan to the $1,499 3090 doesn't hurt either, lol. I was looking at a $1,299 2080Ti before this.
LOL. Did you look at those?
Most are from sketchy AF sellers. See the ones that say parts only. Those guys are only selling disassembled cards. I assume those are Chinese sellers selling parts from the actual factories. That's a lot of labor to get a cheap card.
I urge you to rush out and buy those. How are you at soldering SMT's? I haven't done it in 20+ years and would have to buy a soldering station.
A little more info on EVGA from their forum mod:
EVGATech_LeeM
Also, to answer about half the questions in this thread:
So the XC3 is indeed 2.2 slots, while the FTW3 is a hefty 2.75 slots. Still, it looks like all of EVGA's offerings are going to be less than 3 slots. They do not confirm what the Nvlink will work on.
In a additional post, they are stated that the XC3 for both the 3080 and 3090 are the same physical size, so both are 2.2 slots and use 2x8 pin connectors.
Here's more:
I/O bracket for all models is a 2-slot bracket. This will allow for a slightly wider card to fit in cases that some people had difficulty with on the 20-Series 2.75 slot cards.
Thickness of the 3090/3080 XC3 models with backplate (XC3 Ultra/XC3) is 1.78in. - 45.1mm.
Thickness of the 3090/3080 XC3 models without backplate (XC3 Black) is 1.61in. - 40.9mm.
Thickness of the 3090/3080 FTW3 models with backplate (i.e. all of them) is 2.19in. - 55.55mm.
Thickness of the 3090/3080 FTW3 models without backplate (manually remove) is 2.02in. - 51.35mm.
Oh, 3070 Ti, 16Gb? Check out the Youtuber.
I have a Threadripper PCIe 3 system and will be upgrading but don't want the cost this year.
I said the same thing in another thread, but I had to stand corrected when someone pointed out... dForce Hair. It's extemely dense geometry, and when subdivided can take up gigs by itself. I had forgotten because I convert them to Blender hair particle systems which are an orders of magnitude lighter.
I don't think NVidia knows about second dForce Hair.
...so it's official, just saw the Nvidia release this morning. Still a little hazy though on whether the new NVLink Bridge will function just as an SLI bridge or have full NVLink capability. 48 GB for 3,000$ would be pretty intense. particularly at about half the price of the current Turing RTX Quadro 8000. Would need a beefy PSU though as a single 3090 is rated at 350W at peak output.
...48 GB of VRAM would give me total "peace of mind" that the process would not dump to the CPU even with some of my more "epic" scenes. However from what I just read about the new NVLInk bridge, it seems to only mention SLI
Even so, imagine having almost 21,000 cores on only two cards.(of course it would require a new MB which effectively means a new system). Crikey, moving around in Nvida view mode alone would be as fast as wireframe mode, not mentioning how fast even a relatively large scene would take to render. You would almost have the total core count of 5 RTX Titans or Quadro 6000s in two cards.
This also makes me wonder if the Titan marque may have reached the end of the line with Turing and the 3090 may be its replacement.
...but AMD doesn't support CUDA so only rendering in Blender will benefit.
They are resource intensive - or some of them are; PhilW's I've notice are not.
How do you convert dforce and strand based hair in Blender? (I have managed to use them, but it ends up being a ton of geometry.)
I convert a lot of mesh hairs, and they make for some very nice styles. I even converted an Aiko 3 (yes you read it right) a couple of weeks ago.