Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Like I said, I don't know a program that does 100% mem pooling, so the "if" secures my point. If I saw a benchmark of Studio using 100% scaling, then that would be different.
Gotcha! Thank you for the insight, it makes sense what you're saying. I looked up "NVidia A100" and Engadget says it's "not for me", so I guess I'll just keep waiting, as well HAHA.
In all honesty, I picked up that Quadro 8000 hoping it would last me for at least 4 years... even though I know the speeds will lack in compairison I shouldn't have any need for more VRAM and since it's designed to work nonstop (which is pretty much what I have it doing) I'm not that concerned about it dying out. I know most people think it's crazy, but to me if I can get a return of about 800 dollars a year over the spread of 4 - 5 years it was worth it. I guess, realistically, I'm not in any rush to see new ones released. I just like looking at shiny things!
I think there's an ATI RAGE in the box in my closet. I didn't know what to do with it, but the person who gave it to me was very wise.
I still have a pair of GeForce 2 MXes in two builds. One of those builds still fired up the last time I powered it on...
My system needs have far oustripped those two computers anyways. Athlon Thunderbirds were good processors back in the day, but yeah, no comparison to today's hardware!
There are collectors for that sort of thing. If it's in good shape etc. there are people who might be interested in buying it.
NVLink can't achieve perfect scaling. There has to be some overhead. That's not just in DS. That's in every application that uses it.
Nice AMA from the NVIDIA team here-answers some of the questions people have had (why only 10GB RAM, rasterization speeds, CUDA cores, etc.)
https://www.reddit.com/r/nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered/
Yes, well it's already been stated by multiple folk that those prior leaks weren't happening either; but I'm not inclined to believe that those leaks weren't anything other than typical business - journalistic collusion to attract readers and hype product that actually is in the works. Why doesn't AMD get the same sort of "leak" coverage even though they "leak" too? Because those leaks don't draw in as near as many readers. I still remember when there were many more Apple leak sites than actual Apple products to leak about.
I finally threw my ATI Rage away along with a 2006 Dell PC and 2008 MacMini. The space was more important than the tech. I still have my 2003 Toshiba Portege M205 Portable Tablet though.
@gerster Note that Blender hair particle systems can and indeed should replace dForce hair, is much, much more efficient than representing the hair as geometry, and is perhaps the only obvious reason to care about geometry memory usage, in the face of texture memory usage. At least for Blender, the limitation you correctly point out is really not so big a deal.
But is there anything else in Daz Studio that can be a geometry memory hog?
I'm aware. I wasn't expecting anyone to show me 100% scaling in DAZ and my very first post made it clear that NVLINK wouldn't scale 100%.
I know it is popular to diss New Jersey, but damn dude, I wouldn't say they are part of China yet. And oh, this one has 100% positive feedback, too. But I guess that sounds so sketchy! I mean, he is in New Jersey. You sure seem to assume a lot of things.
Here are some other 100% working 2080tis selling from the United States. These are just a small sample.
So I didn't check on the parts only ones on the first quick look. OK, sure. But there are PLENTY more fully functional cards out there in the $500-700 range. Now remember, this was a $1200 card, $1200 vs $550....hmmm. The fact is people are getting rid of their cards. Cards are flooding the market, and when you have a flood of products hitting a market...prices drop. That's business 101. I am sure you have observed this in your field of work, correct? Some of these are extremes, but it is very easy to find a used 2080ti for $600-700. At $650, that is only half the original price (or less, as so many 2080tis often sold for way more than that.) Just last week these same cards were almost universally going for $1000 and up on ebay. They have lost a lot of value in just days. Something must have happened a few days ago to cause this wild fluctuation. Maybe I am just speculating?
BTW, many of these 2080tis should still have a warranty on them, depending on brand. Since they launched in Sept 2018 they should still have 1 year of warranty left on them. Though buyers might want to double check that the brand they want allows warranties to transfer. I know that EVGA allows it, from my own experience. I believe MSI does as well, I have not checked other brands. So if you do buy one and it croaks on you, you still have a recourse at least with EVGA. No soldering required! Just postage. I paid around $20 to ship my dead 1080ti across the country to EVGA, and they paid to ship it back. And again, I bought that card used off ebay for $500. I never gave EVGA a dime.
These prices may not hold for a long period of time. I do think that once people start getting their new cards, prices will stabilize a bit. But they will never go back to close to $1000. Not when the 3080 blows it away so badly. The 3070 may lack 3 GB, so the 2080ti still holds that advantage over it. But if the 16GB version of the 3070 is ever released, oh boy. All things considered, I think that if the 3070 16GB is ever released, it will only cost $100 more. If that happens, and you have a 16GB GPU that is faster than the 2080ti and costs only $600, then the 2080ti prices will absolutely plummet for sure. Why would anybody want a 2080ti after that point?
At that sort of price, the 2080ti is looking like a viable monitor card, allowing me to retire my 980ti and use a 3090 for rendering.
There may be some cards today, and the people who are selling them are to be as nice as possible chumps. If people believe Nvidia's marketing claims I just can't help them. That 40 minute presentation, how many different times did Jensen, or the "streamer guy" essentially say upgrade your 1080ti? Was there any other point to the whole thing? Sure they made a few passing stabs at the folks, however few of them there are that upgrade every cycle, but they clearly were pushing hard to get people to buy before the reviews drop. Why? If these things really crush my 1080ti that badly then a whole new gen of games will come out and I'll have no choice. So again why the hard sell?
BTW what are those guys doing for GPU's while waiting for the new cards?
Lenovo, leaked a part number for a 3070TI with 16gb of video ram. This might be the way to go.Most 3070's and 3080's will be 8gb and the 3090 24gb but at $1500 for a video card this it too over priced in this economy.
Looking at Ebay here in Australia for 2080Ti's could cause heart palpitations as they are far from cheap.. lol https://www.ebay.com.au/sch/i.html?_from=R40&_trksid=m570.l1313&_nkw=2080ti&_sacat=0
You can say whatever you want about the people doing it, it does not change the facts. The fact is that people are selling these cards and flooding the markets. And the fact is that their value will only continue to fall in the coming months as people get their hands on new cards and sell their old ones. This is only the beginning.
This dude builds computers for a living. He also does a youtube channel. He knows his stuff. He talks about the price drops on these cards, and how they could go even further.
Feel free to debate him if you like!
We do have a hands on thanks to Digital Foundry. If DF is lying, their reputation would be destroyed, gamers would turn on them in a heartbeat. It has been seen over 700K times, with 36K likes versus 945 dislikes.
Nvidia has also shown performance capture of Doom Eternal with the 3080 vs the 2080ti. If these numbers are doctored, then that would be false advertising, and grounds for a lawsuit.
Also take notice that this video of Doom has been viewed 1.3 million times in less than 24 hours. It also has 51K likes versus only 983 dislikes, which is an extremely high ratio for youtube. And that is doubly shocking for a gaming themed video, which almost always has fanboys fighting it out.
+1
https://www.scan.co.uk/products/3xs-vengeance-x8-intel-core-i9-10850k-comet-lake-32gb-ddr4-10gb-nvidia-rtx-3080-2tb-m2-ssd-win-10
First system I've seen available for pre-order
Cheapest I saw for sale in the uK was £400 on ebay
I agree with you, but feel kensshaw has a point; short on details lots of requests for 1080ti owners to damn well upgrade now.
Saw this...
Q: Content creator here. Will these cards be compatible with GPU renderers like Octane/Arnold/Redshift/etc from launch? I know with previous generations, a new CUDA version coincided with the launch and made the cards inert for rendering until the 3rd-party software patched it in, but I'm wondering if I will be able to use these on launch day using existing CUDA software.
Link
A CUDA update will be needed for some renderers. We have been working closely with the major creative apps on these updates and expect the majority (hopefully all!) to be ready on the day these cards hit the shelves.
SOURCE
https://www.nvidia.com/en-us/geforce/news/rtx-30-series-community-qa/
Personally I find any messages of "told you to sell it before the new ones come out" or "imagine being a 2080 TI owner" that can be found on YouTube videos now to have very little merrit. I can't just sell my video card and then be without one for months. I need it. Also this is stuff that kinda goes without saying. New products come out and old ones lose value, sometimes more, sometimes less. As long as you're not doing the obviously dumb stuff like buying old right before new comes out, unless you literally do not care about losing money of course, don't fret about it. Surely you got benefits that were worth it, otherwise you probably wouldn't have spent the money at the time.
These comments come from people who opted not to buy a 2080TI and now feel better about it. But the simple fact is they did not have the benefits of a 2080TI either. It's pure self validation and is certainly not going to make anyone go "oh my god had I only listened".
Lucky to see that sort of price here right now, as it seems not everyone is unloading their used GPU's at the moment, going by some of the prices..
RTX looking good. This game has better looking people in it.
That looks good, but it also looks like cinematic sequence, not game play. If the gameplay looks that good, that would be very impressive.
That is gameplay. You should watch the Nvidia rollout. Pretty eye opening.
Wow, I just watched the doom one, that was pretty crazy looking. I have to admit, I am not much of a gamer anymore. My reflexes aren't what they used to be lol. I smashed my last xbox360 controller playing, or more accurately getting my ass handed to me, in the last dark souls that was releasd. I used to love those games lol. The last two games I played was ff 14 I think, the MMO one that has the techno guys trying to take over the world, and a modded beyond recognition fallout 4.
I remember reading/watching videos about the 20xx series, where people said don't buy it, hold on to your 10xx cards, as this is experimental and if history is anything to go by the generation after is going to be the big one. I bought a 2070 just because I wanted to see the RTX performance (GTX 970 before anyway, and usually skip a generation). Was pretty amazed by it compared to the 970. But it's two gens so it should be a lot better!
I would wait a few more months before deciding which 30xx card to buy. There will almost certainly be higher ram "supers" released. At least I hope there are. The amounts of RAM on these cards doesn't seem quite right to me.
I bought my 2080ti in April no regrets and in no hurry to replace it. Next time will most likely in a few years along with a new system...
I got $600 for my 1080ti..