Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
The whole 8k gaming thing has been debunkt. Beyond a select few titles that have had full optmisations for 8k and are DLSS enabled, the 8k performance is horrible on a 3090. Unplayable. the 3080 will fare worse.
it was just a marketing gimmick, that they have copped a lot of flack for. 8k gaming is not ready yet, so monitor manufacturers are not going to invest in it yet.
anyone has any benchmark result on 3090 vs 3080 in iray engine?
From what I read on 3090 vs 3080's perfomrance in other rendering apps' , 3090 is only 15%-20% faster than 3080 even with a whopping 14G more vrams, I originally wanted a 3090, but 3080 is now very tempting.
vram has 0 effect on render speed if your scene fits into the vram on the card being used to render.
Only time vram comes into play for render speed is if what you are trying to render is too big for the vram available, so the render drops back to CPU mode.
that 15-20% speed increase you mentioned is purely down to additional cuda/RT cores etc on the 3090
VRAM capacity has no direct efffect on rendering performance (ie. speed) in Iray rendering. Just on the size/complexity of scenes able to be rendered on the GPU in question. The 3090 has 20% more processing cores on it than the 3080 which is where the performance uplift comes from. As to actual 3080/3090 Iray benchmarks - Daz has yet to release to the public thef irst 3000 series supporting beta of Daz Studio. So no numbers yet (despite having forum users with the cards already.)
I know, I was hoping someone with the card would tested iray performance in other apps that using iray engine such as iclone or CC3 which could directly reflect the iray performance in Daz, scene complexity is also a big thing for me since I don't enjoy spending time cut the corner on every render I do. but still, wish to read more test results.
Don't you think though, that Nvidia would strike a mortal blow if they could release a 3080 with 20GB VRAM and DLink for $1000 (+/- $100) ?
I would have thought it would be about 999.00, and not less than 950; AMD however, may have an influence on Nvidia's plans; I can't help but wonder if Nvidia rushed out the cards because of what AMD might be bringing to the table.
Ahh well, competition is great... Roll on the competition.
TBH, claiming it's for content creators, yet not providing the software that allows content creators to change RAM useage to suite their own needs, negates some of those claims. Sure it is great for them, but they loose some RAM to windows if they decide not to use it as a display card.
It is about 15% on every engine I've seen (funny how none of the examples use IRAY - that tells me something, another reason I now use Blender for Rendering), which is still worth having. If you're buying a 3090, then you're getting it for the VRAM not the extra performance, unless of course, one should simply get the best "'cause".
Whilst true, there is a caveat.
If it doesn't fit on the card, then it can affect performance; it can also increase workload on the user as they have to try and get it to fit. That can be quick, just by reducing texture sizes, or can take much longer and result in stitching together images afterwards.
I'm not sure where I read it but it was an online article by a technical journalist (probably seen via the Google News page or in the New York Post) and they were claiming that at 8K in 99% of situations (meaning you've not got your face within a half foot of the 85" 8K screen) that the human eye wouldn't notice the improvement from 4K to 8K and they were talking 85" TVs hanging on walls so a 32" 8K gaming monitor is never going to be a thing on human eyes ability to see the difference.
Outrider42 has already stated that he see hardly any discernable difference between 1440P gaming monitors & 4K gaming monitors. Myself, I'm still using a 14 year old 1080P 24" monitor but my TV is 55" and 1080P and if I'm super close like within a couple of feet I notice it's 1080P but otherwise, not really (still going to get a 4K 75" TV next year though, for gaming mostly).
You can look that up yourself. The viewing distance at which 8k is visibly different from 4k on screen sizes that make any sense as a PC monitor, sub 48", is so close you'd have to be Mr. Magoo to care. This is actually a big reason why 4k has been so slow to penetrate. 1080p is visibly better than 720p at normal viewing distances. 1440p is better than 1080p but not all that much. 4k isn't better than either 1080p or 1440p for the average person with a smallish monitor.
You may see 8k TV's once the price is lower than that of an Italian sportscar but computer monitors will just be oddities and flex items.
Some 30-40 years ago, people were talking about hardcore HiFi enthusiast that had used more than the price of his house on a system to play rekords at the best possible quality and he had just one rekord to play... A test rekord...
I have had the pleasure of viewing a Samsung QLED 65" 8k screen. There really is no difference unless you are right on top of the screen. I do not have perfect eyes, but I do have an eye for this stuff since I look at TVs all the time. I was much more impressed by the $15,000 77" LG 4K OLED I saw, which should far les pixel density but still looked superior to me thanks to its perfect black. OLED has better lighting resolution. Maybe with OLED 8K might work a little better, with OLED's ability to control light at a pixel level. Sadly I have not yet seen 8K OLED yet. But 8K on a LCD screen with LED lights just isn't spectacular at all IMO. The light bleed is a serious problem that washes out any supposed gains from 8K sharpness. It is actually dimmer than the 4K QLEDs, too, LOL. So yeah, I am a big fan of OLED. You cannot beat that perfect black. You know what they say about OLED...once you go black you can't go back! I think LG should totally use that in advertising OLED.
8K is only getting hyped by the TV manufacturers. The TV companies want something to push to get people to buy new TVs, that is all there is to it. Dude, even 4K got pushed by the TV companies more than anybody else. Hollywood HATES 4K. Did you know that most special effects are actually mastered in 2K resolution rather than 4K? They do that because of how much longer it takes to render the effects at true 4K. This is why some of you may have wondered to yourselves that the 4K movie with practical effects looked better than the big blockbuster with CG effects. You are right, though maybe not for the reason you expected. So if Hollywood still is not making effects in 4K, then whoa, it will be a very long time before they do 8K.
There have been several attempts at studying 8k vs 4k, and in every case the only people that even saw a hint of difference had better than 20/20 vision.
Pretty much every gamer recognized the publicity stunt of the 8k claims. The 8K OLED TV that Nvidia gave Linus Tech Tips to test on cost $30,000. The simple fact that the TV cost that much demonstrates the absurdity of the whole thing. Huang introduced the 3090 as a Titan replacement, that was the very first thing they discussed with the card.
And yes, I did say that it is tougher to see the difference between 1440p and 4k at the size of a PC monitor. I can see it, but man I need to look hard to see it. I do not miss it at all looking at my 1440p monitor. If I had a larger TV size screen, then I would probably like to have 4k. The initial jumps are all the most visible. Going from old school 480 to 720 to 1080 are all big differences. The jump from 1080p to 1440p is immediately obvious to me on almost any size screen. But as you go above that you start to get diminishing returns. Personally I think 1440p is a great sweat spot. For 4K, it really depends on your specific situation, how far are you, and how large a screen is it, and of course, your eyes.
So 8K is a complete non issue here. Of more concern is what is going on with the stability issues in some 3080s.
Another note on possible 20gb 3080s. One thing that might also hold them back is waiting for 2gb GDDR6X chips to come out. Currently they only offer 1gb chips. The 3090 actually has 24 of them. That is also one reason why is uses so much power. The 3090 has a more complex, and more important, more expensive design. A 3080 with 20 VRAM chips on it would be ridiculous. The only way a 3080 gets 20gb is when they get the 2gb chips. Then it will be more cost effective. Since the 3080 is a $700 GPU, not a $1500 one, this would make sense. The 3090 at $1500 can afford to be ridiculous.
The only other possibility is if Nvidia switches up things and goes back to more regular GDDR6. They wouldn't lose much in doing so, and it would be cheaper. You can 2gb chips of GDDR6, too. If this is possible, and I really have no idea if it is, then we might see that happen. It seems to me like something is not quite right with the GDDR6X. Testing has proven it can get very hot, and that is clearly the reason it is not clocked at its max frequency of 21gbs. Instead, the 3080 and 3090 are "only" at 19 and 19.5 gbs. GDDR6 can get pretty close to that, so in a lot of ways the GDDR6X has been a bit of a failure. I get the feeling that Nvidia was not expecting this. There were leaks months ago about Ampere running 21 gbs VRAM. If Nvidia had just stuck with regular GDDR6 these cards would not be quite running so hot.
So if a 3080 20gb releases, then it could be using normal GDDR6, and in doing so it might actually use the same TDP or even less than the 10gb 3080 with GDDR6X.
Yeah, don't buy any hi resolution TV unless it's OLED. The competing techs just don't match up.
You're right on GDDR6X. Nvidia said the lower speeds were yield related teething issues at the architecture day presentation. They didn't say how bad the yields were. If the modules are all running that hot then there are pretty flawed crystals in there for the frequencies the chips are running at.
Going to GDDR6 could be a issue in itself as that means a new memory controller and new voltage regulation circuits. I had assumed the "Super" cards would just be the base reference PCB's with the 2Gb GDDR6X modules dropped in. The reference memory controller can handle them and unless Micron has really screwed something up they should not consume much more power than the 1Gb chips. Not that Nividia designing new reference cards is the end of the world but it would take some time.
Interesting Vid on someone changing the capacitors on a 3090
I too have a Sony A1E 65" OLED 4K tv and just scored the new 12.4" samsung s7 plus tablet with Super AMOLED for watching TV when doing cardio at the gym..
That dude is either much richer, or much braver than me. You would never catch me anywhere near such an expensive piece of hardware with a soldering iron lol.
Yeh, it was a tough watch!
Roman is either an EE or has enough experience to be considered the equivalent. Of course no solder job on SMT's is without risk but he's done stuff like that many hundreds if not thousands of times.
Now I would never spend the time, and potential risk, just for the gain that he got.
Also people are reporting that the position of the NVLink connector on the 3090's is not standardized. So if you're getting 2 3090's with the intention of connecting them you best get two identical ones right down to make and model.
Is there even a rough estimate as to how long it will be for the ampere update? Going by past launches, even?
No, since we have no idea what factors will determine when it is ready.
Question for the NVLink gurus. Are the RTX 2080TI/2080 Super/ 2080/2070 Super NVLINKS standardized or are each manufacturers links different? TIA
To the best of my knowledge they are stabndardized for each bandwidth.
Thanks, I am still debating if I want to pick up a couple of 2070 Supers used and an NVLink bridge or wait for a mythical/hinted at 3070 16GB/3080 20GB. I keep thinking about a RTX 3090 but I am a hobbyist and can not justify putting $1600.00 on the credit card and paying all that interest.
I personally think Nvidia have dropped the ball this generation with their memory configurations. They could end up looking silly. They made a big thing of saying that they'd spoken to game devs who said 10gb is enough for games - and then a system hog like Microsoft Flight drops, followed by RDNA 2 boasting 16gbs in the top boards (probably / maybe) - which everybody and their dog has suspected for months.
A significant amount of users will say : 16gb > 10gb = Better.
In addition, due to the rumours of RDNA 2 boasting 16gb - it was suspected also that there must be bigger memory configurations of Ampere before launch .. and then, the leaks came, firstly from Gigabyte showing the boards they had lined up clearly showing a 20gb 3080, a 16gb 3070 etc and the Galax showing the roadmap slide with a 3080 20gb
Now that's a problem for Nvidia. A quick straw poll : put your hand up if you were originally thinking about a 3080 10gb but are now holding out waiting for news on the 20gb version ... [raises hand]
There could be quite a significant amount of people doing that because of : 20gb > 10gb = better.
The leaks might actually force Nvidias hand and push them to release earlier than they planned.
If people are holding off waiting on a 20gb version - then that means fewer 10gb versions / 3090s will sell. It means that if news on a 3080 20gb versions drags on for a few months and Big Navi comes close / equals / beats the 3080, then after waiting for those few months some users might go back to the 16gb > 10gb = Better equation because they don't want to wait any longer.
Will that upset some people if Nvidia do release a 3080 20gb so soon? Of course. But not that many, realistically.
That vast majority of users looking for an upgrade will be waiting for the 3060 / 3070. So a 3080 20gb release doesn't effect them. The 3070 got delayed a couple of weeks until after the RDNA 2 announcement (hhhhmmmm ...)
Of those who bought a 3080 - many would have been stretching their budget to make the $700 and wouldn't have spent an extra $200/$300 anyway. So they're not bothered.
A number of them would have spent the $700 as it is perfect for their budget. Yes, some of them would have stretched the extra $200 or $300 but they got what they wanted at the price they were wanting to pay.
The problem bunch are those who had the budget for the extra dollars and would have spent it if they had had the chance. They'll be upset.
As would those who reluctantly handed over $1500 for a 3090 but had hoped for something cheaper. They'll be upset too.
Now, there won't be that many of the above who'll be unhappy. Reading this thread, I get a sense of people saying everybody who bought a 3080 10gb WILL be annoyed if a 20gb version is released so soon.
No, just a small percentage of them will be IMHO.
First Nvidia is right. 10Gb is more than enough for every game on the market. Someone tested Flight Sim 2020 with a 3090 and didn't see any better performance than they got with a 3080. The extra VRAM didn't do anything.
In games where performance actually matters no game currently consumes, at 4k with all texture settings maxed out, more than roughly 8Gb (IIRC that's Red Dead 2).
Nvidia and the AIB's have said they have their entire production wrapped up in the production of cards they have already announced. If you think the entire gaming community will not be pissed off if they say "well no, actually we were producing these other cards with twice the VRAM at only slighlty higher prices, suckers!" then you do not understand people and most specially do not understand lawyers.
Honestly based on what they've done and what is going on I'd guess sometime next spring they'll launch the high VRAM cards as creator versions, or something like that, at price points between the 3080 and 3090. That keeps their product stack simple for the holiday season and gives them a launch 6 months after this which is something they like to do.
But this fantasy that there will be a 20Gb 3080 launch in November? How would that make any sense? They get bad press for having held back this card just in case AMD had a 3080 competitor. Then the stories start flying about this is the reason stock of 3080's and 3090's were so short at launch. Then the bots buy all of them at launch at Nvidia gets another round of that to deal with.
Anyway if the RDNA2 flagship was competitve we'd be seeing "leaked" benchmarks. Seen any recently?
Depends on the price difference to how upset they are; 300 is a significant difference, and many folks can't stretch to that extra.
I personally don't mind handing over 1400 for an FE; it has more RAM, more Cuda and the option to be Nvlinked. Nothing else has those three features - so far.
When 24GB is not enough...
https://www.nvidia.com/en-us/design-visualization/quadro/rtx-a6000/
48GB, can be NVLinked to pool the memory (96GB), 300W TDP, probably will cost an arm and a leg, coming soon...
Well at least cards like that are hope for the future of such capabilities being in our hands for less than $1000. Without it the development of 3D as a hobby will come to a stand still technically mostly.
The Quadro world is confused TBH. Sure the raw compute compared to the RTX 8000 went up but is that the only Quadro in the stack?
And Nvidia buried a detail about their Ampere racks for some reason, no RT cores so those things are next to worthless for rendering, they might as well be V100's. I have no idea what is going on inside Nvidia but this whole thing is fishy AF.
Pretty sure that rendering wasn't what Nvidia had in mind when designing Ampere racks. It wasn't even mentioned at their announcement.