Daz Prayers Answered? 4060 Ti w 16GB VRAM!

I am liking this rumor alot!  

Nvidia GeForce RTX 4060 Ti 16GB Rumored to Have a 165W TDP

That would be pretty sweet... especially if it is SUB $500!
 

«134

Comments

  • TorquinoxTorquinox Posts: 3,335

    That sounds pretty cool!

  • nonesuch00nonesuch00 Posts: 18,131

    pretty awesome

  • Richard HaseltineRichard Haseltine Posts: 100,946

    It is, however, a rumour not a fact - it has already been mentioned in a couple of threads. Rumour seems to have the same attitude to memory size that many celebrity look-alike makers have to cup size.

  • kyoto kidkyoto kid Posts: 41,057

    ...nice, I don't need to drop 1,000$ on an RTX A4000 to get 16 GB.  Indeed, it's about time.

    Not as many cores as the A4000, but an improvement over my old Titan-X that's been soldering along for me.  I actually have a 3060 12 GB, (EVGA) but my MB's BIOS is so old it won't support it so it is sitting in the box waiting until I can afford to upgrade the base hardware.

    Too bad EVGA ceased production of GPUs as they had excellent QA and the best warranty (3 years w/full replacement).

    Sub 500$ and makes it very tempting.

  • TorquinoxTorquinox Posts: 3,335

    Richard Haseltine said:

    It is, however, a rumour not a fact - it has already been mentioned in a couple of threads. Rumour seems to have the same attitude to memory size that many celebrity look-alike makers have to cup size.

    Duly noted. We shall see the reality in the fullness of time. 

  • Saxa -- SDSaxa -- SD Posts: 872

    Richard Haseltine said:

    It is, however, a rumour not a fact - it has already been mentioned in a couple of threads. Rumour seems to have the same attitude to memory size that many celebrity look-alike makers have to cup size.

    This gets my vote as funniest comment of the day (and for my 2 cents + 1 for inflation) probably spot-on.  Would be remarkable if Nvidia did that.

  • outrider42outrider42 Posts: 3,679

    Indeed, these are rumors. Something that most people are overlooking about this rumor is that while a lot of websites are reporting on it, the rumor primarily comes from ONE source on twitter. Videocardz.com doesn't even list a source in their article.

    That bothers me. The twitter source has been correct in the past, but that doesn't mean they will be this time. Of course I hope they are right, having 16gb in the 4060 and/or 4060ti would be a boon for content creators on a budget. If the price is right and the card is decently fast, I'd get one myself to compliment my 3090 better than the 3060. I do run out of VRAM on my 3060 sometimes, so 16 would be welcome.

    In a funny twist, this isn't about creators, though. VRAM has become a real hot button topic in PC gaming the past year, with 8gb GPUs struggling badly with many new games. 8gb GPUs have been getting blasted by the big benchmark websites and channels. Hardware Unboxed in particular has said that 8gb GPUs should be low end products only, and that 12 and 16 should be standard now for the prices Nvidia is asking. If the 4060 and 4060ti have 8gb models, I guarantee those models are going to get roasted by reviewers across the board. There are many reasons for this, as not long ago people thought 8gb was totally fine, and that 12gb was probably more than enough. Boy were they wrong.

    The PS5 has sold more than twice as many units as Xbox. So publishers logically put more resources into PS5 than Xbox, and by extension PC gets even fewer resources. That leaves PC games in a worse state. The PS5 also has a well designed memory system, where the SSD is like having your entire software library sitting in DD2 memory. Its 12 channel memory custom memory controller can load over 2gb of data in a quarter of a second. Stuff loads fast, and that includes being able to load textures quickly into memory as needed. The PS5 also has unified memory, its VRAM is also RAM, and vice versa, so data is not duplicated across VRAM and RAM like it is on PC. So PC has to use brute force to match PS5, meaning more VRAM and RAM than what the PS5 uses. 

  • ColinFrenchColinFrench Posts: 647

    If the rumored 4060TI specs are true it leaves the 4070TI in an awkward spot -- 25% less VRAM but 76% more Cuda cores. It will be interesting to see how that works out in terms of the relative speed of the two cards.

  • edited May 2023

    Wow.... if I had only put the words "Rumor" in the OP, then it would have been obvious... oh wait.

    And we should only have a week or so to wait, as it is supposed to be dropping this month

     

    "While Nvidia is expected to announce all the GeForce RTX 4060 family on Wednesday, May 24, the release may be staggered. First along, this month, should be the RTX 4060 Ti with 8GB. However, those looking for a cheaper entry to the RTX 40 series with the RTX 4060 8GB will have to wait until sometime in July. Similarly, the RTX 4060 Ti 16GB and RTX 4060 16GB models are tipped for a July release."

    Post edited by pjwhoopie@yandex.com on
  • PerttiAPerttiA Posts: 10,024

    ColinFrench said:

    If the rumored 4060TI specs are true it leaves the 4070TI in an awkward spot -- 25% less VRAM but 76% more Cuda cores. It will be interesting to see how that works out in terms of the relative speed of the two cards.

    Not as awkward as with RTX 3060 12GB versus RTX 3060Ti/3070/3070 Ti, as the faster cards with their 8GB VRAM have effectively only 4GB's of VRAM available for textures and geometry when rendering Iray, while the slower and cheaper RTX 3060 12GB has 8GB's for them.

  • PadonePadone Posts: 3,700
    edited May 2023

    It may seem hilarious to you put this way. But actually there's not much difference between 8 GB and 16 GB. That is, the required vram is quadruple with the texture size. So if 2K textures require 4 GB vram then 4K textures require 4*4 = 16 GB vram. And so on 8K textures require 16*4 = 64 GB vram, yes you read it right. So you see the solution is not to go with more vram that sure helps but only if you first reduce the textures size.

    With the actual DAZ trend to go for 8K textures and ultra HD there's no hardware that can suffice.

    Post edited by Padone on
  • KitsumoKitsumo Posts: 1,216

    kyoto kid said:

    ...nice, I don't need to drop 1,000$ on an RTX A4000 to get 16 GB.  Indeed, it's about time.

    Not as many cores as the A4000, but an improvement over my old Titan-X that's been soldering along for me.  I actually have a 3060 12 GB, (EVGA) but my MB's BIOS is so old it won't support it so it is sitting in the box waiting until I can afford to upgrade the base hardware.

    Too bad EVGA ceased production of GPUs as they had excellent QA and the best warranty (3 years w/full replacement).

    Sub 500$ and makes it very tempting.

    Have you priced any upgrades lately? Now's the time to do it when AMD, Intel and the RAM makers are struggling to clear out old inventory. Really all you need is a basic motherboard, 6-core cpu and 32-64Gb of RAM to let you use your 3060. Maybe $350 or so (or less) to quadruple your rendering power.

  • kyoto kidkyoto kid Posts: 41,057
    edited May 2023

    ...already have the parts chosen.  Not into just building a machine to seerve in a pinch as that is a waste of money particularly being on a tight income. but one that is more of a "true" upgrade.and has some future potential.

    The MB also has to meet the requirements for W11. 

    AMD Ryzen 9 5900X 3.7 GHz 12-Core Processor:  313.99$
    Asus Prime X570-P ATX AM4 Motherboard:  248.93$  
    Corsair Vengeance LPX 64 GB (2 x 32 GB) DDR4-3200 CL16 Memory:  129.99$
    (expandable to 128 GB).
    Be quiet! Dark Rock Pro 4 50.5 CFM CPU Cooler:  89.90$

    Microsoft Windows 11 Pro OEM - DVD 64-bit:  139.98$

    Total:  923.29$

    Already have the drives (2 SSDs 2 HDDs), PSU (850w), GPU (RTX 3060 and a Titan X), case, and displays from my current setup.

    Unfortunately my monthly income is only about 150$ more than that so it will take a while putting away what little i can each month.

    Post edited by kyoto kid on
  • KitsumoKitsumo Posts: 1,216

    kyoto kid, cool. I didn't mean to be nosey, I just remember you talking about that motherboard issue before.

    I guess I lean towards lower end systems since I don't do really big renders.

  • outrider42outrider42 Posts: 3,679
    edited May 2023

    Padone said:

    It may seem hilarious to you put this way. But actually there's not much difference between 8 GB and 16 GB. That is, the required vram is quadruple with the texture size. So if 2K textures require 4 GB vram then 4K textures require 4*4 = 16 GB vram. And so on 8K textures require 16*4 = 64 GB vram, yes you read it right. So you see the solution is not to go with more vram that sure helps but only if you first reduce the textures size.

    With the actual DAZ trend to go for 8K textures and ultra HD there's no hardware that can suffice.

    That is strange...my hardware seems to suffice. Hmm. I'm actually thinking of getting a 16gb 4060ti if the price is right. That would be a nice sweet spot to compliment my 3090. Yes I have a 3090, but I don't really use all 24gb, but it is nice to have if I do. I also have a 12gb 3060, and that is a nice helper. These 2 cards together get within range of a 4090 in render speed, though a 4090 is still faster. A 4060ti+3090 should get me past a 4090 in speed. Sometimes I do exceed the 12gb limit on my 3060, not that often actually, so having 16 would be nice. And I do 8k textures. A lot in fact. I converted the 8k normal detail maps to Genesis 8, and customized them into 8k spec maps from there. So I have multiple 8k textures in use on multiple characters quite often. These maps are freakin awesome. I also frequently upscale environment textures when upscaling works on them because so many Daz products have highly compressed textures. These compression artifacts show up in renders.

    You also say this as if it is impossible to reduce the size of textures or not use high subdivision. This just blows my mind. There is absolutely nothing stopping anyone from doing either thing, and this is ridiculously easy to do. I can select all the textures I want to resize, right click, and pick what size to resize them to and click accept. It takes literally seconds to resize a whole folder if I want. Boom I am done. There is also a Daz product designed for this that automates it even more (Scene Optimizer). Is this really that difficult for users to do? If people are doing 3d art, they should have at least this knowledge. I did this all the time when I had a 2gb 670 and later 4gb 970. I have been there, I know what it is like to have a lesser PC for Daz. I might have even complained once in the forums about it. But then I learned how to deal with it, and deeply regretted making that complaint, it was very foolish.

    Because here's the thing. This nonsense about VRAM hurts Daz users more than it helps them. Sure, people with lesser hardware have to deal with it, but as I just described, that really isn't something worth complaing about. Having good quality in a product should be the priority, and the user can then decide how to use that product. Doesn't that sound fair? Once you have Scene Optimizer, it makes the any complaints about memory entirely pointless. It took Power Toys exactly 1 second to resize ALL of the 8k normal detail maps that G9 has in Starter Essentials, I know because it reports how long it takes. So that's done. I can swap them out at will whenever I want to. This is not hard. Power Toys is free. If I want to I can get Scene Optimizer and do the same thing within Daz, and the script has some nice features that make it easy to use and save the new textures how you want to. With Scene Optimizer, there frankly is no excuse.

    But on the flip side, if a Daz user saves up and buys that big fancy rendering PC they have dreamed about, they will be in for a rude awakening when they find their favorite Daz products actually look like crap when rendered at higher resolutions. And this sucks. It really, really sucks. Because now that user has to try to find a way to improve these crappy textures. Upscalers can be great...if they work. Often the textures are so bad that upscaling them is a hopeless task. In some cases you can apply a shader preset and use its textures, but there times when the UV maps on a product don't work with them, and you have to deal with that. And there are times when the shader preset ALSO has crappy low quality textures. What is the point of that???

    It is so much harder to fix a bad texture than it is to downsize a good texture. So people who want to create larger renders are directly impacted by poor textures made by PAs afraid to anger the people with potato PCs. I'm sorry, but computer rendering is not for the feint of heart, it never has been. But things are so much better today. Good grief, we had 2gb GPUs when I started this. Iray was straight up brutal on a lot of people. But now we have access to GPUs with a lot more memory and fancy ray tracing, while Daz products have not really increased that dramatically in memory. It is better than it ever has been, why complain about that?

    I seriously do not understand why people complain about this now. Genesis 9 has its issues, but VRAM just is not one of them. We should be demanding better quality, not just middle of the road quality. Video games are getting better textures than Daz products now! There are 64mb+ sized video game textures. You can even get 8k mods for video games. I have some. And as large as they are, you don't actually need a 3090 to use them. They will fit in 16gb, and sometimes 12gb. Depends on the game. Daz needs to at least keep up with video games, it should be embarrassing to have video games that have better assets and look better than a lot of Daz Iray renders. 

    Post edited by outrider42 on
  • nonesuch00nonesuch00 Posts: 18,131

    it wil ultimately be determined by gaming monitors and TV monitors. If they go to 16K then VRAM needs to be 256GB (because you shouldn't design everything with hitting a boundary in mind). 

  • PadonePadone Posts: 3,700

    @outrider42 It may seem odd to you but I fully agree with what you say and that is not in contrast in any way with what I am pointing out. On the contrary it is a confirmation. Games textures are much more optimized and better quality than DAZ's. Also games support HD a lot better than iray. Because iray doesn't get adaptive subdivision or dynamic textures, just out of curiosity 3delight does.

  • davesodaveso Posts: 7,014

    the question I have, will DAZ 4.21 support all the cool new features of the 4060 ti .. or any 40 series card?  

    From nvidia - "

    DLSS 3 Meets D5 Render
    D5 Render has added support for NVIDIA DLSS 3, bringing a vastly improved new real-time experience to professional 3D artists. DLSS 3 with AI-powered Frame Generation and Super Resolution technologies enhances viewport framerates, making the experience buttery smooth."

    "The GeForce RTX 4060 Family is powered by the ultra-efficient NVIDIA Ada Lovelace architecture with 4th generation Tensor Cores for AI content creation, 3rd generation RT cores and compatibility with DLSS 3 for ultra fast 3D rendering, and the 8th generation NVIDIA Encoder, now with support for AV1. Plus a whole lot more. 

  • outrider42outrider42 Posts: 3,679

    The use of DLSS 3 is more for gaming, it is generating extra frames in the animation. I don't think Iray can do this because I think DLSS 3 requires some data from the next frame to work correctly. The renderers in question using DLSS 3 are creating multiple frames per second, and Iray just can't do that. There is also DLSS 2, which is actually different from 3. DLSS 3 is about frame generation, while the previous versions are about upscaling from lower internal resolutions. Maybe for animations this could be possible, but I don't know if animation is a priority. Single pics can just be upscaled outside Daz Studio. Another Tensor use is denoising, and this was Nvidia's original intention with Tensor. Tensor was intended to compliment ray tracing with denoising, this is important to be able to do it in real time.

    Tensor can be used by the Iray denoiser, but I honestly do not know how much it helps, if any. I can't really tell the difference over using the denoiser over my old GTX 1080ti. Maybe I can try testing it out sometime, but we can't disable the Tensor cores to see if they do any good. You can do denoising outside of Daz, too, and I usually do that as I find Intel's denoiser to work a little better. Iray could use some work in this regard. I'd like to see the denoiser really overhauled. At any rate, denoising is the only task Tensor cores can do in Iray.

    From the Iray dev team's notes, it does sound like the denoiser is something they are working on improving. But it will not exclusive to Lovelace, any RTX card that has Tensor should be able to make use of it.

    So the Tensor cores are not currently a selling point for Daz users IMO. Maybe in the future, but at this time it is better to focus on raw performance, ray tracing, and power draw. The 4060s look to have fantastic power draw. 

    Which, BTW, they are now official. Except for the 16gb 4060, that one did not come to pass. At least not yet. I think the 16gb 4060 could still happen depending on how the market treats the 8gb model. But we are getting a 16gb 4060ti, as well as the 8gb 4060ti. If the 4060ti 8gb sells badly, and the 4060 8gb isn't doing much better, we might get a 16gb version of the 4060.

    4060 8gb - $300  July  115 Watts

    4060ti 16gb - $400  May 24    160 Watts

    4060ti 16gb - $500  July    165 Watts

    The 4060ti 16gb does not have a solid release date yet besides "July". Using just 165 Watts will also be nice for a lot of people. The 4060ti 16gb would make a great companion card. However $500 is a bit rough IMO. Having a $100 premium for 8gb more VRAM is not very cool. It should have been $450 at most. The 4060 using just 115 Watts is pretty neat, but only having 8gb is a serious deal breaker. Not just for Daz, but even for gaming.

    Another thing to note, the 4060ti models are identical specs beside the VRAM. So when the 8gb model launches, we will know exactly how fast the 16gb model will be if someone posts a benchmark. It should be around a 3070, depending on how the ray tracing shakes out.

    @Padone  I've talked a lot about the looming threat of gaming engines. Once Nvidia brought dedicated ray tracing cores to gamers with Turing, I personally believe the writing has been on the wall for traditional PBR engines. Every year the gaming engines get better, and Epic's Unreal is going to do some wild stuff. Daz is a content company first now, but even though we can export their content to something like Unreal, the fact is that Genesis works best in Daz Studio. It is not as great in gaming engines. The Iray thing has been fun and successful for Daz, but Daz needs to be thinking about a strategy for something faster that is like a game engine, or even partnering with one. Maybe they are, you know how secretive they are. That doesn't mean they need to ditch Iray. Nvidia Omniverse is very intriguing, and Omniverse has Iray as a render option (they call it RTX Accurate). So here is software that has numerous rendering options along with Iray.

  • kyoto kidkyoto kid Posts: 41,057
    edited May 2023

    ...the other issue is Windows itself which since W10 "reserves" abut 1 GB of VRAM for their WDDM. With W7 and 8.1 the WSSM footprint was almost negligible. 

    The only way around that is to have a Pro grade (formerly Quadro) GPU that offers TCM (Tesla Compute Mode) which allows one to bypass WDDM.   

    I was actually considering an RTX 4000 SFF before seeing this thread, as it has 20 GB of GDDR6 VRAM with ECC., a higher core count (6144 CUDA 192 Tensor and 48 RT cores as well as 96 ROPs and a TDP of 70W) along with the TCM mode option.

    Post edited by kyoto kid on
  • PadonePadone Posts: 3,700

    I use the ryzen apu for the viewport so the nvidia card is reserved for rendering. No windows overhead there's zero vram until I render. No need of TCM.

  • TorquinoxTorquinox Posts: 3,335

    outrider42 said:

    The 4060ti 16gb does not have a solid release date yet besides "July". Using just 165 Watts will also be nice for a lot of people. The 4060ti 16gb would make a great companion card. However $500 is a bit rough IMO. Having a $100 premium for 8gb more VRAM is not very cool. It should have been $450 at most. The 4060 using just 115 Watts is pretty neat, but only having 8gb is a serious deal breaker. Not just for Daz, but even for gaming.

    After all the fun we had with pricing over the last few years, it is what it is. Still early days for the 4000-series cards. Not feeling the need to hurry up and buy anything, but $500 for a card that has upgrade in every department over the 3060 12GB is probably ok.

  • davesodaveso Posts: 7,014

    what I find interesting is the 4070 is a bit better than the 3080, but, the 3080 systems are beginning to drop in price. Thats looking more inviting all the time. I'm stuill running a 2070 Super which is still decent imo for what I do, so even a 3000 series would be an upgrade for me. 4060Ti would be really nice, but now I'm seeing 4070Ti for what I was going to pay for a 3080. 

  • kyoto kidkyoto kid Posts: 41,057

    Padone said:

    I use the ryzen apu for the viewport so the nvidia card is reserved for rendering. No windows overhead there's zero vram until I render. No need of TCM.

    ...a while back when WDDM was a major topic of discussion, I read (either here or on the Nvidia forums) that dedicating a GTX card to  rendering did not sidestep Windows 10 WDDM, Only a Quadro oe then Titan card could by switching it into TCM. .Doing so required a second GPU card or the CPU's graphics to run the display(s).. 

  • PadonePadone Posts: 3,700

    Windows 10 only allocates some vram on the card connected to the monitor. It doesn't allocate anything if the card is not connected. In my case I connect the monitor to the mobo so it uses the ryzen apu. This way the nvidia card is reserved for rendering and the allocated vram is zero until I render.

    Of course if the cpu doesn't have graphics then windows will take the nvidia card you have no choice in this case.

  • outrider42outrider42 Posts: 3,679

    Torquinox said:

    outrider42 said:

    The 4060ti 16gb does not have a solid release date yet besides "July". Using just 165 Watts will also be nice for a lot of people. The 4060ti 16gb would make a great companion card. However $500 is a bit rough IMO. Having a $100 premium for 8gb more VRAM is not very cool. It should have been $450 at most. The 4060 using just 115 Watts is pretty neat, but only having 8gb is a serious deal breaker. Not just for Daz, but even for gaming.

    After all the fun we had with pricing over the last few years, it is what it is. Still early days for the 4000-series cards. Not feeling the need to hurry up and buy anything, but $500 for a card that has upgrade in every department over the 3060 12GB is probably ok.

    The standard expectation is for the new generation to improve these areas without a big price increase. While MSRP was not real for a long time, the 3060 was $320 MSRP. The 4060ti is $180 more. So it SHOULD be better at everything, lol. I managed to get my 3060 under $400 during the peak of COVID.

    A big part of the problem with the 4000 series is that price to performance has been terrible compared to previous generations. The 4090 is the only card to really improve this mark. The 4080 is faster and has more VRAM than a 3080, but the difference in price doesn't add up. The 4080 offers far worse price to performance than the 3080. The 4070ti did the same thing vs the 3070ti. The 4070 finally got a little closer, but its $100 increase over the 3070 still hurt its overall value.

    The 4060 is the very first 4000 series to launch at a price lower than its predecessor. But even here we are only talking $20, and the 4060 has less VRAM. The 4060ti 8gb matches the launch price of the 3060ti, while the 4060ti 16gb has a $100 premium. This is also paired with the fact that these cards are expected to deliver modest performance increases of their predecessors. Making it it even worse is that any 8gb GPU released in 2023 is going to get murdered by reviewers unless it is really cheap.

    The 4060ti 16gb at $500 is slightly better...but not terribly exciting. The GPU market has crashed. It has crashed so bad that Nvidia is stopping production of 4070s because they cannot sell them. They do not have a good excuse to charge what they are today. They can reduce the prices and still make a good profit. The world is going into recession (some would argue its already there), this is a time period when historically goods shave a little of their margains to keep moving units. But unlike previous eras some markets refuse to drop. Just because prices got wild during COVID doesn't mean they should say that way. The shortages are long over for many goods. This has been coined as "greedflation". I don't think any of the GPUs will sell that great. They are also competing with outgoing 3000 stock.

     

  • outrider42outrider42 Posts: 3,679

    daveso said:

    what I find interesting is the 4070 is a bit better than the 3080, but, the 3080 systems are beginning to drop in price. Thats looking more inviting all the time. I'm stuill running a 2070 Super which is still decent imo for what I do, so even a 3000 series would be an upgrade for me. 4060Ti would be really nice, but now I'm seeing 4070Ti for what I was going to pay for a 3080. 

    Over in the Iray benchmark the thread the 4070 is slower than a 3080ti. I am not sure how it compares to a 3080 10gb or 3080 12gb. Considering how close the various 3080s are in performance, it is probably very close between the 3080 and 4070.

    The primary advantage of the 4070 (and 4000 series in general) is that they use a lot less power than the 3000 series. I don't know what the 4070 uses while rendering Iray, but the 4090 is only using 285 Watts, not even close to its 450 Watt TDP. The 4080 of course uses less than that, and the 4070 logically uses even less. The 4070 has a TDP of only 200 Watts, and given Lovelace history with Iray, it probably uses a lot less than 200 while rendering Iray. The 4070 might well be using 150+ Watts less than a 3080.

    The power draw of Lovelace is outstanding. So people who have expensive electricity could save money over the 3000 series. If they are close in price or performance, the 4000 series is the way to go. So even if the 3080 is faster than than the 4070, the 4070 power draw gives it a big edge.

  • TorquinoxTorquinox Posts: 3,335
    edited May 2023

    outrider42 said:

    The standard expectation is for the new generation to improve these areas without a big price increase. While MSRP was not real for a long time, the 3060 was $320 MSRP. The 4060ti is $180 more. So it SHOULD be better at everything, lol. I managed to get my 3060 under $400 during the peak of COVID.

    A big part of the problem with the 4000 series is that price to performance has been terrible compared to previous generations. The 4090 is the only card to really improve this mark. The 4080 is faster and has more VRAM than a 3080, but the difference in price doesn't add up. The 4080 offers far worse price to performance than the 3080. The 4070ti did the same thing vs the 3070ti. The 4070 finally got a little closer, but its $100 increase over the 3070 still hurt its overall value.

    The 4060 is the very first 4000 series to launch at a price lower than its predecessor. But even here we are only talking $20, and the 4060 has less VRAM. The 4060ti 8gb matches the launch price of the 3060ti, while the 4060ti 16gb has a $100 premium. This is also paired with the fact that these cards are expected to deliver modest performance increases of their predecessors. Making it it even worse is that any 8gb GPU released in 2023 is going to get murdered by reviewers unless it is really cheap.

    The 4060ti 16gb at $500 is slightly better...but not terribly exciting. The GPU market has crashed. It has crashed so bad that Nvidia is stopping production of 4070s because they cannot sell them. They do not have a good excuse to charge what they are today. They can reduce the prices and still make a good profit. The world is going into recession (some would argue its already there), this is a time period when historically goods shave a little of their margains to keep moving units. But unlike previous eras some markets refuse to drop. Just because prices got wild during COVID doesn't mean they should say that way. The shortages are long over for many goods. This has been coined as "greedflation". I don't think any of the GPUs will sell that great. They are also competing with outgoing 3000 stock.

    Strongly reasoned response. I understand. Maybe 4000 series is one to skip. Even so, this is the world we live in. Among 4060s, the16GB card is the only one I would even consider. What would anyone using DS do with those others? If unit sales are going to be less, prices may even end up being a little higher because sales are less. Greedflation is likely part of the price - No reason to doubt it! But other factors could also play a part. Either way, it seems the whole world is in a different place now than it was before Covid. I doubt it's ever really going back. The rules followed in the "before times" may no longer hold true. We shall see.

    Post edited by Torquinox on
  • kyoto kidkyoto kid Posts: 41,057
    edited May 2023

    ...yeah the RTX 4000, SFF sips power at only 70W TDP.  Nice when also running a Titan-X to power the displays 

    Post edited by kyoto kid on
  • nonesuch00nonesuch00 Posts: 18,131

    Padone said:

    Windows 10 only allocates some vram on the card connected to the monitor. It doesn't allocate anything if the card is not connected. In my case I connect the monitor to the mobo so it uses the ryzen apu. This way the nvidia card is reserved for rendering and the allocated vram is zero until I render.

    Of course if the cpu doesn't have graphics then windows will take the nvidia card you have no choice in this case.

    That is what I do but I still run out of VRAM (12GB) for some renders.

Sign In or Register to comment.