16GB RTX 4060ti Releasing in July

IceCrMnIceCrMn Posts: 2,129
edited July 2023 in The Commons

https://www.tomshardware.com/news/nvidia-rtx-4060-ti-16gb-alleged-launch-date-revealed

I've also read the MSRP will be $499 US. This reminas to be seen of course.

Zotac has a place holder page up for theirs.

https://www.zotac.com/us/product/graphics_card/zotac-gaming-geforce-rtx-4060-ti-16gb-twin-edge

Specs page

https://www.zotac.com/us/product/graphics_card/zotac-gaming-geforce-rtx-4060-ti-16gb-twin-edge#spec

increase from 3584 (12GB 3060) to 4352 cuda cores +4GB more vram.

What are your thoughts on this new card?

I like the price to performance ratio. Personally I plan to get one as soon as I can after release.

 

edit: Link to Nvidia official site

https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4060-4060ti/

Post edited by IceCrMn on
«1

Comments

  • nakamuram002nakamuram002 Posts: 788

    I want an RTX 4070 with 16gb or 24gb!!

  • outrider42outrider42 Posts: 3,679

    Remember you cannot directly compare CUDA cores across generations, so the number over the 3060 doesn't help. You can compare to GPUs that share its architecture, the other 4000 series released so far. Nobody has posted a 4060ti 8gb bench in our benchmark thread yet. I suppose nobody wants the 8gb, which is understandable.

    The CUDA cores scale very well across the lineup, but ray tracing cores can scale differently depending on mesh density. But you can still get an idea looking at the benchmarks of the 4070, 4080 and 4090. Unfortunately the thread doesn't have the 4000 series on the first page.

    The 4060ti 16gb has two benefits: the VRAM of course, and its very low power draw (the chip is very small.) But the small die size is more like a lower class card rather than a proper x60ti level card. That is why the 4060ti 8gb and 4060 reviews have been so rough. The performance gains for gaming have been pathetic, and some games even perform WORSE than the last gen model, which is really inexcusable. For us, Iray typically does better thanks to those RT cores being a bigger deal in Iray. I expect the 4060ti to perform above a 3070 in Iray. But it might not in all scenes, depending on how the scene is built. The 4060ti might not be much faster than the 3060ti. It is relying purely on its VRAM capacity, which I admit has me interested.

    GPUs are not selling right now. The 4060ti 8gb has already had multiple discounts in the short time it has been out. It might pay to be patient and see if any deals pop up, because $500 is a bit of a hard to sell. Even as gamers complain for more VRAM, they still want more performance than this card will offer at $500. So I believe that it will be very easy to buy these cards, unless they simply don't supply many. That's the only way they sell out. Only people doing content creation like us will have much interest in the 4060ti 16gb. The 3060 12gb did prove very popular, but that was during the wild crypto rush that has died completely.

    Also note that there are no Founder's card by Nvidia for this 16gb version. All models are going to be 3rd party, and it hard to say how many will be MSRP. Expect on the base models from any brand to be that price.

  • kyoto kidkyoto kid Posts: 41,057

    ....wish EVGA was still in the GPU business..

  • outrider42outrider42 Posts: 3,679

    kyoto kid said:

    ....wish EVGA was still in the GPU business..

    Yeah. >.<

    Actually, I think part of the decrease in demand is due to EVGA not being there. EVGA was such a huge part of Nvidia's brand image. Remember they sold some 40% of Nvidia's GPUs in North America. North America is the biggest market by far, doubling the rest of the world put together.

    Right now it is like a perfect storm of events for a GPU crash. Crypto died. People are going back to work. Inflation is out of control, and recession is looming. And that is on top of what Nvidia has done with pricing, gamers angry about VRAM, and EVGA leaving them. The market has completely flipped. But they are extremely resistant to lower prices.

    I can't overstate how much VRAM has become an issue for gaming in the past year. Many of the tech channels have been preaching the pitfalls of 8gb in 2022/2023 for modern games. The list of games that spill over 8gb is growing constantly, and I am talking about 1080p, let alone higher resolutions. We have GPUs that are strong enough to play games at the higher settings but cannot because of the VRAM capacity. That sucks. Games do not typically crash when running out of VRAM, but the performance tanks and is just too unpleasant.

    So gamers are starting to feel like we do with Iray, lol.

    However, in spite of all of this, Nvidia is still doing perfectly fine thanks to AI booming so much. They are making more off AI than they ever did with crypto, and unlike crypto, AI is using different silicone than gaming. So Nvidia is shifting production to AI products like H100. Honestly it is hard to blame them when these things sell for $40,000+ a pop, versus a 4090 selling for a tiny fraction of that (and we think 4090s are pricey). Nvidia is on track for $11 billion this quarter thanks entirely to AI. That is a huge uplift. This is worth pointing out, because Nvidia is not hurt by the slow GPU market. Not one bit.

    Which is why they are so resistant to dropping prices. They are fine with lower sales volume. So even though they finally gives us a 16gb 4060ti, they want to charge $500 for it. The die size on the 4060ti is only 190mm², that is only 13.8mm. This is a die that might be smaller than your thumbnail. The old 3060ti has a die size that is 392mm². The difference is size is astounding. Lovelace is actually incredible, but the problem is how they cut it up and sell it to us. If Nvidia had made the 4060ti the same die size as the 3060ti, it would be a beast.

    You guys want to know just how crazy this whole thing is?

    As I just said the 3060ti chip is 392mm². Ok. Do you realize that the 3060ti chip is BIGGER than the 4080? The 4080 has a die size of 379mm². That's right, the 3060ti is bigger than the 4080. If Nvidia had made the 4060ti that size, well, it would be faster than a 4080. That is how far Nvidia has moved the stack compared to just the last generation. It is why everything seems so much weaker than it should be...because it is! Even the 3050, which was at the very bottom of the Ampere stack, is bigger. The 3050 has a 276mm² chip, and even the 3050 dwarfs the 4060ti chip in size. So it is remarkable that Nvidia has the performance that they have out of such tiny chips, but they are cutting them too small.

    This is why reviewers are being so harsh, and gamers in general.

    And for some reason AMD is barely trying to do any different. AMD is also trying to hit the jackpot with AI.

    Meanwhile Intel is like a dog chasing its tail.

  • kyoto kidkyoto kid Posts: 41,057

    ...appropriate description of Intel.

    I already have a 12 GB 3060 but it's useless until I can upgrade to a more current board (still running on an X58 MB with a 12 year old BIOS and PCIe 2.0).  Looking at about 940$ for the project which includes a new MB, CPU Ryzen 5900X)  Memory (64 GB with the option to upgrade later). CPU cooler.(capable of handling the 5900X).and W11 Pro.

    Been trying to put away funds, but it's very slow going with prices out here for other things like food, utilities, paying off back taxes I didn't know I owed, & such.

  • LeatherGryphonLeatherGryphon Posts: 11,510

    kyoto kid said:

    ...appropriate description of Intel.

    I already have a 12 GB 3060 but it's useless until I can upgrade to a more current board (still running on an X58 MB with a 12 year old BIOS and PCIe 2.0).  Looking at about 940$ for the project which includes a new MB, CPU Ryzen 5900X)  Memory (64 GB with the option to upgrade later). CPU cooler.(capable of handling the 5900X).and W11 Pro.

    Been trying to put away funds, but it's very slow going with prices out here for other things like food, utilities, paying off back taxes I didn't know I owed, & such.

    Which motherboard are you looking at? 

  • KitsumoKitsumo Posts: 1,216

    To say that Nvidia is less than enthusiastic about this release would be an understatement. https://videocardz.com/newz/nvidia-not-seeding-geforce-rtx-4060-ti-16gb-for-reviews-aibs-hesitant-to-participate-as-well

    Warning: The comment section is toxic.

  • outrider42outrider42 Posts: 3,679

    Kitsumo said:

    To say that Nvidia is less than enthusiastic about this release would be an understatement. https://videocardz.com/newz/nvidia-not-seeding-geforce-rtx-4060-ti-16gb-for-reviews-aibs-hesitant-to-participate-as-well

    Warning: The comment section is toxic.

    That's to be expected from gamers. I can totally understand not being happy about the situation as I think the card is priced too high for what it offers and really should have been called the 4060 (not the ti). But at the same time this is the cheapest 16gb Nvidia card ever made, and some people are just acting ridiculous. 

    This 4060ti isn't going to be the fastest but it is (less than) half the price of the 4080 16gb. It will be faster than the 3060 which has been popular here. I dropped almost $400 on my 3060 at the time. The 3060 was still going for over $300 right up until the 4060 launched. The 4060ti 16gb is more of a 3060 successor than a anything else (again, it really should be just named 4060.)

    At any rate, I don't think this card is going to sell that great. I do think most of its sales will be to people doing content creation, so it will be curious to see just how well it does. We are heading hard into a deep recession, and that is impacting all sorts of markets. While Nvidia (and AMD) have launched their worst priced lineups in a very long time, I don't think sales would be much better if GPUs were cheaper. A lot of people are celebrating acting as if the low sales are proof that gamers are sending a message...no, that is not the case. They are way overestimating themselves, lol. People are just broke.

    I still recommend waiting to see how the launch shakes out and looking for a sale, but it is tough. It is possible that there will not be many 16gb models made, which could complicate things if it does turn out to sell more than expected. So it is hard to make a recommendation on this because of these unknowns. I certainly would not pay more than $500 for this thing, given some models are priced well above that. This card only uses about 165 Watts at most, so it doesn't need some over the top cooler jacking up the price. It probably uses a lot less power for Daz Iray. A simple cooler will be fine as long as it is competent.

  • KitsumoKitsumo Posts: 1,216

    I was going to grab one of these, but I think I'm going to pass, for now at least. My 3060 does all I need it to for now. I try to ignore the gamer-bros whining about bus size, die size, core count, "fake frames", etc. At the end of the day, the only question for me is "Is this the card I need/want, and is this the price I'm willing to pay for it?" It is a fine card, I'm just not onboard with the price. I don't begrudge anyone that does buy it though.

    I spend more time in Stable Diffusion than Daz Studio these days, and it's a lot less VRAM sensitive. In SD, I can do pretty much everything I want with 12Gb. It could help with Blender, but the animations I do don't use much VRAM.

    Both Nvidia and AMD know this generation won't sell well. The market is flooded with used mining cards, plus there's plenty of new last-gen in stock. All they can do for now is try to keep average selling price high, to keep the shareholders happy. So there definitely won't be many of these made (especially after the price drops on the 4080 and 4070ti). It probably will be a great deal for anyone who doesn't have a 3060 or higher. If I hadn't bought a 3060 last year and was still using a 1080ti, this would be a solid upgrade.

  • outrider42outrider42 Posts: 3,679

    Yeah, if you are not really using more than 12gb of VRAM, then there isn't much reason to buy a 16gb card. It will be faster, but it can be skipped if your current GPU is doing ok well enough.

    Nobody ever has to upgrade, unless they really want to.

  • AgitatedRiotAgitatedRiot Posts: 4,437

    Bill Gates: 640K ought to be enough for anyone

  • jmtbankjmtbank Posts: 175
    edited July 2023

    outrider42 said:

     

    As I just said the 3060ti chip is 392mm². Ok. Do you realize that the 3060ti chip is BIGGER than the 4080? The 4080 has a die size of 379mm². That's right, the 3060ti is bigger than the 4080. If Nvidia had made the 4060ti that size, well, it would be faster than a 4080.

     

    There has been a double node shrink. The samsung so called 8nm node wasn't much more advanced than the old pascal 14nm.  4080 die size is bigger than the old beloved 1080*.  In the old tick/tock gpu gen equivalent its normal for one generation to be much bigger than the next. Alternate generations tend to stay on a node size whilst doing architecture improvements and increaseing die size to provide the standard 40% performance increase. It seems that these days gains through archetecture are not so easy for raster, so you only have node shrinks to go on. 

    We had a this back at 28nm where cards were stuck on a node for way too long, however Nvidia came out with their revolutionary DX11 architecture improvements on their Maxwell (GTX 9XX) cards basically giving people a full generational increase on speed without a die shrink. So when Pascal gave a big boost at 14nm, the die sizes were tiny, but no one fixated on this. There was also a 33% price increase on the xx70 cards too, but we got double the memory capacity!

    I believe I'm right in saying that the 4080 has the biggest power/performance that there has ever been. Bigger than Pascal - which had a similar performance jump, but without a power reudction. The 3080 basically threw power management out of the window with its huge die size. The problem with the 4080 isn't its die size or performance. Its the horrible 50% price increase.

    *I'm including the portion of the die used for the cache. Some people are removing this for their comparisons.

    Post edited by jmtbank on
  • outrider42outrider42 Posts: 3,679
    edited July 2023

    There were actually were some complaints over the Pascal die sizes, as there were with the first Kepler 600 series. After all, Nvidia made their mobile chips use the exact same dies as desktop, there is a reason for that. I still remember people in this forum saying there was no way a 1070 could be using just a single 8 pin connector...I was the one who posted the 1070 rumor. The 680 wasn't the biggest die, but Nvidia was so far ahead of AMD at this point they didn't need their biggest die. The 680 was based on GK104, when the top card is typically on a 102 die.

    But things were different, as the performance was there in each of these cases, and of course the big VRAM bumps with Pascal. So these complaints did not get any traction. But people did know about these things, they were not a secret. To be fair, the PS4 released in 2013 and this raised the bar on VRAM (kind of like today, where the PS5 is pushing VRAM requirements in PC up.) So VRAM was becoming a sticking point. However Lovelace has been pushing this even further than before, worse still is the lack of performance per tier, and Nvidia has been stingy with VRAM.

    The 4080 is technically a great card. But like you said, the price is the problem. Lovelace has amazing power for performance across the board. Even the 4060ti, when you examine how much power it uses, it is actually an improvement. The issue has long been how things have been priced, and how things have been cut down and tiers shifted. There is a big gap between the 4080 and 4090.

    Ampere is not that bad at all. The 2080ti has a larger die size than the 3090. The 2080ti die is a whopping 754 mm², with 18.6 billion transistors. The 3090 is 627 mm². That is 17% smaller, while boosting 28.3 billion transistors. The 3090 used more power because Nvidia clocked it to the moon, otherwise it should have been around the typical ~260 Watts that the top cards had been for several generations.

    If you go by the leaks, the 3080 was never intended to be on the same GA102 as the 3090. The original plan was to use the GA103 die for the 3080. However Nvidia changed those plans and shifted the 3080 to a cut down GA102. GA103 was a mystery for a while, because its existence was well known. It finally showed up as the 3080ti laptop, and later there would be a version of the 3060ti built on it (cut down to match the 3060ti, so there is no advantage over the GA104 based 3060ti.) Like the 3090, the 3080 was also clocked super high. This is the real reason for the power draw, combined with the larger than planned die. Ampere can do quite well with undervolting and low power. 

    The point being, comparing to the 3080 isn't really the whole story. The 3080 that launched was a very different product, and since it used such a large die, it wasted a lot of power.

    At any rate, the 3080 is much closer to the 3090 in performance. The 3090 was something new in that it was a "x90". Jenson Huang introduced the 3090 as a GPU for creators. He made a point to say this first. They talked about gaming, sure, but the card was pitched to creators first. So everything has been shifted. We have had x80ti for a while, but the x90 has only been for dual chip GPUs way back in the day. Now the 4090 is a mile ahead of the 4080. That is what people look at. Even the GA103 3080, if it had launched, would have been closer to the 3090. The 4090 is a league all by itself.

    The gaps between each tier have been huge. Lovelace could have been one of the best generations ever, the tech is there. But that hasn't really happened. The 4090 is the best value in the line, which is just really strange.

    Post edited by outrider42 on
  • marblemarble Posts: 7,500

    This always happens to me. My 3090 died and I had to buy a new GPU - couln't afford a 4090 so I opted for a 4070 but it only has half of the VRAM of the 3090. I've already had the 4070 dropping to CPU so I'm back to optimizing scenes. A 4060 with 16GB might have been a better (and cheaper) option for me although I am very impressed with the 4070 performance which is not far short of the 3090.

  • outrider42outrider42 Posts: 3,679

    Most GPUs have a 3 year warranty. You should still be covered. Even if you lack a proof of purchase, the card is less than 3 years old. You should check your brand's website for information and how to RMA.

    Then if you get it repaired, you can have two GPUs at the same time if the scene fits the 4070's VRAM.

  • kyoto kidkyoto kid Posts: 41,057

    LeatherGryphon said:

    kyoto kid said:

    ...appropriate description of Intel.

    I already have a 12 GB 3060 but it's useless until I can upgrade to a more current board (still running on an X58 MB with a 12 year old BIOS and PCIe 2.0).  Looking at about 940$ for the project which includes a new MB, CPU Ryzen 5900X)  Memory (64 GB with the option to upgrade later). CPU cooler.(capable of handling the 5900X).and W11 Pro.

    Been trying to put away funds, but it's very slow going with prices out here for other things like food, utilities, paying off back taxes I didn't know I owed, & such.

    Which motherboard are you looking at? 

    ...an Asus ProArt B550-Creator ATX AM4 

    https://www.newegg.com/asus-proart-b550-creator/p/N82E16813119414

  • PerttiAPerttiA Posts: 10,024

    So far I haven't found the 16GB version on any of our local stores, usually new GPU's have appeared on the lists about a week before release

  • SevrinSevrin Posts: 6,307

    AgitatedRiot said:

    Bill Gates: 640K ought to be enough for anyone

    Enough with this.  From 1997

    Did Gates Really Say 640K is Enough For Anyone? | WIRED 

  • PerttiAPerttiA Posts: 10,024

    Sevrin said:

    AgitatedRiot said:

    Bill Gates: 640K ought to be enough for anyone

    Enough with this.  From 1997

    Did Gates Really Say 640K is Enough For Anyone? | WIRED 

    As if he would admit to saying something like that now...

    https://www.daz3d.com/forums/discussion/comment/6996146/#Comment_6996146
     

  • PerttiAPerttiA Posts: 10,024

    PerttiA said:

    So far I haven't found the 16GB version on any of our local stores, usually new GPU's have appeared on the lists about a week before release

    And now they are there... Starting at EUR 570 (24% VAT included) 

  • ArgleSWArgleSW Posts: 144

    If you can find a used Nvidia 3090 24GB (avoid the Ti version) for about $650USD and its within your price range, its a game changer. I upgraded from a 10GB 3080 and was blown away what an upgrade it was. From my experience, if you work with large scenes and high resolution, the amount of VRAM should be the number one priority. Even if you get the faster 4000 series with 16GB, the 24GB VRAM makes a signifagant upgrade. I have yet to see the render fallback to CPU and Im pushing my scenes WAY beyond what I thought was possible where 10GB would always fail. If there is 1 essential upgrade, my personal recommendation is prioritize the amount of VRAM, then work around that.

  • Ron KnightsRon Knights Posts: 1,785

    I stopped into this thread just because I sometimes fantasize about winning the lottery. And sometimes I think I might actually Make Art and become a Recognized Artist! 

    The truth is that I finally "graduated" to a new PC system with a 6GB card a couple years ago. Truthfully, I've been able to load everything in my 20+  year DAZ collection and render everything with what I have. And, truthfully I've been in a Creative Slump for 20+ years.

    The more I read this thread, the more confused I got. Realistically, I have a limited income. In the past, my Dad has been gracious enough to buy some stuff for me. Dad is now 95, and I believe his days are numbered. I am fortunate to have what I have. I will continue to browse this an similar threads.

    Oh, and speaking of "Famous Quotes:" In the early 1990s, I told everyone "I don't need the Internet. I get everything I want from local bulletin boards!" What a laugh.

  • kevinso2001kevinso2001 Posts: 7
    edited July 2023

    If you're coming from a 3060, you'll only buy the 4060 ti 16gb if you can't afford a rtx 4080. But don't forget the RTX 4080 is just as fast as the RTX 3090 (non ti)

     

    But the 4060 ti has a smaller memory bus than the 3060. Sure you can render bigger scenes but there's a small chance that your render will be slower. Why would you spend money on a brand new GPU that could be a DOWNGRADE on some parts to the RTX 3060?

    So if you really want an upgrade in 2023, the best choice at the moment is really to find a used RTX 3090 (currently selling for $826 used compared to $2307 just 2 years ago). Barring that, it's better to keep your existing RTX 3060

    Post edited by kevinso2001 on
  • marblemarble Posts: 7,500

    outrider42 said:

    Most GPUs have a 3 year warranty. You should still be covered. Even if you lack a proof of purchase, the card is less than 3 years old. You should check your brand's website for information and how to RMA.

    Then if you get it repaired, you can have two GPUs at the same time if the scene fits the 4070's VRAM.

    Oh I did that. I had the receipt and I took it back to the store. They said they will send it back to the manufacturer under a warranty claim but the whole process can take 2 to 3 months (I am in New Zealand so the manufacturer is PNY - a US company, I think). Anyhow, I was not willing to wait 2 or 3 months without a GPU so I bought a 4070. I've had it drop to CPU a couple of times but this was due to some ridiculous texture map sizes and numbers (why would shoes need 49 maps each 4K???).

  • jd641jd641 Posts: 458

    kevinso2001 said:

    If you're coming from a 3060, you'll only buy the 4060 ti 16gb if you can't afford a rtx 4080. But don't forget the RTX 4080 is just as fast as the RTX 3090 (non ti)

     

    But the 4060 ti has a smaller memory bus than the 3060. Sure you can render bigger scenes but there's a small chance that your render will be slower. Why would you spend money on a brand new GPU that could be a DOWNGRADE on some parts to the RTX 3060?

    So if you really want an upgrade in 2023, the best choice at the moment is really to find a used RTX 3090 (currently selling for $826 used compared to $2307 just 2 years ago). Barring that, it's better to keep your existing RTX 3060

    Memory bus has nothing to do with rendering speed in iray, maybe it might take a few seconds more to transfer all the info over to the card but once that's done it's all taken care of on the video card itself. There's no back and forth transfer of data once the render has started, which is why this card may be really good for us once the prices eventually drop.

  • RL_MediaRL_Media Posts: 339

    Too late for me, I bought a 4090 for the vram already. Won't be buying until 6xxx probably, unless they do somethin nuts like make 48gb cards before then lol.

  • kyoto kidkyoto kid Posts: 41,057

    ...well they do have 48 GB cards, the A6000 and Ada 6000, but those are the professional (formerly Quadro) series cards.

  • billyben_0077a25354billyben_0077a25354 Posts: 771
    edited July 2023

    RL_Media said:

    Too late for me, I bought a 4090 for the vram already. Won't be buying until 6xxx probably, unless they do somethin nuts like make 48gb cards before then lol.

    How hard was it to fit in your case?  It is a BIG card

     

    Post edited by billyben_0077a25354 on
  • kyoto kidkyoto kid Posts: 41,057

    ...yeah size along with the power draw is why I'm sticking with my 3060 until there is more  definitive information on pricing of the 4060 Ti and Ada 5000 (the latter which will have 32 GB of VRAM and over 15,000 shader units)  Yes it will cost more than a 4090 but consume less than half the power (meaning less heat) and is a dual rather than triple slot card.  32 GB of VRAM would mean less time spent on optimising scenes, particularly when rendering in large format.

    Unfortunately Nvidia dumped NVLink with the Ada Lovelace generation.so no more VRAM pooling.

  • outrider42outrider42 Posts: 3,679

    marble said:

    outrider42 said:

    Most GPUs have a 3 year warranty. You should still be covered. Even if you lack a proof of purchase, the card is less than 3 years old. You should check your brand's website for information and how to RMA.

    Then if you get it repaired, you can have two GPUs at the same time if the scene fits the 4070's VRAM.

    Oh I did that. I had the receipt and I took it back to the store. They said they will send it back to the manufacturer under a warranty claim but the whole process can take 2 to 3 months (I am in New Zealand so the manufacturer is PNY - a US company, I think). Anyhow, I was not willing to wait 2 or 3 months without a GPU so I bought a 4070. I've had it drop to CPU a couple of times but this was due to some ridiculous texture map sizes and numbers (why would shoes need 49 maps each 4K???).

    2 to 3 months??? Ouch. What are they doing, shipping it on a camel...riding a rowboat over the ocean? 

    Granted, I live in the US, but completely across the country from California. I was out a TOTAL of 2 weeks when my EVGA 1080ti died by lightning. What makes it all even crazier is that I bought the card used off ebay. Though I also had a 2nd 1080ti, so I wasn't truly shut down, I just had my render speed cut in half, lol. But is wild that I got a faster RMA on a second hand product than most people who bought new. (Which is also why EVGA leaving GPUs is such a huge blow.)

    But the 4060 ti has a smaller memory bus than the 3060.

    That is meaningless to Daz Studio. The only time this comes into play is if when the scene is sent to GPU after you hit the render button. But this only adds maybe a couple of seconds at most to the loading time. Seriously. Even for animation it is a non factor. We are still talking about moving data at 288 GBs per second. That's gigabytes, and that is still a lot of bytes. For playing video games, this matters more, because video games move data around so much faster than most other software, especially Iray. Iray gets the entire scene all in one single shot. But a video game is constantly moving data in and out, and this data has to move in tiny fractions of a second because video games are rendering (ideally) more than 60 frames every single second. So any hiccups in that process can cause a video game to stutter. This difference becomes more noticeable at high resolutions like 4K. But we don't have this issue with Daz Studio.

    So the 4060ti is not a great GPU for playing video games, though it is still faster than the 3060, and the 16gb model is an upgrade in pretty much every way. Its VRAM capacity and cache actually make up for its bandwidth versus the 3060, and of course, its raw compute. The issue, like a lot of the 4000 series is the price. It may be an upgrade to the 3060, but is it worth the upgrade? If you already have a 3060 you probably don't need this.

    At the same time, the 4080 costs more than twice as much as the 16gb 4060ti. So I can't say the 4080 is a great upsell, either. If the pricing was better on the 4080, then sure, it would be easier to upsell it. But it isn't. In some benchmarks the 4080 is almost exactly twice as fast as the 4060ti, so you would be paying twice as much for twice the performance...I don't know how ideal that is. HOWEVER, this was not the case for all the render engines. In Blender OptiX the 4080 is NOT twice as fast as the 4060ti, it falls short of that. So paying twice the price for the 4080 for Blender is not actually worth it. It is also worth mentioning that Iray is based on OptiX, but that doesn't mean the performance gap will be the same. We still do not have any benchmarks specifically for Iray yet with the 4060ti, so we cannot say for sure where it falls. But the odds are the 4080 is not a better deal in terms of performance per dollar.

    That's whole point of why people are looking at the 4060ti. Obviously a 4080 would be awesome, but it costs over $1000, and that just something a lot of people cannot do. The more intriguing option is a used a 3090, which we have covered. If I can buy a used 3090 for $600, and the 4060ti is pushing $600, I would just get the 3090. I know there is a fear of the unknown with that, but you can minimize your chances of getting a bad one with some research and buyer protection.

Sign In or Register to comment.