Adding to Cart…

Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
...well according to Newegg they have (or should I say "had") both the standard 3060 and Ti model, all of which are already out of stock by this afternoon. Odd that the Ti version only has 8 GB of VRAM instead of 12.
I've gone from the current version to an older version before on the same scene...just did that today actually because 4.15 was giving me grief and I wanted to see if it would behave in an older version.
...when trying to open a scene in an older version that was changed and saved in a newer one, my experience has always been one of having a dialogue box that would pop up which mentioned the scene could not be opened because it was saved in a newer version.
I managed to buy a 3060 earlier in the UK, mostly through picking a model in advance, making sure I was all logged in and validated, and then refreshing the page immediately after five o'clock and bloody-mindedly clicking "Add to Basket", "Checkout" and "Pay" in rapid succession.
However, from the grumpy messages on Twitter, I think I must have been one of the last people to check out before the store ran out of stock.
I'm honestly surprised that more UK stores weren't implementing some kind of waiting list system, given that queuing is our national pastime. (Also, to be honest, much fairer than "I just rushed through so fast I couldn't even compare model pricing", even if it worked in my favour).
Been scratching my head here trying to figure out why some of you are so set on having the newest graphics card and why the rush to get it right now, despite the increase in price with each new release. Does the one (or 2 in some cases) you currently have not work any more?
Some of you say Newegg needs to ban those sellers/scalper. Have any of you reported these to Newegg so they can be banned?
If Nvidia wants to stop scalpers, all it has to do is stop selling more then 1 or 2 cards to an individual and a set number to legitimate stores that have a business license.
...interesting that the 3060 with a MSRP of 329$ has the same amount of VRAM of, and more cores (including RTX cores) than my Maxwell Titan-X which originally retailed for 1,000$
Crikey, my old 1 GB, GTX 460 with 1/10th the number of cores was only 40$ less than the 3060's MSRP..
Nvidia can't stop scalpers like that. It's not like ticket scalpers having to pay people to stand in line for limited allocation concert tickets. Four bots buying one card each are as easy to run as one bot buying 4.
Todays 3060s are already showing up on ebay.
I just might have to give up on nvidia cards, or any GPU upgrade actually, until PCIe5 cards start showing up.At that point maybe I can get one that hasn't been used for an overclocking example from some gamer site,,or ran 24/7 for 2 years in a mining rig.
The problem with hoping newegg or nvidia will stop scalpers is scalpers money is just the same as everyone elses
theoretically long term if enough people get upset and the scalpers go away they might lose money, but thats theoretical future profits. and companies tend to value that waaaaaaay less than current profits.
I'm not sure if there's any actual regulation that could effectively fight it either.
I really hope Bitcoin and the like crater. They're stupid and also mining them uses more energy than most countries. And did I mention theyre also stupid and pointless?
They work, but some folks are on a never-ending quest for faster render speed, ideally less than a minute per frame. I know some artists who say 30 seconds is too slow for them. ;P
I'm a gamer first and digital artist second. I have a 1080TI. New games coming out these days need better than that when gaming above 1080p. The conversation has also been hashed and rehashed about how Iray is moving more and more towards not fully supporting (or wanting to support) GTX. RTX is the next wave and folks will need to jump on that train if they want to keep up.
They can't stop scalpers from buying them, but they can stop scalpers from using their platform to sell from. Just like when Amazon was asking people to report ads for $12 rolls of toilet paper. If Newegg sees someone selling a 3090 for twice the MSRP, that ad should be tanked and the seller banned.
Yeah, I fought pretty hard to get my current rig built early last year. I'd already delayed a year because I got hospitalized for two days and my husband lost his job at the same time (I couldn't justify replacing a still-working computer under those circumstances, even though it was time to upgrade to better), but it did mean I started with an RTX2060 at least, so I can render with some decent speed - my rig was built to be a render machine first, gaming rig second. The current situation does mean I've had to put off (perhaps indefinitely) the idea of upgrading the video card on the virtual pinball machine downstairs, or at least, that upgrade might well have to morph into replacing the entire computer inside instead of just the video card.
Doesn't mean I'm not still highly annoyed at the constant stripmining sweeps of the cryptominers and the scalpers keeping these things out of the intended market.
The card I'm upgrading from is a GTX 1650, so I don't think you need many guesses as to why I might have been eager to upgrade.
As is, I've been limping on with a particularly underpowered card for some time, as I didn't want to commit to buying a 20XX card relatively late in the first RTX generation - it made sense to wait for the 30XX series. And well, by this point, I was kind of frustrated, and the 3060 was the first of Nvidia's line-up I felt was reasonably well suited to my needs. (I'd've liked the 16GB 3070 that was at one time rumoured, but that doesn't seem likely to be happening any time soon).
The reason why something has to be done to prevent this scalping is because it WILL come back to haunt them really hard in the end. When the previous crypto booms went bust, they caused AMD and Nvidia a lot of trouble and money (money obviously being far more important). Nvidia was sued by their own investors. AMD was effected by a previous bust so bad that they nearly went bankrupt. Both companies boosted production only to be stuck with unsold stock when the crypto bubble busted. When the bust comes, the market is flooded with cards, and the used market drops like crazy. Customers will remember how they were treated during the boom, and hardware makers will need to earn consumer trust back.
Thus miners are a cancer to the entire industry. They take up everything during the boom period, and then flood them into the market when it busts. This does not happen with normal buyers, who would use a GPU for a few years before buying a new one. Some people don't even resell their GPUs. Plus miners might not buy a new one if the bust period is in play.
And keep in mind that AMD and Nvidia do not get to see the inflated prices. They might sell a lot of units, but the retailers are the ones making extra markups, and of course the scalpers. Don't think the retailers get off easy on this, either. If retailers jack up prices they risk their reputations with customers. While they might sell to people who desperately want a GPU today, they may be in for a rude wake up call when stock levels return to normal. There will be a lot of businesses that crash and burn after this crypto boom ends.
...before the last cyrptomining craze, was saving up for a GTX1070 then almost suddenly prices went through the roof. 379$ was a pretty big expense on a tight budget already, but 800$ to even 1,000$ was out of the question (PC Gamer even quipped that one would be just as well off shelling out extra for a Titan-XP).
Suddenly I'm feeling relatively satisfied with my little run-of-the-mill 6GB 1660. At least I have something that does an IRay render before I grow a beard. But I will pounce on a 3060 if one wanders my way when/if things return to normal. I can't justify even $500. I have to think really hard to justify $329;
...yeah I already have a 12 GB card, it may not be state of the art with RTX and Tensor cores, but far better than rendering on the CPU.
If I were going to get anything new it would be a step up from that, like an RTX A5000 when those come out (may be able to actually get my hands on one and it might even cost a bit less than a 3090 given the current price gouging by scalpers). The pro workstation cards don't seem to have the same appeal to miners, as they have lower clock speeds, don't use the same drivers, often are sold as OEMs in full systems, and overclocking them causes instability..
I really do think the 3060 will be a solid card for Iray mostly for the 12gb. And you guys might get the chance, reviews are very mixed at best from gaming review sites. Some of this is because they are getting jaded, but for gaming tasks the 3060 is kind of underwhelming. The 3060ti is way more powerful, so much so that the naming for the 3060 makes little sense.
However, like I wrote in the benchmark thread, Ampere is a beast for Iray. Even the slower RTX cards are faster than the fastest GTX cards. In that benchmark, my 1080ti does 4.6 iterations per second. But the 3060ti has a bench doing 10.6. Do the math people, the 3060ti, at least in this specific scene, is way more than double as fast as the 1080ti. The 3060 is not going to hit that mark, but it doesn't really have to. If the 3060 is 25% slower, that would still place close to double the speed of the 1080ti. The 1080ti was always the attractive card because of its VRAM and speed for its time, and now a GPU that *might* cost $330 could be almost twice as fast rendering Iray. So the 3060 would be a fantastic upgrade for any Iray user who still has a GTX card, no matter what GTX it is. It makes less sense if you already have a RTX card, though compared to the 2060 it would be a very good upgrade...and double the VRAM.
Benchmarks today show the 3060 just matching the 1080ti in games, sometimes it is slower. The only games it is faster involve ray tracing or use DLSS (Tensor cores), and these games are still rare today. So gamers are less impressed by the 3060, as a lot of GPUs are faster now.
So if gamers are apprehensive about the 3060, that could leave you an opening to pick one up once these things start to become more readily available.
Oh, and if anybody does get one, if you are willing, it would be really fantastic to see what its benchmark times may be.
https://www.daz3d.com/forums/discussion/341041/daz-studio-iray-rendering-hardware-benchmarking/p1
...yeah, not being a gamer here the standard 3060 would be the better choice.
Yeah. And if the 3060 can hit say, 8 iterations per second (now I have no idea where it will actually land), that would be pretty solid over a 1080ti, and certainly over any GTX around. The 3080 has not been benched in 4.14 or 4.15, which showed a big performance boost over previous versions. However, it did do 12 iterations per second in 4.12. The 3090 gained I think about 4 or 5 iterations, while the 2080 gained about 2. So I would assume the 3080 would be in the middle, gaining around 3, maybe 4 iterations per second over 4.12. That would place it at 15 or 16 iterations per second. So in this purely hypothetical situation, two 3060s would hit close to 16 iterations per second, while offering more VRAM to the user.
If that is the case, then buying two 3060s might make more sense than a single 3080 to Iray users. Of course that is assuming prices are near MSRP. With how hard the 3080 is to get, it may not matter, the 3060 might be the only one possible to get.
But then Nvidia might just pull that 3080 20GB out of a hat after all, and throw the VRAM thing into disarray.
Nythfall...love that spelling by the way lol. I agree about everyone wanting the faster render.
Considering I have been using a 15 yr old dual quad computer, with a ATI Radeon HD 3400 series graphics card and 4 gigs of ram, my renders are on average 20 to 30 minutes long, so when someone complains about a render taking 30 seconds, I have an overwhelming urge to hit them on the head with a hammer. I would kill for a 1080 graphics card and be in heaven for quite some time with it. Fortunately I have most of the computer parts, i5 cpu and 16 gigs of ram and other stuff, to get a new computer built, just need a graphics card. My daughter is looking for a second hand 750, hopfully so my openGL will be 3.3 instead of 2.2 so I can use the new Blender, as well as to finally have less of a headache using Iray, though it doesn't really interest me, and can finally use dforce, which I currently can't.
Now if one is a gamer, of which my son is and I currently know more about gaming then I ever thought I would (can't believe I spent over 5 hours on a final fantasy wiki reading character bio's last night and I'm not a gamer), I can certainly understand wanting a 20xx card, with today's games, it's a given. However, I don't think the 30xx series cards are needed at this time, since games requiring such are kinda rare right now.
If one wants to create a game or animation, I can understand wanting something in the high 20xx or even possibly the 30xx series cards. It makes sense since techology is moving that fast, with games and animation trying to keep pace with the big production companies of these items. I remember when the first Final Fantasy movie came out (the one not related to the games) and thought the animation was amazing, then FF Advent Children came out and I drooled over the graphics and animation for it, then my son showed me the trailer for CryEngine4 (or 5, can't remember which) and I thought I was going to die. I think you get the picture...lol I've been in love with CGI since my first ever viewing of it, though after all this time, my interests are not in realism, so don't really need Iray.
Only thing left after what I mentioned is single images and I can't understand why anyone would get angry if a single image takes more then a few minutes to render. I get wanting to use Iray because of the realism one can achieve, you just don't need the lastest one for a single image, or even a series of images if one does a comic or graphic novel and the like. Unless one works for a company that demands these things be completed every week, and I highly doubt that, there's no need for the speed these new cards bring.
....if you are going to go for an older card I'd be careful about early generation cards as some are no longer supported for Iray (like Fermi and I believe more recently, Kepler). Maxwell (900 series and Titan) is still safe as well as Pascal (1000 series).
Someone posted a link about the newest cards Nvidia will no longer support, so sent it to my daughter, since she is the one searching for a card for me. She and her husband have a tech shop and basically travel all over the island and mainland doing contract jobs. This summer they will be upgrading all the computers for every post office from win7 to win10 for Newfoundland and Labrador.
Interesting thought experiment--I am 100% certain that the ultimate value of all crypto currency will end up at 0. Consider the relationship between current fiat currency supply and energy availability (especially electricity--how much physical money do you carry anymore) and then ask yourself what is the ultimate value of crypto in the event of a disruption/shortage of energy availability (those in the US can use the recent ice storms in Texas and the southeast as a starting point). With no electric grid and internet all the crypto wallets become worthless. Fiat currency is also impacted but will still maintain some residual value (script and coin always was some store of value well before electricity was invented). I laugh at people who maintain that crypto will make fiat currency obsolete and that it will continue to increase in value. It follows that any policies/actions that have intended or unintended consequences on energy availability will have the potential to inflict significant economic disruptions.
TLDR version--a crash in crypto currencies is coming and it will as always be "unexpected" and it will be breathtaking to watch. The only downside is that I expect it will also spill over into other markets (due to leverage/speculation) and lead to a significant economic crisis. At least GPUs will become more available :D
I think you may be right about impacting other markets, especially since I am seeing some stores, when I am out shopping, state they accept bitcoin as a form of payment
In a way, Iray and gaming are not so different. When it comes to hardware, it is always about what you want, and how much it is worth to you. If you just want to play at 60 fps at 1080p, then a huge number of available GPUs going back years will do the job. But if you desire high frame rates, then the hardware to push those rates goes way up, even for games that are not that demanding. It also depends on the games, too. Some games are way more demanding than others. If you want to have "Real Time Ray Tracing", well you need RTX to do so.
People who play competitively will want the highest frame rates they can get, and generally stick to 1080p because they can push higher rates with it. Others may want the visual fidelity and high resolutions. These can have slightly different demands. You need a good CPU to crank high frame rates, while you might be able to slide at 4K because the GPU becomes the bottleneck.
Then other things can factor, like RAM speed, SSD, and so on. A gaming PC benefits from being balanced, all the parts kind of need to be on par with each other or you will bottleneck somewhere. The GPU is obviously important, but the other parts matter a lot, too.
But Iray is very different here, because it really doesn't demand that much of those components. Iray pretty much relies on GPU power and little else. So you can get away with some very "unbalanced" computers that have old CPUs and parts, but a shiny new RTX. You really only need a computer that can handle the Daz app itself, and enough RAM to handle what you throw at it. But the speed of the RAM is not important at all. Just that you have some.
Iray compresses data a lot when it sends it to VRAM, but the whole scene remains uncompressed in RAM. So if you go buy a big VRAM GPU, you do run a risk of running out of RAM for it if you lack RAM. Many of my scenes use at least twice as much RAM as VRAM. So for an extreme example, if you got a 3090 with 24GB of VRAM, but your system only has 16GB of RAM, then you will never be able to fill that 24GB of VRAM, because you will run out of RAM first. Even if you have 32GB of RAM, you still run a very strong risk of running out of that RAM before getting close to the 3090's limit.
As if on cue there was post yesterday from a user who such an issue. They have a 3090 and 64GB of RAM. They were testing how many Genesis 8s they cram into a scene with a pool (no optimizing BTW). They were up to 14 when it crashed Daz, even though they were only at 17GB of VRAM. It turns out they were exceeding their 64GB of RAM, and that was causing the crashes.
So they need to either optimize the scene or replace their RAM with a 128GB kit if they wish to keep adding Genesis 8s.
Still, they can get to 17+GB in the first place, and that is something most people cannot do, because most people do not have 3090s or cards with that much VRAM.
Just some food for thought. Some components are not being scalped right now, so it is possible to at least build a PC, just maybe not with a brand new GPU. But you can build the rest.
Which on that note, I actually have done that. I bought a 5800x, x570 motherboard, and 64GB of RAM for my new build last week. It is up and running, but I haven't installed everything yet. So like I have two PCs right now, and the new Ryzen is sitting in my old case with my old power supply and GTX 670. Its sweet, that case is from 2002, LOL. That's just so I can keep running my current system and install my stuff. I'd still use that case if it wasn't filled with drive bays that take up have the case, it is very old school design, but it is the sexiest blue steel case ever (if I do say so myself).
While I'm not too worried about the overall final render - I can put that on while I'm out walking the dogs, or out for the day (at least back in times where "out for the day" was a thing) on my old 1050 Ti I had some scenes that took more than half an hour to render a test image to a level where I could start to get more than the vaguest idea of whether idea of the lighting was catching certain details right.
When it takes multiple attempts to get some of these things right, that gets really time consuming and frustrating.
Having previously used a 1050 Ti or 1650 for Iray, and even done an odd few renders on a 3rd Gen Core i5 Laptop, I know it's possible to use less than the latest and greatest software, but I'm also not going to begrudge people getting frustrated with that, particularly when many modern assets are starting to demand more processing power and memory.
~~~~~
As far as the question of assets demanding more processing power and memory... well, it depends on how well the 3060 takes off, but it is the card that Nvidia would like to target the ~10% of gamers still using a 1060. (Still the most common card in use).
If the 3060 becomes commonplace, mid-range gaming systems having 12GB of VRAM on tap represents a big leap in what the average PC in the Daz market will be able to handle. Could be interesting as far as how adventurous PAs can expect to be and keep their work saleable to a large portion of the audience.
For those frustrated by the challenges in obtaining the latest GPU hardware, I offer a couple options for consideration. They may allow you to achieve render times VASTLY better than the relatively minor improvements you might see from next generation GPU's (historically, cutting render times by only 30-50%). Personally, if my renders are taking more than 5 minutes I figure I have a problem that needs fixin'.
1. "Less is More": I take it as a fun challenge with every scene to figure out how I can have the least number of objects in the scene while still conveying the location and mood. Empty backgrounds render VASTLY faster than indoor/enclosed backgrounds or those with lots of stuff. Characters and their interactions are usually the points of interest in a scene, so the surroundings often serve a very minor purpose (or even an annoying distraction). An example I've used before is if your character is in a bar, what is the one object that could immediately define the location? Well, a neon beer sign on the wall with an empty black background. Or, outside on a farm? A clump of tall grass on the ground and maybe a bale of hay, with a light background. And so on. Not only will that render quickly, it will take a LOT less memory (RAM and VRAM), and will allow you to navigate your scene in Iray preview, which a fast GPU might not. And it will focus on the important character, not grab attention to irrelevant surroundings.
2. Compositing: What the pros do, and it's basically rendering an image in layers. It allows you to put, say, a rendered character on top of a photo background, or have ultimate control over lighting and shadows and depth of field and colors and tons and tons of stuff, much of which takes a huge amount of render time to reproduce. And with animations you can do ONE render of a complex background image, and place an animated sequence of renders of only your character in motion. Much faster than doing, say, 120 frames of 1 hour renders for a 4 second animation.
3. Scene Management: This is my term for recognizing what situations cause huge render times and trying to fix them. For example, a render inside an enclosed box (building, room, whatever) usually takes tons more time than an unenclosed render. So you manage your scene so as not to use enclosed spaces. And so on...
The list is much longer, but suffice it to say that there are alternatives for expensive GPU's that provide vastly faster renders while giving you far more artisitic control and, for many, with a much better and more satisfying result. Not for everyone, but something to consider at least.
Well, I'll be benchmarking later (once my new PSU has arrived and I can actually use my 3060), but I'm ballparking the improvement I'm expecting relative to my 1650 at more like 75-80%.
Bear in mind, a lot of people will be jumping two or three generations (while my 16 series was *kind of* last generation, the fact it was non-RTX means it's a lot closer to the 10 series than the 20 series) and two or three lots of 30-50% multiplies up.
How small are you people rendering?
I know my 1650 isn't the greatest card in the world, but I've had several renders in sizes like 2400x2400 or the like (which is small compared to some people's renders) that even after carefully optimising them to not crawl still take five, six, nine hours before they were even clean enough to hand to a denoising tool, and I don't think even some dual 3090 beast could reasonably get those render times down to under ten minutes.