GRAPHICS CARD - VRAM/SYSTEM RAM Conundrum + Myth Busting

edited May 2023 in The Commons

Ok.... 
Here is what I am working with.

Nvida 3060 12GB
PROCESSOR: AM4 Ryzen 9 3900X
SYSTEM RAM: 32 GB DDR4 3600 (PC4 28800) CAS 16
OS DRIVE: MSI SPATIUM M480 M.2 2280 2TB PCIe Gen4x4
DAZ Library Drive: Team Group MP34 M.2 2280 4TB PCIe 3.0 

Ok, so what I do with this computer is 
45% Daz/3d Graphics Programs (Daz/Zbrush/Blender)
45% Internet/e-mail/Gen Office
10% Gaming - World of Tanks/VN kind of stuff
So, a typical rainy Sunday afternoon will have me Rendering a Scene in Daz, with Renpy open along with Firefox browser with the Daz Store and Forum open.

I might be good as is for what I am doing, but RAM is getting so cheap that I though I might go to 64GB

With todays "modern" computers, I have read conflicting arguments about adding RAM. Some say that you HAVE to buy it all at once and get matched etc etc.  Others say, It doesn't matter much, you might have a slight performance hit, but you can add more RAM to what you already have.

So...  I could add another two sticks of the same exact memory I already have (32 GB DDR4 3600 (PC4 28800) CAS 16) or for just a little bit more, get  64GB of (2 x 32GB) Patriot Viper 4 Blackout Series DDR4 3600 CL18 

Which would have me running 32 of CL16 and 64 of CL18 for 98GB of Mismatch.  Or take the 32 out and go with the New Matched 64GB CL18.
With my use case, is the performance hit going to be that big a deal?  Can it even be done?  At the end of the day, I am guessing as long as my scene has less than 12GB (for sake of argument) worth of assets, its all going to be rendered on the card... But if I go to 14GB (outside of the card's ability) it Drops to the CPU, and will RAM, mismatched or not, matter any?

Is there any REAL difference, for my use case, in having CL18 as oppsed to the CL16 in it already?

Again, for my use case, I am not sure it matters and I may be good as is, but eventually I'd like to get a 4080/4090 and the extra RAM will help.  Also, would there be any performance (for what I do) benefit for upgrading to anything in the  Zen 3 CPU Line (5900X)?

Thanks, in advance

 

Post edited by pjwhoopie@yandex.com on

Comments

  • AgitatedRiotAgitatedRiot Posts: 4,437

    Chumly said:

    Ok.... 
    Here is what I am working with.

    Nvida 3060 12GB
    PROCESSOR: AM4 Ryzen 9 3900X
    SYSTEM RAM: 32 GB DDR4 3600 (PC4 28800) CAS 16
    OS DRIVE: MSI SPATIUM M480 M.2 2280 2TB PCIe Gen4x4
    DAZ Library Drive: Team Group MP34 M.2 2280 4TB PCIe 3.0 

    Ok, so what I do with this computer is 
    45% Daz/3d Graphics Programs (Daz/Zbrush/Blender)
    45% Internet/e-mail/Gen Office
    10% Gaming - World of Tanks/VN kind of stuff
    So, a typical rainy Sunday afternoon will have me Rendering a Scene in Daz, with Renpy open along with Firefox browser with the Daz Store and Forum open.

    I might be good as is for what I am doing, but RAM is getting so cheap that I though I might go to 64GB

    With todays "modern" computers, I have read conflicting arguments about adding RAM. Some say that you HAVE to buy it all at once and get matched etc etc.  Others say, It doesn't matter much, you might have a slight performance hit, but you can add more RAM to what you already have.

    So...  I could add another two sticks of the same exact memory I already have (32 GB DDR4 3600 (PC4 28800) CAS 16) or for just a little bit more, get  64GB of (2 x 32GB) Patriot Viper 4 Blackout Series DDR4 3600 CL18 

    Which would have me running 32 of CL16 and 64 of CL18 for 98GB of Mismatch.  Or take the 32 out and go with the New Matched 64GB CL18.
    With my use case, is the performance hit going to be that big a deal?  Can it even be done?  At the end of the day, I am guessing as long as my scene has less than 12GB (for sake of argument) worth of assets, its all going to be rendered on the card... But if I go to 14GB (outside of the card's ability) it Drops to the CPU, and will RAM, mismatched or not, matter any?

    Is there any REAL difference, for my use case, in having CL18 as oppsed to the CL16 in it already?

    Again, for my use case, I am not sure it matters and I may be good as is, but eventually I'd like to get a 4080/4090 and the extra RAM will help.  Also, would there be any performance (for what I do) benefit for upgrading to anything in the  Zen 3 CPU Line (5900X)?

    Thanks, in advance

     

    I know you had to match RAM from the old days, but nowadays, not so much. I continue to match my RAM modules. but mixing and matching RAM modules isn't the best for system performance. My Two Cents.

  • oddboboddbob Posts: 396

    5900x looks to be about 20% quicker than a 3900x in single and multi core performance, don't think you'd see much real world difference.

    The mixed ram may work but you'd have to use the timings of the slower kit. I had 3200 CL16 and CL18 in an intel board but the bios would only report and allow me to use one set of XMP values depending on which slots the ram were in. It required some swapping about but it did run ok in the end. It would be safer to buy the 64gb knowing that if you don't get both sets to work together then you still have 64. You won't see any noticeable difference between CL16 and CL18 at the same speed.

    Have you considered waiting and doing a full build later rather than swap parts and possibly end up with annoying bottlenecks?

    I put a 4090 into a 10700k system and it was ok but if felt like it was being held back. It rendered like a 4090 does but doing test renders on single figures the load times were often longer than the render times, upgrading to a current gen cpu/board/ram setup made it feel a lot more balanced. For gaming the 10700k struggles to drive a 4090, even at 4k, I was seeing hard FPS limits where changing graphics settings made little to no difference. I was going to wait until next round to upgrade but either the board or CPU took a dump which kind of forced my hand.

  • nonesuch00nonesuch00 Posts: 18,131

    It doesn't have to be bought all at once for RAM or the same brand. if the RAM speed doesn not match if will run at the speed of the slowest RAM. It doesn't even have to be the same size either.

    if I were you, I'd upgrade to max RAM if it's higher than 64GB, eg 128GB I'd go for it. At this stage then next target you should go for is a upgrade to a RTX 4090, or if you have to scrimp & save for a long time for such expenses, maybe an RTX 5090.

    Why 128GB or higher? Because in about 5 years I think you will be able to animate and render animations simply by writing a simple script or reciting action to your computer (OK, in my opinion). And for the results of that work not to absolutely crawl for weeks when being generated, you need the RAM & the need the top of the line graphics. Go for top of the line Graphics card over RAM though, RAM only if it's really cheap.

    Myself, my DAZ library + DAZ installers is at about 1.5TB on a 2TB SSD so I decided after thinking I would do what you did with 4TB USB SSD I will instead go with 2TB USB SSD (installed DAZ Library) + 2TB USB SSD (DAZ DIM Install Files) as an SSD failure of one won't be as catastrophic. I will be buying matching size backup SSDs as well but all in good time given the lesser failure rate. 

    I also have 4 1TB USB SSDs for each of Unity I/O, UE4 I/O, Blender/MD/DAZ I/O, and general compilations I/O.

    I'm going to have to read up if it's possible to install all of those as "USB on a stick" style, although practically speaking except for Blender I will always be using the same desktop to use the others.

  • LeatherGryphonLeatherGryphon Posts: 11,512
    edited May 2023

    Maybe this is a repeat of what somebody else has already said, but my experience for the last three years, with DDR4 RAM on good quality modern(at the time) motherboards with adequate specifications, has been that mixing memory sizes and speeds can work at default speeds.  BUT, if you push them, and your motherboard has four slots for memory, you may have better luck using just two of them if you want to get full rated speed from your RAM.   When I used Intel's XMP feature (or AMD's corresponding feature) to unlock the rated speed of the RAM, they sometimes just wouldn't work and I had to disable XMP and work at the default speed of 2133tps(or whatever it is), instead of the 3200tps I'd paid for.sad  So, I played musical memory sticks among my machines for a while trying to find a set that worked to give me 64GB in 4 slots at 3200 speed.  But never got it to work.  During that period, my habit was to buy the same memory manufacturer & type of RAM (2 sticks of DDR4, 16GB, Corsair Vengeance, 3200 tps, same latency), well within the "should work" category.  I never did buy a factory matched set of four 16GB sticks.  I was afraid I'd get stuck with 4 extra sticks.sad  So, I finally gave up and bought two, same type, but 32GB sticks, left the other two slots empty**, and called it a win.indecision  The two leftover 16GB sticks eventually found a home in another computer.  Wheee... 

    I haven't built a compter in about a year so I haven't kept up with current problems.  Just sayin'.indecision

    **(Empty slots make me sad.sad)

     

    Post edited by LeatherGryphon on
  • frank0314frank0314 Posts: 14,059

    I tend to get RAM that matches the model type, size, frequency, latency, and manufacturer so I don't have any "possible" conflicts. 

  • PerttiAPerttiA Posts: 10,024

    frank0314 said:

    I tend to get RAM that matches the model type, size, frequency, latency, and manufacturer so I don't have any "possible" conflicts. 

    Same here 

  • LeatherGryphonLeatherGryphon Posts: 11,512

    Perhaps my experience was due to defective electrons, or forcing red electrons down blue wires.wink

  • generalgameplayinggeneralgameplaying Posts: 517
    edited May 2023

    Technically you could mix RAM with different frequencies and timings in theory, which will boil down to the best specs all modules support at the same time. Probably shouldn't try mixing incompatible types (ECC with non-ECC, DDR3 with DDR4 or DDR5??, really odd stuff).

     

    Performance-wise, apart from identical specs being best compatibility-wise, which also can benefit performance, ensuring dual channel mode to work, also will be of benefit. Typically that means using RAM bars with identical capacity, though there may be controllers which do well with A+A and B+B with four ram slots, HOWEVER, with lots of memory used for one application, like with 3D-rendering, NxA will certainly be preferable. Number of channels also depends on Motherboard and CPU specs. Most consumer mainboards with 4 slots are still only dual channel, so it's crucial to ensure, if your envisioned configuration will work in dual channel mode at all. Typically quad-channel would mean a workstation or server platform (E.g. Threadripper and up for AMD). So matching the capacity may have the largest impact on performance, next to "not working at all" (10-20% for some tests, CPU is taxed more too, in single channel mode). 

     

    Concerning "fastest", also be on guard about the specs of the motherboard and the CPU, as there might be limitations like the fastes clock frequencies only being compatible when using two ram slots, not all four, for instance, which might limit the maximum amount of memory at that frequency, or force you to use a lower clock frequency with 4 slots in use, if all the RAM modules support that.

     

    To be on the safer side, at least RAM type (DDR4 for instance), capacity, clock frequency and latency should match for the desired configuration. From there on matching specs of the same manufacturer and series, or even a kit will be preferable in that order (reversed order, i.e. last is best). Certainly there are more factors like heat dissipation, airflow in general and space management inside of the case :).

    Post edited by generalgameplaying on
  • oddboboddbob Posts: 396

    LeatherGryphon said:

      So, I played musical memory sticks among my machines for a while trying to find a set that worked to give me 64GB in 4 slots at 3200 speed.  But never got it to work. 

    I got 2 x 16 CL16 and 2 x 32 CL18 to run at 3200 in an intel Z490. Took some trial and error as to which sticks needed to be in which slots so the board would read the XMP values from the slower ram. Wouldn't even boot otherwise.

  • oddboboddbob Posts: 396

    frank0314 said:

    I tend to get RAM that matches the model type, size, frequency, latency, and manufacturer so I don't have any "possible" conflicts. 

    It's a fair approach and it'll give you a fighting chance but you could still get chips from a different manufacturer or chips with different sub timings. Ram chips are typically made by Samsung, SK Hynix or Micron regardless of who assembled the ram and put their name on.

    Another issue is manufacturers releasing products and then downgrading them later with slower and cheaper parts but keeping the model name the same. This has happened lately with SSD drives.

  • MouserMouser Posts: 675

    One thing you may want to investigate especialy if you have some of these applications running at the same time.

    Make sure you are using 64bit versions of the software, right down to the simplest (email etc).

    Sometimes a 32bit version of an application can flip a flag somewhere in the system and the OS (logicly) starts thinking its a 32 bit OS and no longer accesses the RAM etc you (physicly) have.

    Apple's IOS dropped 32bit support for its current OS version due to this very issue.

  • Saxa -- SDSaxa -- SD Posts: 872

    Story for you guys.

    A previous PC, upgraded to 4 dimms from 2. Made sure all the same, depsite buying years apart.  Worked perfect.

    My latest PC build.
    Bought 2 sets of x2DIMMs for a total of 4.
    At same time from same vendor - and otherwise perfect matched sets.  Top rep Mem-maker.

    Put in PC thinking would be fine.
    Yes, the base memory clock works.  But not XMP.  Which is what memory is usually based on when advertised.
    If remember right, the difference between base ram speed and XMP is about double.

    Vendor pretty quick showed a disinterest despite having bought alot from them.  Probably knew it would be involved.
    Contacted Mem-maker and got lucky.

    Very long story short, got verification the 2 sets of memory had different timings, and suggestion what to use.
    This meant in this case in adv.bios had to change not just 4 settings, but it was all 12 to 16 (been a while since did this).

    XMP finally worked proper.

    That took a lot of time comunicating and doing and stress.
    Lesson for me was buy 4 dimms in matched set.

     

  • Saxa -- SD said:

    Story for you guys.

    A previous PC, upgraded to 4 dimms from 2. Made sure all the same, depsite buying years apart.  Worked perfect.

    My latest PC build.
    Bought 2 sets of x2DIMMs for a total of 4.
    At same time from same vendor - and otherwise perfect matched sets.  Top rep Mem-maker.

    Put in PC thinking would be fine.
    Yes, the base memory clock works.  But not XMP.  Which is what memory is usually based on when advertised.
    If remember right, the difference between base ram speed and XMP is about double.

    Vendor pretty quick showed a disinterest despite having bought alot from them.  Probably knew it would be involved.
    Contacted Mem-maker and got lucky.

    Very long story short, got verification the 2 sets of memory had different timings, and suggestion what to use.
    This meant in this case in adv.bios had to change not just 4 settings, but it was all 12 to 16 (been a while since did this).

    XMP finally worked proper.

    That took a lot of time comunicating and doing and stress.
    Lesson for me was buy 4 dimms in matched set.

     

    Wow, though maybe it's similar to what i did, previously. So the lesson for those who don't want to or can't afford maxed out kits for upgrading: Research. Ask manufacturer and/or research for exact batches that are reported to work.

    Previously using the manufacturers model/batch name like "KSM26ES8/8MR" for Kingston used to work, though i might have used whatever is printed onto the ram bars, in order to find the exact batch or whatever there was. This might be different with some manufacturers, who knows. 

  • AgitatedRiotAgitatedRiot Posts: 4,437

    generalgameplaying said:

    Saxa -- SD said:

    Story for you guys.

    A previous PC, upgraded to 4 dimms from 2. Made sure all the same, depsite buying years apart.  Worked perfect.

    My latest PC build.
    Bought 2 sets of x2DIMMs for a total of 4.
    At same time from same vendor - and otherwise perfect matched sets.  Top rep Mem-maker.

    Put in PC thinking would be fine.
    Yes, the base memory clock works.  But not XMP.  Which is what memory is usually based on when advertised.
    If remember right, the difference between base ram speed and XMP is about double.

    Vendor pretty quick showed a disinterest despite having bought alot from them.  Probably knew it would be involved.
    Contacted Mem-maker and got lucky.

    Very long story short, got verification the 2 sets of memory had different timings, and suggestion what to use.
    This meant in this case in adv.bios had to change not just 4 settings, but it was all 12 to 16 (been a while since did this).

    XMP finally worked proper.

    That took a lot of time comunicating and doing and stress.
    Lesson for me was buy 4 dimms in matched set.

     

    Wow, though maybe it's similar to what i did, previously. So the lesson for those who don't want to or can't afford maxed out kits for upgrading: Research. Ask manufacturer and/or research for exact batches that are reported to work.

    Previously using the manufacturers model/batch name like "KSM26ES8/8MR" for Kingston used to work, though i might have used whatever is printed onto the ram bars, in order to find the exact batch or whatever there was. This might be different with some manufacturers, who knows. 

    I always set the memory timings and voltage to the spec of the memory stick. This is not considered overclocking; you should get the timing and the voltage right. Just don't let your motherboard do training for memory. Then everything should be stable.

  • frank0314frank0314 Posts: 14,059

    oddbob said:

    frank0314 said:

    I tend to get RAM that matches the model type, size, frequency, latency, and manufacturer so I don't have any "possible" conflicts. 

    It's a fair approach and it'll give you a fighting chance but you could still get chips from a different manufacturer or chips with different sub timings. Ram chips are typically made by Samsung, SK Hynix or Micron regardless of who assembled the ram and put their name on.

    Another issue is manufacturers releasing products and then downgrading them later with slower and cheaper parts but keeping the model name the same. This has happened lately with SSD drives.

    I can understand that and I know it is possible to mix and match. I've known many people that did it. I make this content for a living and just can't take the chance of my system being down because I did something like this and by some chance, it screwed something up. I won't even buy say 4 sticks today and 4 sticks 5 months from now. I try my hardest to buy them all at the same time to hopefully get the same batch. It's silly but it's kind of a preventative move for me. If I decide to up and buy RAM today, I determine how much I want and plan for the future and come up with the total I would need to get me through the next 2 years. At that point, I buy however many sticks (Have 8 rails though but try to only use 4 just in case I do add more). I didn't know that many people made the chips. Good to know, thanks for the info.

  • Thanks for all the imput guys.

    My computer is not necessarily "long in the tooth" compared to some frequent posters/members here... but I had thought that I would skip the AMD 5xxx series and think about the 7xxx as the upgrade point.

    Then covid etc etc.

    And now, for my use case (posted above) I don't think I am going to see A LOT of improvement for my day to day tasks... I am guessing I would see a huge improvment with a 4090 though...

    So, Maybe, for now, just get some more memory, save for a 4090 (or 5090) and wait to do a upbuild with the Next Gen of Processors.

     

  • outrider42outrider42 Posts: 3,679

    A GPU upgrade would be by far the biggest upgrade you can make. A 3900x is totally fine and should stay fine for a good number of years. If Daz Studio ever goes all in on multithreading, you'll see a bigger boost from that than you will from a 7000 series upgrade. The biggest bottleneck of a lot of software like Daz Studio is...the software itself. I am just talking about the software performance, not rendering, it impacts how well Daz Studio itself runs. The better your CPU, the more stuff a scene can handle before it starts bogging down. But currently it only runs on mostly one core, so only single core performance can provide a boost. The GPU can handle the viewport, so a strong GPU can help the viewport run smoother. It isn't a big deal though unless you like using Iray Preview in the viewport a lot. Even here I don't know if Daz is fully utilizing the GPU enough to make a massive difference. I saw a big difference going from a 1080ti to a 3090, but that is huge hardware difference. A 3060 to a 3090 (I have both), a bit less dramatic. Again, that's just viewport performance.

    But in real rendering the 3090 is over twice as fast as the 3060, and you can probably guess a 4090 is even faster. In fact the 3090 can be as much as 2.5 times faster. IMO that's kind of dramatic.

    You have a 12gb VRAM card with 32gb of RAM. It is already possible to exceed 32gb with such a GPU. But maybe in your situation, you don't. It depends on how you build scenes, and what your texture compression is. If you changed Iray's compression settings so that they are not very compressed, your VRAM will be closer to your RAM use, meaning you would likely max out both VRAM and RAM at a similar rate. If you use the defaults, you can see some wide gaps between VRAM and RAM, because the default is actually very aggressive. Basically every texture over 1024 pixels is compressed a lot, and that's like every texture in modern Daz products. That is how some people can see their RAM use be 3, 4, or even 5 times higher than VRAM. Compression only affects VRAM, though, not RAM. More compression saves VRAM, not the other way around. Compression allows you to get the most VRAM possible. It doesn't really seem to alter rendering performance any other than the initial load time. Compressing a lot of stuff might take a tiny bit longer to load. If you compress too much you might get artifacts and bleeding textures.

    You can even use a 4090 in your exact current setup with 32gb of RAM if you want. It is not like you MUST get more RAM. If you do not feel constrained by your 32gb RAM today, then you will probably be fine keeping that same amount with a 4090, you'd be getting a 4090 for speed more than memory. There is no need to spend money on something you don't need. But such a GPU will have more memory if you do need it at some point. It is only if you want to use all 24gb of VRAM that it starts to become an issue. If you run out of RAM, Daz will probably crash. If that becomes a problem, then you can go buy more RAM. If you do, I'd still suggest getting RAM that matches what you got if possible to avoid any potential issues. But it isn't the end of the world if you don't. I expanded my RAM in my previous DDR3 machine from 16 to 32, and I did that by just buying another 16 of the same RAM. No issues. This was at least a year or two later, so it certainly was not the same batch of RAM, though it was the same SKU. That computer never exploded, and it ran fine until it simply got too old to run demanding software. And if the new RAM doesn't work and you can't get it to play nice, return it. Don't sweat it.

  • Thanks Folks!

    Outrider... 

    I pulled the GTX 1060 6GB and it is sitting on a shelf.  I was considering just giving it away in the Pay it Forward Daz3d Style thread... would it provide much value to be used with the 3060... ? or since it is just a 6GB card, not much?

  • outrider42outrider42 Posts: 3,679

    With a 3060 I don't think the 1060 will do much. The only value it might have is if you drive the monitor with the 1060, so the 3060 can devote all the resources it can to render. This would allow you to do more on your PC while rendering if by chance your scene is pushing the 3060's VRAM. But I don't think the 1060 will add much to rendering power, and having used a 1080ti with a 3090, I seemed to have some issues when rendering with both. I got a lot more crashes. I don't know what the issue was, maybe GTX and RTX don't mix well since GTX must emulate the ray tracing? Or something was going wrong with my 1080ti, but that 1080ti had zero issues playing demanding video games. When I swapped the 1080ti for a 3060, I have had far fewer crashes with Daz.

     For other people, I wouldn't think 6gb is much, however that's just me. There are still plenty of people out there who have weaker cards or less memory than a 1060 who would consider it an upgrade. So I still think that would be much appreciated by someone in such a situation. You can do it in that thread, or if logistics become an issue, let somebody have it in your local area. I gave my old 970 to a friend a couple years ago and he got a lot out of it. He only games.

    When I started PC gaming, the best I could do was an ATI 5870 that had two case fans strapped to it instead of a proper shroud, but it worked. It was a decent card back in the day. I got that off ebay for $50! Then I found this weird thing called Daz Studio, and I dropped about $100 on a 670 2gb. I would later give that away, too. But I kept the 5870, it looks too awesome to give away, lol. But I also keep it as a memento of that time period, maybe I am strange.

    The 1060 has been #1 in the Steam hardware survey for years, and only recently dropped to #2 in December 2022. In a sad twist, the card that took its place is a lowly 1650 with 4gb. Not exactly a better product! So even if Daz users don't care for it, the card has life left for budget gamers.

  • nonesuch00nonesuch00 Posts: 18,131

    Once they get so old you may as well send them to the electronic recycle bin. I've tossed 2008 MacMini and 2008 Radeon (720 I believe) after resisting throwing them away for the longest time. The oldest hardware I still have now is a 2003 Toshiba Portege M205 Convertable Tablet with some ancient discrete nVidia GPU. It can only run up to Windows 7 and Blender 2.9.X LTR. 4GB RAM is maximum. It is heavy too, but it does have an integrated Wacom digitizer.

Sign In or Register to comment.