GDDR 6 or 5?

RandomRandom Posts: 214

I presently have a Nvidia GeForce730 and am thinking of getting a new card GTX 1660ish. For best Iray performance, which matters, GDDR 6 or 5, SC, TI, Ultra?? Are the 2 fans models quieter than the 1 fan models? I'm thinking of going with the EVGA brand, like the EVGA GeForce GTX 1660 Ti XC SC Ultra Gaming card, but if it's just as well going with a lesser 1660 card then cheaper would be better. I don't do gaming. Quiet is important but not a deal breaker if the noise isn't too bad. I should add, my MB is older with pci 2.0 slots.

Post edited by Random on

Comments

  • PCIE 2 won't matter if you don't game.

    In evaluating cards the things to consider is primarily CUDA count and VRAM amount. This basically equates to newer and more expensive is better.

    GDDR5 vs GDDR6 is a matter of RAM speed. GDDR6 will render slightly faster.

    The number of fans really has very little impact on noise. What matters is the quality of the fans and how fast they turn. EVGA, IIRC, can have custom fan curves set. You should check and find a card thatyou can set the curves on. Then you can set the curve based on how much noise the fans make versus how hot the card gets.

  • RandomRandom Posts: 214

    Thanks, Kenshaw. I intend to go ahead and try the EGVA 1660 then. The fans on the card are supposed to be quiet. We shall see. I've always used fanless ones but Iray takes so long to render with my middlish 5 year old cpu.

  • every decent card today has fans. I sit right beside my system, and sleep in the same room, and cannot hear the 6 fans on 1080ti and 2070 when sitting right by it and they don't disturb me overnight when I run all my renders.

  • Just for the record... Unlike when "gaming", "rendering" will only use a few specific components within the GPU to render. Specific components are the CUDA cores, TENSOR cores and the RTX controller cores if they exist.

    For example, when gaming, your card may pull up to 280 watts, at peaks and about 230 watts constantly. When rendering, that same card may only hit peaks of 180 watts and pull about 135 watts constantly. Thus, your memory and cores will not heat-up as much, as they are only generating about 180 watts of heat, which the stock fan can easily cool, at a medium speed.

    On that same respect, when it comes to memory, GDDR5 vs GDDR6 is honestly not going to impact your renderings by any significant ammount. It would be better to have more CUDA cores than to have faster VRAM, with slower VRAM, more cores will win, every time. Same with PCIe1 vs PCIe2... That would only impact potential loading times, but honestly, cards don't even reach the 16x throughput, so having PCIe2 would only be an advantage if you had multiple cards all running at 4x, and you only had ONE that needed 8x still, that ONE card could get the advantage from the lane-sharing. But, with rendering, all cards run exactly the same and they load at speeds closer to 4x max. (No where near 16x speeds, at all, when loading. Which is the only time you will notice any actual speed gains, if any, from the PCIe slots. Daz and IRAY load items in bursts with no steady demand on the PCIe lanes at all.)

  • RandomRandom Posts: 214

    Thank you, JD and Kenshaw. Very helpful understanding Iray's demands. Can't wait to try out the new card. I bought it but didn't realize the DVI connector is now DVI-D so all my older vga-dvi adapters won't work. Anyway I did temporarily install it and turned the power on. Running under no load it was QUIET. Wow.

  • Just adding, if you have an internal video-chip, (like the one that came with your CPU, unless you have a new CPU without an internal GPU.) You can use the video-out from your motherboard. Windows now treats your "video-cards" like one giant video-card, streaming through the internal GPU portion of the CPU, or the on-board graphics card, if it exists.

    For rendering, that can help you out a bit, as the IRAY preview is just a "video stream", from the IRAY DLLs into Daz. Thus, you will be using your GPU as a dedicated rendering device, while your CPU/GPU manages Daz. You will save some VRAM that way, as your internal CPU/GPU uses mostly the direct connected RAM instead.

    Yeah, the new cards have a better "low power management" than many previous cards had. With winter coming, you should stay cool a lot longer too, until the card becomes a personal room space heater. :)

  • nicsttnicstt Posts: 11,715

    ... But when comparing cards from different generations, using CUDA count is largely useless. It works when they are the same generation.

     

  • RandomRandom Posts: 214
    edited November 2019

    I have an Asus Sabertooth rev 2 AMD system which doesn't have onboard video. So its all up to the card. My fanless GTX 730 heats up about 10c degrees when rendering. It will be interesting to see if the new 1660 one makes my drafty room warmer in the winter.

    Post edited by Random on
  • DVI to VGA? Are you using some ancient monitor? HDMI monitors are cheap now. My triple monitor setup at work invoiced just under $350 and that includes the arm that holds them all.

  • DVI to VGA? Are you using some ancient monitor? HDMI monitors are cheap now. My triple monitor setup at work invoiced just under $350 and that includes the arm that holds them all.

    Buy a 4K TV... it's only about $100-$300 for a decent one, which is the size of a normal desktop monitor. I use a 65" curved-screen Samsung 4K TV as my computer monitor. Best $700 I ever spent. Its like having 6x 1080p screens. (But yes, I realize it's only 4x the size of a 1080p screen, in resolution... But it's 6x larger than my last monitor, and cost just as much.)

  • RandomRandom Posts: 214
    edited November 2019

    I'm afraid my thoughts about computer systems are pretty old, been doing computers since the late 80's, build my own. I haven't thought about changing my display, but now that you mention it, any particular models/brands, you'd recomend for good detail and good color/brightness consistency over the whole area of the screen (I do art and photography too so need that consistency)? 65" is kinda large as I have to sit about 18" from the screen.

    Post edited by Random on
  • If you need a pro level monitor you have a lot of options. What I'd look for first and formost is what percentage of the NYSC color gamut it covers, 72% = 99% of the sRGB so anything above 70% is good and anything over 75% is amazing. Then it should be factory color calibrated and the calibration report should be included.

    For brands ViewSonic makes a number of well thought of pro monitors as do some other companies, a Google search for professional monitors turns up plenty of options.

  • prixatprixat Posts: 1,590

    NYSC, the New York Stock Exchange has a colour standard? cheeky

    Regarding monitors, I also decided to go for a tv as a monitor, in my case it's a mid-range 4K, 60Hz samsung with HDR10.

    That gives me 3 of the 4 things you need for HDR: the OS, the software, the monitor and the GPU.

    I still have to get a 10bit GPU.

  • NTSC, spell check doesn't help with acronyms.

  • RandomRandom Posts: 214

    Thanks again all. I'll really have to consider a new (larger) monitor. Still getting over the cost of the new graphics card. Gave it a spin this a.m. on a scene rendering which was 80% converged in 21 minutes on my old card. Now it ran 100% in 5 1/2 minutes. Wow. I'm sold. And it didn't hear it at all (my old-timer ears).

Sign In or Register to comment.