Is there a big difference between rendering on laptop vs desktop?

Hi there, hoping to get some perspectives!

I've been rendering on a gaming laptop for about a year now, and am looking into building out a desktop PC. But before I plunk down some serious cash, I was wondering if there's a demonstrable difference between what I have now and what I potentially would get.

Currently, I'm using a ROG Zyphyrus G14 gaming laptop with an AMD Ryzen 9 4900HS and a GeForce RTX 2060. To render an image at 1080 about 1000 pixels wide and with 2-3 characters with good lighting and optimized backgrounds takes about 5 minutes for less than 2000 samples.

The gaming PC I'm looking to build would feature an AMD Ryzen 5 5600X with the GeForce RTX3070.

In your opinion, would upgrading to a desktop and the newer specs really make that big a difference in render times? I'm just worried that I'll make the investment and then realize it's not that much faster.

Thank  you for you input!

Comments

  • KCMustangKCMustang Posts: 114
    edited January 2022

    I render on a laptop with a RTX3070 (full power) GPU and it does OK too. A 3070 desktop will of course be a bit faster but the big thing is the RAM. 8GB like what I have is not enough and I don't think even the desktop 3070s come with more. Can you stretch to a 3080 16GB?

    Post edited by KCMustang on
  • kyoto kidkyoto kid Posts: 41,213

    ...the issue I have with notebooks is you are pretty much locked in with what you get and little room for upgrading or expansion.  About the only components that can be upgraded are memory and storage (the latter usually through external drives. You are pretty much stuck with the CPU and GPU chip the system originally came with. so to upgrade either one means getting a new notebook.

    Cooling is another matter as there really is no aftermarket cooling solutions for a notebooks which depends on hamster wheel fans to draw in fresh air and vent heat. No CPU cooler with big heat sinks now water cooling for the GPU or CPU.  On my old Toshiba notebook the key switches just above the area where CPU is eventually burned out from the heat generated by rendering in 3DL.  

    Faster rendering will more likely be accomplished whith a higher VRAM GPU and more system memory. A general rule of thumb I've been seeing is multiply the GPU's VRAM by 3 for the amount of system memory to support a scene so 32 GB would easily cover a 3070.  For a 12 GB 3060 you'd bee slightly below that threshold (keep in mind the OS and system processes also take up memory in addition to any other software you have open like a browser). 

    While I usually don't recommend prebuilts that is about the only way to get a decent GPU card at a reasonable price. Currently 3070s on their own are going for anywhere from about 900$ to well over 1,000 (a couple are listed at more than the MSRP for as 3090). Best bet is to go with someplace that lets you configure a system they build.

    Two questions, how much memory does your notebook have, and do you also intend to game on the system as well or just work in Daz?

  • Doc AcmeDoc Acme Posts: 1,153

    Well, ya. It can be done, but it's a slog.

    I was untimely sucked out of retirement back to L.A. last year, and had to use a fairly decent, top of the line HP for several months.

    It's great system. but to do anything like I'm used to was pretty much impossible; it was quite limiting, but at least I could set things up so when I could, I'd add the other elements. Such as mutiple characters, let alone with hair.

    But if ya can, if your are really serious about graphics, I'd highly recommend check into Boxx systems.

    World’s Fastest Custom Workstations | BOXX

    Even after yearsthe initial sale, these folk really go out of their way on tech support. It might seem spendy, but you'll quickly realize it's worth every penny.

    Just my 2¢...

     

  • starionwolfstarionwolf Posts: 3,670

    I have a mini pc that is more similar to a laptop than a desktop.  CPU fan runs loud when I render in 3Delight.  The CPU will run hot, like above 165 degrees F.  Expect thermal throttling to occur.  The CPU fan runs fast even when I draw using a USB tablet in Gimp.  Just be aware if you want to use a laptop for 3D work.  I never had an issue with my old desktop computer running too hot with an AMD video card.  Good luck.

  • Thank you for your input, everyone!

    My current RAM is 16GB, though I'd love to bump it to 32. I also had 1TB of space, but as you all know, 3D artwork quickly eats up your space, so I'm going to need to figure something out soon.

    But it does seem that, at very least, a desktop will give me some space to upgrade in the future, whereas the laptop I'm using, while adequate, is kind of stuck where it is.

    Thank you!

  • cgidesigncgidesign Posts: 442

    There is an article on techspot regarding mobile vs. destop GPUs.

    It is about gaming not rendering but the general conclusion is that a desktop GPU is way faster than a mobile GPU.

    In some cases an old RTX 2070 desktop is faster than the latest RTX 3070 mobile.

    https://www.techspot.com/review/2206-geforce-rtx-3070-laptop-vs-desktop/

    You plan to go from 2060 mobile to 3070 desktop. I think you will get a serious speed boost by this.

  • kyoto kidkyoto kid Posts: 41,213

    ...for gaming. system memory not quite as crucial as it is for 3D rendering.  That' s why many game systems on the market tend to come with 16 GB even when they have an 8 GB GPU,

    @ Doc Acme, I looked at Boxx a while back and as you mention, excellent Quality and support, but prepare to dig a bit deep. Unfortunately  I just went there and attempted to configure a system but in launching the build routine I received an error message that reads:

    Error occurred - Failed to create Session and render User Interface 

    ...the primary cause being:

    Culture is not supported. Parameter name: name en-GB-oxendict is an invalid culture identifier.

     Received the same message in "incognito mode"  so it's not advert blockers or other extensions..

  • Doc AcmeDoc Acme Posts: 1,153

    kyoto kid said:

    ...for gaming. system memory not quite as crucial as it is for 3D rendering.  That' s why many game systems on the market tend to come with 16 GB even when they have an 8 GB GPU,

    @ Doc Acme, I looked at Boxx a while back and as you mention, excellent Quality and support, but prepare to dig a bit deep. Unfortunately  I just went there and attempted to configure a system but in launching the build routine I received an error message that reads:

    Error occurred - Failed to create Session and render User Interface 

    ...the primary cause being:

    Culture is not supported. Parameter name: name en-GB-oxendict is an invalid culture identifier.

     Received the same message in "incognito mode"  so it's not advert blockers or other extensions..

    Hmm. I guess try just starting at boxx.com directly from your browser.  I've never had an issue that way.

  • Doc AcmeDoc Acme Posts: 1,153

    I did just that & got directly to the page, but the "Build your System" feature they had just last week ago isn't there.  Too bad as that was very useful.  I'll ask them why, but I suspect it's due to the Damndemic© & supply chain & how volatile availability is at the moment.

    They mentioned then doing a site update when they called back in my quest for add'l RAM.  I wanted 64Gb in 16Gb chips & having a heck of a time finding them.  They're having the same issue. Tech support had ONE.  Sales had ONE. And Amazon had ONE in stock. I needed four & they all had to match, and preferably be on the ASRock's (the motherboard manu) approved memory list.  I could get them in town, but wouldn't be until mid Jan.

    By the time I got back home a couple hours later, I checked the link Boxx had sent me earlier, to copy the info so I could to search further. This time though, there were 4 items available.  I "uh, buh, uh, buh"ed a bit & ordered all.  They were going to arrive on Tues, which is considerably quicker than mid Jan. 

    Sunday morning though, I get an e-message with a photo of the package at the front door.

    They were exactly what I needed & working purrrfectly.

     

  • KCMustangKCMustang Posts: 114
    edited January 2022

    cgidesign said:

    There is an article on techspot regarding mobile vs. destop GPUs.

    It is about gaming not rendering but the general conclusion is that a desktop GPU is way faster than a mobile GPU.

    In some cases an old RTX 2070 desktop is faster than the latest RTX 3070 mobile.

    https://www.techspot.com/review/2206-geforce-rtx-3070-laptop-vs-desktop/

    You plan to go from 2060 mobile to 3070 desktop. I think you will get a serious speed boost by this.

    It's not actually that much of a difference on the rendering benchmark test. My 3070 laptop ran the benchmark test in 2 minutes 41 seconds which is about 20 seconds slower than the desktop 3070s and a minute faster than the 3060 desktop systems, probably would be closer up against a 3060TI..

    And I haven't been plagued by the overheating / throttling problems people talk about. In the middle of summer on a 2 + hour render the GPU maxes at about 73 - 74 degrees C. Modern laptops are pretty good and the only choice for some of us that don't have space for a desktop.

    But I think we're getting side-tracked a bit as the OP already has a decent laptop and isn't asking whether to buy another, they want to know if a desktop system with a 3070 will be much better than what they have now.  The 3070 will certainly be faster but I think if the OP goes with a 3070 they will be frustrated by the 8GB GPU memory. Go with a 3060 desktop and you can get 12GB but the rendering speed goes down a lot.

    Post edited by KCMustang on
  • cgidesigncgidesign Posts: 442
    edited January 2022

    @KCMustang

    that's interesting. Do you might have a comparison of renders that took longer (e.g. about 30 mins on the notebook)? And, have you had the chance to check the desktop GPU's state with a monitoring tool like HWinfo64?

    I am curious if there is throttling on the desktop GPU.

    I can't compare myself because the notebook has a 2070 and the desktop a 2080 Ti which would not be a fair comparison. And the 2080 Ti is water cooled which keeps it way below any throttling threshold. So, I don't know if cuda rendering would lead to throttling on the desktop GPU if that one is not water cooled.

    EDIT:

    here is a comparison of some user benchmarks made with Blender. Each value is from a different user. I copied values of a 2070 maxQ in notebooks together with a 2070 in desktops. Interesting is the variance. Some notebooks render quite fast, some are quite slow. Same in the desktop world. But overall even the slowest desktop is faster than the fastest notebook.

    rendertime.png
    987 x 971 - 72K
    Post edited by cgidesign on
  • cgidesigncgidesign Posts: 442
    edited January 2022

    And for @stargazersix

    This is 2060 notebook vs. 3070 desktop in Blender. Iray is not the same as Blender's Cycles but I assume the speed difference will be significant as well.

    rendertime3.png
    994 x 1105 - 99K
    Post edited by cgidesign on
  • cgidesigncgidesign Posts: 442
    edited January 2022

    A last one:

    Notebook 3070 vs. Desktop 3070. Again interesting. There are notsbooks which are quite fast - the fastest one is even catching the slowest desktop. But there are others which are quite slow.

     

    rendertime4.png
    878 x 972 - 62K
    Post edited by cgidesign on
  • cgidesigncgidesign Posts: 442

    In case somebody wants to check other Blender benchmark runs:

    https://opendata.blender.org/benchmarks/query/

  • KCMustangKCMustang Posts: 114
    edited January 2022

    cgidesign said:

    @KCMustang

    that's interesting. Do you might have a comparison of renders that took longer (e.g. about 30 mins on the notebook)? And, have you had the chance to check the desktop GPU's state with a monitoring tool like HWinfo64?

    I am curious if there is throttling on the desktop GPU.

    I can't compare myself because the notebook has a 2070 and the desktop a 2080 Ti which would not be a fair comparison. And the 2080 Ti is water cooled which keeps it way below any throttling threshold. So, I don't know if cuda rendering would lead to throttling on the desktop GPU if that one is not water cooled.

    EDIT:

    here is a comparison of some user benchmarks made with Blender. Each value is from a different user. I copied values of a 2070 maxQ in notebooks together with a 2070 in desktops. Interesting is the variance. Some notebooks render quite fast, some are quite slow. Same in the desktop world. But overall even the slowest desktop is faster than the fastest notebook.

     

    I had forgotten all about HWinfo, I used to use it back in the day when I was really into computers but that was in the early 2000s.

    I just ran a 35 minute render monitored by HWinfo and there is no GPU thermal throttling at all, The CPU however is thermal throttling during renders so I stand corrected. In the reviews the Infinity W5 shows the ability to undervolt the CPU in BIOS so I went to give that a try but for some reason it is not enabled on my system and is locked against TP apps. I've contacted the company to see if I can get a fix for that.

    Ran the Blender classroom benchmark in Cuda and it rendered in 130.74 secs. I suspect the fastest RTX 3070 laptops have been tweaked. Interesting is that the same scene rendered in 60.6 secs using OptiX.

    I'd love to see what the new full power TI versions of the laptop GPU can do, especially the 3080ti which has 16GB VRAM and 7424 Cuda cores.

    XMG is now offering a liquid cooling option for some of their laptops, although you'd still have to find somewhere to put it.

    https://www.xmg.gg/en/news-xmg-oasis/

     

     

    Post edited by KCMustang on
  • cgidesigncgidesign Posts: 442

    Yes, I have seen the news about the XMG as well. That could be a very interesting solution. But I fear the price will be very high. The news say final pricing and preorder will be available end of January.

Sign In or Register to comment.