Carrara not using all of my render nodes! Solved!

1235711

Comments

  • @MistyMIst Or just get this: $259 Buy Now, free US shipping... 5 available, 7 days to go:

    http://www.ebay.com/itm/291869781570?_trksid=p2055119.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

  • DartanbeckDartanbeck Posts: 21,551

    @MistyMIst Or just get this: $259 Buy Now, free US shipping... 5 available, 7 days to go:

    http://www.ebay.com/itm/291869781570?_trksid=p2055119.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

    My God!

    If I had the funds available, I'd click "Buy Now" right now! I would!

    After reading your article on this thing, I know that I'd just love owning one of my own!

  • DartanbeckDartanbeck Posts: 21,551
    edited October 2016

    These machines were really highly respected when they came out. Fine, fine machine!

    Drool!

    Get this, add it to my own workstation as another Render Node, then buy a higher end laptop - also via a great deal of a yester-year machine, and Bam... I'd be rocking!

    Post edited by Dartanbeck on
  • MistaraMistara Posts: 38,675

    @MistyMIst Or just get this: $259 Buy Now, free US shipping... 5 available, 7 days to go:

    http://www.ebay.com/itm/291869781570?_trksid=p2055119.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

    My God!

    If I had the funds available, I'd click "Buy Now" right now! I would!

    After reading your article on this thing, I know that I'd just love owning one of my own!

     

     

    i hafta wait after holidays >.<

  • JonstarkJonstark Posts: 2,738

     

    @MistyMIst Or just get this: $259 Buy Now, free US shipping... 5 available, 7 days to go:

    http://www.ebay.com/itm/291869781570?_trksid=p2055119.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

    Almost precisely the same specs as mine, which I'm very happy with (though I keep looking at upgrading to x5670 hex core cpus instead but I can't make it make financial sense)

     

    MistyMist said:

    @MistyMIst Or just get this: $259 Buy Now, free US shipping... 5 available, 7 days to go:

    http://www.ebay.com/itm/291869781570?_trksid=p2055119.m1438.l2649&ssPageName=STRK%3AMEBIDX%3AIT

    My God!

    If I had the funds available, I'd click "Buy Now" right now! I would!

    After reading your article on this thing, I know that I'd just love owning one of my own!

     

     

    i hafta wait after holidays >.<

     

    My best guess is that prices will only go downwards as time passes, so after the holidays might be even better timing  smiley

  • JonstarkJonstark Posts: 2,738

    By the way, I just want to thank Jay for turning me onto the idea of implementing Remote Desktop for my nodes.

    All my computers are Windows, and after a little research, I discovered there's a built-in Remote Desktop feature I could use.  After watching a few quick 'how-to' vids on youtube, setting it all up was a breeze.

    Up til now, I only had one monitor to share between all of my render nodes (obviously the laptops have their own monitor too).  You can imagine what a pain in the neck it was up til now if something went wonky on any of the render nodes, or if I needed to change a setting, to connect each of them up to a mouse/keyboard and monitor to get in there, restart, check, change, save, etc etc.

    Now with the Remote desktop setup, I can get into any of the nodes from my main PC at any time I want with no hassle at all.  Also because of the monitor situation and not wanting to go to the effort to connect everything up each time I wanted to shut down a render node, I was originally just leaving all of my machines running on idle all of the time.  My new (better) plan is simply to turn on all the render nodes when I need them (they're all in the same room within easy reach of the start buttons on each machine) and now with Remote Desktop whenever I'm done I just remote in and tell the node to shut down (for Remote Desktop, the way to do this is ctrl+alt+end which will give access to the shutdown functions of the node you are remoting to).  Easy peasy, and no more dragging out my wired mouse and keyboard and pulling out the machine to reconnect a cable for the monitor anymore.  Very neat feature,  Thanks again Jay!  smiley

  • DartanbeckDartanbeck Posts: 21,551

    Cool... so you're using the Windows built-in Remote Desktop? I saw that myself.

    I really want to set that up on ine too. Perhaps today will be the day? This will definitely be the week!

    Okay jonstark, time to get a good overall count. What exactly does your Render Room consist of, machine/#of core-wise?

    Are you still collecting? I know you said you've calmed down... but....

  • MistaraMistara Posts: 38,675
    edited October 2016

    the plan smiley

    upgrade i5 to i7 first -  
    Intel Core i7 6700K 4.00 GHz   - based on with 1 core light sampling calculation, ghz important, hopefuly by december
    then the z600 dual hex xeon - hopefully by feb - if the ebays offers are good mebbe 2 of em as render nodes
    then i want a mac for final cut pro

    there lil problem i dont know how to remove the heatsink monster off the i5

    some of this- 

     

    and some time next year one of these to study for ccna exam
    Cisco CCNA CCENT CCNP CCIE Massive Lab Kit CCNA2.5 Free Rack 200-101 100-101

     

    they goin to mpls circuits, totally a dinosaur nows >.<

     

    laugh3Dconnexion 3DX-700028 SpaceNavigator 3D Mouse

    Post edited by Mistara on
  • JonstarkJonstark Posts: 2,738

    Cool... so you're using the Windows built-in Remote Desktop? I saw that myself.

    I really want to set that up on ine too. Perhaps today will be the day? This will definitely be the week!

    Okay jonstark, time to get a good overall count. What exactly does your Render Room consist of, machine/#of core-wise?

    Are you still collecting? I know you said you've calmed down... but....

    Yeah, the built-in Remote Desktop app works great (I have variously Win7/8/10 on my devices, but the remote desktop works the same way on all and has no problem with them talking to one another).  I did watch a couple of youtube videos on how to set it up but once I followed the step-by-step, which didn't take long, I was up and running without a hitch. Well, no there was one hitch, the z600 came already with Win7pro installed, and they had 'cleverly' named the User for that device as 'User' which didn't seem to count as a real username, so I had to change the username to something else, but once I did that, no hitches in connecting via remote desktop.  smiley

     

    My little network consists of:

    My primary desktop which is an i7-6700 (8 rendercores)

    1 HP 8300 i7-3770 node (8 rendercores)

    1 HP 8200 i7-2600 node (8 rendercores)

    1 Asus laptop i7-2670QM (8 rendercores)

    1 Asus laptop i7-4700MQ (8 rendercores)

    1 HP z600 dual Xeon E5620 (16 rendercores)

    I also purchased a Dell Inspiron laptop with i7-2670QM (8 rendercores)  The laptop has arrived but did not come with power charger, which is also in the mail and should get her this week sometime, but I have not yet integrated it into the network.

    and lastly I have also purchased a Lenovo D20 with 2 xeons (16 rendercores) that should arrive later this week.

    All told I should wind up with 80 rendercores, at the moment though I'm only up to a mere 56 rendercores.  I plan on 'stopping' at 80 rendercores til at least the end of the year.  I also just over the weekend bought an Asus t100 transformer 2 in 1 (it's a laptop with detachable tablet screen) but I don't intend to use that as part the network (except that I could use it to remote into the network when I'm away on trips or whatever).  I've never been interested in tablets before or understood the appeal; I sort of considered them a very weak excuse for a laptop, but the price was so phenomenally low I couldn't resist, and I thought I'll give it a chance to see what it's like and what the appeal of a tablet might be.

    Once everything comes in and I get it all setup, I'll definitely throw up a screenshot of the 80 cores  smiley

  • JonstarkJonstark Posts: 2,738
    MistyMist said:

    the plan smiley

    upgrade i5 to i7 first -  
    Intel Core i7 6700K 4.00 GHz   - based on with 1 core light sampling calculation, ghz important, hopefuly by december
    then the z600 dual hex xeon - hopefully by feb - if the ebays offers are good mebbe 2 of em as render nodes
    then i want a mac for final cut pro

    there lil problem i dont know how to remove the heatsink monster off the i5

     

    That pic on the prior page looks very similar to a standard Intel heatsink with a fan on top, I don't think it's going to be too difficult to pull off (you just turn the plastic mounting thingies in the opposite direction of the arrows and pull it straight out).  

  • MistaraMistara Posts: 38,675

    opposite direction of the arrows??!!  dohhh 

    lol  thanks 

  • DartanbeckDartanbeck Posts: 21,551
    MistyMist said:

    there lil problem i dont know how to remove the heatsink monster off the i5

    Do a search on YouTube. It needs to be separated away, as the grease is now behaving like glue. I may even just lift off.

    However, keep in mind that many of the people who do this stuff a lot always recommend getting a new cooler when you get a new CPU. This is not, of course manditory. I would though... personally. Yoy might even opt for getting the cpu option which includes a heat sink and fan.

    The CPU/Heatsink will come off your motherboard in one shot. So you'll want to find out how to unlatch the cpu pins from the motherboard, and check how to unmount the fan, which also has a small power cord running to the motherboard nearby.

    BE CAREFUL! My son wanted to 'clean' his heat sink fan without knowing what he was doing. he pulled the thing off the motherboard, bending all of the cpu pins in the process - the whole thing is stuck together at this point. There was no way I was going to attempt bending all of those pins back! No Way!

    It is recommended that you 'ground' yourself before playing around inside your computer. If any of your actions cause a spark or some smoke to rise, you'll likely want to start over with a new computer.

    I have some nice little blue gloves I use when working on mine - my brother has a ground strap that attaches his wrist to something grounded nearby. Whichever method you use, make sure you don't bring any static electricity into your computer.

    Also, and I'm not sure why I'm saying this last, but...

    I recommend doing some research on building a computer from scratch before going to far. If not that, at least do research into what it is you're doing before just pushing forward asking questions that you think you might have, like "how to I get the fan off my cpu?" 

    Ypu should be aware of how your fan connects (physically and electrically) to the motherboard, how the cpu latch works, etc.,

    My favorite source of information for this particular stuff is the manual for the motherboard. Buying a pre-built computer most often will not include this. But if you know what it is, the manufacturer has them freely for download. I always download my motherboard manual before it comes in, so I know what I'm doing by the time it gets to me.

    Best to not wreck a computer trying to speed it up. A slightly slower cpu is much faster than one that doesn't work at all!

  • DartanbeckDartanbeck Posts: 21,551
    Jonstark said:

    Cool... so you're using the Windows built-in Remote Desktop? I saw that myself.

    I really want to set that up on ine too. Perhaps today will be the day? This will definitely be the week!

    Okay jonstark, time to get a good overall count. What exactly does your Render Room consist of, machine/#of core-wise?

    Are you still collecting? I know you said you've calmed down... but....

    Yeah, the built-in Remote Desktop app works great (I have variously Win7/8/10 on my devices, but the remote desktop works the same way on all and has no problem with them talking to one another).  I did watch a couple of youtube videos on how to set it up but once I followed the step-by-step, which didn't take long, I was up and running without a hitch. Well, no there was one hitch, the z600 came already with Win7pro installed, and they had 'cleverly' named the User for that device as 'User' which didn't seem to count as a real username, so I had to change the username to something else, but once I did that, no hitches in connecting via remote desktop.  smiley

     

    My little network consists of:

    My primary desktop which is an i7-6700 (8 rendercores)

    1 HP 8300 i7-3770 node (8 rendercores)

    1 HP 8200 i7-2600 node (8 rendercores)

    1 Asus laptop i7-2670QM (8 rendercores)

    1 Asus laptop i7-4700MQ (8 rendercores)

    1 HP z600 dual Xeon E5620 (16 rendercores)

    I also purchased a Dell Inspiron laptop with i7-2670QM (8 rendercores)  The laptop has arrived but did not come with power charger, which is also in the mail and should get her this week sometime, but I have not yet integrated it into the network.

    and lastly I have also purchased a Lenovo D20 with 2 xeons (16 rendercores) that should arrive later this week.

    All told I should wind up with 80 rendercores, at the moment though I'm only up to a mere 56 rendercores.  I plan on 'stopping' at 80 rendercores til at least the end of the year.  I also just over the weekend bought an Asus t100 transformer 2 in 1 (it's a laptop with detachable tablet screen) but I don't intend to use that as part the network (except that I could use it to remote into the network when I'm away on trips or whatever).  I've never been interested in tablets before or understood the appeal; I sort of considered them a very weak excuse for a laptop, but the price was so phenomenally low I couldn't resist, and I thought I'll give it a chance to see what it's like and what the appeal of a tablet might be.

    Once everything comes in and I get it all setup, I'll definitely throw up a screenshot of the 80 cores  smiley

    WOW!!!

  • DartanbeckDartanbeck Posts: 21,551

    One thought I have about your setup compared to my getting an Z600:

    I bet most of those i7s you have, either in laptop form or other, are much faster than the 2.4GHz of the Z600. That would make a big difference in rendering speeds.

    I'll look into those Z600s a bit more - as with some other possible options.

    I'm lucky in that I'm not in any kind of hurry, nor am I overy anxious about this whole thing - but I will be pushing forward into this, for sure! Especially the remote desktop stuff!

  • DartanbeckDartanbeck Posts: 21,551

    Oh... and Thanks you guys! You're very generous with all of this info!

  • JonstarkJonstark Posts: 2,738
    edited October 2016
    MistyMist said:

    opposite direction of the arrows??!!  dohhh 

    lol  thanks 

    Careful, I don't want to steer you wrong,  but yeah that looks like the standard stock intel cooler mechanism, I recommend doing a quick google search on 'removing stock intel cooler' and you should see lots of vids talking about that particular plastic mounting/locking mechanism, I'm just going from memory but I believe you rotate in the opposite direction from where the arrow is pointing to unlock to remove. It's not a big twist either, something like a half rotation.

    Also ditto on grounding yourself.  I have a sword that have tip down to the ground that I ground myself to, using a wrist grounding strap, and also I use blue gloves as well (yes, overkill but I'm paranoid)

    Post edited by Jonstark on
  • JonstarkJonstark Posts: 2,738
    Jonstark said:
    MistyMist said:

     

    One thought I have about your setup compared to my getting an Z600:

    I bet most of those i7s you have, either in laptop form or other, are much faster than the 2.4GHz of the Z600. That would make a big difference in rendering speeds.

    I'll look into those Z600s a bit more - as with some other possible options.

    I'm lucky in that I'm not in any kind of hurry, nor am I overy anxious about this whole thing - but I will be pushing forward into this, for sure! Especially the remote desktop stuff!

    Believe it or not, it's the opposite, I'm seeing faster render speeds on the z600 than on even higher Ghz processors that are more modern.  There was a similar topic in the commons where Dustrider offered the opinion that he thought Xeons would runder faster than more modern i7's for Carrara, because of the way they are structured and Carrara's rendering method matches up better with the kinds of tasks Xeons are designed for.  That doesn't mean I'm not going to upgrade my Xeons to higher Ghz hex cores at some point though, but I'm very happy with the render speeds this old z600 is giving  smiley

  • MistaraMistara Posts: 38,675

    thanks. smiley

    and i'm about to find out if my new graphic card will boost my preview window experience.  dunno if gpu does anything for opengl

  • DartanbeckDartanbeck Posts: 21,551
    Jonstark said:
    Jonstark said:
    MistyMist said:

     

    One thought I have about your setup compared to my getting an Z600:

    I bet most of those i7s you have, either in laptop form or other, are much faster than the 2.4GHz of the Z600. That would make a big difference in rendering speeds.

    I'll look into those Z600s a bit more - as with some other possible options.

    I'm lucky in that I'm not in any kind of hurry, nor am I overy anxious about this whole thing - but I will be pushing forward into this, for sure! Especially the remote desktop stuff!

    Believe it or not, it's the opposite, I'm seeing faster render speeds on the z600 than on even higher Ghz processors that are more modern.  There was a similar topic in the commons where Dustrider offered the opinion that he thought Xeons would runder faster than more modern i7's for Carrara, because of the way they are structured and Carrara's rendering method matches up better with the kinds of tasks Xeons are designed for.  That doesn't mean I'm not going to upgrade my Xeons to higher Ghz hex cores at some point though, but I'm very happy with the render speeds this old z600 is giving  smiley

    I'm glad that I speak my mind! This is exactly what I was hope you (and/or Jay) would come back and say!

    BTW - Sorry... I didn't see that you'd replied already to Misty's question. Magento sure has been messing with me today. I saw your response to 'me', but not to Misty! Strange!

    Okay so actually I'm very relieved to here the news about the Xeons. I really want a Z600 or two. I just love the design and their intent. Like Dustrider says - it's like they were designed specifically for Carrara - and I like that. I'll have to ask my Singer if he can front me five hundred bucks to improve my 3D studio - just hit Quantity 2 "Buy Now" and be done with it.

    Later, if they need some work done to them, I'll be happy to do so. I really kinda get a charge out of doing all the stuff Jay did on his Blog - minus the Windows analysis stuff. I'm not PC geek, nor do I have any programming in my soul - though I can often work my way out of code that has already been written - so I'm not a total dunce.

    Like you. I'd be considering replacing the Xeons down the road to Hex-Core - but no hurry.

    Sword to ground yourself? I'm SO Jealous!!! :D  Rosie... JonStark has a sword that he....

  • DartanbeckDartanbeck Posts: 21,551
    edited October 2016
    MistyMist said:

    thanks. smiley

    and i'm about to find out if my new graphic card will boost my preview window experience.  dunno if gpu does anything for opengl

    Be careful not to spend too much for naught.

    Carrara working view relies upon OpenGL. Nothing else in your G-Card will mean a darned thing.

    I also wanted to tell you how much I enjoy working on/building my own PCs.

    I hope I didn't sound like I was scolding earlier. This stuff is a total blast once you get into it - unless you dislike tinkering. I just wanted to let you know that dangers DO exist!

    My brother always told me that: "It's all in the configuration of the smoke". "Let the smoke out of a chip and it's just impossible to get it back in, reconfigured to a working state" LOL

    In other words... never make a chip smoke!

    But it was when I was training a yound college student to be my right-hand landscaper. He is a total gaming nerd and loves D&D, like me. (I notice you like Geralt of Rivia! I'm a Witcher fan, though I almost NEVER game!)

    Anyways, during our breaks he would go over his 'designs' for his new gaming PC. I was eager to find out that he's building his own PC!

    He gave me some tips regarding how to do the research. It'ds a LOT easier than one might think! Here's what "I" do when designing my new machine (which is Way fun):

    I start by picking out which CPU I want. This determines which slot-type I need for a motherboard

    Next I pick out at least one favorite motherboard. This take a bit of energy because I'll have certain things that I need to decide upon:

    • CPU Slot Type - first absolute requirement
    • Maximum RAM Allowed
    • Expansion Slot, like Graphics Card acceptabilities and such
    • Other specs and expansion possibilities compariso with others

    Next on "My" list is RAM which is recommended by the motherboard. Once I'm fairly certain which motherboard I want, I check the manufacturer's site and find out what I need to know about it for RAM and Power (mainly connector types and such) needs. I often download the manual and read the whole thing.

    Then I check out Graphics Cards. I used to go for the highest-end nVidia Card I could get without getting too expensive. Now I usually go for the cheapest nVidia cards I can find - and compare all of them that are within a very low price bracket. But that's my love for Carrara's Native engine speaking to me. I'm actually somewhat interested on getting a higher-end GTX now, but no hurry.

    Around this same time in my research, I'll also start looking at which Tower or Box (I like Towers) I want to build this in. I always (ALWAYS) look for a tower which includes filtered intake slots for the fans. My first was an Amazing (but now very outdated) ThermalTake Xaser V tower (right) which came with a whole bunch of what are called SilentFans - each intake (fans blowing IN to the case) have remarkable, washable filters for them. Now I have an Antec 300 Two (left), which has one big filter in the front, which I like very much. These fans are bigger, so I don't need as many. 

    My brother says this is where I should skimp - the case. I disagree ever since I've bought that ThermalTake. Now the case is a major consideration for me - though I still stay within a very tight budget - which vcan become quite a challenge once the comparrisons begin because I'll always think that I can spend just a bit more for this better case - and can easily get carried away.

    I've never had the courage to go liquid-cooled yet. Something about liquid next to my electronics (remember the configured smoke thing?) just gives me the heebie jeebies!

    So Once I have a motherboard, CPU and Graphics Card picked out - as well as an idea whether I might want to add a second (or even third) Graphics card in the future - I start looking at Power Supplies. The Graphics card, CPU will often have suggestions for minimum Wattage for your PSU (power supply unit), but they'll also list (in their Specs) which connectors they use/need, as will the motherboard.

    Make sure to look at everything you're getting for hardware at what power connectors and minimum power they need. This makes the difference of being able to boot up your computer right away - or having to wait for your request for return, ship it back, wait for processing, then wait for the new replacement. So it's good to check through everything the first time through - and That, my friend, is why I prefer to read the user manual for the motherboard right away - early on.

    It should be known that I love doing all of this through Neweeg.com, because they carry everything I need to build the whole computer, all shipped (relatively) from the same place at the same time - saving on shipping and headaches. But the biggest reason is their wonderful 'saveable Wishlist' system! Newegg lets me save various wishlists under different names - so I can make several PC 'builds' and compare them.

    So I continue on, picking out my new Hard Drive(s), optical (DVD) drive(s), do I want an aftermarket cooler for the cpu? (yes, usually I do - but not always - not last time), do I want/need extra case fans? Any special cables? (We used to have to include a 3.5 floppy drive - I'm glad that's done!) 

    I try to NEVER build a new computer relying on Hard Drives that I already own as the main C drive. Hard Drives (all of them) have finite life spans - they all turn over and die on us at some unexpected time. Building a new machine? Give it a new C drive at the very least! RAM is cheap now - at least the new stuff for new motherboards it is. Buying a 4mb stick for my old 8 color laptop is around $125!!! But it was less than that for my 16GB for my new workstation!

    Do I need a new mouse and/or keyboard? Monitor?

    I have always been a manual laborer on a fairly low income, so I need to try and pinch wherever I can. Sometimes I might need something, but if I can keep it off of the actual New PC order, I'll be able to order, build, and use the computer faster. And faster is better for me when I really NEED a new computer. I'm lucky in that mine is really nice, so I can take my time right now.

    In reading Jay's blog about is new (old) Z600, it becomes more illustrated how much easier it can be to design and build a new machine, that it is to buy an older machine and try to get compatible parts for it.

    However, using the same procedures above, we could just as easily look at the specs of an HP Z600 and find out what works with it. It's still a little more challenging, since they use custom motherboards, so we can't just look them up for what all we can do with them on that front - perhaps I'm wrong and we can find that all out through HP?

    It can also be a little challenging, like his article points out, making sure that new components will even fit inside the custom made case! 

    Still... it's all very fun!  I'm hoping to get my hands on at least one of those HP Z600s for sure! But I really do have the luxury of being able to really take my time with it - even if it doesn't work out of the box.

    Anyways... if you have any questions along your process... well you know... just ask! ;)

    Post edited by Dartanbeck on
  • JonstarkJonstark Posts: 2,738
    edited October 2016
    Jonstark said:
    Jonstark said:
    MistyMist said:
    Sword to ground yourself? I'm SO Jealous!!! :D  Rosie... JonStark has a sword that he....

     

    Lol!  I know it sounds absurd, but my new desktop chassis is painted white, and I've read elsewhere that if the metal is painted it's not good to try to ground yourself on it.  So I looked around the room for something else metal, and came up with Aragorn's sword.  I got it a couple of years ago from a girl I work with, who was moving and wanted to get rid of it.  When she said 'hey, do you want to buy a sword?' I wasn't sure I heard her right, but as soon as my mind could grasp the concept, I was immediately all-in.  I love my firearms, but I think every home needs a fantasy sword  smiley   Someday I'd like to get Jon Snow's 'Longclaw' too...

    Of course, as soon as you mentioned it, I had to go look it up on ebay to see what swords like this are going for.  Found it for $80, not bad!  And I keep humming that Weird Al song...

    Post edited by Jonstark on
  • DartanbeckDartanbeck Posts: 21,551

    LOL!

    No! It's not absurd at all... it's totally freaking AWESOME!!!

    I've always wanted a nice fantasy sword like that. I have some incredible pieces of stone outside. Here's the story:

    I'm always doing custom stone chiseling for people. But my stuff is all nature-based, outdoor, very realistic natural formations like cliffs, miniature cliffs, waterfalls, stone walls, patios, walkways, ponds, etc.,

    I was doing my thing at this house that was being completely redone inside at the same time. These custom stone interior fellows were wrestling this giant counter top into the house. Of course, I offered to asist if needed. "Nah... we got this" (grunt)

    That guy came back outside, minutes later, with chunks of the now broken, very beautifully cut granite counter top. Some of the pieces were four or five feet long and perfectly straight, about 3/4" thick, 4" wide x 4 - 5' long. 

    I commented that even broken, it's some beautiful stone. "Want it?" "Oh yeah!"

    So now I have several very heavy slices of stone that I want to, one day, fashion into a stone sword or two... maybe three or four! I've already made a really nice limestone sword for a friend of mine - meant as a decoration for his garden. "FOTM" it says on it's blade - because one morning, he came out of the shop with one of our great big stone saws in his hand - held over his head and he said: "I am F**K... Of... The Mountain!". From that day forward, my partner and I have always called Dave, "FOTM" (pronounced: FOE-tum)

    Needless to say, he loves his custom sword! But there's no way we could use our stone swords as a grounding rod! 

  • DartanbeckDartanbeck Posts: 21,551

    Rosie says that, one day... I'll own a really nice fantasy sword of my own!

    In 2008 or 9, she said that, one day, I'll own Carrara. In the beginning of 2010 she came to me as I was trying to let the pain calm down from the long overtime of chiseling stone all week and said: "I've done it,"

    I look up at her - her face glowing as it does... all beautiful and wild, she continues: "order that software you've been wanting"

    "Software? What software?"

    "You know... that 3d thing you want."

    "Carrara?"

    "That's it. Order it. There's enough to get that club thing (Yearly Platinum Club - then $99), Carrara, and some cool stuff to go with it", she calmly finished.

    "OMG!!! Really?!!!"

    ...and I've been a Carraraist ever since! But you've heard all of that before! ;)

    Point being... she says I can, one day, get something... she sees to it that I, one day, get it! :D

    I do the same for her heart

  • MistaraMistara Posts: 38,675

    my whole system load rendering an old english village

    my preview window still sluggish, but render took only a few minutes

    gpu temp seem high?

    during carrara render.JPG
    1040 x 644 - 138K
  • DartanbeckDartanbeck Posts: 21,551

    Yes. Old English Village is very intense on OpenGL, no matter what card you're using (to my knowledge). 

    Fantasy village is the same way.

    What we need to do in order to increase workflow ease, is to look at which weighty parts of the scene we can afford to remove from 3d view.

    In Fantasy village it's really easy: Find all of the replicated thatch for all of the roofs and deselect "Show Object in 3D View". Just that 'Quick fix' will immediately bring up the speed of working view to acceptable levels.

    Old English Village - look for things like replicated grass, hand-placed objects that you don't need to "See" while working - like things that are in another area than where you're working, and do the same thing: deselect "Show Object in 3D View".

    So if you're setting the camera to do a tour though the whole village, for example, look for details that you won't need to see - also keep in mind what you definitely DO want to see.

    Just know that these things will all still show up in the render - as long as they're in the camera's view. We just need to disable the working view camera from rendering them in OpenGL.

    You could spend thousands (literally) of dollars on a new card and still bring it to it's knees in this type of performance! Just saying!

    BTW... lovely scenes are these! They're just very detailed, making them 'heavy' within working view. Add a figure and the whole screen locks up good. So please try to disable these things by deselecting "Show Object in 3D View" immediately before adding more elements to the scene and locking up your computer. 

    Don't worry - big-time production studios have to work like this too. They often keep only the absolute minimum visible at any one time. The cool part is that, using this method, everything still renders. It just doesn't kill our working view.

  • DartanbeckDartanbeck Posts: 21,551
    edited October 2016

    Notice how your card is barely working, as far as using its main power of rendering, yet it's about to blow up from overheating.

    This is because it's working very heavily on OpenGL - which has a lot of limitations compared to how this card will render game visuals.

    Perhaps the workstation-class (like Quadro) cards which are designed to excel at rendering OpenGL might make a huge difference - I don't know. I'd love to find out, though!

    Post edited by Dartanbeck on
  • kakmankakman Posts: 225

    MistyMist,

    At FIRST glance I thought that your card was running way too hot, but then I noticed that your card temperature was in Fahrenheit NOT Celsius (which is the norm).

    91.4 Fahrenheit is ONLY 33 (approximately) Celsius – which is a very low temperature for that card.

  • JonstarkJonstark Posts: 2,738

    Good catch kakman, my eye skipped right over that.  That's not hot at all  smiley

  • kakmankakman Posts: 225
    Jonstark said:

    Good catch kakman, my eye skipped right over that.  That's not hot at all  smiley

    Jonstark,

    I was in the process of writing a diatribe on how there seemed to be a lack of any fan control management software governing her card – the fan speed was only at 30% in the screen grab – and decided to double check the screen grab (as it did not make sense that the fan speed was so low when the card was so hot).  It was only then that I noticed the temperature was in Fahrenheit.  I am just so used to computer component temperatures being reported in Celsius; it had not originally occurred to me that her temperature was in Fahrenheit.  This reminded me of how I always chide people to "read the screen!" and here I was failing to do just that.

  • MistaraMistara Posts: 38,675

    it was Celcius by default, changed it to Fahs, my mind can't gauge a Celsius reference, or kilometers per hour. 

    working on it though, changed my weather page to Celcius

    knots per hour are windier smiley

Sign In or Register to comment.