OT: I actually bought a server
In the old forums I had a short discussion going with the subject "OT: I almost bought a server" but for the life of me I cannot find that thread. Irritating not being able to find and link to your own posts. Google search can't find it, but I know it's there. Must not have been visited enough.
Anyway, I did finally buy an old server VERY cheap. What I ended up with is a 3rd generation Dell Poweredge 1950 with dual Quad-core cpus and 8Gb of memory. It was up for auction and I got it for $80, but with buyer's premium and tax it was $100 out the door. No hard drives, no OS. Even with slower cpus and slower memory than what's available today, it's still about 6x faster than my current lump and has 4x the memory. It's 64 bit capable and has 8Gb of memory vs. the max 2 Gb on my old desktop. I just didn't want to go out and spend $1500 on a gaming rig for this hobby. Priorities, priorities.
Now the question is, what OS to put on it? I would dearly love to run Linux on it, which is what it probably had on it before, and what I use at work. I can do that at no cost other than a new HDD, which I need anyway. Problem is, I can't use it as a network render box for Bryce if I put Linux on it. I could run Luxrender on Linux via Reality, which I do have, but Bryce Lightning insists on having the slaves the same OS as the master, and it doesn't support Linux at all. Sure, I could install one OS as base and run the other as a VM, but there's no real point doing that as far as what I need it for. I may also want to run DS4 on it directly, as my current system is 32-bit, and the server will handle 64 bit windows quite nicely. That means buying a copy of Windows 7 Professional. Maybe cheaper with the OEM/system builder edition, but still, it's Windows, with all the attendant problems and nuisance. Linux would be easy and stable, but the apps I want to run on it aren't supported. Opinions?
The chief problem running a 1U server as a home computer is the noise level. Running full tilt with all 8 cores cranking, the noise level is about the level of a shop vacuum. I'll need industrial strength hearing protection... :grrr:
Comments
Decent price. Especially considering that you have ECC RAM in there. Keep in mind that ECC RAM is slower than normal RAM, but the likelyhood of crashes from bad RAM are close to 0. Your biggest problem is going to be your display. That server has NO 3d capability at all. It has an ATI ES1000 which is an integreated version of the old "Rage XL" chips that won't even begin to boot a 3d program. This means that you are going to have to find a video card that will fit into your 1U case. *THAT* will be your challenge.
After you jump that hurdle, then you can worry about OS. Until then, your 3D hopes are nil.
EDIT: Check to see if your server shipped with a riser or not. If it did, you'll need to determine if it is PCI-X or PCI-e before you start shopping for video cards.
Kendall
Aha, very good input. I knew about the substandard graphics, very common for servers. This one does have riser cards that will accept a full-height, half length PCIe card. That means I can't put a monster card in it, but a 1Gb card that handles OpenGL might be no problem. Since I don't intend to do GPU renders, a modest card should suffice. From the Dell system specs:
2 expansion slots on 2 different riser options:
Riser 1 Option –PCIe
2 PCI Express slots (two x8) full height, half-length
Riser 2 Option – PCI-X
2 64-bit/133MHz PCI-X full-height, half-length
Ummm.... At 1U you have NO space for a videocard with a fan or large heatsink. Please open your case and look at your MB clearance, that will dictate your path. I fight this all the time. I avoid 1U servers for this very reason. 2U servers are much more forgiving.
EDIT: Also, those are riser *options* and they cost extra... lots extra. So many didn't buy them. In many cases, PCI-X is preferred in servers due to better performance for the work that servers are tasked for. So, even if you do have a riser, it is likely that it may be PCI-X and not PCI-e. *continued in another post*
Kendall
OK. Issues you need to watch for in 1U cases.
1) Clearance. Assuming that there are no barrel caps or chipset heatsinks protruding into the expansion space, you will have a maximum of 1 inch of space from motherboard to case. The physical space necessary for the slot hardware will remove upwards of 1/2 inch of usable space from that. Leaving at most 1/2 to 3/4 inches of usable clearance.
2) Cooling. Servers are designed to cool the CPU's and RAM, not the expansion slots. Most 1U server designs will assume that any expansion will be in the form of Network Cards or SCSI/SAS interfaces which need minimal cooling.
3) Power. The power supplies in 1U servers have NO excess capacity above providing for the CPU's and hard drives. There will be no options for GPU power. Keep this in mind. A GPU that needs it's own power connector is NOT a good candidate for a 1U enclosure.
4) Back Panel access. Most 1U cases have NO clearance for a DVI or VGA style connector. The cases were designed for minimum space usage and the expansion was designed for RJ-45 or optical plugs. HDMI or ultra-lowprofile DVI plug extensions are your best hope.
Kendall
No, the risers are there to mount add-in cards horizontally. It specifically says you can have a full height, but half length card. There's about 15mm between the slot and the junk on the MB. Yes, a honker card will not fit, but a small card with a low profile fan should fit. I will be checking measurements...
And yes, it does have the risers, and the rear panel does have a pop-out for the card plugged into the riser that is basically the same as on a desktop. PSU is rated at 670W. The system is rated to have 4x SAS HDDs spinning at 15,000 RPMs. I plan to put one SATA drive in it. You're right about no extra power connectors.
It's good that the risers are there. Verify their type: X vs e. Also, remember that "full height" refers to the height from the card pins to the upper card edge. The width of the card is going to be your issue. Specifically the heatsinks. Even the lower end nVidia 520's have heatsinks that are too tall for 1U. Been there, done that.
One thing to consider is that it may be less expensive to transplant that DeLL into a 2U or 4U case than to spend the extra cash for a video card that will fit/operate in 1U.
If you are contemplating Linux, then you can remove ATI from your consideration. Wine + ATI + OpenGL + DAZ or Poser = no go. Under Linux right now nVidia or Intel chipsets are your only viable choices.
Kendall
At 15mm you have no room for air flow with any size fan. You'll actually be adding heat to the system if you try. You'll need to go with a fanless card with a low-profile heatsink.
Kendall
Yes, I'm looking at fanless cards. Apparently, PCIe x16 cards will down-negotiate themselves if physically plugged into an open-ended x8 slot. There is plenty of air, so passive is the way to go. The real deal killer here may be the BIOS. Some people report that some video cards will bypass the onboard graphics, which can't be disabled, others will not. :down: Also. some report that Windoze gets mighty confused having 2 video cards in a non-SLI configuration. Linux allows you do designate one as primary, ignoring the other, but not Windows.
I have considered re-casing the machine, possibly to get large diameter quieter fans in it. That's a major project.
Just to follow up, I found a card that seems to work, sort of. At least it shows graphics during boot. I got a PNY GeForce GT 520 at the absurdly low price of $24.95 less 5% and with a $10 mail in rebate. :coolsmile: It's a 1GB card, too. No external power required either. The cards listed as low-profile mostly seem to have minimal heat sink and/or fan setups. In this installation, it's actually not the clearance to the stuff on the mobo that's the issue, it's the clearance to the lid. The card goes in component side up, so the sink is on top. The one I got fits quite comfortably. There are 2 PCIe slots, one behind the chipset, and the other behind the RAM. The one behind the chipset runs cooler. The one behind the RAM is more like a convection oven... There is room to hook up a VGA cable in the back, I just have to remember to disconnect it BEFORE sliding the lid back to open it up. :red: I need to update the BIOS, but I'm not entirely sure that will fix all problems. The BIOS does IRQ sharing, which is common, but the PCIe slots are on the same IRQ as the RAID controller. One post in another forum said that at least for Linux, you suffer a longer boot time as a result of having the other video card in it, but once booted, it worked fine. I do not have a full OS installed yet. Windows is an unknown still, as to whether or not it can be coaxed into working in this oddball config. What told me this one might work is that the system requirements said, "PCIe or PCIe 2.0." Installation required slotting the end of PCIe x8 socket on the riser card, which was pretty easy.
At idle, the noise isn't too bad, but at full tilt, it is about like having a shop vacuum in the room...
Sounds like a good choice. As for the position above the RAM: Check the airflow through that area. As I said before, most 1U's are optimized for airflow past the CPU's and RAM. If the Videocard won't occlude the airflow overly much and can take advantage, then it may be worth putting it above the RAM if the flow is greater than over the expansion area.
As for the VGA/DVI connectors... I purchased some ultra-thin 3 inch extensions for my 1U's. This allows me to offset the honkin big ones far enough away so that they don't cause problems. I paid about $6 each.
Kendall
The main system fans are in banks of 2 going the full width of the case. The fans are behind the HDD bays, pulling air past the HDDs and pushing it into the rest of the system. There are basically 3 airstreams. The far left goes past the RAID controller and part of one CPU cooler, then through the dual power supplies and out the back. The second goes past parts of both CPU coolers and then the chipset heatsink, then through one of the PCIe bays. That's where I put the video. The far right stream goes past part of one cpu cooler and then through the 8 DIMMs of RAM, then through the other PCIe bay and out the back. The far right PCIe bay was occupied by a Remote Access Controller (RAC), which I removed. Nothing in this system goes "on top" of anything else, as it is 1U, which is only about 1 5/8 inches tall. :gulp: The only issue I see with airflow to the video card is that most of the air is passing between the video card and the mobo, because that's where the holes in the back of the case are. The video card mounting bracket actually blocks airflow across the heatsink side of the card.
BTW, this is not my first time doing goofy things with server hardware. At a previous company, we had a computer affectionately known as "The Football" after the briefcase that follows the US President around. It was a portable computer that weighed in at 50 lbs! I rebuilt it a couple of years ago to keep it in "flyable storage" as we say...
The Football on photobucket
This was basically 2 server motherboards stuffed into a briefcase. It won't win any speed or beauty contests, but it should easily take the 10 pounds of s**t in a 5 pound bag award. :coolsmirk:
This is fairly common. A 4mm drill bit will resolve that problem nicely. :-) In this 1U I would highly recommend drilling a few ventilation holes in that plate.
The layout you're describing is fairly common for generic 1U cases, it's hard to tell when DeLL is going to go custom on their cases. My ASUS 1U units have the same layout. Those turbo fans in there will deafen you if you get several servers kicking in simultaneously. :-) That's why I tend to load my 2U and 4U units with work before I allocate the 1U's.
Kendall
*NEWS FLASH*
Bryce Lightning runs under Wine!
See post below
http://www.daz3d.com/forums/discussion/4132/#64026
Interesting thread!
Update: The Dell 1950 server is happily running Ubuntu 12.04, with Luxrender v1.0RC3 running native and Bryce Lightning under Wine (see above). I got Reality working on my old DS3 desktop (XP 32), so the next step is to export a scene from Reality and render on the 1950. I ran a test render standalone on the 1950 in Lux. With all 8 cores cranking away, I'm getting about 25K samples/second. At that rate, I can Luxrender a scene at 1200x900 to 1000 samples/pixel in 12 hrs. Not too shabby, IMO. I set up a Samba shared directory, so *in theory* all I need to do is export the scene from Reality in DS, save it to the shared dir with exported textures, then fire up Luxrender on the server. That way, I can still work on the desktop while the server does the heavy lifting. I can use VNC to open a desktop on the server without actually turning around and logging into it. That was the evil plan all along. The graphics card seems not to be an issue, as Lux with pure CPU rendering doesn't seem to care.
The fans actually never went full blast during the test render, but it was at night and the air was fairly cool.
On a similar note, I actually have an IBM eServer xSeries 346 (Type 8840) But its not in very good condition and some of the memory is bad. I don't know if its worth investing money into it and get it running again. It can take up to 16gigs of memory and it has an old windows server 2003 installed on it.
I am also not really sure if its good for rendering or not.
http://www-947.ibm.com/support/entry/portal/docdisplay?lndocid=migr-58373
Personally, I wouldn't bother with that machine. It only offers 2-way SMP (dual core, no hyperthreading). The memory is only 400Mhz. That's not ordinary memory either. It needs PC2-3200 DDR2 400MHz registered ECC memory. The memory in the machine I got isn't much faster, but I got 8 GB and 8 cores (dual cpu, 4 cores each).
http://www.dell.com/us/dfb/p/poweredge-1950/pd
I have a completely ad-hoc metric for computer performance. Since cpu speeds have kind of leveled out, what's more important is the number of threads and the memory bandwidth. My metric is simply # threads x memory speed in Mhz. Your system would rate a score of 800, same as the old desktop I'm using for creating Daz Studio and Bryce scenes. My Dell 1950 "render box" gets a 4264. The HP Z800 I have at work (not available for rendering) rates 21328! :gulp: The HP has dual 4-core cpus with hyperthreading, and DDR3 1333 memory.
Personally, I wouldn't bother with that machine. It only offers 2-way SMP (dual core, no hyperthreading). The memory is only 400Mhz. That's not ordinary memory either. It needs PC2-3200 DDR2 400MHz registered ECC memory. The memory in the machine I got isn't much faster, but I got 8 GB and 8 cores (dual cpu, 4 cores each).
http://www.dell.com/us/dfb/p/poweredge-1950/pd
I have a completely ad-hoc metric for computer performance. Since cpu speeds have kind of leveled out, what's more important is the number of threads and the memory bandwidth. My metric is simply # threads x memory speed in Mhz. Your system would rate a score of 800, same as the old desktop I'm using for creating Daz Studio and Bryce scenes. My Dell 1950 "render box" gets a 4264. The HP Z800 I have at work (not available for rendering) rates 21328! :gulp: The HP has dual 4-core cpus with hyperthreading, and DDR3 1333 memory.
By that metric this little thing here only rates a 63984.
Kendall
24 HT cpus? Must be nice. The HP Z800 starts around $2K, and as configured with 48Gb of memory, came in at $11K. It's marketed for engineering use. With 192Gb of Ram, the list price is > $100K, but aftermarket memory can be had for much less. I spent less than $200 on the Dell. So, how much did you spend on that screamin' demon? :coolsmile:
A bunch more than I should've spent! The Motherboard alone was $2500.00. I'm glad I didn't have to buy it out of pocket...
Kendall
Just s followup... you called this thing a "screamin demon." While it is true that it has a crapload of cores and memory, it really isn't that fast. The individual cores are only in the 2.3G range and they are throttled up or down depending on use. Take into account that the software written for a machine like this is not all that common, and the cores rarely get used to their full extent. That's why I mentioned the "rating" the way I did. The machine spends the majority of its efforts running Oracle, and when the need arises, VMWare sessions. As to being a rendering machine, there are better, and cheaper, ways to get more rendering performance.
Kendall
Heh, I've got something similar, that I'd already given up on expanding. S5000VSA
Intel has drivers for SUSE and Red Hat, wondering if it'd run on the free version of either... (Don't need much of a beast, for the first Linux app I'm itching to try).
At least, with its 4 Gigs, and the XP drivers, it can replace my elderly XP, which may be showing a first symptom of "motherboard rot".
I need a strong XP for at least one app... similarly I still need W98b, temporarily, for at least a codec.
Wicked case, though, assuming it'll accept a standard PSU upgrade. And two PCIe slots, if I wanna go the GPU rendering route.
Meanwhile, I've grown to hate working on my PC hardware.
Was that an in-person auction, or online? You have my envy. (Low on fundage).
@Kendall Sears, As I thought, this was "capital equipment" not just another "toy" like my Dell. :-) We have MUCH larger machines in the farm at work. The HP z800 is intended to be portable enough to drag out to customer sites if needed. A customer sent me some info for review on one of their runs. The machine at the customer site has 40 cpus (+40 virtual) and 1 TB of DRAM. Yes, that's one Terabyte of Ram. :gulp: We do have software that will max that machine out with 80 threads...
@T Jaiman, your motherboard specs are suspiciously similar to the Dell PE 1950. :)
Somehow the old thread I had in the old forums escaped the Google machinery. I just can't find it anymore. It was not in the members only section as I am not a PC member. It was in The Commons, just like this one. The auction where I got the server was online, but I picked it up in person, as it was nearby. I only go to nearby auctions, as shipping would blow the economics of the deal. I think I got lucky on the system config, as the descriptions in the auction catalog were rather vague and some of the lots may have had less. If you scrounge surplus stores, you can also find comparable systems for around $125-200 without hard drives. I also got lucky on this one, as it happens to support standard 2.5" SATA laptop drives. Having to buy SAS drives would have driven up the cost. Lastly, I avoided the Windows tax by going with Ubuntu Linux, which is free, provided you can download a DVD worth of data in a reasonable time. The deal with Windows is that multiple cores will work on W7 Home, but if you have 2 or more PHYSICAL cpus, you need Windows Professional or Ultimate.
The motherboard allows for something pretty decent. But luck wasn't with me - my metric is a paltry 2668. Two Xeon 5110's (no hyperthreading), more than cancel out the 667 MHz. If they'd been 5300's, I'd have 5336. But 4 gigs memory with it, Vista maximum, uncertain Linux potential. (Further hampered by the fact that I still haven't tackled Linux).. Kinda limits my upgrading options. Looks like it will take ordinary SATA, if I don't use the slow software 5.1 raid (hardware boot, software driven). 6 full-size drive trays, if I want to go nuts.
I'm glad you started this thread, I've started to look around, this might be the way to go.
The motherboard allows for something pretty decent. But luck wasn't with me - my metric is a paltry 2668. Two Xeon 5110's (no hyperthreading), more than cancel out the 667 MHz. If they'd been 5300's, I'd have 5336. But 4 gigs memory with it, Vista maximum, uncertain Linux potential. (Further hampered by the fact that I still haven't tackled Linux).. Kinda limits my upgrading options. Looks like it will take ordinary SATA, if I don't use the slow software 5.1 raid (hardware boot, software driven). 6 full-size drive trays, if I want to go nuts.
I'm glad you started this thread, I've started to look around, this might be the way to go.
Yes, I lucked out in that mine came with the 5300 quad-core cpus and 8Gb. I don't have hyperthreading either, but I have 8 cores. Ubuntu will run on just about anything, and is a very popular free distro. In retrospect, I kind of don't like the Ubuntu desktop, because it looks like Mac OS. :lol: Ganging up an "army" of cheap Linux boxes is one way to go on render time. The GPU and CPU/GPU hybrid rendering is still somewhat iffy. Luxrender can definitely network, and now Bryce Lightning can run under Wine, so I'm happy. Now I have to get used to Reality and the lighting in Lux. *Sigh* yet another learning curve.