Last time, I gave an overall look at what my server setup at home looks like. I talked broad strokes of what I was trying to accomplish, and the software at the top of the stack that I thought would get me there.
This time, I want to talk about the hardware that this is all going to be built on. I wanted a set of hardware that wouldn’t be too expensive, relatively speaking, but I also wasn’t chasing down thrift stores trying to find old an old Dell to try and house this all on.
You can definitely hit up Labgopher to try and find a good deal on someone’s old server set up. There are a lot of businesses that routinely upgrade their server hardware, and when they put it on the market to recoup some of their costs, they’re not too worried about making a ton of money back. It’s generally pretty easy to find something that’s a few years old for dirt-cheap.
I chose not to take this route, mostly because many server systems are designed to be in a huge data center, where some of the cooling can be taken care of externally and where nobody cares if the thing is super loud. I don’t plan to have my server cooled externally (just the fans in the case and on the CPU), and I’m definitely concerned with noise, since this is going to be living in my house. Plus, you’re going to be working with years-old hardware, which may not perform as well as you’d like.
So, I decided to go with consumer-grade hardware for my server. I’m not going to be too demanding on it 24/7, so I’m not super worried about power consumption or the hardiness of the hardware to be able to last years at 100% load. The most intensive thing this server will be doing is probably live transcoding of video from the Plex server to one, maybe two clients at the same time.
The important thing to consider when putting together hardware for your home server is not only what you want to do now, but what you might also want to do in The Future(tm). Don’t go too overboard at the start, but keep in mind the ways that you might want to expand your hobby (and this does become a hobby) and plan for that. Give yourself some wiggle room where you can, you’ll thank yourself later.
The Hardware
TL;DR:
- CPU: Ryzen 5 2400g.
- 4 Cores/8 threads
- 3.9Ghz
- 65w TDP
- Radeon RX Vega 11 Graphics
- Pretty badass as far as integrated graphics go, probably not enough to be a satisfying gaming machine.
- More rationale explained below.
- RAM: 32GB Corsair Vengeance LPX
- Two 16GB sticks (gotta get that dual channel goodness)
- 2666 MHz
- I went with 32GB because virtual machines are going to be very hungry for RAM, especially if I want to play around with Windows Server evaluations and other things that bring their own GUI.
- Motherboard: Gigabyte B450M DS3H
- Built-in LAN and HDMI
- Honestly, nothing too special here. Supports the rest of my hardware.
- Has enough expansion slots so that I don’t feel trapped if I wanted to add TV tuner cards or additional graphics cards.
- Storage: Mushkin Pilot 250GB M.2 SSD
- Again, nothing too special here. I wanted an SSD to run the OS and VM images from, and that’s what I’ve got. It’s fairly cheap but still many times faster than an old mechanical hard drive.
- I’ve also got a standard 1.5TB mechanical hard drive in there, for media storage. This will likely be upgraded in the future to something more parallel and redundant.
- Power Supply: 550w Corsair
- Enough power for the system. Probably overkill, but will scale if/when I decide to put a graphics card in this machine for whatever reason.
- Case: iStarUSA D Value D-313SE
- I could have gotten away with a 1U server rack, but I wanted to have tons of space to throw extra hard drives in here, as well as to fit a 2-slot graphics card, should the need arise. Plus it looks neat in the rack I have it in.
When it comes to price/performance ratio, AMD really can’t be beat these days. I don’t need a 16 core/32 thread Threadripper, but since I want to be able to support a couple of transcodes at once, I’ll want a fair few amount of cores. This is also important if you plan to run virtual machines for any reason, since there is quite a bit of overhead involved for each machine that you’re running.
I settled on an AMD Ryzen 5 2400g. I went with this because it has 4 cores (hyperthreaded, so it supports 8 simultaneous threads), and because it has some level of built-in graphics. I’m not sure what I’d be using those graphics for, but I figured in the event that this system doesn’t work out and I want to repurpose it, those onboard graphics might come in handy. Maybe I can mine ethereum or bitcoin with it ;-).
This particular model comes with a cooler too, which is guaranteed to work well, and, in my experience, is pretty quiet. This thing is stuffed away in a closet, but sometimes people stay in my guest room, so I’d rather not have them be bothered by blaring fans in their closet.
It’s also fairly power-efficient as well, with a maximum TDP of 65 watts. I’m not super concerned about power-efficiency (though power in Southern California is not… the cheapest), mostly because this is a hobby and I expect it to cost a little bit of money. That said, I’ll take the wins where I can, and I think this is a nice feature of the 2400g.
It also came with a free copy of Tom Clancy’s The Division 2, which I planned to buy anyway, so that’s a nice bonus!
Everything is mounted in this rack, with a 1U power strip/surge protector. I wouldn’t recommend it, however, since it’s very short and seems to be tough to find things to fit in it. The server itself is mounted solely by the front mounting points, which seems precarious to me but also hasn’t caused any problems yet. I plan to get sliding drawer for a laptop to put in here, but haven’t been able to find anything that would fit this rack. Eventually, when I own my own house and have the entire thing wired up with CAT 6 cable, this will also hold a patch panel and switch. For now, though, it’s content just holding Ikora herself.
That’s the hardware, in a nutshell. This has proven to be able to support everything I’ve tried so far without skipping a beat, including 10s of Docker containers and a couple of different VMs running all at the same time.
Speaking of which, next time I’ll detail the software setup. I’m still finalizing and tweaking it for now (and likely will be for the forseeable future) but once it’s in a place where I’m comfortable sharing everything I’ve done and all that I’ve learned I’ll do another post about that.