Servers: Living Room Data Center - feralcoder/shared GitHub Wiki
Up Links
Public
feralcoder public Home feralcoder IT
WTF
I decided years ago that I needed to build My Own Private Cloud, for education and my own use, but the feat was beyond me while I had a job and otherwise overextended life. My work is more "infrastructure" than "application", and this project would require many layers of physical infrastructure, in redundancy, just to get started.
In contrast a code mock-up for a web service could be complete with all components running on a single VM. My work is everything underneath the real-world version of that, all the layers between the power company and the cloud services. It's impossible to mock that up without building it in all layers.
Eventually I quit my job and upgraded to enterprise gear so that I could pull this off. It only took several months after that. But now I'm there, and my electricity bill is outrageous. But it's considerably cheaper than an actual datacenter. It's April 2021, and it's starting to get a lot too warm for this - I may move my cabinet and blades out onto the patio for the summer. Really.
Update 2024: I did move my cabinet onto the balcony, and the blades alone still kept the house outrageously hot. I ran a $700 electric bill each of the last few months the stack ran. I was hired back at LinkedIn in September 2021, I shut down the whole stack very shortly after that, well before its first birthday, and it's been dark since. Worth it.
My Private Cloud
My cloud is OpenStack. Legit real enterprise-grade OpenStack. With HA control plane, on a Ceph storage cluster, with real distributed load balancing, isolated multi-nic networking, and all the other service-grade bells and whistles. This stack could easily support the internal needs of a software engineering org. With some hardening and more modern hardware it could serve production workloads.
Nitty gritty installation details, and how to do it yourself: Kolla-Ansible Openstack Deployment
Cloud Physical
The cloudy part of my cloud lives on physical servers in my living room. I've pirated power from my oven's circuit, because that one has 5x more power than the circuit behind most of the outlets in my house. For redundancy my servers and network gear also use power from 2 other circuits. I've chosen the lowest power CPUs I could, so the oven circuit actually has lots of headroom all by itself.
Server Cabinets
The cabinet is sealed, with front fans for positive pressure. Exhaust is vented with a barrel fan for improved cooling, allowing the servers to run more quietly.
Blade Enclosure
This thing is amazing for reducing cable madness and unifying management. At the cost of noise and storage capacity. The baseline networking, in blades and enclosure, is 10Gbps.
Most amazing is how cheap the hardware is on eBay - DIY builders don't look for these, so when enterprise users upgrade their hardware, there's very little demand for this used stuff. Unlike the servers which I populated my cabinet with, which are very popular in garage stacks.
OpenStack Servers
I'm using HP Enterprise for everything, from servers through network gear:
- Proliant DL380p G8 2u servers (Ivy Bridge CPU)
- HP BLC7000 Blade Enclosure
- BL460c G6 blades (Westmere-EP CPU)
- BL460c G8 blades (Ivy Bridge CPU)
- ProCurve 2910AL Switches with 10Gbps trunk modules