Servers: Blade Enclosure - feralcoder/shared GitHub Wiki

Up-Links

feralcoder Shared Top Level feralcoder IT

Private

(private) Top Level Wiki (private) Feralcoder IT

Related

My Private Cloud Living Room Data Center Server Cabinets OpenStack Servers

Blades? In a living room?

This monster: HP BLC7000 Blade Enclosure

Logistics

Power

The enclosure takes 6 power supplies, with different redundancy configurations. I'm using all 6 - 4 are powered off the same 240V circuit, tapped from my kitchen stove. The remaining 2 are powered by one 120V circuit each: from the kitchen, and the general house outlet circuit. Any one circuit can fail, or both of the 120Vs, without the enclosure going down.

One way I've guaranteed power reliability is by choosing the lowest-power servers I could. My gen8 ProLiant blades run at 65W per CPU, and the gen6 are 40W per CPU. All of these run at lower power when idling.

Network

This is where the enclosure really shines. It can be populated with up to 8 network modules, each serving all the contained blades. Every ProLiant blade has 2 on-board Virtual-Connect NICs, which each provide 4 physical network ports to the OS. Additionally, every blade can be expanded with 2 mezzanine cards, each providing 2 more VirtualConnect NICs, allowing up to 24 network ports per server. All of this is done without any cables, automatically available via the enclosure's midplane connections.

Compare this photo with network photos on the Server Cabinets page...

Each NIC (up to 6) provides a 10Gbps connection to the network modules, and the x4 'virtual' ports (up to 24) provide configurable rates up to 10Gbps. These aren't really virtual, either, because each has an independent network pipeline in hardware. The OS sees 24 physical ports with no abstraction.

There are some limitations to using the VirtualConnect technology - some traditional advanced network features aren't available, such as trunk ports to servers with native untagged vlans. In those cases, CISCO modules are available for the enclosure, and a combination of traditional and VirtualConnect ports can be used in each blade.

Price

There is very low home-and-garage demand for this kind of hardware. So when enterprise owners upgrade their cages and sell these on eBay, the prices stay ridiculously low.

Here's a rough price breakdown:

For comparison, equivalent networking to traditional servers would cost at least $50/card, and thousands for the switch.

The most expensive parts, by far, are CPUs and RAM, because these are the same as are used in traditional pizza-box servers, which are in much higher DIY demand.