Building a Raspberry Pi 3 cluster for under £100 (£250 including five RPi3s)
Inspired by all the great Raspberry Pi projects out there, I thought I’d try designing & building something simple myself. The launch of the Raspberry Pi 3 in March 2016 got me enthusiastic about building my very own cluster of Pi’s (a “bramble” ). Along the way I got to play with a 40W laser cutter, and spent more than a few hours learning basic design with the SketchUp and Inkscape applications:
The completed cluster measures 141 (w) x 150 (h) x 210mm (d) and weighs 1.4kg. (5.6 x 5.9 x 8.3″, 50oz)
My design is fairly similar to that of the commercial Pico Cluster (USA) who sell a fully-finished 5 node cluster of Raspberry Pi 2s for US$499. Once you add on international shipping, import tax, VAT, etc. it becomes a rather expensive £488.30 here in the UK…
3D design in SketchUp
Originally I made my design in 3D using the free version of SketchUp and built rough templates of the Pi’s, network switch, USB hub, sockets, etc. However this proved very messy exporting to a 2D design in SVG/DXF format, ready for the laser cutting. I tried using Simon Beard’s SVG plugin for SketchUp, but my SketchUp file had all sorts of rubbish in it which caused problems, and the resulting SVG files had to be laboriously cleaned-up by hand in Inkscape.
I might have been better off designing in 2D with Inkscape, and then using SketchUp to make a 3D model to check component spacing, etc?
2D design in Inkscape
I used the free Inkscape application for 2D design, ready for exporting to the laser cutter. Each colour is a different pass of the laser, at different power/speed levels, so the green lines are cut first to make holes for ports/screws/ventilation, orange is text/lines that are etched, yellow is text that is solid engraved and finally blue cuts the outside of each panel.
Download files for laser cutting on two 600x400x3mm sheets (the largest I could cut):
Prototypes laser-cut in 3mm MDF
Using exactly the same design files in the laser cutter, I made a couple of prototype cases in 3mm MDF (which is cheaper than acrylic). The first version (left) fitted together and worked, but it was very tight getting all the cables routed, and there was no ventilation at all. The final design (right) is narrower, shorter, but much deeper. Each case panel has annotations etched to make it easier to join in the correct order, and plenty of vents were added. External ports were moved, internal cable routing much improved, and the case lid now has half the number of elastic clips.
Each case took two 600x400x3mm sheets, as unfortunately the design doesn’t quite fit on a single sheet.
Laser cutting was done at Access Space in Sheffield, with their 40W laser. The final case was cut in 3mm extruded acrylic (perspex), and took about 30 minutes. The design could be much-optimised for a more efficient cutting order. Extruded (rather than cast) acrylic has a fine thickness tolerance, which is required for the elastic clips to work.
If you are in Sheffield, Hindleys is a good supplier for a wide range of acrylic sheets.
Case clipping system
There are various solutions for joining case panels together (glue, screws, etc) but I was particularly impressed with this ‘elastic clip’ designed by Patrick Fenner of Deferred Procrastination in Liverpool. It enables a remarkably strong case to be made without any extra parts, which somehow seems more ‘elegant’.
Full details of his clip design, including downloadable SVG files, are at: Laser-Cut Elastic-Clips.
My prototype case originally had 8 elastic clips holding the lid on, which was completely over-the-top (as well as needing at least 3 hands to fit!). I’ve replaced 4 of the clips with a simple tab instead, which works well.
Power, temperature & cooling
At idle, the entire system of five RPi3s, network switch & USB hub sips a mere 9W, and at 100% load it still only uses 22W in total. There is the possibility of further reducing the power requirements by disabling WiFi, Bluetooth and HDMI?
The USB hub can supply up to 60W (2.4A per Pi), which is more than enough for a couple of power-hungry external devices to be plugged into the USB ports. Using:
vcgencmd measure_temp
to measure the SoC core temperature, the cluster idles at 45°C (113°F) with passive cooling.
At 100% load, using:
sysbench --test=cpu --cpu-max-prime=200000 --num-threads=4 run &
the SoC core temperatures reach 80°C (176°F) after 7 minutes. At this point the SoCs automatically throttle down their clock speed, to avoid overheating. They can safely run long-term at this temperature, but you don’t get maximum performance.
What about active cooling? You can optionally strap on a 92mm fan if you are going to run the cluster at high load for extended periods.
Exactly the same case design should work with Raspberry Pi 2s which run much cooler than the model 3, and so shouldn’t need any active cooling.
Optional 92mm case fan
If you want to run the cluster at 100% for extended periods, I’ve designed the case lid for externally mounting a suitable 92mm fan, which should be able to be run from the 5V power+ground rails of the GPIO. A neat solution would be to control the fan with the PWM pin, so it only spins up at high loads, although a quality modern fan should be nigh-on silent anyway.
So far I’ve tried a Nanoxia Deep Silence 92mm Ultra-Quiet (£7.82), running at its slowest speed (5V) powered by one of the Pi’s. At idle the cluster reaches 37°C (8°C improvement). At 100% load the cluster reaches an average of 74°C, although two of the Pi’s get very close to throttling at 79-80°C. I have to get my ear within 50-75mm (2-3″) to hear even the slightest whisper from the fan, and the supplied rubber gromets definitely do a good job of isolating the case from any small vibrations.
I may try a more powerful 5V fan, or use a boost converter circuit to power the fan at 7V/9V so that it spins much faster.
Design choices
- RPi3 vs RPi2 – the RPi3 is shiny and new (and much faster), but actually for a cluster there probably isn’t much advantage: a Pi cluster isn’t built for high performance, you probably won’t use WiFi/Bluetooth on the RPi3, and a RPi2 runs cooler & uses less power.
- Horizontal vs Vertical-mount – every other cluster I’ve seen stacks the Pi’s vertically on top of each other, but mounting them horizontally on rails should give much better airflow across the SoCs (for passive cooling). I had hoped I could design something without any fans. The horizontal rails are quite fiddly to set up, or to swap out a faulty Pi, etc.
- Gigabit vs Megabit-switch – the Pi network ports are only 100Mbps, but using a 1000Mbps switch means there is no bottleneck if all the Pi’s are simultaneously saturating their links. e.g., talking to outside network, such as a NAS.
- Switch 5V vs 12V – a network switch can be powered from a spare 5V USB instead of needing a separate 12V supply.
- Eco switch vs Standard – reduces power used when network ports aren’t connected, or aren’t currently active.
- Beefy USB hub vs Standard – many USB hubs don’t provide enough to reliably power an RPi3. They can draw up to a maximum of 2.5A per Pi, but will actually be much less in this cluster, without extra USB and GPIO accessories.
- External LAN x2 vs x1 – with 2 sockets you can chain multiple clusters together, and still have 1 socket to plug into a WAN, NAS, etc.
- Elastic case clips vs Screws – more elegant? a bit quicker to build and very slightly cheaper.
- Glue hub vs Screw – unfortunately this USB hub has no mounting holes, and I didn’t want to drill into the metal heatsink, so a few drops of superglue to attach brass standoffs seemed like the easiest solution for attaching it to a case panel.
- Hub underneath Pi’s? – not ideal, as the hub itself generates some heat that has to go somewhere. This was a compromise because I wanted short, neat USB cables.
- Pi Heatsink vs None – should help with dissipating the heat, and it is the large SoC chip that generates the most heat.
- Case Fan vs None – the design allows for either: passive cooling through the case lid vents, or strap on a 92mm fan if you are going to run the cluster at high load for extended periods.
- Right-angled HDMI vs Standard – HDMI cables are too thick to bend easily, and a standard straight connector will require a higher case to fit.
- Transparent case vs Opaque – so you can see all the Pi-goodness inside. Obvs.
- Rainbow cables vs Black – just because!
Building the Pi Cluster
- Remove network switch case (2 small screws)
Remove USB hub case (tricky – needs to be carefully prised open, there are no screws) - Cut C7 plug power cable to ~35cm length of flex. Solder onto the C8 screw mount socket (it doesn’t matter which wire goes to which pin.) I couldn’t source a ready-made cable with the right-angle C7 plug that I wanted
- Attach the network switch to case base, using 4x 6mm brass spacers + bolts + nuts. This only fits one way around
Screw 2 external LAN ports to inside of the case back (has “AC100-240V” etched on the outside)
Bolt C8 socket port to case back, secure with 2 nuts
Clip case back to case base (marked C+D) - Plug external LAN into network switch ports 1+3 (no room to use 1+2)
Plug Pi LAN cables into network switch ports 4-8 (be very careful if removing these – it is easy to break tiny plastic clips on the switch ports)
- Fit 4x 6mm brass standoffs to underside of case shelf, using screws
Superglue the metal top of the USB hub to the brass standoffs
Plug USB cables and C7 power plug to USB hub (the power port on the hub is fragile, so support it with one hand while plugging in/out)
Plug USB hub to network switch DC power
Clip case shelf to case back (L+K)
Read the full article at http://climbers.net/diy-raspberry-pi-3-cluster/
Author: Nick Smith, Climbers.net UK
Climbing websites & Photography http://climbers.net/