Each machine needs to be connected to the switch that we mentioned earlier using a standard Ethernet cable. You also need to connect the switch to your Internet or wireless router. A router is needed to assign a network address to the machines in the cluster; this should happen automatically thanks to the DHCP server running on the router.
If you don't want to use a router (or connect your cluster to the Internet), use your master machine as a DHCP server. This will create IP addresses for each machine on the switch.
Installing the software
With the hard stuff out the way, it's time to install the software. We used the latest version of the Ubuntu Linux distribution, Intrepid Ibex.
Ubuntu is our first choice because there is such a wide variety of packages available, and these can be installed through the default package manager without any further configuration.
This makes installing apps like Blender across your cluster as easy as typing a single command on each machine, and it also means that you can install support utilities such as the Dr Queue rendering queue manager just as easily.
We opted for the Desktop rather than the Server version. This was because we still need the desktops on each node for setting up our various tests, and while we could still do the same with the Server edition (which by default has no desktop), configuration would be easier with a desktop.
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Installing Ubuntu is easy but a little tedious, as you need to go through the following routine for each machine. After creating a burned CD from the ISO image of the distro, boot each machine in turn with this disc in the optical drive. If it doesn't boot, check the boot order in your BIOS. Next, choose the 'Install Ubuntu' option from the Boot menu and answer the resulting questions.
On the Partition Options page, use the entire disk for the installation (unless you're sharing the machine with a Windows OS), and create the same user account on each machine. This makes setting up the shared storage device easier. You should also give each computer within the cluster its own name (on the Who Are You page).
Installation can take up to 45 minutes, depending on the speed of each node. When finished, your machine will restart and you'll need to log in to your new desktop. It's likely that you will also have to install a few security updates; a small yellow balloon message will open if these are necessary. The next step is to install the control software.
New applications and software can be installed through the Synaptic package manager. This can be found in the 'System | Administration' menu. First, you need to search for a package called 'openssh-server'.
In Synaptic, click on the package to enable it, followed by 'Apply' to download and install it. You need to do the same for a package called 'tightvncserver'. Both of these tools are used for remote administration, so you won't need them if you plan to keep a keyboard, mouse and screen attached to your various nodes.
To connect to each machine from a master device running Windows on the network or one of the Linux machines, you need to use an SSH client. The most popular application for Windows is the freely available Putty, while Linux users can simply type 'ssh -l username IP Address' into the command-line.
To find the IP address of each node, either use your router's web interface for connected devices (if it has one) or right-click on the Network icon on the Ubuntu desktop and select 'Connection Information'. If you're using the command line, type 'ipconfig'.