The Graphical Processing Unit (GPU) workstation: a photographic tutorial.

The parts list for this workstation was originally written by Ross C. Walker, available at ambermd.org/gpu. I decided to use a few different components to keep the workstation significantly cooler. The build took about an hour.

First, we have our empty case. The case must be compatible with the size of the motherboard, which is the absolute key to the GPU workstation. For this specific build, the case has a few nice features external HD dock to ease data back; side-panel 200mm fan to cool the GPUs; five internal quick-change HD bays; space for two 200mm fans on top, two 120mm fans on front, one 200mm fan on right side, and one 120mm fan on back.

Case

 Figure 1. Empty case.

Next we install the motherboard, CPU, and power supply.

Mobo, CPU, PSU

Figure 2. Motherboard, power supply unit (PSU), and CPU. Before installing the PSU into the case, I attached all the power cables for the workstation, easing installation and organization of the system build. The cooling fan for the PSU faces down (left in figure) and the case has a grill to allow air to flow for the PSU. Counterclockwise from top-left of the motherboard: audio peripheral to case, USB peripheral to case, power-switch/reset-switch/HD status to case, sata to internal HD, sata to external HD on case, main motherboard power (and external USB to case above), CPU fan power, CPU and CPU cooler, 12V power to motherboard.

Now the GPUs. Reference cards (EVGA GTX 780) were used because of their superior cooling design. This cooling design is critical to a robust, stable workstation (which so far has been working 24/7 for 1+ year). 

DSC03306

Figure 3.1 Four GTX 780 GPUs. Cooling design: air is drawn in from the left inlet on the card and exhausted to the right of the card. Importantly, cool air is drawn from the case (as opposed to very hot air from the adjacent cards on a standard cooling design with twin fans) and exhausted to right, which exits the rear of the case. Thus the heat generated by the GPUs is exhausted outside the case.

DSC03308

 Figure 3.2. GTX 780 GPU air inlet.

The GPUs are installed easily. The gates on the rear of the case abreast the PCI slots are removed (all of them for four GPUs) with a screwdriver. The GPUs are installed by first securing the PCI interface, you should hear a click. This may take a good strong push to hear the click (make sure the clip on the motherboard is in the open position and seating the GPU into the PCI slot will close this clip producing the "clicking" sound). To make seating the GPUs a bit easier, align the HDMI females on the GPU into the slots of the case, tracking the slot all the way until the GPU is fully seated. Remember seating is complete only after you hear the "click". Finally, secure the GPU to the case with the screws that were securing the gates to the case. Trick: do not fully tighten these screws until all four GPUs are seated. You may want to make small adjustments to make sure the GPUs do not touch each other and securing them with the screws.

DSC03315

Figure 4.1. Seated GPU.

 DSC03316

Figure 4.2. Seated GPU PCI zoom.

 

Repeat the seating procedure outlined above to install all four GPUs. With all four GPUs properly seated, secure the GPU installation by tightening the screws between the GPU and the case.

DSC03326

Figure 5. All four GPUs are seated and installed. The screws that secure the GPUs to the case (top) can now be tightened to prevent the GPUs from touching adjacent GPUs.

Finally, we plug in the power cables from the PSU to the GPUs. Now with all the components installed on the motherboard, attaching the power cables to the PSU is much more difficult. Thus, if you haven't already done so in your own build, attach all power cables to the PSU before you install the PSU. In this build, notice how the cables are mindfully plugged so the cables naturally bend away from the GPUs so air flow in the case is smooth.

DSC03336

Figure 6. All done. Clean wiring and methodical installation makes for a simple and stable GPU workstation.

The build is now complete. The right panel is secured (which has a 200mm fan and is plugged into one of the three system fan outlets on the motherboard), the AC cable attached to the PSU, power up, and post.

After the system posts, you can install the OS of your choice. If you have a small cluster or a single workstation, a free Linux OS like Ubuntu works well. Once your OS is installed, download the NVIDIA drivers and toolkits from the NVIDIA website (specific to architecture, OS, and GPU version). If you are using a recent version of Ubuntu (say 14.04), you will need to perform a few steps to get your NVIDIA system running. You should have a second computer ready so you can follow the multi-step instructions. This process took me a few hours to go from post to fully installed NVIDIA drivers and toolkits, capable of running CUDA code like pmemd.cuda in AMBER14 (see ambermd.org/gpu).

Note you will want to make sure the NVIDIA driver and toolkit are *both* compatible with your GPU (GTX 7-series), architecture (x86), and OS (Ubuntu 14.04). After you download the appropriate driver and toolkit, you will need to disable nouveau (blacklist) and shutdown Xorg (sudo services lightdm stop). Then login to your OS with your username and password, run the NVIDIA installer, follow the prompts.

Installation of NVIDIA drivers and toolkit can be complicated by system-specific setup, gcc version, etc. For me, an issue arose that the OS would not load properly. The issue was the numlock greeter, which I had to disable. The lesson: be patient and use online forums to help you debug peculiar errors during installation.