Alex and I recently upgraded the server that runs our tools applications. We figured some people may be interested in what powers our applications so we decided to make a post showcasing the upgrade.  As every one knows, our tools are GPU powered but we have never talked much about the actual guts of the machine.

I got into GPU computing a few years ago when it was discovered that GPU’s could greatly increase the look up time of password hash’s.  One of the founders of the Backtrack-Linux CD, Muts, asked me to look into getting some of the security tools which utilized GPU processing working on Backtrack.  At the time I knew very little about the subject. My first box of GPU’s had 3 8800 GT cards and crunched WPA passwords at a whooping 15,000 pmk/s. I built the first set of packages for backtrack which included pyrit and a few other tools. Pretty soon this set up was just not good enough for me and I decided I needed a dedicated server. This was mainly because the X server was almost unusable when tools like pyrit were running due to the fact that pyrit also included CPU power in its computations. I started to look into my options and discovered that the Nvidia 295 GTX was the current “Top Dog” when it came to Nvidia cards so I decided to get some. I saved up the cash and purchased 3 of these cards, which set me back about 1500 dollars. The problem was I didn’t leave much money for the rest of the components. So the first version of Kracker was built on a budget. I got the cheapest board I could find with 3 slots which was a Foxconn Destroyer board, a used Q6600 Intel Quad core and 8 gigs of cheap gskill ram. The case was a $100 4u rackserver case from some no name company. The only other thing that cost a good bit was a 1200 watt power supply. All in all I had spent about $2600 on the server.

That was about a year ago and this winter I decided a upgrade was in order. We wanted to offer our users a faster crack and be able to crunch through bigger lists.  I was finally able to assemble all the parts I wanted to rebuild the box. In the following article I will detail the rebuild.

The hardest part about this type of build is finding a motherboard which will hold and support 4 double space video cards. There are a few out there but not many many. I finally settled on the ASRock x58 supercomputer. It got very good reviews and had the 4 slots I needed.

Here is a picture of the board:

They only problem with buying this board is that neither my CPU or my current ram were going to work. The original reason for the upgrade was to go from 3 Video cards to 4 Video cards for some increased performance, however, as i soon found out that meant upgrading every thing.
Because i was on a budget I decided to go with the Intel i7 920. I am a pretty good overclocker and I figured I could get it up to 3.6 GHz or so with out to much issue.
Here is a pic of CPU I used:

The next big choice was ram, I decided to go all out and get a 12GB kit of Corsair Dominator ram.
I probably could have used cheaper ram but it looked so cool:

The first thing to do was gut the entire case:

In order to make 4 295 GTX cards fit in a case like this some modifications need to be made. We had to completely remove the entire back panel spacing section and both of the rear fans. This will be easier to see at the end when I show the back section. Another thing that gave me trouble is the new 295 GTX cards have a different design so the are a little bit fatter than the old card so i had to account for that. As you can see in the picture I removed the CD drive bay and the hard drive bay as well. The reason for this is because I will be using 2 power supplies and I need the extra room for ventilation. As you can imagine 4 295 GTX cards is going to get a little hot so we need every ounce of airflow we can get.
Next step is to assemble the motherboard:

We added some Arctic 5 thermal paste to the CPU which in my opinion is the best thermal compound around. I went ahead and put the ram in first since the CPU cooler I bought was a little bigger than I had planned on and I wasn’t sure I would be able to get it all in there.
The CPU cooler:

Here is what it looked like installed:

The cooler is a little taller than my 4u rack case so i do not recommend getting this one unless you are willing to cut a hole in the top of your case. It actually looks pretty mean and nasty sticking out of the top of my case like a real V8.

Once we got that done we mounted the board in the case and added our 4 295 GTX video cards.
It was a tight fit but we made it work:

Next step was the power supplies. Now here is the problem. A 295 GTX Video card under full load pulls around 300 watts so just our 4 cards is going to be about 1200 watts which is what my current power supply does. So in order to get this thing the juice it needs we had to implement a second power supply.
The original power supply I had was a 1200 watt Thermaltake and was still in good shape so we went ahead and added that one in the standard location:

I choose a 850 Watt Thermaltake power supply for my second one. I have always had great luck with their Power supplies so I always try to use them.
Now in order to use 2 power supplies we need to find away to make them both come on at the same time. In the past I have used a more expensive fancier part however after helping my friend ReL1k build a similar system, he had some issues with the part I suggested, so after a little research I found the part we needed at Frozen CPU.
It would actually be easy to make this but for 14.99 it seemed much easier to just buy the damn thing:

Once the 2 power supplies were placed it became sort of a mess and hard to take pictures however I tried to highlight the locations in this one:

As you can see I routed the power cord along the top edge and out the back:

You can also see in this picture how much of the back area I had to remove in order to make all 4 cards fit properly.

The last thing to get going was the hard drive. I recently got a new laptop with a 250 gig SSD drive and I love it. The speed of SSD is absolutely amazing so we decided to shell out the cash for a SSD drive. we ended up going with a 128 gig Corsair laptop drive. I went with the laptop drive to save on space. I took a old drive cage from a junk laptop and used a dremel to modify it to suit my needs. I then used a few pieces of heavy duty velcro and mounted it to the side of the case where it is out of the way.
Here it is mounted in place:

You can also see the routing path I used for the power cable in that picture.
Okay so at this point every thing is in place but there is still one huge problem. Cooling. As you may well imagine this guy is going to get HOT!.
I had to remove the back two fans to make room for the CPU Cooler so that left me with only one fan in the front. I did replace this fan with a high end Cooler Master fan but this was not enough.
In my old server I had created a bar with 3 120 mm fans mounted to it which spanned the width of the case. I tried to make this work again but due to the second power supply it wouldn’t fit. I broke out the sheet metal and the saw and started to create a new one. As I was working I started looking around through old parts and found a old Penguin Computing server which had a sort of wall just like I was trying to build. It had 4 80 mm fans in it and was perfect. I removed the crappy old fans and added new Cooler Master fans.
Here is the fan bar I added to the box:

This created a huge amount of airflow in the case. It is now running cooler than ever.
The ram also has its own fan assembly:

After getting all this going I did a bit of clean up on the cables although there is not a ton of room to work with. Even using modular power supplies there is a lot of cable.
Here is the finished product:

The last thing we had to do was cut a hole for the CPU cooler. This is of course not needed if you buy a CPU cooler that fits but the V8 one was just to awesome and we had to get it.

Me and Alex had a awesome time building this beast and managed to do it all with only one day of downtime. No bad for a couple of Kentucky Boys.
We have a few more upgrades planned for the future, we would like to upgrade the CPU to a i7 965 extreme edition and we would like to get a few more SSD drives and put them in RAID but for now we are happy with our creation.

DeliciousStumbleUponDiggTwitterFacebookRedditLinkedInHyvesEmail
Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , ,
5 Responses to “The Tools Server gets a Fat Upgrade”
  1. vidkun says:

    Are you guys still running the quad-295′s or have you upgraded to the 480′s? I’d love to see some numbers on how many PMK’s you guys are getting these days and the current box’s hardware.

    BTW, nice presentation at the ISSA conf.

  2. purehate says:

    Hey Vidkun,
    On the production box we are currently still running the 4 295 gtx’s. We do have access to a box with 8 480 gtx cards which is what we are doing all our testing and demos with. Over the winter we will be building a new box which will most likely contain 500 series cards. We plan to keep this project alive and keep making it better and better.

  3. vidkun says:

    Eight 480′s!!? That’s plain insane!! I love it! I’d love to see some details/numbers on that one. How did you manage to get 8 cards on a single box, or how does the logistics of that work out? Two separate boards running quad-cards each and then using something like pyrit with the network distributing?

  4. compaq says:

    what motherboard has the ability to handle 8 single gpu’s? So far all I have seen is one that can handle 7 and they are single slot ones.

  5. jukan says:

    I think you should use ATI 6990 or ATI 5970 because ATI > NVIDIA for this type of computation ???

  6.  
Leave a Reply

*