This is the fifth post in our series on cloud server performance benchmarking. In this post, we'll look at encoding and encryption performance using a compilation of 7 different benchmarks.
Benchmark Setup
All benchmarked cloud servers were configured almost identically using CentOS 64-bit (or 32-bit in the case of EC2 m1.small, c1.medium, Gandi, and IBM cloud servers).
Benchmark Methodology
Individual benchmark scores are calculated using the Phoronix Test Suite. Phoronix runs each test 3 times or until the standard deviation between each execution is less than 3.5% to improve statistical accuracy.
We chose to use a dedicated, bare-metal cloud server as the performance baseline for this post. This will provide a more readily comparable reference to non-cloud configurations. The server we chose as the baseline is the NewServers Jumbo server configured with dual Intel E5504 quad core 2.00 GHz processors and 48GB DDR3 ECC ram. We chose NewServers because they are the only IaaS cloud that does not utilize a virtualization layer that could adversely affect the benchmark results. All NewServers servers run on top of physical hardware. We assigned the baseline server a score of 100. All other servers were assigned a score proportional to the performance of that server, where greater than 100 represents better results and less than 100 represents poorer results. For example, a server with a score of 50 scored 50% lower than the baseline server overall, while a server with a score of 125, scored 25% higher.
To compute the score, the results from each of the 7 benchmarks on the baseline server are compared to the same benchmark results for a cloud server. The baseline server benchmark score represents 100% for each benchmark. If a cloud server scores higher than the baseline it receives a score higher than 100% (based on how much higher the score is) and vise-versa for a lower score.
Benchmarks
The following benchmarks were used to calculate the aggregate encoding performance (Encode) score displayed in the results tables below.
Monkey Audio Encoding [weight=100]: This test times how long it takes to encode a sample WAV file to APE format.
WAV To FLAC [weight=100]: This test times how long it takes to encode a sample WAV file to FLAC format.
WAV To MP3 [weight=100]: LAME is an MP3 encoder licensed under the LGPL. This test measures the time required to encode a WAV file to MP3 format.
WAV To Ogg [weight=100]: This test times how long it takes to encode a sample WAV file to Ogg format.
WAV To WavPack [weight=100]: This test times how long it takes to encode a sample WAV file to WavPack format.
FFmpeg AVI to NTSC VCD [weight=100]: This test uses FFmpeg for testing the systems audio/video encoding performance.
GnuPG [weight=100]: This test times how long it takes to encrypt a 2GB file using GnuPG of a file.
Results
The results are divided into tables separated by provider. If the provider has more than one data center location, multiple tables are included for each location. Each table shows the server identifier, CPU architecture, memory (GB), and the aggregate baseline relative score (as described above) linked to the complete Phoronix results. On the high end, there wasn't as much performance variation between providers in this post when compared to the previous 4 performance posts. Additionally, the results show that these benchmarks appear to be most influenced by CPU model and clock speed versus number of CPUs/cores. The top performers in this post are NewServers, GoGrid and Bluelock.