Memcache vs Memcached PHP benchmark

We are talking about the following two PHP memcache clients.

Should you use memcache or memcached? There are some people writing about this:

Memcached 16% faster than Memcache?

Nobody seemed to benchmark the two clients to see which one is faster and that is why I did just that. I tested both clients on the same machine with default settings in Ubuntu 12.04 for both the clients and the server. This is the benchmark script I used:

<?php
// Initialize values: 10000 keys of 20 bytes with 40 bytes of data
$c = 10000;
$values = array();
for ($i=0;$i<$c;$i++) $values[sprintf('%020s',$i)]=sha1($i);
echo "memcache vs memcached: $c keys\n";
// Memcached
$m = new Memcached();
$m->addServer('localhost', 11211);
$start = microtime(true);
foreach ($values as $k => $v) $m->set($k, $v, 3600);
$time = microtime(true)-$start;
echo "memcached set: $time\n";
$start = microtime(true);
foreach ($values as $k => $v) $m->get($k);
$time = microtime(true)-$start;
echo "memcached get: $time\n";
// Memcache
$m = new Memcache();
$m->addServer('localhost', 11211);
$start = microtime(true);
foreach ($values as $k => $v) $m->set($k, $v, 0, 3600);
$time = microtime(true)-$start;
echo "memcache set: $time\n";
$start = microtime(true);
foreach ($values as $k => $v) $m->get($k);
$time = microtime(true)-$start;
echo "memcache get: $time\n";

And this is the output:


maurits@maurits-Aspire-X3960:~$ php memcache.php
memcache vs memcached: 10000 keys
memcached set: 0.91661500930786
memcached get: 0.86234307289124
memcache set: 1.0546097755432
memcache get: 1.0519700050354

We clearly see that memcached is faster than memcache, but to find out how much faster we have to use xdebug profiling (install with “sudo apt-get install php5-xdebug”). We enabled xdebug profiling by setting “xdebug.profiler_enable = 1” in “/etc/php5/conf.d/xdebug.ini”. After running the php script a “cachegrind.out” file is created in the “/tmp” directory. KCachegrind (install with “sudo apt-get install kcachegrind”) can analyze that file and make pretty graphs:

memcache_vs_memcached_3
picture 1: KCachegrind call graph

memcache_vs_memcached_2
picture 2: KCachegrind callee map

memcache_vs_memcached
picture 3: KCachegrind flat profile

Conclusion

The difference is really small, so there are probably better reasons to choose the memcached client over the memcache client 😉

Share

Memcache bundle for Symfony2

memcache_debug

Installation instructions can be found on the Github LswMemcacheBundle readme.

When building high traffic websites you probably heard about “Memcache”. If you want to optimize your web application for high load and/or fast page loads this is an indispensable tool. The memcache website states about the software that it is a:

“Free & open source, high-performance, distributed memory object caching system, generic in nature, but intended for use in speeding up dynamic web applications by alleviating database load.

Memcached is an in-memory key-value store for small chunks of arbitrary data (strings, objects) from results of database calls, API calls, or page rendering.” — memcached.org

One of the ways you can apply Memcache is as a session storage solution. It will manage your session data without doing disk I/O on web or database servers. You can also run it as a central object storage for your website. In this role it is used for caching expensive API calls or database queries.

PHP 5 has good support for Memcache using the “Memcache” and “Memcached” modules (note that there is a one-letter difference). The first module calls the Memcache daemon directly, while the second one uses “libmemcached” to communicate. You can see the difference between them in terms of features in a matrix on the Memcached wiki. We chose the “Memcached” module, because it offers more features, is newer and faster because it supports the binary protocol.

At LeaseWeb we have created a Symfony2 bundle with Web Debug Toolbar integration to help you optimize your web application performance. It is called ‘LswMemcacheBundle’ and can be found on Github and Packagist.

Share