Redis-python setting multiple key/values in one operation

alonisser picture alonisser · Mar 5, 2014 · Viewed 20.5k times · Source

Currently I use the basic mset feature to store a key/value;

from common.redis_client import get_redis_client
cache = get_redis_client()
for k,v in some_dict.items():
   kw = {'key': value}
   cache.mset(kw) 

#later:
   cache.get('key')

I store each key/value separatly (not in one json for example) Since storing the whole dict would turn it into a string and would require me to serialize/deserialize on storing and retrieving and I really need access to seperate key/values.

My question:: is there a way I can mset multiple key/values at once? Instead of multiple writes to the redis db? And vice-versa can I have multiple reads (get) in one access? (and Yes - I have a lot of redis activity going on and with heavy load. I do care about this)

Answer

Pascal Le Merrer picture Pascal Le Merrer · Mar 5, 2014

updated after Agis comment

If you use Redis-py, which is currently the recommended Redis client for python, you can use the pipelining which does exactly what you want. Here is a simple example:

>>> r = redis.Redis(...)
>>> r.set('bing', 'baz')
>>> # Use the pipeline() method to create a pipeline instance
>>> pipe = r.pipeline()
>>> # The following SET commands are buffered
>>> pipe.set('foo', 'bar')
>>> pipe.get('bing')
>>> # the EXECUTE call sends all buffered commands to the server, returning
>>> # a list of responses, one for each command.
>>> pipe.execute()
[True, 'baz']

I don't know which redis client you are using, but either it supports the pipelining, or you should consider switching to redis-py.

Have a look at redis documentation about pipelining; it explains that you can expect a x5 performance boost -but also that you must not perform too important bulk operations (10 000 operations at each execution is OK).