Requirement : Python objects with 2-3 levels of nesting containing basic datypes like integers,strings, lists, and dicts. ( no dates etc), needs to be stored as json in redis against a key. What are the best methods available for compressing json as a string for low memory footprint. The target objects are not very large, having 1000 small elements on average, or about 15000 characters when converted to JSON.
eg.
>>> my_dict
{'details': {'1': {'age': 13, 'name': 'dhruv'}, '2': {'age': 15, 'name': 'Matt'}}, 'members': ['1', '2']}
>>> json.dumps(my_dict)
'{"details": {"1": {"age": 13, "name": "dhruv"}, "2": {"age": 15, "name": "Matt"}}, "members": ["1", "2"]}'
### SOME BASIC COMPACTION ###
>>> json.dumps(my_dict, separators=(',',':'))
'{"details":{"1":{"age":13,"name":"dhruv"},"2":{"age":15,"name":"Matt"}},"members":["1","2"]}'
1/ Are there any other better ways to compress json to save memory in redis ( also ensuring light weight decoding afterwards ).
2/ How good a candidate would be msgpack [http://msgpack.org/] ?
3/ Shall I consider options like pickle as well ?
We just use gzip
as a compressor.
import gzip
import cStringIO
def decompressStringToFile(value, outputFile):
"""
decompress the given string value (which must be valid compressed gzip
data) and write the result in the given open file.
"""
stream = cStringIO.StringIO(value)
decompressor = gzip.GzipFile(fileobj=stream, mode='r')
while True: # until EOF
chunk = decompressor.read(8192)
if not chunk:
decompressor.close()
outputFile.close()
return
outputFile.write(chunk)
def compressFileToString(inputFile):
"""
read the given open file, compress the data and return it as string.
"""
stream = cStringIO.StringIO()
compressor = gzip.GzipFile(fileobj=stream, mode='w')
while True: # until EOF
chunk = inputFile.read(8192)
if not chunk: # EOF?
compressor.close()
return stream.getvalue()
compressor.write(chunk)
In our usecase we store the result as files, as you can imagine. To use just in-memory strings, you can use a cStringIO.StringIO()
object as a replacement for the file as well.