I'm trying to serialize a list of python objects with JSON (using simplejson) and am getting the error that the object "is not JSON serializable".
The class is a simple class having fields that are only integers, strings, and floats, and inherits similar fields from one parent superclass, e.g.:
class ParentClass:
def __init__(self, foo):
self.foo = foo
class ChildClass(ParentClass):
def __init__(self, foo, bar):
ParentClass.__init__(self, foo)
self.bar = bar
bar1 = ChildClass(my_foo, my_bar)
bar2 = ChildClass(my_foo, my_bar)
my_list_of_objects = [bar1, bar2]
simplejson.dump(my_list_of_objects, my_filename)
where foo, bar are simple types like I mentioned above. The only tricky thing is that ChildClass sometimes has a field that refers to another object (of a type that is not ParentClass or ChildClass).
What is the easiest way to serialize this as a json object with simplejson? Is it sufficient to make it serializable as a dictionary? Is the best way to simply write a dict method for ChildClass? Finally, does having the field that refer to another object significantly complicate things? If so, I can rewrite my code to only have simple fields in classes (like strings/floats etc.)
thank you.
I've used this strategy in the past and been pretty happy with it: Encode your custom objects as JSON object literals (like Python dict
s) with the following structure:
{ '__ClassName__': { ... } }
That's essentially a one-item dict
whose single key is a special string that specifies what kind of object is encoded, and whose value is a dict
of the instance's attributes. If that makes sense.
A very simple implementation of an encoder and a decoder (simplified from code I've actually used) is like so:
TYPES = { 'ParentClass': ParentClass,
'ChildClass': ChildClass }
class CustomTypeEncoder(json.JSONEncoder):
"""A custom JSONEncoder class that knows how to encode core custom
objects.
Custom objects are encoded as JSON object literals (ie, dicts) with
one key, '__TypeName__' where 'TypeName' is the actual name of the
type to which the object belongs. That single key maps to another
object literal which is just the __dict__ of the object encoded."""
def default(self, obj):
if isinstance(obj, TYPES.values()):
key = '__%s__' % obj.__class__.__name__
return { key: obj.__dict__ }
return json.JSONEncoder.default(self, obj)
def CustomTypeDecoder(dct):
if len(dct) == 1:
type_name, value = dct.items()[0]
type_name = type_name.strip('_')
if type_name in TYPES:
return TYPES[type_name].from_dict(value)
return dct
In this implementation assumes that the objects you're encoding will have a from_dict()
class method that knows how to take recreate an instance from a dict
decoded from JSON.
It's easy to expand the encoder and decoder to support custom types (e.g. datetime
objects).
EDIT, to answer your edit: The nice thing about an implementation like this is that it will automatically encode and decode instances of any object found in the TYPES
mapping. That means that it will automatically handle a ChildClass like so:
class ChildClass(object):
def __init__(self):
self.foo = 'foo'
self.bar = 1.1
self.parent = ParentClass(1)
That should result in JSON something like the following:
{ '__ChildClass__': {
'bar': 1.1,
'foo': 'foo',
'parent': {
'__ParentClass__': {
'foo': 1}
}
}
}