I am trying to parse a JSON object into a Python dict
. I've never done this before. When I googled this particular error, (What is wrong with the first char?), other posts have said that the string being loaded is not actually a JSON string. I'm pretty sure this is, though.
In this case, eval()
works fine, but I'm wondering if there is a more appropriate way?
Note: This string comes directly from Twitter, via ptt tools.
>>> import json
>>> line = '{u\'follow_request_sent\': False, u\'profile_use_background_image\': True,
u\'default_profile_image\': False,
u\'verified\': False, u\'profile_sidebar_fill_color\': u\'DDEEF6\',
u\'profile_text_color\': u\'333333\', u\'listed_count\': 0}'
>>> json.loads(line)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py", line 326, in loads
return _default_decoder.decode(s)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 382, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Expecting property name: line 1 column 1 (char 1)
That's definitely not JSON - not as printed above anyhow. It's already been parsed into a Python object - JSON would have false
, not False
, and wouldn't show strings as u
for unicode (all JSON strings are unicode). Are you sure you're not getting your json string turned into a Python object for free somewhere in the chain already, and thus loading it into json.loads() is obviously wrong because in fact it's not a string?