I have many GB of data stored in PostgreSQL database and i need those to be imported into the MongoDB. I did this using CSV export and mongoimport.
There are columns like this '2011-06-25' in that CSV and it has been imported as string, not as MongoDate, so i cannot effectively search by date.
I've found this : http://www.mongodb.org/display/DOCS/Import+Export+Tools#ImportExportTools-Example%3AImportingInterestingTypes but the example says, i need to use JSON structure for the file. Do i really need to export JSON file from PostgreSQL?
If i do - how?
If i don't, how to export "MongoDate" through CSV?
Actually the first option is pretty fast even with huge data. Here is an example query using the mongo console:
/usr/bin/mongo yourdbname --eval "db.yourcollectionname.find().forEach(function(doc){doc.yourdatefield = new ISODate(doc.yourdatefield);db.yourcollectionname.save(doc)});"