How to import from sql dump to MongoDB?

Jaffer Wilson picture Jaffer Wilson · Feb 8, 2017 · Viewed 14.2k times · Source

I am trying to import data from MySQL dump .sql file to get imported into MongoDB. But I could not see any mechanism for RDBMS to NoSQL data migration.
I have tried to convert the data into JSON and CSV but it is not giving m the desired output in the MongoDB.
I thought to try Apache Sqoop but it is mostly for SQL or NoSQL to Hadoop.
I could not understand, how it can be possible to migrate data from 'MySQL' to 'MongoDB'?
I there any thought apart from what I have tried till now?
Hoping to hear a better and faster solution for this type of migration.

Answer

McGrady picture McGrady · Feb 8, 2017

I suggest you dump Mysql data to a CSV file,also you can try other file format,but make sure the file format is friendly so that you can import the data into MongoDB easily,both of MongoDB and Mysql support CSV file format very well.

You can try to use mysqldump or OUTFILE keyword to dump Mysql databases for backup,using mysqldump maybe takes a long time,so have a look at How can I optimize a mysqldump of a large database?.

Then use mongoimport tool to import data.

As far as I know,there are three ways to optimize this importing:

  • mongoimport --numInsertionWorkers N It will start several insertion workers, N can be the number of cores.

  • mongod --njournal Most of the continuous disk usage come from the journal,so disable journal might be a good way for optimizing.

  • split up your file and start parallel jobs.

Actually in my opinion, importing data and exporting data aren't difficulty,it seems that your dataset is large,so if you don't design you document structure,it still make your code slower,it is not recommended doing automatic migrations from relational database to MongoDB,the database performance might not be good.

So it's worth designing your data structure, you can check out Data models.

Hope this helps.