How to do a bulk insert with node-postgres

Fabrizio Mazzoni picture Fabrizio Mazzoni · Feb 26, 2017 · Viewed 17.6k times · Source

I am importing an excel file into a postgres database with express and node-pg

Currently I am looping through the excel rows and executing an insert for every row but I feel it's not the right way:

workbook.xlsx.readFile(excel_file).then(function () {
        // get the first worksheet          
        var worksheet = workbook.getWorksheet(1);
        // Loop through all rows
        worksheet.eachRow(function (row, rowNumber) {
            // Commit to DB only from line 2 and up. We want to exclude headers from excel file
            if (rowNumber > 1) {
                // Loop through all values and build array to pass to DB function
                row.eachCell(function (cell, colNumber) {
                    arrSQLParams.push(cell.value)                   
                })

                // Add the user id from session to the array
                arrSQLParams.push(user);

                // Insert into DB
                db.query(strSQL, arrSQLParams, function (err, result) {
                    if (err) {
                        console.log(err);
                            ret = false;
                        }
                })

                // Empty the array for new query
                arrSQLParams = [];
            }
        })          
    });

Is there a better way to do this to improve performance?

Answer

vitaly-t picture vitaly-t · Mar 1, 2017

Following the clarification provided by the author, to insert up to 1000 records at a time, the solution as suggested within Multi-row insert with pg-promise is exactly what the author needs, in terms of both performance and flexibility.

UPDATE

A must-read article: Data Imports.