mysql load data local infile syntax issues with set fields

sadmicrowave picture sadmicrowave · Sep 3, 2013 · Viewed 13k times · Source

I'm trying to use mysql's LOAD DATA LOCAL INFILE syntax to load a .csv file into an existing table. Here is one record from my .csv file (with headers):

 PROD,       PLANT,PORD,  REVN,A_CPN,     A_CREV,BRDI,       DTE,     LTME
 100100128144,12T1,2070000,04,3DB18194ACAA,05_01,ALA13320004,20130807,171442

The issue is that I want 3 extra things done during import:

  1. A RECORDID INT NOT NULL AUTO_INTEGER PRIMARY_KEY field should be incremented as each row gets inserted (this table column and structure already exists within the mysql table)

  2. DTE and LTME should be concatenated and converted to a mysql DATETIME format and inserted into an existing mysql column named TRANS_OCR

  3. A CREATED TIMESTAMP field should be set to the current unix timestamp on row insertion (this table column and structure already exists as well within the mysql table)

I'm trying to import this data into the mysql table with the following command:

 LOAD DATA LOCAL INFILE 'myfile.csv' INTO TABLE seriallog
 FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
 LINES TERMINATED BY '\n'
 IGNORE 1 LINES
 (FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID)
 SET CREATED = CURRENT_TIMESTAMP;

I think I have the CREATED column set properly but the others are causing a mysql warning to be issued:

Warning: Out of range value for column 'FLEX_PN' at row 1
Warning: Row 1 was truncated; it contained more data than there were input columns

Can someone help me with the syntax, the LOAD DATA LOCAL INFILE module is confusing to me...

Answer

sadmicrowave picture sadmicrowave · Sep 4, 2013

Figured out the proper syntax to make this work:

sql = """LOAD DATA LOCAL INFILE %s INTO TABLE seriallog_dev 
         FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' 
         LINES TERMINATED BY '\\n' 
         IGNORE 1 LINES 
         (FLEX_PN, FLEX_PLANT, FLEX_ORDID, FLEX_REV, CUST_PN, CUST_REV, SERIALID, @DTE, @LTME) 
         SET RECORDID = NULL, 
             TRANS_OCR = STR_TO_DATE(CONCAT(@DTE,'',@LTME), "%%Y%%m%%d%%H%%i%%s"), 
             CREATED = CURRENT_TIMESTAMP;"""

params = (file,)
self.db.query( sql, params )

Mind you--this is done with python's mysqldb module.

CAVEAT

The only issue with this solution is that for some reason my bulk insert only inserts the first 217 rows of data from my file. My total file size is 19KB so I can't imagine that it is too large for the mysql buffers... so what gives?

more info

Also, I just tried this syntax directly within the msyql-server CLI and it works for all 255 records. So, obviously it is some problem with python, the python mysqldb module, or the mysql connection that the mysqldb module makes...

DONE

I JUST figured out the problem, it had nothing to do with the load data local infile command but rather the method I was using to convert my original .dbf file into the .csv before attempting to import the .csv. For some reason the mysql import method was running on the .csv before .dbf to .csv conversion method finished -- resulting in a partial data set being found in the .csv file and imported... sorry to waste everyone's time!