I am using Oracle Sql Loader Utility from Linux shell to load csv data into Oracle DB. But I have noticed that if source csv files lines endings are '\r\n' (Windows format), sqlldr fails to load data for last column.
For example, if last column is of FLOAT type (defined in ctl file as 'FLOAT EXTERNAL'), sqlldr fails with 'ORA-01722: invalid number':
Sqlldr ctl file:
OPTIONS(silent=(HEADER))
load data
replace
into table fp_basic_bd
fields terminated by "|" optionally enclosed by '"'
TRAILING NULLCOLS
(
FS_PERM_SEC_ID CHAR(20),
"DATE" DATE "YYYY-MM-DD",
ADJDATE DATE "YYYY-MM-DD",
CURRENCY CHAR(3),
P_PRICE FLOAT EXTERNAL,
P_PRICE_OPEN FLOAT EXTERNAL,
P_PRICE_HIGH FLOAT EXTERNAL,
P_PRICE_LOW FLOAT EXTERNAL,
P_VOLUME FLOAT EXTERNAL
)
sqlldr execution command:
sqlldr -userid XXX -data ./test.data -log ./test.log -bad ./test.errors -control test.ctl -errors 3 -skip_unusable_indexes -skip_index_maintenance
sqlldr error log:
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
FS_PERM_SEC_ID FIRST 20 | O(") CHARACTER
"DATE" NEXT * | O(") DATE YYYY-MM-DD
ADJDATE NEXT * | O(") DATE YYYY-MM-DD
CURRENCY NEXT 3 | O(") CHARACTER
P_PRICE NEXT * | O(") CHARACTER
P_PRICE_OPEN NEXT * | O(") CHARACTER
P_PRICE_HIGH NEXT * | O(") CHARACTER
P_PRICE_LOW NEXT * | O(") CHARACTER
P_VOLUME NEXT * | O(") CHARACTER
value used for ROWS parameter changed from 300000 to 65534
Record 1: Rejected - Error on table FP_BASIC_BD, column P_VOLUME.
ORA-01722: invalid number
Record 2: Rejected - Error on table FP_BASIC_BD, column P_VOLUME.
ORA-01722: invalid number
When I replaced Windows line endings to Unix ones, all errors gone and all data loaded correctly.
My question is: how could I specify line terminator char in sqlldr config file but still keep the source file name in shell command?
I've seen some examples of how to do that with stream record format http://docs.oracle.com/cd/E11882_01/server.112/e16536/ldr_control_file.htm#SUTIL1087, but these examples are not applicable in my case as I need to keep name of data file in shell command, and not inside ctl file.
I recently encountered the same issue while loading data into my table via csv file. My file looked like this :
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
TERM_AGREEMENT INTEGER EXTERNAL
)
And as you mentioned , i kept getting the same error 'invalid number' Turns out this usually occurs -when your column datatype is Number but data you're getting from your csv file is in string,so oracle loader fails to perform a conversion of string to number. - when your field in csv file is terminated by some delimiters ,say space,tabs etc.
This is how i altered my ctl file :
LOAD DATA
infile '/ipoapplication/utl_file/LBR_HE_Mar16.csv'
REPLACE
INTO TABLE LOAN_BALANCE_MASTER_INT
fields terminated by ',' optionally enclosed by '"'
(
ACCOUNT_NO,
CUSTOMER_NAME,
LIMIT,
REGION,
TERM_AGREEMENT INTEGER Terminated by Whitespace
)