Sqoop Hive table import, Table dataType doesn't match with database

anu picture anu · Jan 24, 2014 · Viewed 18.6k times · Source

Using Sqoop to import data from oracle to hive, its working fine but it create table in hive with only 2 dataTypes String and Double. I want to use timeStamp as datatype for some columns. How can I do it.

bin/sqoop import --table TEST_TABLE --connect jdbc:oracle:thin:@HOST:PORT:orcl --username USER1 -password password -hive-import --hive-home /user/lib/Hive/

Answer

deba picture deba · Apr 19, 2016

In addition to above answers we may also have to observe when the error is coming, e.g.

In my case I had two types of data columns that caused error: json and binary

for json column the error came while a Java Class was executing, at the very beginning of the import process :

/04/19 09:37:58 ERROR orm.ClassWriter: Cannot resolve SQL type

for binary column, error was thrown while importing into the hive tables (after data is imported and put into HDFS files)

16/04/19 09:51:22 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive does not support the SQL type for column featured_binary

To get rid of these two errors, I had to provide the following options

--map-column-java column1_json=String,column2_json=String,featured_binary=String --map-column-hive column1_json=STRING,column2_json=STRING,featured_binary=STRING

In summary, we may have to provide the

--map-column-java 

or

--map-column-hive

depending upon the failure.