I'm trying to create an internal (managed) table in hive that can store my incremental log data. The table goes like this:
CREATE TABLE logs (foo INT, bar STRING, created_date TIMESTAMP)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '<=>'
STORED AS TEXTFILE;
I need to load data into this table periodically.
LOAD DATA INPATH '/user/foo/data/logs' INTO TABLE logs;
But the data is not getting inserted into the table properly. There might be some problem with the delimiter.Can't find why.
Example log line:
120<=>abcdefg<=>2016-01-01 12:14:11
On select * from logs;
I get,
120 =>abcdefg NULL
first attribute is fine, the second contains a part of delimiter but since it's string that is getting inserted and third will be null since it expects date time.
Can anyone please help on how to provide custom delimiters and load data successfully.
By default, hive only allows user to use single character as field delimiter. Although there's RegexSerDe to specify multiple-character delimiter, it can be daunting to use, especially for amateurs.
The patch (HIVE-5871) adds a new SerDe
named MultiDelimitSerDe
. With MultiDelimitSerDe
, users can specify a multiple-character field delimiter when creating tables, in a way most similar to typical table creations.
hive> CREATE TABLE logs (foo INT, bar STRING, created_date TIMESTAMP)
> ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe'
> WITH SERDEPROPERTIES ("field.delim"="<=>")
> STORED AS TEXTFILE;
hive> dfs -put /home/user1/multi_char.txt /user/hive/warehouse/logs/. ;
hive> select * from logs;
OK
120 abcdefg 2016-01-01 12:14:11
Time taken: 1.657 seconds, Fetched: 1 row(s)
hive>