Delimiter not found error - AWS Redshift Load from s3 using Kinesis Firehose

Master of none picture Master of none · Mar 18, 2017 · Viewed 7.9k times · Source

I am using Kinesis firehose to transfer data to Redshift via S3. I have a very simple csv file that looks like this. The firehose puts it to s3 but Redshift errors out with Delimiter not found error. I have looked at literally all posts related to this error but I made sure that delimiter is included.

File

GOOG,2017-03-16T16:00:01Z,2017-03-17 06:23:56.986397,848.78
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:24:02.061263,848.78
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:24:07.143044,848.78
GOOG,2017-03-16T16:00:01Z,2017-03-17 06:24:12.217930,848.78

OR

"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:48:59.993260","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:07.034945","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:12.306484","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:18.020833","852.12"
"GOOG","2017-03-17T16:00:02Z","2017-03-18 05:49:24.203464","852.12"

Redshift Table

CREATE TABLE stockvalue
( symbol                   VARCHAR(4),
  streamdate               VARCHAR(20),
  writedate                VARCHAR(26),
  stockprice               VARCHAR(6)
);
  • Error Error

  • Just in case, here's what my kinesis stream looks like Firehose

Can someone point out what may be wrong with the file. I added a comma between the fields. All columns in destination table are varchar so there should be no reason for datatype error. Also, the column lengths match exactly between the file and redshift table. I have tried embedding columns in double quotes and without.

Answer

Jon Ekiz picture Jon Ekiz · Mar 19, 2017

Can you post the full COPY command? It's cut off in the screenshot.

My guess is that you are missing DELIMITER ',' in your COPY command. Try adding that to the COPY command.