how to merge multiple parquet files to single parquet file using linux or hdfs command?

Shankar picture Shankar · Jul 27, 2016 · Viewed 28k times · Source

I have multiple small parquet files generated as output of hive ql job, i would like to merge the output files to single parquet file?

what is the best way to do it using some hdfs or linux commands?

we used to merge the text files using cat command, but will this work for parquet as well? Can we do it using HiveQL itself when writing output files like how we do it using repartition or coalesc method in spark?

Answer

giaosudau picture giaosudau · Oct 7, 2016

According to this https://issues.apache.org/jira/browse/PARQUET-460 Now you can download the source code and compile parquet-tools which is built in merge command.

java -jar ./target/parquet-tools-1.8.2-SNAPSHOT.jar merge /input_directory/
        /output_idr/file_name

Or using a tool like https://github.com/stripe/herringbone