Apache Spark: Get number of records per partition

nilesh1212 picture nilesh1212 · Sep 4, 2017 · Viewed 15.9k times · Source

I want to check how can we get information about each partition such as total no. of records in each partition on driver side when Spark job is submitted with deploy mode as a yarn cluster in order to log or print on the console.

Answer

Alper t. Turker picture Alper t. Turker · Sep 4, 2017

I'd use built-in function. It should be as efficient as it gets:

import org.apache.spark.sql.functions.spark_partition_id

df.groupBy(spark_partition_id).count