How can I convert a pyspark.sql.dataframe.DataFrame back to a sql table in databricks notebook

Semihcan Doken picture Semihcan Doken · Aug 20, 2016 · Viewed 9.5k times · Source

I created a dataframe of type pyspark.sql.dataframe.DataFrame by executing the following line: dataframe = sqlContext.sql("select * from my_data_table")

How can I convert this back to a sparksql table that I can run sql queries on?

Answer

Alberto Bonsanto picture Alberto Bonsanto · Aug 20, 2016

You can create your table by using createReplaceTempView. In your case it would be like:

dataframe.createOrReplaceTempView("mytable")

After this you can query your mytable using SQL.

If your a spark version is ≤ 1.6.2 you can use registerTempTable