Spark: converting GMT time stamps to Eastern taking daylight savings into account

Bob Swain picture Bob Swain · Aug 18, 2017 · Viewed 10.4k times · Source

I'm trying to convert a column of GMT timestamp strings into a column of timestamps in Eastern timezone. I want to take daylight savings into account.

My column of timestamp strings look like this:

'2017-02-01T10:15:21+00:00'

I figured out how to convert the string column into a timestamp in EST:

from pyspark.sql import functions as F

df2 = df1.withColumn('datetimeGMT', df1.myTimeColumnInGMT.cast('timestamp'))
df3 = df2.withColumn('datetimeEST', F.from_utc_timestamp(df2.datetimeGMT, "EST"))

But the times don't change with daylight savings. Is there another function or something that accounts for daylight savings with converting the timestamps?

EDIT: I think I figured it out. In the from_utc_timestamp call above, I needed to use "America/New_York" instead of "EST":

df3 = df2.withColumn('datetimeET', F.from_utc_timestamp(df2.datetimeGMT, "America/New_York"))

Answer

Bob Swain picture Bob Swain · Aug 21, 2017

I ended up figuring out the answer, so I figured I would add it here. I also think that this question/answer is worthwhile because while I was searching for this issue before posting the question, I couldn't find anything about daylight savings for spark. I probably should have realized that I should search for the underlying java functions.

The answer to the question ended up being to use the string "America/New_York" instead of "EST". This correctly applies daylight savings.

from pyspark.sql import functions as F
df3 = df2.withColumn('datetimeET', F.from_utc_timestamp(df2.datetimeGMT, "America/New_York"))

EDIT:

This link shows a list of available time zone strings that can be used in this way: https://garygregory.wordpress.com/2013/06/18/what-are-the-java-timezone-ids/