Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
816 views
in Technique[技术] by (71.8m points)

r - changing the JVM timezone in sparklyr

I am desperately trying to change the timezone of my JVM in Sparklyr (using spark 2.1.0). I want GMT everywhere.

I am setting:

config$`driver.extraJavaOptions` <-"Duser.timezone=GMT"

in my spark_config() file but unfortunately, in the Spark UI I still see (under System Properties) that user.timezone is set to America/New_York.

Any ideas? Thanks!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

A few things:

  • The name of the property is spark.driver.extraJavaOptions.
  • The value is missing leading -. Should be -Duser.timezone=GMT.
  • For consistency you need both spark.driver.extraJavaOptions and spark.executor.extraJavaOptions.
  • In general case spark.driver.extraJavaOptions and similar properties should be set outside the application. As explained in the official documentation:

    In client mode, this config must not be set through the SparkConf directly in your application, because the driver JVM has already started at that point. Instead, please set this through the --driver-java-options command line option or in your default properties file.

    On the driver calling corresponding Java methods should work

    # sc is spark_shell_connection / spark_connection
    sparklyr::invoke_static(sc, "java.util.TimeZone",  "getTimeZone", "GMT") %>%
      sparklyr::invoke_static(sc, "java.util.TimeZone", "setDefault", .)
    

    but might not be reflected in the UI, and you'll still need spark.executor.extraJavaOptions.

In general case you should edit spark-defualts.conf in the configuration directory to include

spark.driver.extraJavaOptions -Duser.timezone=GMT
spark.executor.extraJavaOptions -Duser.timezone=GMT

If you cannot modify main configuration you can create an application specific directory and point to it using SPARK_CONF_DIR environment variabl.e

In the recent versions you can also set spark.sql.session.timeZone in the application itself (note that it is different than corresponding JVM options and affects only Spark queries).


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...