0

I am running Spark-Shell with Scala and I want to set an environment variable to load data into Google bigQuery. The environment variable is GOOGLE_APPLICATION_CREDENTIALS and it contains /path/to/service/account.json

In python environment I can easily do,

import os os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = "path/to/service/account.json" 

However, I cannot do this in Scala. I can print out the system environment variables using,

scala> sys.env 

or

scala> System.getenv() 

which returns me a map of String Key,Value pairs. However,

scala> System.getenv("GOOGLE_APPLICATION_CREDENTIALS") = "path/to/service/account.json" 

returns an error

<console>:26: error: value update is not a member of java.util.Map[String,String] 
5
  • have you found a solution ? Commented Jul 30, 2018 at 20:02
  • yes, I have found a work around for this. Commented Jul 30, 2018 at 20:04
  • can you answer your question with the solution ? Commented Jul 30, 2018 at 20:42
  • hey, i answered the question. let me know if it works. Commented Jul 30, 2018 at 21:02
  • Possible duplicate of Scala: Unable to set environment variable Commented Jul 31, 2018 at 2:35

1 Answer 1

0

I found a work around for this problem, though I dont think its the best practice. Here is the 2 step solution for this -

  1. From terminal/cmd, first create the environment variable -

    export GOOGLE_APPLICATION_CREDENTIALS=path/to/service/account.json

  2. From the same terminal window, open spark-shell and run -

    System.getenv("GOOGLE_APPLICATION_CREDENTIALS")

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.