0

I have an sql script which creates temp tables valid for only that session. Now after running the script, I am trying to read data from the table through spark and then process it. Below is the code I have code for spark read.

sparkSession.read().format("jdbc").option("url", jdbcURL).option("dbtable", tableOrQuery).option("user", userName).option("password", password) .option("driver", driverName).load(); 

Now I need to pass the jdbc connection I created so that spark can read data in the same session. Is this possible ?

1
  • Incidentally I answered a similar question not so long ago - TL;DR; there can be no such option whatsoever. Commented Feb 12, 2019 at 18:50

1 Answer 1

1

No, you cannot pass jdbc connection to spark. It will manage JDBC connection by itself.

JdbcRelationProvider Create Connection

JdbcUtils connect

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks. Do you have any suggestions what I can do instead ?
You can get connection metadata and build Spark URL according to it, solution is in stackoverflow.com/questions/5718952/…

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.