I use Spark 1.3.0 Let's say I have a dataframe in Spark and I need to store this to Postgres DB (postgresql-9.2.18-1-linux-x64) on a 64bit ubuntu machine. I also use postgresql9.2jdbc41.jar as a driver to connect to postgres
I was able to read data from postgres DB using the below commands
import org.postgresql.Driver val url="jdbc:postgresql://localhost/postgres?user=user&password=pwd" val driver = "org.postgresql.Driver" val users = { sqlContext.load("jdbc", Map( "url" -> url, "driver" -> driver, "dbtable" -> "cdimemployee", "partitionColumn" -> "intempdimkey", "lowerBound" -> "0", "upperBound" -> "500", "numPartitions" -> "50" )) } val get_all_emp = users.select("*") val empDF = get_all_emp.toDF get_all_emp.foreach(println) I want to write this DF back to postgres after some processing. Is this below code right?
empDF.write.jdbc("jdbc:postgresql://localhost/postgres", "test", Map("user" -> "user", "password" -> "pwd")) Any pointers(scala) would be helpful.