3

I am running pyspark but it can be unstable at times. There are couple of times it crashes at this command

spark_conf = SparkConf() 

with the following error message

 File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/pyspark/conf.py", line 106, in __init__ self._jconf = _jvm.SparkConf(loadDefaults) File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 772, in __getattr__ raise Py4JError('{0} does not exist in the JVM'.format(name)) Py4JError: SparkConf does not exist in the JVM 

Any idea what is the problem? Thank you for your help!

1 Answer 1

1

SparkConf does not exist in the pyspark context, try:

from pyspark import SparkConf 

in the pyspark console or code.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks, I found the problem. After closing a SparkContext, I will get the above error message when I try to call SparkConf() and initialize a new SparkContext again.
What do you mean? I have the same error when using from pyspark import SparkContext and then sc = SparkContext()
@Michael how to closed spark session?
@JatinPatel-JP Use spark.stop() to stop a SparkSession. Here spark is the name of the SparkSession object.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.