Hy,
I have been run Spark multiple times (Spyder IDE). Today I got this error (the code it's the same)
from py4j.java_gateway import JavaGateway gateway = JavaGateway() os.environ['SPARK_HOME']="C:/Apache/spark-1.6.0" os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_71" sys.path.append("C:/Apache/spark-1.6.0/python/") os.environ['HADOOP_HOME']="C:/Apache/spark-1.6.0/winutils/" from pyspark import SparkContext from pyspark import SparkConf conf = SparkConf() The system cannot find the path specified. Traceback (most recent call last): File "<stdin>", line 1, in <module> File "C:\Apache\spark-1.6.0\python\pyspark\conf.py", line 104, in __init__ SparkContext._ensure_initialized() File "C:\Apache\spark-1.6.0\python\pyspark\context.py", line 245, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway() File "C:\Apache\spark-1.6.0\python\pyspark\java_gateway.py", line 94, in launch_gateway raise Exception("Java gateway process exited before sending the driver its port number") Exception: Java gateway process exited before sending the driver its port number What's go wrong? thanks for your time.