My pip install pyspark worked, I get a message in my command prompt that SparkSession available as 'spark'.
However, when I do:
from pyspark import SparkContext it gives me a:
ModuleNotFoundError: No module named 'pyspark' What's the problem and how do I fix it?