Add the below export path line to bashrc file and and hopefully your modules will be correctly found:
# Add the PySpark classes to the Python path:
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
There is one more method.
Use findspark
1. Go to your python shell
pip install findspark
import findspark
findspark.init()
2.import the necessary modules
from pyspark import SparkContext
from pyspark import SparkConf
Now, you will find no errors and successful import of Spark modules will be done.
If you want to know more about Spark, then do check out this awesome video tutorial: