running ./bin/pyspark interactively automatically loads the sparkContext.
You will even see that written on the screen when you execute pyspark.
So, you can either run "del sc" or stop.sc() at the beginning and create a new sparkcontext:
→ stop.sc() → sc = SparkContext.getOrCreate()
or just carry on and use "sc" as automatically defined.