Looking at your code, I don’t think you are overwriting anything .
You and see it by yourself, just type this command as soon as you start pyspark shell:
sc.getConf().getAll()
This will give you all of the current config settings. Then execute your code and do it again. You will get to know that nothing is changing.
Now, I would suggest you instead of following your approach try to create a new configuration and use that to create a SparkContext:
conf = pyspark.SparkConf().setAll([('spark.executor.memory', '8g'), ('spark.executor.cores', '3'), ('spark.cores.max', '3'), ('spark.driver.memory','8g')])
sc.stop()
sc = pyspark.SparkContext(conf=conf)
Then you can check yourself just like above with:
sc.getConf().getAll()
This should reflect the configuration you wanted.