Simply open PySpark shell and check the settings:
sc.getConf().getAll()
Now you can execute the code and again check the setting of the Pyspark shell.
You first have to create conf and then you can create the Spark Context using that configuration object.
config = pyspark.SparkConf().setAll([('spark.executor.memory', '8g'), ('spark.executor.cores', '3'), ('spark.cores.max', '3'), ('spark.driver.memory','8g')])
sc.stop()
sc = pyspark.SparkContext(conf=config)
I hope this answer helps you!