I'm trying to build a recommender using Spark and just ran out of memory:
Exception in thread "dag-scheduler-event-loop" java.lang.OutOfMemoryError: Java heap space
I'd like to increase the memory available to Spark by modifying the spark.executor.memory property, in PySpark, at runtime.
Is that possible? If so, how?