How can I kill a running process in the Spark shell on my local OSX machine without exiting?
For example, if I just do a simple .count() on an RDD, it can take a while and sometimes I want to kill it.
However, if I do Ctrl C then it kills the whole shell.
Is there a way to kill the process but not the shell?