I’m not sure why you chose the answer above when it requires restarting your shell and opening with a different command! Though that works and is useful, there is an in-line solution which is what was actually being requested. This is essentially what @zero323 referenced in the comments above, but the link leads to a post describing implementation in Scala. Below is a working implementation specifically for PySpark.
Note: The SparkContext you want to modify the settings for must not have been started or else you will need to close it, modify settings, and re-open.
from pyspark import SparkContext
SparkContext.setSystemProperty('spark.executor.memory', '2g')
sc = SparkContext("local", "App Name")
source:
https://spark.apache.org/docs/0.8.1/python-programming-guide.html
p.s. if you need to close the SparkContext just use:
SparkContext.stop(sc)
and to double check the current settings that have been set you can use:
sc._conf.getAll()