Why does Spark fail with java.lang.OutOfMemoryError: GC overhead limit exceeded?

Add the following JVM arg when you launch spark-shell or spark-submit:

-Dspark.executor.memory=6g

You may also consider to explicitly set the number of workers when you create an instance of SparkContext:

Distributed Cluster

Set the slave names in the conf/slaves:

val sc = new SparkContext("master", "MyApp")

Leave a Comment

404 Not Found

Not Found

The requested URL was not found on this server.

Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.