How do I stop a spark streaming job?

You can stop your streaming context in cluster mode by running the following command without needing to sending a SIGTERM. This will stop the streaming context without you needing to explicitly stop it using a thread hook.

$SPARK_HOME_DIR/bin/spark-submit --master $MASTER_REST_URL --kill $DRIVER_ID

-$MASTER_REST_URL is the rest url of the spark driver, ie something like spark://localhost:6066

-$DRIVER_ID is something like driver-20150915145601-0000

If you want spark to stop your app gracefully, you can try setting the following system property when your spark app is initially submitted (see http://spark.apache.org/docs/latest/submitting-applications.html on setting spark configuration properties).

spark.streaming.stopGracefullyOnShutdown=true

This is not officially documented, and I gathered this from looking at the 1.4 source code. This flag is honored in standalone mode. I haven’t tested it in clustered mode yet.

I am working with spark 1.4.*

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)