How to add third-party Java JAR files for use in PySpark
You could add the path to jar file using Spark configuration at Runtime. Here is an example : conf = SparkConf().set(“spark.jars”, “/path-to-jar/spark-streaming-kafka-0-8-assembly_2.11-2.2.1.jar”) sc = SparkContext( conf=conf) Refer the document for more information.