when I am running spark, there is an exception thrown

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: java.lang.NullPointerException

turns out this is due to the class conflicts.

as i have some libraries bundled in the uber jar, for example, the

com.google.protobuf

hence i have requested spark to use my own libraries, instead of the ones bundled in spark by

--conf spark.driver.userClassPathFirst=true --conf spark.executor.userClassPathFirst=true

this works as intended to point to the right libraries. however, seems like there are more libraries duplicated between the uber jar and spark.

hence instead of using above option, and turning to maven shaded sorted out the issue.

--

--

No responses yet