faq Spark ClassNotFoundException org apache log4j spi Filter - padogrid/padogrid GitHub Wiki
I'm getting "java.lang.ClassNotFoundException: org.apache.log4j.spi.Filter" when I run Spark in PadoGrid.
When I run start_cluster
, I get the following exception.
starting org.apache.spark.deploy.worker.Worker, logging to /Users/dpark/Padogrid/workspaces/myrwe/myws/clusters/myspark/log/spark--org.apache.spark.deploy.worker.Worker-1-padomac.local.out
failed to launch: nice -n 0 /Users/dpark/Padogrid/products/spark-3.2.1-bin-without-hadoop/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8581 --port 60000 spark://localhost:7077
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:650)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:632)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.spi.Filter
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 7 more
This exception occurs if you are running a spark-without-hadoop version. Your version of Spark relies on an older version of log4j
, but PadoGrid includes a newer version of log4j
which no longer supports org.apache.log4j.spi.Filter
.
The following FAQ explains how to include Hadoop in Spark.