ClassNotFound异常从apache Spark运行java程序时

时间:2015-08-04 06:35:11

标签: java hadoop apache-spark pyspark

我有编译Java程序并尝试使用spark运行,但它显示ClassNotFound Exception,即使那里存在类文件。enter image description here

    package org.apache.spark.examples;

    import org.apache.spark.SparkConf;
    import org.apache.spark.api.java.JavaRDD;
    import org.apache.spark.api.java.JavaSparkContext;
    import org.apache.spark.api.java.function;  

    public final class JavaHelloWorld
    {
         public static void main(String args[])throws Exception
         {

         SparkConf sparkConf = new SparkConf().setAppName("JavaSparkPi");
         JavaSparkContext jsc = new JavaSparkContext(sparkConf);
         System.out.println("Hello World... Niyat From Apache Spark");
         }
    }   

1 个答案:

答案 0 :(得分:2)

你必须写出类的确切名称,你的初始public class ImageAdapter extends PagerAdapter { public ImageAdapter(Context context, String list1Details) { // Use your extra here } } 小写:

j

相关问题