Windows 10机器上的Apache Zeppelin 0.7.2和spark-2.1.1-bin-hadoop2.7

时间:2017-07-09 20:28:52

标签: apache-spark apache-zeppelin

我是zeppelin的新手并尝试配置apache zeppelin以连接本地计算机上的独立spark,我在zeppelin-env.cmd文件中设置了以下设置。并在zeppelin-site.xml文件中将端口号更改为8082。

    set JAVA_HOME=C:\PROGRA~1\Java\jdk1.8.0_101
    set SPARK_HOME=C:\tmp\spark\spark-2.1.1-bin-hadoop2.7

如下所示启动spark master和worker spark-class2.cmd org.apache.spark.deploy.master.Master spark-class2.cmd org.apache.spark.deploy.worker.Worker spark://192.168.99.1:7077

启动zeppelin zeppelin.cmd并更改spark解释器设置,如下所示。 将master更改为指向本地spark master来激发://192.168.99.1:7077并将useHiveContext更改为false。

当我尝试运行默认的开箱即用笔记本时,我遇到错误。

  

org.apache.zeppelin.interpreter.InterpreterException:文件名,目录名或卷标语法不正确。

at org.apache.zeppelin.interpreter.remote.RemoteInterpreterManagedProcess.start(RemoteInterpreterManagedProcess.java:143)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterProcess.reference(RemoteInterpreterProcess.java:73)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.open(RemoteInterpreter.java:265)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreter.getFormType(RemoteInterpreter.java:430)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.getFormType(LazyOpenInterpreter.java:111)
at org.apache.zeppelin.notebook.Paragraph.jobRun(Paragraph.java:387)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.RemoteScheduler$JobRunner.run(RemoteScheduler.java:329)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

0 个答案:

没有答案