SparkConf没有找到运行spark neo4j连接器的例子

时间:2017-08-18 20:17:02

标签: scala apache-spark neo4j

我像这样执行spark neo4j示例代码:

spark-shell --conf spark.neo4j.bolt.password=TestNeo4j --packages neo4j-contrib:neo4j-spark-connector:1.0.0-RC1,graphframes:graphframes:0.1.0-spark1.6 -i neo4jspark.scala 

我的Scalafile:

import org.neo4j.spark._
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.SparkConf



val conf = new SparkConf.setMaster("local").setAppName("neo4jspark")
val sc = new SparkContext(conf)   
val neo = Neo4j(sc)



val rdd = neo.cypher("MATCH (p:POINT) RETURN p").loadRowRdd
rdd.count

错误:

Loading neo4jspark.scala...
import org.neo4j.spark._
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.rdd.RDD
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.SparkConf
<console>:38: error: not found: value SparkConf
       val conf = new SparkConf.setMaster("local").setAppName("neo4jspark")
                      ^
<console>:38: error: not found: value conf
       val sc = new SparkContext(conf)
                                 ^
<console>:39: error: not found: value Neo4j
       val neo = Neo4j(sc)
                 ^
<console>:38: error: not found: value neo
       val rdd = neo.cypher("MATCH (p:POINT) RETURN p").loadRowRdd
                 ^
<console>:39: error: object count is not a member of package org.apache.spark.streaming.rdd
       rdd.count
           ^

我正在导入SparkConf,不知道它说它没有价值,我错过了什么(简单我希望)??

编辑:这似乎是版本错误:

我在启动时运行它:

spark-shell --conf spark.neo4j.bolt.password=TestNeo4j --packages neo4j-contrib:neo4j-spark-connector:2.0.0-M2,graphframes:graphframes:0.2.0-spark2.0-s_2.11 -i neo4jspark.scala

仍然会遇到conf错误,但确实会运行。我只需要弄清楚为什么我对返回的RDD中断做了什么:D这是在Mac上,只测试了我窗口上所有内容的相同版本,它就像它显示她一样,因为它无法获得import.org .neo4j.spark._这里是错误:

<console>:23: error: object neo4j is not a member of package org
       import org.neo4j.spark._
                  ^

不知道窗口有什么不同于我正在使用的mac:/

0 个答案:

没有答案