我怎样才能包含MySQL连接器jar

时间:2015-09-17 15:13:45

标签: mysql apache-spark

tong@tong-VirtualBox:/usr/local/spark$ bin/sparkling-shell --jars /home/tong/sparkling-water/mysql-connector-java-5.1.36-bin.jar
bash: bin/sparkling-shell: No such file or directory
tong@tong-VirtualBox:/usr/local/spark$ bin/spark-shell --jars /home/tong/sparkling-water/mysql-connector-java-5.1.36-bin.jar
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/09/17 11:19:27 INFO SecurityManager: Changing view acls to: tong
15/09/17 11:19:27 INFO SecurityManager: Changing modify acls to: tong
15/09/17 11:19:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tong); users with modify permissions: Set(tong)
15/09/17 11:19:29 INFO HttpServer: Starting HTTP Server
15/09/17 11:19:29 INFO Utils: Successfully started service 'HTTP class server' on port 43333.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.1
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_80)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/17 11:19:43 WARN Utils: Your hostname, tong-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.23.36.82 instead (on interface eth0)
15/09/17 11:19:43 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/09/17 11:19:43 INFO SparkContext: Running Spark version 1.4.1
15/09/17 11:19:43 INFO SecurityManager: Changing view acls to: tong
15/09/17 11:19:43 INFO SecurityManager: Changing modify acls to: tong
15/09/17 11:19:43 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tong); users with modify permissions: Set(tong)
15/09/17 11:19:45 INFO Slf4jLogger: Slf4jLogger started
15/09/17 11:19:45 INFO Remoting: Starting remoting
15/09/17 11:19:46 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.23.36.82:35469]
15/09/17 11:19:46 INFO Utils: Successfully started service 'sparkDriver' on port 35469.
15/09/17 11:19:46 INFO SparkEnv: Registering MapOutputTracker
15/09/17 11:19:47 INFO SparkEnv: Registering BlockManagerMaster
15/09/17 11:19:47 INFO DiskBlockManager: Created local directory at /tmp/spark-f8f4de26-e607-416f-9fed-f37440bd3878/blockmgr-ed450a0c-5719-4721-98b7-fd6e4664a7d4
15/09/17 11:19:47 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/09/17 11:19:47 INFO HttpFileServer: HTTP File server directory is /tmp/spark-f8f4de26-e607-416f-9fed-f37440bd3878/httpd-99f7d1e3-e6d8-4a06-8ae0-65d0fb76a038
15/09/17 11:19:47 INFO HttpServer: Starting HTTP Server
15/09/17 11:19:47 INFO Utils: Successfully started service 'HTTP file server' on port 51511.
15/09/17 11:19:47 INFO SparkEnv: Registering OutputCommitCoordinator
15/09/17 11:19:48 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/09/17 11:19:48 INFO SparkUI: Started SparkUI at http://10.23.36.82:4040
15/09/17 11:19:48 INFO SparkContext: Added JAR file:/home/tong/sparkling-water/mysql-connector-java-5.1.36-bin.jar at http://10.23.36.82:51511/jars/mysql-connector-java-5.1.36-bin.jar with timestamp 1442506788779
15/09/17 11:19:49 INFO Executor: Starting executor ID driver on host localhost
15/09/17 11:19:49 INFO Executor: Using REPL class URI: http://10.23.36.82:43333
15/09/17 11:19:50 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34826.
15/09/17 11:19:50 INFO NettyBlockTransferService: Server created on 34826
15/09/17 11:19:50 INFO BlockManagerMaster: Trying to register BlockManager
15/09/17 11:19:50 INFO BlockManagerMasterEndpoint: Registering block manager localhost:34826 with 267.3 MB RAM, BlockManagerId(driver, localhost, 34826)
15/09/17 11:19:50 INFO BlockManagerMaster: Registered BlockManager
15/09/17 11:19:51 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/09/17 11:19:54 INFO HiveContext: Initializing execution hive, version 0.13.1
15/09/17 11:19:55 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/09/17 11:19:55 INFO ObjectStore: ObjectStore, initialize called
15/09/17 11:19:56 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
15/09/17 11:19:56 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/09/17 11:19:56 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/17 11:19:57 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/17 11:20:04 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/09/17 11:20:04 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5.  Encountered: "@" (64), after : "".
15/09/17 11:20:06 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:06 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/17 11:20:14 INFO ObjectStore: Initialized ObjectStore
15/09/17 11:20:15 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
15/09/17 11:20:16 INFO HiveMetaStore: Added admin role in metastore
15/09/17 11:20:16 INFO HiveMetaStore: Added public role in metastore
15/09/17 11:20:17 INFO HiveMetaStore: No user is added in admin role, since config is empty
15/09/17 11:20:18 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
15/09/17 11:20:18 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.

我在MySQL中构建了一个数据库。现在我想用Spark连接它。 我用bin / sparkling-shell --jars mysql:mysql-connector-java:5.1.36 并获得警告:跳过远程jar mysql:mysql-connector-java:5.1.36。

我下载mysql-connector-java-5.1.36.tar.gz把它放在home / tong / sparkling-water仍然无法正常工作。

如何包含jdbc jar文件? 我正在使用Spark 1.4.1有没有其他方法来连接Mysql和Spark?

1 个答案:

答案 0 :(得分:0)

bin / spark-shell --driver-class-path /home/tg/sparkling-water/mysql-connector-java-5.1.36-bin.jar

val jdbcDF = sqlContext.load(" jdbc",Map(" url" - >" jdbc:mysql:// localhost:3306 / employee?user = tg& password = *******"," dbtable" - >"员工"))

相关问题