连接到sparklyr到端口(8880)for sessionid

时间:2016-11-03 02:41:02

标签: sparkr

生效错误(代码):   连接到sparkid到端口(8880)的sessionid(2044)时失败:端口(8880)中的网关没有响应。     路径:C:\ Users \ user1 \ AppData \ Local \ rstudio \ spark \ Cache \ spark-1.6.2-bin-hadoop2.6 \ bin \ spark-submit2.cmd     参数: - class,sparklyr.Backend, - packages," com.databricks:spark-csv_2.11:1.3.0"," D:\ Users \ user1 \ R \ R- 3.3.1 \ library \ sparklyr \ java \ sparklyr-1.6-2.10.jar",8880,2044     追溯:       shell_connection(master = master,spark_home = spark_home,app_name = app_name,version = version,hadoop_version = hadoop_version,shell_args = shell_args,config = config,service = FALSE,extensions = extensions)       start_shell(master = master,spark_home = spark_home,spark_version = version,app_name = app_name,config = config,jars = spark_config_value(config," spark.jars.default",list()),packages = spark_config_value( config," sparklyr.defaultPackages"),extensions = extensions,environment = environment,shell_args = shell_args,service = service)       试着抓({     gatewayInfo< - spark_connect_gateway(gatewayAddress,gatewayPort,sessionId,config = config,isStarting = TRUE) },error = function(e){     abort_shell(粘贴("连接到sparklyr到端口(",gatewayPort,")失败,用于sessionid(",sessionId,"):",e $ message,sep =""),spark_submit_path,shell_args,output_file,error_file) })       tryCatchList(expr,classes,parentenv,handlers)       tryCatchOne(expr,names,parentenv,handlers [[1]])       值[3]       abort_shell(粘贴("连接到sparklyr到端口(",gatewayPort,")失败,用于sessionid(",sessionId,"):",e $ message,sep =""),spark_submit_path,shell_args,output_file,error_file)

----输出日志---- 系统找不到指定的路径。

----错误日志----

1 个答案:

答案 0 :(得分:0)

通过更改JAVA_HOME解决了上述错误,Spark无法识别JAVA_HOME,如果将\ bin添加到JAVA_HOME的末尾。所以我从JAVA_HOME中删除了\ bin,它开始工作了。