SQOOP导入失败,找不到文件异常

时间:2016-07-02 08:06:01

标签: hadoop sqoop

我是hadoop架构系统的新手,并使用网络搜索安装了组件。为此,我安装了Hadoop,sqoop,hive。这是我的安装的目录结构(我的本地ubuntu机器而不是任何虚拟机,每个我的安装都在不同的目录中): -

  • 的/ usr /本地/ hadoop的
  • 的/ usr /本地/ sqoop
  • 的/ usr /本地/蜂巢

通过查看错误我试图解决它,所以我将sqoop(本地机器/ usr / local / sqoop)文件夹复制到hdfs目录(hdfs:// localhost:54310 / usr / local / sqoop)。这解决了我的问题。我想从中了解某些事情: -

  • 在将我的sqoop复制到hdfs之前,我的安装是对吗?
  • 是否有必要将sqoop目录从ext文件系统复制到hdfs文件系统。
  

16/07/02 13:22:15错误工具.ImportTool:遇到IOException运行导入作业:java.io.FileNotFoundException:文件不存在:hdfs:// localhost:54310 / usr / local / sqoop / lib /avro-mapred-1.7.5-hadoop2.jar       在org.apache.hadoop.hdfs.DistributedFileSystem $ 18.doCall(DistributedFileSystem.java:1122)       在org.apache.hadoop.hdfs.DistributedFileSystem $ 18.doCall(DistributedFileSystem.java:1114)       在org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)       在org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)       at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:269)       at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:390)       在org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:483)       在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1296)       在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1293)       at java.security.AccessController.doPrivileged(Native Method)       在javax.security.auth.Subject.doAs(Subject.java:415)       在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)       在org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)       在org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)       在org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)       在org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)       在org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)       在org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)       在org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)       在org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)       在org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)       在org.apache.sqoop.Sqoop.run(Sqoop.java:143)       在org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)       在org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)       在org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)       在org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)       在org.apache.sqoop.Sqoop.main(Sqoop.java:236)

2 个答案:

答案 0 :(得分:2)

安装没问题,不需要从sqoop目录复制所有文件只需将sqoop库文件复制到hdfs。

答案 1 :(得分:0)

  • 在hdfs中创建一个与$ SQOOP_HOME / lib相同的目录结构。

    实施例:hdfs dfs -mkdir -p /usr/lib/sqoop

  • 将所有sqoop库文件从$ SQOOP_HOME / lib复制到hdfs lib

    实施例:hdfs dfs -put /usr/lib/sqoop/lib/* /usr/lib/sqoop/lib

相关问题