sqoop到teradata - 列长度问题

时间:2016-10-21 07:22:16

标签: teradata sqoop

我正在尝试将表格的数据从HIVE sqoop到Teradata并得到错误

Error: com.teradata.connector.common.exception.ConnectorException: java.sql.SQLException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.20] [Error 1186] [SQLState HY000] Parameter 8 length is 67618 bytes, which is greater than the maximum 64000 bytes that can be set.

任何人都可以建议我在这里做些什么改变?在HIVE表中,第8列是太长的字符串,这就是为什么我将TERADATA中的数据类型定义为VARCHAR(50000),但它仍然失败。

Error: com.teradata.connector.common.exception.ConnectorException: java.sql.SQLException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.20] [Error 1186] [SQLState HY000] Parameter 8 length is 67618 bytes, which is greater than the maximum 64000 bytes that can be set.
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDriverJDBCException(ErrorFactory.java:94)
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDriverJDBCException(ErrorFactory.java:74)
    at com.teradata.jdbc.jdbc_4.TDPreparedStatement.internalSetString(TDPreparedStatement.java:1121)
    at com.teradata.jdbc.jdbc_4.TDPreparedStatement.setString(TDPreparedStatement.java:1095)
    at com.teradata.jdbc.jdbc_4.TDPreparedStatement.setObject(TDPreparedStatement.java:1631)
    at com.teradata.connector.teradata.TeradataObjectArrayWritable.write(TeradataObjectArrayWritable.java:232)
    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:142)
    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:114)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:107)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:65)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
    at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
    at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
    at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:129)
    at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:117)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:151)
    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:114)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:107)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:65)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
    at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
    at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
    at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:129)
    at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:117)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

1 个答案:

答案 0 :(得分:0)

Hive中的字符串列有67618个字符,您将其映射到Teradata的VARCHAR(50000)

因此预计错误。

您应该使用Clob(70000)

Sqoop Export应该适用于此。