LoginSignup
1
0

More than 5 years have passed since last update.

java.lang.ClassNotFoundException: org.apache.atlas.sqoop.hook.SqoopHook 対応

Last updated at Posted at 2019-02-17

概要

java.lang.ClassNotFoundException: org.apache.atlas.sqoop.hook.SqoopHook

[centos@zzeng-hdp-2 ~/sandbox/crosscomponent_demo/crosscomponent_scripts/sqoop-demo]$ ./002-run-sqoop-import.sh
Warning: /usr/hdp/3.1.0.0-78/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/phoenix/phoenix-5.0.0.3.1.0.0-78-server.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/02/17 16:13:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.1.0.0-78
Enter password:
19/02/17 16:13:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/02/17 16:13:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/02/17 16:13:15 INFO manager.MySQLManager: Argument '--fetch-size 1' will probably get ignored by MySQL JDBC driver.
19/02/17 16:13:15 INFO tool.CodeGenTool: Beginning code generation
19/02/17 16:13:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_table_sqoop1` AS t LIMIT 1
19/02/17 16:13:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_table_sqoop1` AS t LIMIT 1
19/02/17 16:13:16 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.1.0.0-78/hadoop-mapreduce
19/02/17 16:13:19 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-centos/compile/a7256e4672ed1609e26b7a72679c935d/test_table_sqoop1.java to /home/centos/sandbox/crosscomponent_demo/crosscomponent_scripts/sqoop-demo/./test_table_sqoop1.java. Error: Destination '/home/centos/sandbox/crosscomponent_demo/crosscomponent_scripts/sqoop-demo/./test_table_sqoop1.java' already exists
19/02/17 16:13:19 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-centos/compile/a7256e4672ed1609e26b7a72679c935d/test_table_sqoop1.jar
19/02/17 16:13:19 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/02/17 16:13:19 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/02/17 16:13:19 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/02/17 16:13:19 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/02/17 16:13:19 INFO mapreduce.ImportJobBase: Beginning import of test_table_sqoop1
19/02/17 16:13:20 INFO client.RMProxy: Connecting to ResourceManager at zzeng-hdp-1.field.hortonworks.com/172.26.249.75:8050
19/02/17 16:13:20 INFO client.AHSProxy: Connecting to Application History server at zzeng-hdp-2.field.hortonworks.com/172.26.249.76:10200
19/02/17 16:13:21 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/centos/.staging/job_1550042495287_0019
19/02/17 16:13:24 INFO db.DBInputFormat: Using read commited transaction isolation
19/02/17 16:13:24 INFO mapreduce.JobSubmitter: number of splits:1
19/02/17 16:13:24 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1550042495287_0019
19/02/17 16:13:24 INFO mapreduce.JobSubmitter: Executing with tokens: []
19/02/17 16:13:24 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.1.0.0-78/0/resource-types.xml
19/02/17 16:13:24 INFO impl.YarnClientImpl: Submitted application application_1550042495287_0019
19/02/17 16:13:25 INFO mapreduce.Job: The url to track the job: http://zzeng-hdp-1.field.hortonworks.com:8088/proxy/application_1550042495287_0019/
19/02/17 16:13:25 INFO mapreduce.Job: Running job: job_1550042495287_0019
19/02/17 16:13:36 INFO mapreduce.Job: Job job_1550042495287_0019 running in uber mode : false
19/02/17 16:13:36 INFO mapreduce.Job:  map 0% reduce 0%
19/02/17 16:13:42 INFO mapreduce.Job: Task Id : attempt_1550042495287_0019_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
    at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:167)
    at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:158)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
    at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
    at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:165)
    ... 10 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1121)
    at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:357)
    at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2484)
    at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2521)
    at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2306)
    at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:839)
    at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:49)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
    at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:421)
    at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:350)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:247)
    at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:300)
    at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
    ... 11 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:589)
    at java.net.Socket.connect(Socket.java:538)
    at java.net.Socket.<init>(Socket.java:434)
    at java.net.Socket.<init>(Socket.java:244)
    at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:259)
    at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:307)
    ... 27 more

19/02/17 16:13:48 INFO mapreduce.Job:  map 100% reduce 0%
19/02/17 16:13:49 INFO mapreduce.Job: Job job_1550042495287_0019 completed successfully
19/02/17 16:13:49 INFO mapreduce.Job: Counters: 33
    File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=241708
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=85
        HDFS: Number of bytes written=48
        HDFS: Number of read operations=6
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
    Job Counters
        Failed map tasks=1
        Launched map tasks=2
        Other local map tasks=2
        Total time spent by all maps in occupied slots (ms)=33780
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=8445
        Total vcore-milliseconds taken by all map tasks=8445
        Total megabyte-milliseconds taken by all map tasks=34590720
    Map-Reduce Framework
        Map input records=2
        Map output records=2
        Input split bytes=85
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=83
        CPU time spent (ms)=1530
        Physical memory (bytes) snapshot=222658560
        Virtual memory (bytes) snapshot=5501313024
        Total committed heap usage (bytes)=182452224
        Peak Map Physical memory (bytes)=222658560
        Peak Map Virtual memory (bytes)=5501313024
    File Input Format Counters
        Bytes Read=0
    File Output Format Counters
        Bytes Written=48
19/02/17 16:13:49 INFO mapreduce.ImportJobBase: Transferred 48 bytes in 29.2481 seconds (1.6411 bytes/sec)
19/02/17 16:13:49 INFO mapreduce.ImportJobBase: Retrieved 2 records.
19/02/17 16:13:49 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_table_sqoop1` AS t LIMIT 1
19/02/17 16:13:49 INFO hive.HiveImport: Loading uploaded data into Hive
19/02/17 16:13:51 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
19/02/17 16:13:51 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19/02/17 16:13:51 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19/02/17 16:13:51 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
19/02/17 16:13:51 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
19/02/17 16:13:54 INFO hive.HiveImport: Connecting to jdbc:hive2://zzeng-hdp-1.field.hortonworks.com:2181,zzeng-hdp-2.field.hortonworks.com:2181,zzeng-hdp-3.field.hortonworks.com:2181/default;password=centos;serviceDiscoveryMode=zooKeeper;user=centos;zooKeeperNamespace=hiveserver2
19/02/17 16:13:54 INFO hive.HiveImport: 19/02/17 16:13:54 [main]: INFO jdbc.HiveConnection: Connected to zzeng-hdp-2.field.hortonworks.com:10000
19/02/17 16:13:54 INFO hive.HiveImport: Connected to: Apache Hive (version 3.1.0.3.1.0.0-78)
19/02/17 16:13:54 INFO hive.HiveImport: Driver: Hive JDBC (version 3.1.0.3.1.0.0-78)
19/02/17 16:13:54 INFO hive.HiveImport: Transaction isolation: TRANSACTION_REPEATABLE_READ
19/02/17 16:13:54 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks> CREATE TABLE IF NOT EXISTS `test_hive_table1` ( `name` STRING, `location` STRING) COMMENT 'Imported by sqoop on 2019/02/
19/02/17 16:13:54 INFO hive.HiveImport: 17 16:13:49' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : Compiling command(queryId=hive_20190217161355_912abe3d-27d2-40d5-9cc8-e176306a01e1): CREATE TABLE IF NOT EXISTS `test_hive_table1` ( `name` STRING, `location` STRING) COMMENT 'Imported by sqoop on 2019/02/17 16:13:49' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : Semantic Analysis Completed (retrial = false)
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : Completed compiling command(queryId=hive_20190217161355_912abe3d-27d2-40d5-9cc8-e176306a01e1); Time taken: 0.11 seconds
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : Executing command(queryId=hive_20190217161355_912abe3d-27d2-40d5-9cc8-e176306a01e1): CREATE TABLE IF NOT EXISTS `test_hive_table1` ( `name` STRING, `location` STRING) COMMENT 'Imported by sqoop on 2019/02/17 16:13:49' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : Completed executing command(queryId=hive_20190217161355_912abe3d-27d2-40d5-9cc8-e176306a01e1); Time taken: 0.017 seconds
19/02/17 16:13:55 INFO hive.HiveImport: INFO  : OK
19/02/17 16:13:55 INFO hive.HiveImport: No rows affected (0.314 seconds)
19/02/17 16:13:55 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks> LOAD DATA INPATH 'hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1' INTO TABLE `test_hive_tab
19/02/17 16:13:55 INFO hive.HiveImport: le1`;
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Compiling command(queryId=hive_20190217161355_199abc25-f012-4143-966a-ade99cab8637): LOAD DATA INPATH 'hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1' INTO TABLE `test_hive_table1`
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Semantic Analysis Completed (retrial = false)
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Completed compiling command(queryId=hive_20190217161355_199abc25-f012-4143-966a-ade99cab8637); Time taken: 0.101 seconds
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Executing command(queryId=hive_20190217161355_199abc25-f012-4143-966a-ade99cab8637): LOAD DATA INPATH 'hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1' INTO TABLE `test_hive_table1`
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Starting task [Stage-0:MOVE] in serial mode
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Loading data to table default.test_hive_table1 from hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Starting task [Stage-1:STATS] in serial mode
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : Completed executing command(queryId=hive_20190217161355_199abc25-f012-4143-966a-ade99cab8637); Time taken: 0.492 seconds
19/02/17 16:13:56 INFO hive.HiveImport: INFO  : OK
19/02/17 16:13:56 INFO hive.HiveImport: No rows affected (0.684 seconds)
19/02/17 16:13:56 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks>
19/02/17 16:13:56 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks> Closing: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks.com:2181,zzeng-hdp-2.field.hortonworks.com:2181,zzeng-hdp-3.field.hortonworks.com:2181/default;password=centos;serviceDiscoveryMode=zooKeeper;user=centos;zooKeeperNamespace=hiveserver2
19/02/17 16:13:56 INFO hive.HiveImport: Hive import complete.
19/02/17 16:13:56 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.
19/02/17 16:13:56 INFO tool.ImportTool: Publishing Hive/Hcat import job data to Listeners for table test_table_sqoop1
19/02/17 16:13:56 WARN mapreduce.PublishJobData: Unable to publish import data to publisher org.apache.atlas.sqoop.hook.SqoopHook
java.lang.ClassNotFoundException: org.apache.atlas.sqoop.hook.SqoopHook
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.apache.sqoop.mapreduce.PublishJobData.publishJobData(PublishJobData.java:45)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:566)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:656)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:150)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:186)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:240)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:249)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:258)

Error message :

19/02/17 16:13:56 WARN mapreduce.PublishJobData: Unable to publish import data to publisher org.apache.atlas.sqoop.hook.SqoopHook
java.lang.ClassNotFoundException: org.apache.atlas.sqoop.hook.SqoopHook
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:264)
    at org.apache.sqoop.mapreduce.PublishJobData.publishJobData(PublishJobData.java:45)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:566)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:656)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:150)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:186)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:240)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:249)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:258)

Apache Atlasのドキュメントをチェックしたところ、
https://atlas.apache.org/Hook-Sqoop.html
JarファイルのSoft Linkを作ればいい。

対策


sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-client-common-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-atlas-client-common-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-client-v1-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-atlas-client-v1-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-client-v2-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-atlas-client-v2-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-common-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-atlas-common-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-intg-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-atlas-intg-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-notification-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-atlas-notification-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/hdfs-model-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-hdfs-model-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/hive-bridge-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-hive-bridge-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/jersey-json-1.19.jar /usr/hdp/current/sqoop-server/lib/zz-jersey-json-1.19.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/jsr311-api-1.1.jar /usr/hdp/current/sqoop-server/lib/zz-jsr311-api-1.1.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/kafka-clients-2.0.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-kafka-clients-2.0.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/kafka_2.11-2.0.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-kafka_2.11-2.0.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/sqoop-bridge-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-server/lib/zz-sqoop-bridge-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-client-common-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-atlas-client-common-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-client-v1-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-atlas-client-v1-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-client-v2-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-atlas-client-v2-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-common-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-atlas-common-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-intg-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-atlas-intg-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/atlas-notification-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-atlas-notification-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/hdfs-model-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-hdfs-model-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/hive-bridge-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-hive-bridge-1.1.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/jersey-json-1.19.jar /usr/hdp/current/sqoop-client/lib/zz-jersey-json-1.19.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/jsr311-api-1.1.jar /usr/hdp/current/sqoop-client/lib/zz-jsr311-api-1.1.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/kafka-clients-2.0.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-kafka-clients-2.0.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/kafka_2.11-2.0.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-kafka_2.11-2.0.0.3.1.0.0-78.jar
sudo ln -s /usr/hdp/current/atlas-server/hook/sqoop/atlas-sqoop-plugin-impl/sqoop-bridge-1.1.0.3.1.0.0-78.jar /usr/hdp/current/sqoop-client/lib/zz-sqoop-bridge-1.1.0.3.1.0.0-78.jar

 問題解決

再度実行した結果:


[centos@zzeng-hdp-2 ~/sandbox/crosscomponent_demo/crosscomponent_scripts/sqoop-demo]$ ./002-run-sqoop-import.sh
Warning: /usr/hdp/3.1.0.0-78/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/phoenix/phoenix-5.0.0.3.1.0.0-78-server.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/02/17 16:22:34 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7.3.1.0.0-78
Enter password:
19/02/17 16:22:36 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/02/17 16:22:36 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/02/17 16:22:36 INFO manager.MySQLManager: Argument '--fetch-size 1' will probably get ignored by MySQL JDBC driver.
19/02/17 16:22:36 INFO tool.CodeGenTool: Beginning code generation
19/02/17 16:22:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_table_sqoop1` AS t LIMIT 1
19/02/17 16:22:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_table_sqoop1` AS t LIMIT 1
19/02/17 16:22:37 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/3.1.0.0-78/hadoop-mapreduce
19/02/17 16:22:39 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-centos/compile/3a0011318880684ddc36dfa828e3d30a/test_table_sqoop1.java to /home/centos/sandbox/crosscomponent_demo/crosscomponent_scripts/sqoop-demo/./test_table_sqoop1.java. Error: Destination '/home/centos/sandbox/crosscomponent_demo/crosscomponent_scripts/sqoop-demo/./test_table_sqoop1.java' already exists
19/02/17 16:22:39 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-centos/compile/3a0011318880684ddc36dfa828e3d30a/test_table_sqoop1.jar
19/02/17 16:22:39 WARN manager.MySQLManager: It looks like you are importing from mysql.
19/02/17 16:22:39 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
19/02/17 16:22:39 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
19/02/17 16:22:39 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
19/02/17 16:22:39 INFO mapreduce.ImportJobBase: Beginning import of test_table_sqoop1
19/02/17 16:22:40 INFO client.RMProxy: Connecting to ResourceManager at zzeng-hdp-1.field.hortonworks.com/172.26.249.75:8050
19/02/17 16:22:41 INFO client.AHSProxy: Connecting to Application History server at zzeng-hdp-2.field.hortonworks.com/172.26.249.76:10200
19/02/17 16:22:41 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/centos/.staging/job_1550042495287_0020
19/02/17 16:22:44 INFO db.DBInputFormat: Using read commited transaction isolation
19/02/17 16:22:44 INFO mapreduce.JobSubmitter: number of splits:1
19/02/17 16:22:44 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1550042495287_0020
19/02/17 16:22:44 INFO mapreduce.JobSubmitter: Executing with tokens: []
19/02/17 16:22:45 INFO conf.Configuration: found resource resource-types.xml at file:/etc/hadoop/3.1.0.0-78/0/resource-types.xml
19/02/17 16:22:45 INFO impl.YarnClientImpl: Submitted application application_1550042495287_0020
19/02/17 16:22:45 INFO mapreduce.Job: The url to track the job: http://zzeng-hdp-1.field.hortonworks.com:8088/proxy/application_1550042495287_0020/
19/02/17 16:22:45 INFO mapreduce.Job: Running job: job_1550042495287_0020
19/02/17 16:22:56 INFO mapreduce.Job: Job job_1550042495287_0020 running in uber mode : false
19/02/17 16:22:56 INFO mapreduce.Job:  map 0% reduce 0%
19/02/17 16:23:04 INFO mapreduce.Job:  map 100% reduce 0%
19/02/17 16:23:05 INFO mapreduce.Job: Job job_1550042495287_0020 completed successfully
19/02/17 16:23:05 INFO mapreduce.Job: Counters: 32
    File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=243800
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=85
        HDFS: Number of bytes written=48
        HDFS: Number of read operations=6
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=2
    Job Counters
        Launched map tasks=1
        Other local map tasks=1
        Total time spent by all maps in occupied slots (ms)=18672
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=4668
        Total vcore-milliseconds taken by all map tasks=4668
        Total megabyte-milliseconds taken by all map tasks=19120128
    Map-Reduce Framework
        Map input records=2
        Map output records=2
        Input split bytes=85
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=115
        CPU time spent (ms)=1330
        Physical memory (bytes) snapshot=224505856
        Virtual memory (bytes) snapshot=5501239296
        Total committed heap usage (bytes)=183500800
        Peak Map Physical memory (bytes)=224505856
        Peak Map Virtual memory (bytes)=5501239296
    File Input Format Counters
        Bytes Read=0
    File Output Format Counters
        Bytes Written=48
19/02/17 16:23:05 INFO mapreduce.ImportJobBase: Transferred 48 bytes in 24.512 seconds (1.9582 bytes/sec)
19/02/17 16:23:05 INFO mapreduce.ImportJobBase: Retrieved 2 records.
19/02/17 16:23:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_table_sqoop1` AS t LIMIT 1
19/02/17 16:23:05 INFO hive.HiveImport: Loading uploaded data into Hive
19/02/17 16:23:07 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.
19/02/17 16:23:07 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19/02/17 16:23:07 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
19/02/17 16:23:07 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
19/02/17 16:23:07 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
19/02/17 16:23:10 INFO hive.HiveImport: Connecting to jdbc:hive2://zzeng-hdp-1.field.hortonworks.com:2181,zzeng-hdp-2.field.hortonworks.com:2181,zzeng-hdp-3.field.hortonworks.com:2181/default;password=centos;serviceDiscoveryMode=zooKeeper;user=centos;zooKeeperNamespace=hiveserver2
19/02/17 16:23:10 INFO hive.HiveImport: 19/02/17 16:23:10 [main]: INFO jdbc.HiveConnection: Connected to zzeng-hdp-2.field.hortonworks.com:10000
19/02/17 16:23:10 INFO hive.HiveImport: Connected to: Apache Hive (version 3.1.0.3.1.0.0-78)
19/02/17 16:23:10 INFO hive.HiveImport: Driver: Hive JDBC (version 3.1.0.3.1.0.0-78)
19/02/17 16:23:10 INFO hive.HiveImport: Transaction isolation: TRANSACTION_REPEATABLE_READ
19/02/17 16:23:10 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks> CREATE TABLE IF NOT EXISTS `test_hive_table1` ( `name` STRING, `location` STRING) COMMENT 'Imported by sqoop on 2019/02/
19/02/17 16:23:10 INFO hive.HiveImport: 17 16:23:05' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Compiling command(queryId=hive_20190217162310_3c6d2171-8c2d-4306-86c2-61688ac2943d): CREATE TABLE IF NOT EXISTS `test_hive_table1` ( `name` STRING, `location` STRING) COMMENT 'Imported by sqoop on 2019/02/17 16:23:05' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Semantic Analysis Completed (retrial = false)
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Completed compiling command(queryId=hive_20190217162310_3c6d2171-8c2d-4306-86c2-61688ac2943d); Time taken: 0.081 seconds
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Executing command(queryId=hive_20190217162310_3c6d2171-8c2d-4306-86c2-61688ac2943d): CREATE TABLE IF NOT EXISTS `test_hive_table1` ( `name` STRING, `location` STRING) COMMENT 'Imported by sqoop on 2019/02/17 16:23:05' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Completed executing command(queryId=hive_20190217162310_3c6d2171-8c2d-4306-86c2-61688ac2943d); Time taken: 0.015 seconds
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : OK
19/02/17 16:23:11 INFO hive.HiveImport: No rows affected (0.255 seconds)
19/02/17 16:23:11 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks> LOAD DATA INPATH 'hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1' INTO TABLE `test_hive_tab
19/02/17 16:23:11 INFO hive.HiveImport: le1`;
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Compiling command(queryId=hive_20190217162311_da16daf0-1137-4a3d-97c9-4921758db2eb): LOAD DATA INPATH 'hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1' INTO TABLE `test_hive_table1`
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Semantic Analysis Completed (retrial = false)
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Completed compiling command(queryId=hive_20190217162311_da16daf0-1137-4a3d-97c9-4921758db2eb); Time taken: 0.086 seconds
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Executing command(queryId=hive_20190217162311_da16daf0-1137-4a3d-97c9-4921758db2eb): LOAD DATA INPATH 'hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1' INTO TABLE `test_hive_table1`
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Starting task [Stage-0:MOVE] in serial mode
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Loading data to table default.test_hive_table1 from hdfs://zzeng-hdp-1.field.hortonworks.com:8020/user/centos/test_table_sqoop1
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Starting task [Stage-1:STATS] in serial mode
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : Completed executing command(queryId=hive_20190217162311_da16daf0-1137-4a3d-97c9-4921758db2eb); Time taken: 0.458 seconds
19/02/17 16:23:11 INFO hive.HiveImport: INFO  : OK
19/02/17 16:23:11 INFO hive.HiveImport: No rows affected (0.588 seconds)
19/02/17 16:23:11 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks>
19/02/17 16:23:11 INFO hive.HiveImport: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks> Closing: 0: jdbc:hive2://zzeng-hdp-1.field.hortonworks.com:2181,zzeng-hdp-2.field.hortonworks.com:2181,zzeng-hdp-3.field.hortonworks.com:2181/default;password=centos;serviceDiscoveryMode=zooKeeper;user=centos;zooKeeperNamespace=hiveserver2
19/02/17 16:23:11 INFO hive.HiveImport: Hive import complete.
19/02/17 16:23:11 INFO hive.HiveImport: Export directory is contains the _SUCCESS file only, removing the directory.
19/02/17 16:23:11 INFO tool.ImportTool: Publishing Hive/Hcat import job data to Listeners for table test_table_sqoop1
19/02/17 16:23:11 INFO atlas.ApplicationProperties: Looking for atlas-application.properties in classpath
19/02/17 16:23:11 INFO atlas.ApplicationProperties: Loading atlas-application.properties from file:/etc/sqoop/3.1.0.0-78/0/atlas-application.properties
19/02/17 16:23:11 INFO atlas.ApplicationProperties: No graphdb backend specified. Will use 'janus'
19/02/17 16:23:11 INFO atlas.ApplicationProperties: Using storage backend 'hbase2'
19/02/17 16:23:11 INFO atlas.ApplicationProperties: Using index backend 'solr'
19/02/17 16:23:11 INFO atlas.ApplicationProperties: Setting solr-wait-searcher property 'true'
19/02/17 16:23:11 INFO atlas.ApplicationProperties: Setting index.search.map-name property 'false'
19/02/17 16:23:12 INFO atlas.ApplicationProperties: Property (set to default) atlas.graph.cache.db-cache = true
19/02/17 16:23:12 INFO atlas.ApplicationProperties: Property (set to default) atlas.graph.cache.db-cache-clean-wait = 20
19/02/17 16:23:12 INFO atlas.ApplicationProperties: Property (set to default) atlas.graph.cache.db-cache-size = 0.5
19/02/17 16:23:12 INFO atlas.ApplicationProperties: Property (set to default) atlas.graph.cache.tx-cache-size = 15000
19/02/17 16:23:12 INFO atlas.ApplicationProperties: Property (set to default) atlas.graph.cache.tx-dirty-size = 120
19/02/17 16:23:12 ERROR security.InMemoryJAASConfiguration: Unable to add JAAS configuration for client [KafkaClient] as it is missing param [atlas.jaas.KafkaClient.loginModuleName]. Skipping JAAS config for [KafkaClient]
19/02/17 16:23:12 INFO kafka.KafkaNotification: ==> KafkaNotification()
19/02/17 16:23:12 INFO kafka.KafkaNotification: <== KafkaNotification()
19/02/17 16:23:12 INFO hook.AtlasHook: Created Atlas Hook
19/02/17 16:23:12 INFO mapreduce.PublishJobData: Published data is Operation=import, Url=jdbc:mysql://localhost/test?zeroDateTimeBehavior=convertToNull, User=centos, StoreType=mysql, StoreTable=test_table_sqoop1, StoreQuery=null, HiveDB=default, HiveTable=test_hive_table1, StartTime=1550420559485, EndTime=1550420591914, CmdLineArgs={reset.onemapper=false, codegen.output.delimiters.enclose=0, sqlconnection.metadata.transaction.isolation.level=2, codegen.input.delimiters.escape=0, codegen.auto.compile.dir=true, accumulo.batch.size=10240000, codegen.input.delimiters.field=0, accumulo.create.table=false, mainframe.input.dataset.type=p, enable.compression=false, skip.dist.cache=false, hive.compute.stats.table=false, hive.table.name=test_hive_table1, accumulo.max.latency=5000, db.username=root, sqoop.throwOnError=false, db.clear.staging.table=false, codegen.input.delimiters.enclose=0, hdfs.append.dir=false, import.direct.split.size=0, hcatalog.drop.and.create.table=false, codegen.output.delimiters.record=10, codegen.output.delimiters.field=1, hbase.bulk.load.enabled=false, mapreduce.num.mappers=1, export.new.update=UpdateOnly, db.require.password=true, hive.import=true, customtool.options.jsonmap={}, hdfs.delete-target.dir=false, codegen.output.delimiters.enclose.required=false, direct.import=false, codegen.output.dir=., hdfs.file.format=TextFile, hive.drop.delims=false, codegen.input.delimiters.record=0, db.batch=false, codegen.delete.compile.dir=false, split.limit=null, hcatalog.create.table=false, hive.fail.table.exists=false, hive.overwrite.table=false, incremental.mode=None, temporary.dirRoot=_sqoop, verbose=false, hbase.null.incremental.mode=Ignore, import.max.inline.lob.size=16777216, import.fetch.size=1, codegen.input.delimiters.enclose.required=false, relaxed.isolation=false, sqoop.oracle.escaping.disabled=true, db.table=test_table_sqoop1, hbase.create.table=false, codegen.compile.dir=/tmp/sqoop-centos/compile/3a0011318880684ddc36dfa828e3d30a, codegen.output.delimiters.escape=0, db.connect.string=jdbc:mysql://localhost/test?zeroDateTimeBehavior=convertToNull}
19/02/17 16:23:12 INFO hook.AtlasHook: ==> Shutdown of Atlas Hook
19/02/17 16:23:12 INFO kafka.KafkaNotification: ==> KafkaNotification.createProducer()
19/02/17 16:23:12 INFO producer.ProducerConfig: ProducerConfig values:
    compression.type = none
    metric.reporters = []
    metadata.max.age.ms = 300000
    metadata.fetch.timeout.ms = 60000
    acks = 1
    batch.size = 16384
    reconnect.backoff.ms = 10
    bootstrap.servers = [zzeng-hdp-3.field.hortonworks.com:6667, zzeng-hdp-1.field.hortonworks.com:6667, zzeng-hdp-2.field.hortonworks.com:6667]
    receive.buffer.bytes = 32768
    retry.backoff.ms = 100
    buffer.memory = 33554432
    timeout.ms = 30000
    key.serializer = class org.apache.kafka.common.serialization.StringSerializer
    retries = 0
    max.request.size = 1048576
    block.on.buffer.full = true
    value.serializer = class org.apache.kafka.common.serialization.StringSerializer
    metrics.sample.window.ms = 30000
    send.buffer.bytes = 131072
    max.in.flight.requests.per.connection = 5
    metrics.num.samples = 2
    linger.ms = 0
    client.id =

19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration key.deserializer = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration value.deserializer = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration hook.group.id = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration zookeeper.connection.timeout.ms = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration zookeeper.session.timeout.ms = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration enable.auto.commit = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration zookeeper.connect = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration zookeeper.sync.time.ms = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration session.timeout.ms = null was supplied but isn't a known config.
19/02/17 16:23:12 WARN producer.ProducerConfig: The configuration auto.offset.reset = null was supplied but isn't a known config.
19/02/17 16:23:12 INFO kafka.KafkaNotification: <== KafkaNotification.createProducer()
19/02/17 16:23:12 INFO hook.AtlasHook: <== Shutdown of Atlas Hook

Atlas 画面での確認結果:
image.png

image.png

image.png

1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0