一.准备工作
1.hadoop,hive,hbase 集群安装
HADOOP_HOME=/soft/hadoop/hadoop-2.9.2
HBASE_HOME=/soft/hbase/hbase-2.1.6
HIVE_HOME=/soft/hive/apache-hive-2.3.6-bin
SQOOP_HOME=/soft/sqoop/sqoop-1.99.7-bin-hadoop200
JAVA_HOME=/soft/jdk/jdk1.8.0_211
export HADOOP_COMMON_HOME=$HADOOP_HOME/share/hadoop/common
export HADOOP_HDFS_HOME=$HADOOP_HOME/share/hadoop/hdfs
export HADOOP_MAPRED_HOME=$HADOOP_HOME/share/hadoop/mapreduce
export HADOOP_YARN_HOME=$HADOOP_HOME/share/hadoop/yarn
PATH=$PATH:$HOME/bin:$HADOOP_HOME/bin:$HBASE_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$JAVA_HOME/bin
export SQOOP_SERVER_EXTRA_LIB=$SQOOP_HOME/extra
2.hadoop core.site 添加
<property>
<name>hadoop.proxyuser.$SERVER_USER.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.$SERVER_USER.groups</name>
<value>*</value>
</property>
3.拷贝mysql 驱动包到sqoop_home/extra目录下
4.sqoop_bootstrap.properties和sqoop.properties配置
sqoop_bootstrap.properties
sqoop.config.provider=org.apache.sqoop.core.PropertiesConfigurationProvider
sqoop.properties
org.apache.sqoop.submission.engine.mapreduce.configuration.directory=/soft/hadoop/hadoop-2.9.2/etc/hadoop
org.apache.sqoop.security.authentication.type=SIMPLE
org.apache.sqoop.security.authentication.handler=org.apache.sqoop.security.authentication.SimpleAuthenticationHandler
org.apache.sqoop.security.authentication.anonymous=true
5.验证
sqoop2-tool verify
6.启动
sqoop2-server start
原文地址:https://www.cnblogs.com/duanzexun/p/11859199.html
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。