flink編譯hadoop3.1.2(失败,这个问题没有意义,关闭)
日期: 2020-06-05 分类: 跨站数据测试 299次阅读
環境:
組件 | 版本 |
zookeeper | 3.6.0 |
hadoop | 3.1.2 |
spark | 3.0.0-preview2-bin-hadoop3.2 |
hbase | 2.2.4 |
hive | 3.0.0 |
flink | 1.10.1 |
https://mirrors.tuna.tsinghua.edu.cn/apache/flink/flink-1.10.1/flink-1.10.1-src.tgz
pom.xml中93-159行是需要關注的版本,目前的修改办法是如果pom.xml中的版本比其他大数据组件中的jar更新,那么就不修改版本了,如果比其他大数据组件中的版本旧,那么就进行修改和更新。
pom.xml中的變量 | pom.xml中的默認版本 | 预期修改成 | 修改理由-hadoop中對應的jar包/文件夾 |
hadoop.version | 2.4.1 | 3.1.2 | hadoop-3.1.2 |
guava.version | 18.0 | ||
akka.version | 2.5.21 | ||
slf4j.version | 1.7.15 | 1.7.25 | $TEZ_HOME/lib/slf4j-api-1.7.25.jar |
scala.version | 2.11.12 | 感覺應該不會有太大影響,所以沒改 | |
scala.binary.version | 2.11 | 感覺應該不會有太大影響,所以沒改 | |
zookeeper.version | 3.4.10 | 3.6.0 | apache-zookeeper-3.6.0-bin |
curator.version | 2.12.0 | 2.13.0 | apache-tez-0.9.2-src/tez-dist/target/tez-0.9.2/lib/curator-client-2.13.0.jar |
jackson.version | 2.10.1 | 2.7.8(实际没修改) | $HADOOP/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.7.8.jar |
hive.version | 2.3.4 | 3.0.0 | apache-hive-3.0.0-bin |
hivemetastore.hadoop.version | 2.7.5 | 3.0.0 | 在mysql中查看 |
hadoop.version | 2.7.0-mapr-1607 | 不用修改 | MapR是一家大数据公司,不使用他们的平台就不用对此处进行修改[1] |
zookeeper.version | 3.4.5-mapr-1604 | 不用修改 | MapR是一家大数据公司,不使用他们的平台就不用对此处进行修改[1] |
同时flink-connectors/flink-hbase/pom.xml中的hbase修改为2.2.4
mysql> use hive;
mysql> select * from VERSION;
+--------+----------------+----------------------------+
| VER_ID | SCHEMA_VERSION | VERSION_COMMENT |
+--------+----------------+----------------------------+
| 1 | 3.0.0 | Hive release version 3.0.0 |
+--------+----------------+----------------------------+
1 row in set (0.00 sec)
mvn clean package -DskipTests -Dhadoop.version=3.1.2
编译失败
[ERROR] Failed to execute goal on project flink-hadoop-fs: Could not resolve dependencies for project org.apache.flink:flink-hadoop-fs:jar:1.10.1: Could not find artifact org.apache.flink:flink-shaded-hadoop-2:jar:3.1.2-9.0 in alimaven (http://maven.aliyun.com/nexus/content/groups/public/) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :flink-hadoop-fs
因为前些时候已经编译成功flink1.12了,和hadoop3.x兼容的很好,所以这里在编译的时候特地采用hadoop3.x其实意义不大。
Reference:
精华推荐