Young87

SmartCat's Blog

So happy to code my life!

游戏开发交流QQ群号60398951

当前位置:首页 >跨站数据测试

flink編譯hadoop3.1.2(失败,这个问题没有意义,关闭)

環境:

組件版本
zookeeper3.6.0
hadoop3.1.2
spark3.0.0-preview2-bin-hadoop3.2
hbase2.2.4
hive3.0.0
flink1.10.1

https://mirrors.tuna.tsinghua.edu.cn/apache/flink/flink-1.10.1/flink-1.10.1-src.tgz

pom.xml中93-159行是需要關注的版本,目前的修改办法是如果pom.xml中的版本比其他大数据组件中的jar更新,那么就不修改版本了,如果比其他大数据组件中的版本旧,那么就进行修改和更新。

pom.xml中的變量pom.xml中的默認版本预期修改成修改理由-hadoop中對應的jar包/文件夾
hadoop.version2.4.13.1.2hadoop-3.1.2
guava.version18.0  
akka.version2.5.21  
slf4j.version1.7.151.7.25$TEZ_HOME/lib/slf4j-api-1.7.25.jar
scala.version

2.11.12

感覺應該不會有太大影響,所以沒改 
scala.binary.version2.11感覺應該不會有太大影響,所以沒改 
zookeeper.version3.4.103.6.0apache-zookeeper-3.6.0-bin
curator.version2.12.02.13.0apache-tez-0.9.2-src/tez-dist/target/tez-0.9.2/lib/curator-client-2.13.0.jar
jackson.version2.10.12.7.8(实际没修改)$HADOOP/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.7.8.jar
hive.version2.3.43.0.0apache-hive-3.0.0-bin
hivemetastore.hadoop.version2.7.53.0.0在mysql中查看
hadoop.version2.7.0-mapr-1607不用修改MapR是一家大数据公司,不使用他们的平台就不用对此处进行修改[1]
zookeeper.version3.4.5-mapr-1604不用修改MapR是一家大数据公司,不使用他们的平台就不用对此处进行修改[1]

同时flink-connectors/flink-hbase/pom.xml中的hbase修改为2.2.4

 

mysql> use hive;

mysql> select * from VERSION;
+--------+----------------+----------------------------+
| VER_ID | SCHEMA_VERSION | VERSION_COMMENT            |
+--------+----------------+----------------------------+
|      1 | 3.0.0          | Hive release version 3.0.0 |
+--------+----------------+----------------------------+
1 row in set (0.00 sec)

 

 

mvn clean package -DskipTests -Dhadoop.version=3.1.2

编译失败

[ERROR] Failed to execute goal on project flink-hadoop-fs: Could not resolve dependencies for project org.apache.flink:flink-hadoop-fs:jar:1.10.1: Could not find artifact org.apache.flink:flink-shaded-hadoop-2:jar:3.1.2-9.0 in alimaven (http://maven.aliyun.com/nexus/content/groups/public/) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :flink-hadoop-fs
 

因为前些时候已经编译成功flink1.12了,和hadoop3.x兼容的很好,所以这里在编译的时候特地采用hadoop3.x其实意义不大。

 

Reference:

[1]

除特别声明,本站所有文章均为原创,如需转载请以超级链接形式注明出处:SmartCat's Blog

上一篇: 干货 | 基于SRS直播平台的监控系统之实现思路与过程

下一篇: Codeforces Round #647 (Div. 2) 题解

精华推荐