flink1.13.5编译,各种填坑_flink .newbuilder(time.hours(24l))编译错误-程序员宅基地

技术标签: flink  

系列文章目录



前言

问题:使用官网flink1.13.5,hadoop2.7.2,无法把jar提交到集群进行standalone和yarn模式的部署
解决方法:自编译源码


一、源码准备

到官网或者github下载源码

二、 修改pom

pom中把hadoop和hive的版本改为与自己的版本一致。

三、编译

1.编译失败

编译命令:
mvn clean install -DskipTests

[INFO] > fsevents@1.2.7 install /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/fsevents
[INFO] > node install
[INFO] 
[INFO] 
[INFO] > node-sass@4.11.0 install /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass
[INFO] > node scripts/install.js
[INFO] 
[ERROR] Unable to save binary /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/vendor/linux-x64-64 : {
     Error: EACCES: permission denied, mkdir '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/vendor'
[ERROR]     at Object.mkdirSync (fs.js:729:3)
[ERROR]     at sync (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/mkdirp/index.js:71:13)
[ERROR]     at Function.sync (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/mkdirp/index.js:77:24)
[ERROR]     at checkAndDownloadBinary (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/scripts/install.js:114:11)
[ERROR]     at Object.<anonymous> (/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/scripts/install.js:157:1)
[ERROR]     at Module._compile (internal/modules/cjs/loader.js:689:30)
[ERROR]     at Object.Module._extensions..js (internal/modules/cjs/loader.js:700:10)
[ERROR]     at Module.load (internal/modules/cjs/loader.js:599:32)
[ERROR]     at tryModuleLoad (internal/modules/cjs/loader.js:538:12)
[ERROR]     at Function.Module._load (internal/modules/cjs/loader.js:530:3)
[ERROR]   errno: -13,
[ERROR]   syscall: 'mkdir',
[ERROR]   code: 'EACCES',
[ERROR]   path:
[ERROR]    '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/vendor' }
[INFO] 
[INFO] > node-sass@4.11.0 postinstall /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass
[INFO] > node scripts/build.js
[INFO] 
[INFO] Building: /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node/node /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
[ERROR] gyp info it worked if it ends with ok
[ERROR] gyp verb cli [ '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node/node',
[ERROR] gyp verb cli   '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-gyp/bin/node-gyp.js',
[ERROR] gyp verb cli   'rebuild',
[ERROR] gyp verb cli   '--verbose',
[ERROR] gyp verb cli   '--libsass_ext=',
[ERROR] gyp verb cli   '--libsass_cflags=',
[ERROR] gyp verb cli   '--libsass_ldflags=',
[ERROR] gyp verb cli   '--libsass_library=' ]
[ERROR] gyp info using node-gyp@3.8.0
[ERROR] gyp info using node@10.9.0 | linux | x64
[ERROR] gyp verb command rebuild []
[ERROR] gyp verb command clean []
[ERROR] gyp verb clean removing "build" directory
[ERROR] gyp verb command configure []
[ERROR] gyp verb check python checking for Python executable "python2" in the PATH
[ERROR] gyp verb `which` succeeded python2 /usr/bin/python2
[ERROR] gyp verb check python version `/usr/bin/python2 -c "import sys; print "2.7.5
[ERROR] gyp verb check python version .%s.%s" % sys.version_info[:3];"` returned: %j
[ERROR] gyp verb get node dir no --target version specified, falling back to host node version: 10.9.0
[ERROR] gyp verb command install [ '10.9.0' ]
[ERROR] gyp verb install input version string "10.9.0"
[ERROR] gyp verb install installing version: 10.9.0
[ERROR] gyp verb install --ensure was passed, so won't reinstall if already installed
[ERROR] gyp verb install version not already installed, continuing with install 10.9.0
[ERROR] gyp verb ensuring nodedir is created /root/.node-gyp/10.9.0
[ERROR] gyp WARN EACCES user "root" does not have permission to access the dev dir "/root/.node-gyp/10.9.0"
[ERROR] gyp WARN EACCES attempting to reinstall using temporary dev dir "/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp"
[ERROR] gyp verb tmpdir == cwd automatically will remove dev files after to save disk space
[ERROR] gyp verb command install [ '--node_gyp_internal_noretry', '10.9.0' ]
[ERROR] gyp verb install input version string "10.9.0"
[ERROR] gyp verb install installing version: 10.9.0
[ERROR] gyp verb install --ensure was passed, so won't reinstall if already installed
[ERROR] gyp verb install version not already installed, continuing with install 10.9.0
[ERROR] gyp verb ensuring nodedir is created /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp/10.9.0
[ERROR] gyp WARN install got an error, rolling back install
[ERROR] gyp verb command remove [ '10.9.0' ]
[ERROR] gyp verb remove using node-gyp dir: /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp
[ERROR] gyp verb remove removing target version: 10.9.0
[ERROR] gyp verb remove removing development files for version: 10.9.0
[ERROR] gyp WARN install got an error, rolling back install
[ERROR] gyp verb command remove [ '10.9.0' ]
[ERROR] gyp verb remove using node-gyp dir: /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp
[ERROR] gyp verb remove removing target version: 10.9.0
[ERROR] gyp verb remove removing development files for version: 10.9.0
[ERROR] gyp ERR! configure error 
[ERROR] gyp ERR! stack Error: EACCES: permission denied, mkdir '/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass/.node-gyp'
[ERROR] gyp ERR! System Linux 3.10.0-1160.49.1.el7.x86_64
[ERROR] gyp ERR! command "/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node/node" "/root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
[ERROR] gyp ERR! cwd /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/node-sass
[ERROR] gyp ERR! node -v v10.9.0
[ERROR] gyp ERR! node-gyp -v v3.8.0
[ERROR] gyp ERR! not ok 
[ERROR] Build failed with error code: 1
[INFO] 
[INFO] > husky@1.3.1 install /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard/node_modules/husky
[INFO] > node husky install
[INFO] 
[INFO] husky > setting up git hooks
[INFO] HUSKY_SKIP_INSTALL environment variable is set to 'true', skipping Git hooks installation.
[ERROR] added 1250 packages in 16.206s
[INFO] 
[INFO] --- frontend-maven-plugin:1.6:npm (npm run build) @ flink-runtime-web_2.11 ---
[INFO] Running 'npm run build' in /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard
[INFO] 
[INFO] > flink-dashboard@2.0.0 build /root/opensource/flink-1.13.5/flink-runtime-web/web-dashboard
[INFO] > ng build --prod --base-href ./
[INFO] 
[ERROR] Browserslist: caniuse-lite is outdated. Please run next command `npm update`
Killed
[root@hadoop103 flink-1.13.5]# npm update caniuse-lite browserslist
[root@hadoop103 flink-1.13.5]# npm update caniuse-lite browserslist
[root@hadoop103 flink-1.13.5]# npm i caniuse-lite browserslist -S
npm ERR! fetch failed https://registry.npmjs.org/browserslist/-/browserslist-4.20.2.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001325.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001325.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/browserslist/-/browserslist-4.20.2.tgz
npm WARN retry will retry, error on last attempt: Error: Parse Error
npm ERR! fetch failed https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001325.tgz
npm ERR! fetch failed https://registry.npmjs.org/browserslist/-/browserslist-4.20.2.tgz
npm ERR! Linux 3.10.0-1160.49.1.el7.x86_64
npm ERR! argv "/usr/bin/node" "/usr/bin/npm" "i" "caniuse-lite" "browserslist" "-S"
npm ERR! node v6.14.2
npm ERR! npm  v3.10.10
npm ERR! code HPE_UNEXPECTED_CONTENT_LENGTH

npm ERR! Parse Error
npm ERR! 
npm ERR! If you need help, you may report this error at:
npm ERR!     <https://github.com/npm/npm/issues>

npm ERR! Please include the following file with any support request:
npm ERR!     /root/opensource/flink-1.13.5/npm-debug.log

2. 填坑

2.1 填坑1

cd xxxx/flink
mvn clean install -DskipTests -Dfast -T 4 -Dmaven.compile.fork=true  -Dscala-2.11

分解:
mvn clean install \
  -DskipTests \ # 跳过测试部分
  -Dfast \ # 跳过doc检查等
  -T 4 \ # 支持多处理器或者处理器核数参数,加快构建速度,推荐Maven3.3及以上
  -Dmaven.compile.fork=true #允许多线程编译,推荐maven在3.3及以上

(1)flink-runtime-web这个包访问外网npm下载难,需要改一下这个包下的pom文件(执行编译命令前需要做的事)
搜索“ci --cache-max=0 --no-save”替换为“install -registry=https://registry.npm.taobao.org --cache-max=0 --no-save”
即:install -registry=https://registry.npm.taobao.org --cache-max=0 --no-save

作者:FishMAN__
链接:https://www.jianshu.com/p/66a3cf379042

重新编译:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Flink : Table : Planner 1.13.5:
[INFO] 
[INFO] Flink : Table : Planner ............................ SUCCESS [02:49 min]
[INFO] Flink : Formats : .................................. SUCCESS [  4.462 s]
[INFO] Flink : Format : Common ............................ SUCCESS [  0.279 s]
[INFO] Flink : Table : SQL Parser Hive .................... SUCCESS [ 16.250 s]
[INFO] Flink : Table : Runtime Blink ...................... SUCCESS [ 21.968 s]
[INFO] Flink : Table : Planner Blink ...................... FAILURE [ 58.828 s]
[INFO] Flink : Formats : Json ............................. SKIPPED
[INFO] Flink : Connectors : Elasticsearch base ............ SKIPPED
[INFO] Flink : Connectors : Elasticsearch 5 ............... SKIPPED
[INFO] Flink : Connectors : Elasticsearch 6 ............... SKIPPED
[INFO] Flink : Connectors : Elasticsearch 7 ............... SKIPPED
[INFO] Flink : Connectors : HBase base .................... SUCCESS [  7.488 s]
[INFO] Flink : Connectors : HBase 1.4 ..................... SKIPPED
[INFO] Flink : Connectors : HBase 2.2 ..................... SKIPPED
[INFO] Flink : Formats : Hadoop bulk ...................... SUCCESS [  1.868 s]
[INFO] Flink : Formats : Orc .............................. SKIPPED
[INFO] Flink : Formats : Orc nohive ....................... SKIPPED
[INFO] Flink : Formats : Avro ............................. SKIPPED
[INFO] Flink : Formats : Parquet .......................... SKIPPED
[INFO] Flink : Formats : Csv .............................. SKIPPED
[INFO] Flink : Connectors : Hive .......................... SKIPPED
[INFO] Flink : Connectors : JDBC .......................... SKIPPED
[INFO] Flink : Connectors : RabbitMQ ...................... SUCCESS [  2.197 s]
[INFO] Flink : Connectors : Twitter ....................... SUCCESS [  6.866 s]
[INFO] Flink : Connectors : Nifi .......................... SUCCESS [  2.562 s]
[INFO] Flink : Connectors : Cassandra ..................... SKIPPED
[INFO] Flink : Metrics : JMX .............................. SUCCESS [  1.353 s]
[INFO] Flink : Formats : Avro confluent registry .......... SKIPPED
[INFO] Flink : Connectors : Kafka ......................... SKIPPED
[INFO] Flink : Connectors : Google PubSub ................. SUCCESS [  4.299 s]
[INFO] Flink : Connectors : Kinesis ....................... SKIPPED
[INFO] Flink : Connectors : SQL : Elasticsearch 6 ......... SKIPPED
[INFO] Flink : Connectors : SQL : Elasticsearch 7 ......... SKIPPED
[INFO] Flink : Connectors : SQL : HBase 1.4 ............... SKIPPED
[INFO] Flink : Connectors : SQL : HBase 2.2 ............... SKIPPED
[INFO] Flink : Connectors : SQL : Hive 1.2.2 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 2.2.0 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 2.3.6 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Hive 3.1.2 .............. SKIPPED
[INFO] Flink : Connectors : SQL : Kafka ................... SKIPPED
[INFO] Flink : Connectors : SQL : Kinesis ................. SKIPPED
[INFO] Flink : Formats : Sequence file .................... SUCCESS [  1.608 s]
[INFO] Flink : Formats : Compress ......................... SUCCESS [  1.434 s]
[INFO] Flink : Formats : Avro AWS Glue Schema Registry .... SKIPPED
[INFO] Flink : Formats : SQL Orc .......................... SKIPPED
[INFO] Flink : Formats : SQL Parquet ...................... SKIPPED
[INFO] Flink : Formats : SQL Avro ......................... SKIPPED
[INFO] Flink : Formats : SQL Avro Confluent Registry ...... SKIPPED
[INFO] Flink : Examples : Streaming ....................... SKIPPED
[INFO] Flink : Examples : Table ........................... SKIPPED
[INFO] Flink : Examples : Build Helper : .................. SUCCESS [  0.401 s]
[INFO] Flink : Examples : Build Helper : Streaming Twitter  SKIPPED
[INFO] Flink : Examples : Build Helper : Streaming State machine SKIPPED
[INFO] Flink : Examples : Build Helper : Streaming Google PubSub SKIPPED
[INFO] Flink : Container .................................. SUCCESS [  1.244 s]
[INFO] Flink : Queryable state : Runtime .................. SUCCESS [  1.906 s]
[INFO] Flink : Mesos ...................................... SUCCESS [ 57.883 s]
[INFO] Flink : Kubernetes ................................. SUCCESS [ 19.016 s]
[INFO] Flink : Yarn ....................................... SUCCESS [  6.343 s]
[INFO] Flink : Libraries : Gelly .......................... SUCCESS [  8.045 s]
[INFO] Flink : Libraries : Gelly scala .................... SUCCESS [ 42.796 s]
[INFO] Flink : Libraries : Gelly Examples ................. SKIPPED
[INFO] Flink : External resources : ....................... SUCCESS [  0.458 s]
[INFO] Flink : External resources : GPU ................... SUCCESS [  0.462 s]
[INFO] Flink : Metrics : Dropwizard ....................... SUCCESS [  1.061 s]
[INFO] Flink : Metrics : Graphite ......................... SUCCESS [  0.773 s]
[INFO] Flink : Metrics : InfluxDB ......................... SUCCESS [  3.366 s]
[INFO] Flink : Metrics : Prometheus ....................... SUCCESS [  1.748 s]
[INFO] Flink : Metrics : StatsD ........................... SUCCESS [  0.825 s]
[INFO] Flink : Metrics : Datadog .......................... SUCCESS [  1.147 s]
[INFO] Flink : Metrics : Slf4j ............................ SUCCESS [  0.882 s]
[INFO] Flink : Libraries : CEP Scala ...................... SUCCESS [ 34.466 s]
[INFO] Flink : Table : Uber ............................... SKIPPED
[INFO] Flink : Table : Uber Blink ......................... SKIPPED
[INFO] Flink : Python ..................................... SKIPPED
[INFO] Flink : Table : SQL Client ......................... SKIPPED
[INFO] Flink : Libraries : State processor API ............ SUCCESS [  3.106 s]
[INFO] Flink : Dist ....................................... SKIPPED
[INFO] Flink : Yarn Tests ................................. SKIPPED
[INFO] Flink : E2E Tests : ................................ SKIPPED
[INFO] Flink : E2E Tests : CLI ............................ SKIPPED
[INFO] Flink : E2E Tests : Parent Child classloading program SKIPPED
[INFO] Flink : E2E Tests : Parent Child classloading lib-package SKIPPED
[INFO] Flink : E2E Tests : Dataset allround ............... SKIPPED
[INFO] Flink : E2E Tests : Dataset Fine-grained recovery .. SKIPPED
[INFO] Flink : E2E Tests : Datastream allround ............ SKIPPED
[INFO] Flink : E2E Tests : Batch SQL ...................... SKIPPED
[INFO] Flink : E2E Tests : Stream SQL ..................... SKIPPED
[INFO] Flink : E2E Tests : Distributed cache via blob ..... SKIPPED
[INFO] Flink : E2E Tests : High parallelism iterations .... SKIPPED
[INFO] Flink : E2E Tests : Stream stateful job upgrade .... SKIPPED
[INFO] Flink : E2E Tests : Queryable state ................ SKIPPED
[INFO] Flink : E2E Tests : Local recovery and allocation .. SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 5 ................ SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 6 ................ SKIPPED
[INFO] Flink : Quickstart : ............................... SUCCESS [  1.541 s]
[INFO] Flink : Quickstart : Java .......................... SUCCESS [  2.873 s]
[INFO] Flink : Quickstart : Scala ......................... SUCCESS [  0.258 s]
[INFO] Flink : E2E Tests : Quickstart ..................... SKIPPED
[INFO] Flink : E2E Tests : Confluent schema registry ...... SKIPPED
[INFO] Flink : E2E Tests : Stream state TTL ............... SKIPPED
[INFO] Flink : E2E Tests : SQL client ..................... SKIPPED
[INFO] Flink : E2E Tests : File sink ...................... SKIPPED
[INFO] Flink : E2E Tests : State evolution ................ SKIPPED
[INFO] Flink : E2E Tests : RocksDB state memory control ... SKIPPED
[INFO] Flink : E2E Tests : Common ......................... SKIPPED
[INFO] Flink : E2E Tests : Metrics availability ........... SKIPPED
[INFO] Flink : E2E Tests : Metrics reporter prometheus .... SKIPPED
[INFO] Flink : E2E Tests : Heavy deployment ............... SKIPPED
[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kafka ................ SKIPPED
[INFO] Flink : E2E Tests : Plugins : ...................... SKIPPED
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SKIPPED
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SKIPPED
[INFO] Flink : E2E Tests : TPCH ........................... SKIPPED
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SKIPPED
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SKIPPED
[INFO] Flink : E2E Tests : Common Kafka ................... SKIPPED
[INFO] Flink : E2E Tests : TPCDS .......................... SKIPPED
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SKIPPED
[INFO] Flink : E2E Tests : Python ......................... SKIPPED
[INFO] Flink : E2E Tests : HBase .......................... SKIPPED
[INFO] Flink : E2E Tests : AWS Glue Schema Registry ....... SKIPPED
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [  1.198 s]
[INFO] Flink : Contrib : .................................. SUCCESS [  0.504 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [  1.566 s]
[INFO] Flink : FileSystems : Tests ........................ SKIPPED
[INFO] Flink : Docs ....................................... SKIPPED
[INFO] Flink : Walkthrough : .............................. SUCCESS [  0.476 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [  1.539 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [  0.261 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [  0.295 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [  1.627 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  02:52 min (Wall Clock)
[INFO] Finished at: 2022-04-07T11:19:11+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project flink-table-planner-blink_2.12: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project flink-table-planner-blink_2.12: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.maven.plugin.MojoExecutionException: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137)
    at scala_maven.ScalaMojoSupport.execute (ScalaMojoSupport.java:490)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an error: 137 (Exit value: 137)
    at org.apache.commons.exec.DefaultExecutor.executeInternal (DefaultExecutor.java:377)
    at org.apache.commons.exec.DefaultExecutor.execute (DefaultExecutor.java:160)
    at org.apache.commons.exec.DefaultExecutor.execute (DefaultExecutor.java:147)
    at scala_maven_executions.JavaMainCallerByFork.run (JavaMainCallerByFork.java:100)
    at scala_maven.ScalaCompilerSupport.compile (ScalaCompilerSupport.java:161)
    at scala_maven.ScalaCompilerSupport.doExecute (ScalaCompilerSupport.java:99)
    at scala_maven.ScalaMojoSupport.execute (ScalaMojoSupport.java:482)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:196)
    at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call (MultiThreadedBuilder.java:186)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call (Executors.java:511)
    at java.util.concurrent.FutureTask.run (FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624)
    at java.lang.Thread.run (Thread.java:748)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <args> -rf :flink-table-planner-blink_2.12

2.2 填坑2 仓库没有confluent.version>5.3.0版本

[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  28.374 s (Wall Clock)
[INFO] Finished at: 2022-04-07T11:34:06+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project flink-avro-confluent-registry: Could not resolve dependencies for project org.apache.flink:flink-avro-confluent-registry:jar:1.13.5: io.confluent:kafka-schema-registry-client:jar:5.5.2 was not found in http://maven.aliyun.com/nexus/content/groups/public during a previous attempt. This failure was cached in the local repository and resolution is not reattempted until the update interval of nexus-aliyun has elapsed or updates are forced -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:

原因: 到mvnrepo查,发现: 只有5.3.0版本
解决方法:把pom 版本改了

[root@hadoop103 flink-formats]# vim flink-avro-confluent-registry/pom.xml
<confluent.version>5.3.0</confluent.version>

2.3 填坑3 编译出错,从错误地方继续跑的方法

-rf : 指定失败的module ,从失败的地方继续跑

编译命令:

[root@hadoop103 flink-1.13.5]# mvn clean install -DskipTests -Dfast -T 4 -Dmaven.compile.fork=true -Dscala-2.12 -rf :flink-avro-confluent-registry

[INFO] Flink : E2E Tests : Connectors : Google PubSub ..... SUCCESS [ 21.321 s]
[INFO] Flink : E2E Tests : Streaming Kafka base ........... SUCCESS [  0.210 s]
[INFO] Flink : E2E Tests : Streaming Kafka ................ SUCCESS [  7.944 s]
[INFO] Flink : E2E Tests : Plugins : ...................... SUCCESS [  0.113 s]
[INFO] Flink : E2E Tests : Plugins : Dummy fs ............. SUCCESS [  0.095 s]
[INFO] Flink : E2E Tests : Plugins : Another dummy fs ..... SUCCESS [  0.148 s]
[INFO] Flink : E2E Tests : TPCH ........................... SUCCESS [  1.719 s]
[INFO] Flink : E2E Tests : Streaming Kinesis .............. SUCCESS [ 19.961 s]
[INFO] Flink : E2E Tests : Elasticsearch 7 ................ SUCCESS [  4.154 s]
[INFO] Flink : E2E Tests : Common Kafka ................... SUCCESS [ 56.202 s]
[INFO] Flink : E2E Tests : TPCDS .......................... SUCCESS [  1.079 s]
[INFO] Flink : E2E Tests : Netty shuffle memory control ... SUCCESS [  0.152 s]
[INFO] Flink : E2E Tests : Python ......................... SUCCESS [  8.451 s]
[INFO] Flink : E2E Tests : HBase .......................... SUCCESS [  2.208 s]
[INFO] Flink : E2E Tests : AWS Glue Schema Registry ....... SUCCESS [ 23.854 s]
[INFO] Flink : State backends : Heap spillable ............ SUCCESS [  0.517 s]
[INFO] Flink : Contrib : .................................. SUCCESS [  0.108 s]
[INFO] Flink : Contrib : Connectors : Wikiedits ........... SUCCESS [  0.858 s]
[INFO] Flink : FileSystems : Tests ........................ SUCCESS [  2.662 s]
[INFO] Flink : Docs ....................................... SUCCESS [  3.249 s]
[INFO] Flink : Walkthrough : .............................. SUCCESS [  0.423 s]
[INFO] Flink : Walkthrough : Common ....................... SUCCESS [  0.433 s]
[INFO] Flink : Walkthrough : Datastream Java .............. SUCCESS [  0.133 s]
[INFO] Flink : Walkthrough : Datastream Scala ............. SUCCESS [  0.222 s]
[INFO] Flink : Tools : CI : Java .......................... SUCCESS [  2.507 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  07:46 min (Wall Clock)
[INFO] Finished at: 2022-04-07T11:59:03+08:00
[INFO] ------------------------------------------

最后的安装包在:build-target目录下

/root/opensource/flink-1.13.5/build-target

总结

编译成功,经过测试,服务端,能客户端提交的任务

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/spark_dev/article/details/124008534

智能推荐

计算机网络信息安全的实验报告,网络安全实验报告.doc-程序员宅基地

文章浏览阅读1.1k次。网络安全实验报告本科实验课程报告(2016至2017学年第1学期)课程名称: 网络信息安全专业名称:行政班级:学 号:姓 名:指导教师: 赵学民报告时间: 年 月 日实验大纲一、实验的性质、目的和任务本课程实验是非独立设课实验,随课程开设。主要的目的是使学生通过实验加深对网络安全知识的理解,以提高学生的学习兴趣和实践动手能力。二、实..._大学计算机基础 计算机网络与信息安全实验项目

java冒泡算法_java冒泡排序法代码-程序员宅基地

文章浏览阅读321次。展开全部冒泡排序是比2113较经典的排5261序算法。代码如下:4102for(int i=1;ifor(int j=1;j//交换位置}拓展1653资料:原理:比较版两个相邻的元素,将权值大的元素交换至右端。思路:依次比较相邻的两个数,将小数放在前面,大数放在后面。即在第一趟:首先比较第1个和第2个数,将小数放前,大数放后。然后比较第2个数和第3个数,将小数放前,大数放后,如此继续,直至比较最后..._java冒泡排序int w;for(int i=1)

学习UML类图基本画法:Java类图示例及步骤-程序员宅基地

文章浏览阅读1.2w次,点赞14次,收藏62次。类 简要画法类有三个单元格的矩形(看上图中的动物类)第一格:类名称(如果是抽象类,名称标注为斜体字)第二格:类属性名称第三格:类操作名称类属性或者操作的访问修改符的标注:public用加号标注private用减号标注protected用#号标注接口 简要画法接口有两个单元格的矩形(看上图中的飞翔接口)第一格:接口名称(名称前面要加入接口标注<>)第二格:操作名称属性或者操作的访问修改符..._javauml类图怎么画

【C语言】十进制转换二进制_c语言十进制转二进制代码-程序员宅基地

文章浏览阅读9.5w次,点赞226次,收藏780次。本题要求实现一个函数,将正整数n转换为二进制后输出。函数接口定义:void dectobin( int n );函数dectobin应在一行中打印出二进制的n。建议用递归实现。裁判测试程序样例:#include <stdio.h>void dectobin( int n );int main(){ int n; scanf("%d", &n); dectobin(n); return 0;}/* 你的代码将被嵌在这里 */_c语言十进制转二进制代码

51(52)单片机 定时器实现数码管时钟_51单片机数码管时钟-程序员宅基地

文章浏览阅读9.3k次,点赞30次,收藏172次。51(52)单片机 定时器实现数码管时钟(按键实现时.分加及时钟启动)_51单片机数码管时钟

1024 程序员节日快乐-程序员宅基地

文章浏览阅读3.7k次。2019-10-24 只有我们程序员的节日,虽然公司不放假,虽然啥礼品也没有,但是。。。转眼在csdn写博客已经三年了,自己学到很多,访问量也达到了31万,排名1万多,虽然自己还是差很多,但是我会继续努力,给大家带来更好的东西。在这三年里,虽然学习了很多除了android的知识,python,java后台,sql,但是感觉还是差很多。所以在接下来的时间,我会继续努力,首先主要的目的是把...

随便推点

大数据技术与应用实验报告2_大数据技术原理与应用”课程实验报告-实验二-程序员宅基地

文章浏览阅读2.7k次。大数据技术与应用实验报告2HDFS常用shell命令的使用,以及用JAVA API实现HDFS常用shell命令的功能实现上传文件,下载文件实现上传和下载文件的主要的shell命令把本地文件上传到hdfs:hdfs dfs -put anaconda-ks.cfg /aa 把本地文件上传到hdfs:hdfs dfs -copyFromLocal a.t..._大数据技术原理与应用”课程实验报告-实验二

Java,JavaSE、JavaEE和JavaSE的区别_我们学的的java ee还是java se-程序员宅基地

文章浏览阅读3k次。JavaEE是指Java Enterprise Edition,Java企业版,多用于企业级开发,包括web开发等等。也叫J2EE。JavaSE通常是指Java Standard Edition,Java标准版,就是一般Java程序的开发就可以(如桌面程序),可以看作是JavaEE的子集。Java是一问语言,J2EE是Java语言的一门使用技术,Java为J2EE提供了库和语法,J_我们学的的java ee还是java se

2021湖南高考成绩查询考生版,湖南省普通高校招生考试考生综合信息平台入口2021...-程序员宅基地

文章浏览阅读259次。湖南省普通高校招生考试考生综合信息平台入口2021,这是最近一个很不错的教育考试服务平台,用户可以在线办理多种业务,报名、缴费都没问题,并且还能了解到最新的考试资讯信息。湖南省普通高校招生考试考生综合信息平台入口2021功能:1.非常专业的招生信息服务平台,大家在这里不仅可以查询自己的考试成绩,还能在线上了解更多的招生信息。2.整合了全国两千多所高等院校的公开资料,真正做到一站式查询。免费向国内权..._2021湖南省普通高校招生考试考生综合信息平台

VvvebJs —— 使用拖拽的方式生成网页_web页面拖拽生成-程序员宅基地

文章浏览阅读5.3k次。https://www.oschina.net/p/vvvebjs?utm_source=wechat&utm_medium=zaobaoVvvebJs —— 使用拖拽的方式生成网页https://dwz.cn/xA4KhNJCVvvebJs是一个开源的网页拖拽自动生成的JavaScript库,你可以以简单拖拽的方式生成自己需要的网页样式,内置jquery和Bootstrap,..._web页面拖拽生成

matlab 相位解旋绕,相位解缠绕方法-南京航空航天大学学报.PDF-程序员宅基地

文章浏览阅读1.1k次。第 卷第 期 南 京 航 空 航 天 大 学 学 报年 月一种基于等效残差点的相位解缠绕方法蒋 锐 朱岱寅 朱兆达南京邮电大学通信与信息工程学院南京 南京航空航天大学电子信息工程学院南京摘要二维相位解缠绕处理是干涉合成孔径雷达 数据处理的关键本文提出了一种基..._相位解缠绕

mysql read next_mysql handle_read_next-程序员宅基地

文章浏览阅读294次。MySQL多版本并发控制机制(MVCC)-源码浅析MySQL多版本并发控制机制(MVCC)-源码浅析前言作为一个数据库爱好者,自己动手写过简单的SQL解析器以及存储引擎,但感觉还是不够过瘾。<>诚然讲的非常透彻,但只能提纲挈领,不能让你玩转某个真正的数据库。感谢cmake,能够让我在mac上用x...文章无毁的湖光2018-08-228877浏览量【从入门到放弃-MySQL】数据库连接..._mysql read_next

推荐文章

热门文章

相关标签