<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-runtime-web_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-statebackend-rocksdb_2.12</artifactId>
<version>1.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_2.12</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-common</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- 引入kafka -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.8.1</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-tools</artifactId>
<version>2.8.1</version>
</dependency>
<dependency>
<groupId>com.ververica</groupId>
<artifactId>flink-connector-oracle-cdc</artifactId>
<version>2.2.0</version>
</dependency>
执行报错:
org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:137)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$0(AkkaInvocationHandler.java:237)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:1081)
at akka.dispatch.OnComplete.internal(Future.scala:264)
at akka.dispatch.OnComplete.internal(Future.scala:261)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:191)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:188)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:573)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:22)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:21)
at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:207)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:197)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:188)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:677)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:435)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
... 4 more
Caused by: java.lang.NoSuchMethodError: org.apache.kafka.common.utils.Utils.isBlank(Ljava/lang/String;)Z
at org.apache.kafka.connect.runtime.WorkerConfig$ListenersValidator.ensureValid(WorkerConfig.java:429)
at org.apache.kafka.common.config.ConfigDef$ConfigKey.<init>(ConfigDef.java:1131)
at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:152)
at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:172)
at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:211)
at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:373)
at org.apache.kafka.connect.runtime.WorkerConfig.baseConfigDef(WorkerConfig.java:252)
at io.debezium.embedded.EmbeddedEngine$EmbeddedConfig.<clinit>(EmbeddedEngine.java:1101)
at io.debezium.embedded.EmbeddedEngine.<init>(EmbeddedEngine.java:600)
at io.debezium.embedded.EmbeddedEngine.<init>(EmbeddedEngine.java:81)
at io.debezium.embedded.EmbeddedEngine$BuilderImpl.build(EmbeddedEngine.java:302)
at io.debezium.embedded.EmbeddedEngine$BuilderImpl.build(EmbeddedEngine.java:218)
at io.debezium.embedded.ConvertingEngineBuilder.build(ConvertingEngineBuilder.java:151)
at com.ververica.cdc.debezium.DebeziumSourceFunction.run(DebeziumSourceFunction.java:422)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:269)
解决方式:
经查看kafka-client源码,2.8.1版本中不存在 isBlank() 方法,查看不同版本源码,发现在3.0后才有此方法,这里用3.1.2版本解决此问题。
调整jar版本
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>3.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-tools</artifactId>
<version>3.1.2</version>
</dependency>
oracle cdc报错:NoSuchMethodError: org.apache.kafka.common.utils.Utils.isBlank(Ljava/lang/String;)Z
经查看kafka-client源码,2.8.1版本中不存在 isBlank() 方法,查看不同版本源码,发现在3.0后才有此方法,这里用3.1.2版本解决此问题。使用oracle cdc同步数据到mysql。
1.如果运行程序出现错误:“Exception in thread "main"
java
.
lan
g.NoClassDefFoundError:
org
/slf4j/LoggerFactory”,这是因为项目缺少slf4j-api.jar和slf4j-log4j12.jar这两个jar包导致的错误。
2.如果运行程序出现错误:“
java
.
lan
g.NoClassDefFoundError:
org
/
apache
/log4j/LogManager”,这是因为项目缺少log4j.jar这个jar包
3.错误:“Exception in thread "main"
java
.
lan
g.
NoSuchMethodError
:
org
.slf4j.MDC.getCopyOfContextMap()L
java
/util/Map”,这是因为jar包版本冲突造成的。
Maven的Jar包冲突问题,经常出现的场景为:
本地运行报
NoSuchMethodError
,ClassNotFoundException。明明在依赖里有这个Jar包啊。怎么运行不了!?
项目中明明定义着某个jar包版本为2.0.2,怎么打包之后变成2.5.0了!?
A项目引xxx.jar包运行好好的,B项目同样引入xxx.jar后,运行
报错
了。。是B项目有问题,还是xxx.jar包有问题!?
本地环境和测试环境运行的好好的,到了生产就报一堆
NoSuchMethodError
,是我人品有问题还是生产环境有问题!?
这样的问题如果不熟悉maven依赖机制的同学排查起来,估计挺头痛的。
而且maven依赖结构不好的项目,在引入新的Jar包时的风险也是巨大的。小则影响性能,大则引起生产发布和运行时异常。
其实以上问题的根源都来自于Maven的Jar包冲突和使用不当的依赖传递。这篇文章我就好好分析下以下3个内容:
依赖传递的原则和产生Jar包冲突的原理分析
定位冲突以及解决Jar包冲突的几个简单技巧
如何写一个干净依赖关系的POM文件
实用程序,使它们具有类型安全性,易于测试和更具表现力。 使用预定义的或定义您自己的,不要再注入Dispatchers对象。
val presenter = MyPresenter ( MainCoroutineScope ())
class MyPresenter @Inject constructor(
* Defaults to the Main dispatcher
val coroutineScope : MainCoroutineScope
fun loopSomething () = coroutineScope.launchDefault { }
suspend fun updateSomething () = withMainImmediate { }
class MyTest {
@Test
十分抱歉,上次整合的是一个半成品,spring3.1和hibernate4.1目前为止我测试了,整合过程中有很多问题!关键问题有几个,第一个HibernateDaoSupport这个没有了,在使用hibernateTemplate的时候,
报错
误:
java
.
lan
g.
NoSuchMethodError
:
org
.hibernate.SessionFactory.openSession()L
org
/hibernate/classic/Session 很是悲催。
第二 spring3.1 不知道为什么不没有了HibernateTemplate的封装,又是一个很悲剧的事情! 谷歌百度后,说spring3.1还没有对hibernate4.1开始支持,具体情况不知道,有知道告诉我一下!呵呵 不懂!
我这里有测试的demo 基本上大家能看懂! 还有就是我这里习惯把service和dao按照不同的模块进行划分,比较繁琐,但是习惯了! 有意见的请给我留言!
在项目中偶尔会出现这样的情况,线上的应用报
java
.
lan
g.
NoSuchMethodError
异常,然而开发环境却很正常,出错的类没有修改过,然后根据异常提示补上对应的class文件之后又恢复正常,然后过段时间又发生了这样的情况,对此做了研究。
首先分析了下开发环境与线上环境的区别,我打的上线包都是根据自己修改过的
java
文件路径打成压缩包的格式,而开发环境使用的是eclipse默认的自动编译
Caused by:
java
.
lan
g.
NoSuchMethodError
:
org
.
apache
.xmlbeans.XmlOptions.put(L
java
/
lan
g/Object;)V
这个错误表明在调用 `
org
.
apache
.xmlbeans.XmlOptions.put(L
java
/
lan
g/Object;)` 方法时发生了异常,因为该方法不存在。这可能是由于以下几种情况之一导致的:
1. 版本冲突:您正在使用的 `
org
.
apache
.xmlbeans` 库版本与您的代码或其他依赖项不兼容。请确保您使用的所有库的版本都是兼容的。
2. 缺少依赖项:您可能缺少 `
org
.
apache
.xmlbeans` 库的某个依赖项。请检查您的项目配置文件或构建工具,确保所有必需的依赖项都已正确配置。
3. 混淆问题:如果您使用了代码混淆工具,例如 ProGuard,它可能会删除或重命名了 `
org
.
apache
.xmlbeans.XmlOptions` 类或方法。请检查混淆配置,确保相关类和方法被正确地保留。
请仔细检查您的代码和依赖项配置,以解决此问题。如果您能提供更多上下文信息,我可以提供更具体的建议。