flink-be-god's Introduction
flink-be-god's People
Forkers
roadofshang huangxiaofeng10047 nyy610203 xbyang18 libinjale2008 oceanhaiyang xiejiajun maikouliujian sijifeng tgluon deeservent yangrong688 xuhail zhaoxiaoge lq351334131 nightqiuhua xueqianing fuchanghai vajaw guanpx deepeye tang-dake xuchangqun zqbird zhaohaolin mine1202 kylinzhu haobingtian vaquarkhan shy847339585 xingdianp kbendick leijid ggaosong wardlican wangzhiwubigdata randiman wivw0306 doublexxxxxx guochunhong ramottamado vincentmliu mars11 jefyjiang hpf99 izouxv wqlsdb guanxianchun wanggang0216flink-be-god's Issues
Flink 自定义sql 连接器,无法找到类问题(flink-sql-connector-customized)
你好:
问题如下
1. 直接运行flink-sql-connector-customized 里面的代码,直接提示无法找到类? 可以给一个能够运行的demo 吗?
2. 对照es 的sink 源码, Factory 和 TableFactroy 实现略有不同。一个是DynamicTableSinkFactory 实现,一个是StreamTableSinkFactory实现,看博主的文章发现StreamTableSinkFactory 是旧版实现;为何master 代码中都指向了Dynamic 的实现。
测试代码如下(直接在flink-sql-connector-customized 下建立一个测试代码):
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.RichParallelSourceFunction;
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import static org.apache.flink.table.api.Expressions.$;
public class UserDefineSqlConnector {
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
EnvironmentSettings bsSettings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
StreamTableEnvironment bsTableEnv = StreamTableEnvironment.create(env, bsSettings);
DataStream<String> temp = env.addSource(new RichParallelSourceFunction<String>() {
boolean label = true;
int loop = 10;
@Override
public void run(SourceContext<String> ctx) throws Exception {
while (label && loop > 0) {
loop--;
ctx.collect("message");
}
}
@Override
public void cancel() {
label = false;
}
});
bsTableEnv.createTemporaryView("temp", temp, $("source"));
bsTableEnv.executeSql("select * from temp").print();
bsTableEnv.executeSql("CREATE TABLE user_behavior_sink (\n" +
" user_id String\n" +
") WITH (\n" +
" 'connector' = 'customized',\n" +
" 'job' = 'test',\n" +
" 'metrics' = 'aaa',\n" +
" 'address' = 'bbb'\n" +
")");
bsTableEnv.executeSql("insert into user_behavior_sink select * from temp");
bsTableEnv.execute("");
}
}
错误信息
Exception in thread "main" org.apache.flink.table.api.ValidationException: Unable to create a sink for writing table 'default_catalog.default_database.user_behavior_sink'.
Table options are:
'address'='bbb'
'connector'='customized'
'job'='test'
'metrics'='aaa'
at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:164)
at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:344)
at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:204)
at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:163)
at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:163)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:163)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1264)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:700)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:787)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:690)
at UserDefineSqlConnector.main(UserDefineSqlConnector.java:46)
Caused by: org.apache.flink.table.api.ValidationException: Cannot discover a connector using option ''connector'='customized''.
at org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:329)
at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:157)
... 18 more
Caused by: org.apache.flink.table.api.ValidationException: Could not find any factory for identifier 'customized' that implements 'org.apache.flink.table.factories.DynamicTableSinkFactory' in the classpath.
Available factory identifiers are:
blackhole
elasticsearch-6
kafka
print
at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:240)
at org.apache.flink.table.factories.FactoryUtil.getDynamicTableFactory(FactoryUtil.java:326)
... 19 more
异步io中vertx链接不上mysql,一直报超时错误
数据库配置没有错,但是链接超时,也看不到为啥链接不上得错误
运行成功但是 日志有报错
我这边代码和你的几乎一致的,任务在yarn上是succeeded的了,但是查看log会有报错:
2021-11-25 19:12:00,145 ERROR org.apache.flink.runtime.rest.handler.job.coordination.ClientCoordinationHandler [] - Unhandled exception.
org.apache.flink.runtime.messages.FlinkJobNotFoundException: Could not find Flink job (33f61437cffb74ebf8b67266e257eee0)
at org.apache.flink.runtime.dispatcher.Dispatcher.getJobMasterGateway(Dispatcher.java:799) ~[realtime-ods.jar:3.3.2]
at org.apache.flink.runtime.dispatcher.Dispatcher.performOperationOnJobMasterGateway(Dispatcher.java:809) ~[realtime-ods.jar:3.3.2]
at org.apache.flink.runtime.dispatcher.Dispatcher.deliverCoordinationRequestToCoordinator(Dispatcher.java:655) ~[realtime-ods.jar:3.3.2]
at sun.reflect.GeneratedMethodAccessor13.invoke(Unknown Source) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:286) ~[realtime-ods.jar:3.3.2]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:201) ~[realtime-ods.jar:3.3.2]
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74) ~[realtime-ods.jar:3.3.2]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:154) ~[realtime-ods.jar:3.3.2]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) ~[realtime-ods.jar:3.3.2]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) ~[realtime-ods.jar:3.3.2]
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123) ~[realtime-ods.jar:3.3.2]
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) ~[realtime-ods.jar:3.3.2]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170) ~[realtime-ods.jar:3.3.2]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) ~[realtime-ods.jar:3.3.2]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) ~[realtime-ods.jar:3.3.2]
at akka.actor.Actor$class.aroundReceive(Actor.scala:517) ~[realtime-ods.jar:3.3.2]
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) ~[realtime-ods.jar:3.3.2]
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) ~[realtime-ods.jar:3.3.2]
at akka.actor.ActorCell.invoke(ActorCell.scala:561) ~[realtime-ods.jar:3.3.2]
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) ~[realtime-ods.jar:3.3.2]
at akka.dispatch.Mailbox.run(Mailbox.scala:225) ~[realtime-ods.jar:3.3.2]
at akka.dispatch.Mailbox.exec(Mailbox.scala:235) ~[realtime-ods.jar:3.3.2]
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [realtime-ods.jar:3.3.2]
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [realtime-ods.jar:3.3.2]
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [realtime-ods.jar:3.3.2]
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [realtime-ods.jar:3.3.2]
2021-11-25 19:12:00,277 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher [] - Shutting down cluster because someone retrieved the job result.
2021-11-25 19:12:00,277 INFO org.apache.flink.runtime.entrypoint.ClusterEntrypoint [] - Shutting YarnJobClusterEntrypoint down with application status SUCCEEDED. Diagnostics null.
2021-11-25 19:12:00,277 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - Shutting down rest endpoint.
2021-11-25 19:12:00,306 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - Removing cache directory /tmp/flink-web-fbb7bb3f-2a1e-4d4f-8c2e-65abbaa54c3e/flink-web-ui
2021-11-25 19:12:00,306 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - http://ip:3584 lost leadership
2021-11-25 19:12:00,307 INFO org.apache.flink.runtime.jobmaster.MiniDispatcherRestEndpoint [] - Shut down complete.
2021-11-25 19:12:00,307 INFO org.apache.flink.runtime.resourcemanager.active.ActiveResourceManager [] - Shut down cluster because application is in SUCCEEDED, diagnostics null.
2021-11-25 19:12:00,308 INFO org.apache.flink.yarn.YarnResourceManagerDriver [] - Unregister application from the YARN Resource Manager with final status SUCCEEDED.
2021-11-25 19:12:00,314 INFO org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl [] - Waiting for application to be successfully unregistered.
2021-11-25 19:12:00,415 INFO org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl [] - Waiting for application to be successfully unregistered.
2021-11-25 19:12:00,902 INFO org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent [] - Closing components.
2021-11-25 19:12:00,903 INFO org.apache.flink.runtime.dispatcher.runner.JobDispatcherLeaderProcess [] - Stopping JobDispatcherLeaderProcess.
2021-11-25 19:12:00,904 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher [] - Stopping dispatcher akka.tcp://[email protected]:8600/user/rpc/dispatcher_1.
2021-11-25 19:12:00,904 INFO org.apache.flink.runtime.dispatcher.MiniDispatcher [] - Stopping all currently running jobs of dispatcher akka.tcp://[email protected]:8600/user/rpc/dispatcher_1.
2021-11-25 19:12:00,904 INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.BackPressureRequestCoordinator [] - Shutting down back pressure request coordinator.
2021-11-25 19:12:00,905 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy [] - Opening proxy : uatbigdata01.ip:45454
2021-11-25 19:12:00,904 INFO org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl [] - Interrupted while waiting for queue
java.lang.InterruptedException: null
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) ~[?:1.8.0_202]
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048) ~[?:1.8.0_202]
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442) ~[?:1.8.0_202]
at org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl$CallbackHandlerThread.run(AMRMClientAsyncImpl.java:274) [flink
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.