GithubHelp home page GithubHelp logo

datalinkdc / dinky Goto Github PK

View Code? Open in Web Editor NEW
2.9K 38.0 1.1K 32.64 MB

Dinky is a real-time data development platform based on Apache Flink, enabling agile data development, deployment and operation.

Home Page: http://www.dinky.org.cn

License: Apache License 2.0

Java 63.25% Shell 0.11% JavaScript 0.14% TypeScript 23.58% Less 0.17% Batchfile 0.01% Dockerfile 0.02% CSS 0.01% Python 0.05% PLpgSQL 12.59% FreeMarker 0.04% Smarty 0.02%
flink flinksql real-time-computing-platform flinkcdc olap sql datalake datawarehouse

dinky's People

Contributors

18216499322 avatar aiwenmo avatar boolean-dev avatar chengchuen avatar codertomato avatar forus0322 avatar gaogao110 avatar gaoyan1998 avatar hxp0618 avatar javaht avatar jinyanhui2008 avatar jpengcheng avatar leechor avatar leeoo avatar lewnn avatar mydq avatar pandas886 avatar suxinshuo avatar walkhan avatar wmtbnbo avatar wuzhenhua01 avatar xiaolin84250 avatar xiebanggui777 avatar yangzehan avatar yqwoe avatar zackyoungh avatar zhangyongtian avatar zhuangchong avatar ziqiang-wang avatar zzm0809 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dinky's Issues

[优化]编译main成功后,解压没有plugins文件夹和flink/lib,需要手动创建,希望目录树里可以自动创建。After main is compiled successfully, there is no plugins folder after decompression, which needs to be created manually. I hope it can be created automatically in the directory tree.

image
群里很多同学不看教程,不知道plugins在哪,也不知道要把flink/lib包放进去,启动造成缺少依赖的较多。这里怕造成误会希望解压目录树里能自动创建plugins文件夹和引用依赖,比如能在解压编译后文件的auto.sh里添加比如下面的逻辑:
Many students in the group don't read the tutorial, don't know where the plugins are, and don't know to put the flynk / lib package in. Startup causes more lack of dependency. For fear of misunderstanding, I hope that the plugins folder and reference dependency can be automatically created in the decompression directory tree, auto.sh Add the following logic to :

if [ ! -d "./plugins" ];then
echo '创建plugins'
mkdir plugins
cd plugins
if [ ! -d ${FLINK_HOME} ];then
echo '没有FLINK_HOME环境变量,请指定FLINK_HOME环境变量来引用Flink/lib'
else
ln -s ${FLINK_HOME}/lib
fi
fi
这样可以减少沟通成本,在启动时候自动引用。
This can reduce the communication cost and be referenced automatically at startup.

数据写出到mysql报错

CREATE TABLE log ( database string, table` string,
type string,
ts bigint,
data ROW<id bigint,insure_num string,case_code string,created_at string,applicant_phone string>
) WITH (
'connector.type' = 'kafka',
'connector.version' = 'universal',
'connector.topic' = 'xx',
'connector.startup-mode' = 'earliest-offset',
'connector.properties.0.key' = 'group.id',
'connector.properties.0.value' = 'xxx',
'connector.properties.1.key' = 'bootstrap.servers',
'connector.properties.1.value' = '192.168.0.0:9092',
'update-mode' = 'append',
'format.type' = 'json',
'format.derive-schema' = 'true'
);

CREATE TABLE sink_table (
ts bigint,
id varchar(100),
nick varchar(100),
mobile varchar(100),
create_time varchar(100)
) WITH (
'connector.type' = 'jdbc',
'connector.driver' = 'com.mysql.cj.jdbc.Driver',
'connector.url' = 'jdbc:mysql://192.168.0.0:0/xxx',
'connector.table' = 'xxx',
'connector.database' = 'xx',
'connector.username' = 'xx',
'connector.password' = 'xxx',
'connector.write.flush.max-rows' = '5',
'connector.write.flush.interval' = '5s'
);

use userportrait;
INSERT INTO userportrait.sink_table
select ts,data.id,data.case_code,data.applicant_phone,data.created_at from log;

`
执行配置
图片

报错信息

2022-02-17T18:07:12.607:Exception in executing FlinkSQL:
use userportrait
Error message: A database with name [userportrait] does not exist in the catalog: [default_catalog].

PrintStackTrace <<<
org.apache.flink.table.catalog.CatalogManager.setCurrentDatabase(CatalogManager.java:282)
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:1009)
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:665)
com.dlink.executor.Executor.executeSql(Executor.java:187)
com.dlink.job.JobManager.executeSql(JobManager.java:241)
com.dlink.service.impl.StudioServiceImpl.executeFlinkSql(StudioServiceImpl.java:97)
com.dlink.service.impl.StudioServiceImpl.executeSql(StudioServiceImpl.java:81)
com.dlink.controller.StudioController.executeSql(StudioController.java:38)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150)
org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895)
org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808)
org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87)
org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067)
org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963)
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006)
org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:909)
javax.servlet.http.HttpServlet.service(HttpServlet.java:681)
org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883)
javax.servlet.http.HttpServlet.service(HttpServlet.java:764)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
com.alibaba.druid.support.http.WebStatFilter.doFilter(WebStatFilter.java:124)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201)
org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162)
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197)
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97)
org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:540)
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135)
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78)
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357)
org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382)
org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65)
org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895)
org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1732)
org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191)
org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659)
org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
java.lang.Thread.run(Thread.java:748)

[Optimization] after registration data source, hope can be in the editor and quick reference table for quick DDL filling

image
如果注册数据源可以生成表的DDL,后期持久化,希望可以在编辑器中输入数据源名称后快捷填充table的DDL。
If the registered data source can generate the DDL of the table for post persistence, you can quickly fill in the DDL of the table after entering the name of the data source in the editor.

image
比如上图,编辑器中输入数据源名称,可提示对应的database和tables。鼠标选择后快速填充table DDL到编辑器
For example, in the above figure, enter the data source name in the editor to prompt the corresponding database and tables. Quickly fill table DDL to editor after mouse selection

mysql-cdc依赖冲突问题

dlink支持Flink绝大多数插件与连接器,但是对于cdc相关连接器如Mysql-cdc的引入会导致dlink无法正常启动。

不支持查看执行图的SQL,UI上需对用户友好

CREATE TABLE Orders (
    order_number BIGINT,
    price        DECIMAL(32,2),
    buyer        ROW<first_name STRING, last_name STRING>,
    order_time   TIMESTAMP(3)
) WITH (
  'connector' = 'datagen',
  'rows-per-second' = '1'
);
select order_number,price,order_time from Orders

image

image

在UI界面上需要更友好的提示

目录菜单下作业移动功能

如果将一个作业移动到某个目录下面,目前只能先删除作业在重建作业。可以增加一个作业移动功能

集群配置创建表单的参数项建议

Dlink 0.4.0 新功能创建集群配置会涉及到Yarn与Flink的配置文件及具体配置项,尤其是在提交perjob与application作业时会涉及Flink集群参数调优配置,那您认为常用的配置有哪些?您的提议通过后将会完善在创建表单上。
格式:(标题:配置项=默认值)
例子:(并行度:parallelism.default=1)

[功能] 目录侧一些功能需求

1.对于测试生产作业迁移,前期只能通过元数据做迁移,后期可改为直接以项目的形式导入导出(导入入不覆盖原有作业);
2.对于一些复用的SQL脚本作业,可添加粘贴作业到某个目录下,这样避免大量的重复性工作;

您好,作者,请问是否支持starrocks

可以的话,也请帮忙支持一下!
因为大量的left join,这种报表,clickhouse并发太差不适合搞

另外是否有支付宝? 我想尽自己一份绵薄之力,支持一下这个项目!

Who is using Dinky& FlinkSql?

Who is using Dinky & FlinkSql?

Sincerely thank everyone who constantly keeps on using and supporting Dinky. We will try our best to make Dinky better and make the community and ecology more prosperous.

The original intention of this issue:
We are willing to listen to the suggestions of the community and make Dinky more professional in the development and application of FlinkSql.
It is conducive to the promotion of both sides to attract more contributors.

What do we expect from you:
Please submit a comment in this issue to include the following information:
Your company/school/organization name and logo.
For what business scenario do you use Dinky


Please use the following format:

en:

  • Logo:
  • CompanyName:
  • CompanyOfficialAddress:
  • ContactInformation:
  • Purpose:
  • Using:
  • Scenario:

zh:

  • Logo:
  • 公司名称:
  • 公司官网地址:
  • 联系方式:
  • 用途:
  • 使用场景:

Thank you.

在plugins下放入mysql-cdc.jar后 无法启动项目

项目编译好是dlink-release-0.6.0-SNAPSHOT
目录下新建plugins文件夹,文件夹下只有cdc的jar包,启动日志没问题 访问前端页面报错500
flink-sql-connector-mysql-cdc-1.2.0.jar
控制台日志如下:
[dlink] 2022-01-27 14:39:12.429 ERROR org.apache.juli.logging.DirectJDKLog 175 log - Exception Processing ErrorPage[errorCode=0, location=/error] java.lang.NoSuchMethodError: javax.servlet.http.HttpServletRequest.getHttpServletMapping()Ljavax/servlet/http/HttpServletMapping;
at org.apache.catalina.core.ApplicationHttpRequest.setRequest(ApplicationHttpRequest.java:714) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationHttpRequest.(ApplicationHttpRequest.java:113) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationDispatcher.wrapRequest(ApplicationDispatcher.java:920) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:359) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:313) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardHostValve.custom(StandardHostValve.java:403) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardHostValve.status(StandardHostValve.java:249) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardHostValve.throwable(StandardHostValve.java:344) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1732) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.56.jar!/:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_312]

jar包冲突

我将jar把flink-sql-connector-oracle-cdc-2.1-SNAPSHOT.jar放到plugin目录下启动,会报这个错。

ERROR org.apache.juli.logging.DirectJDKLog 175 log - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler processing failed; nested exception is java.lang.NoSuchMethodError: javax.servlet.http.HttpServletRequest.getHttpServletMapping()Ljavax/servlet/http/HttpServletMapping;] with root cause java.lang.NoSuchMethodError: javax.servlet.http.HttpServletRequest.getHttpServletMapping()Ljavax/servlet/http/HttpServletMapping;

集群实例YarnSession模式在JobManager HA探查时没有考虑Per-Job独占集群从而导致销毁的YarnSession实例状态依然正常

如题:
在一个集群中,Flink在YarnSession模式和Per-Job模式的集群启动都会用到conf下的配置文件,当YarnSession的集群在Yarn中销毁后,由于Per-Job独占集群的rest api 依然存在,导致YarnSession实例状态依然正常,应在该模式下探查JobManager rest服务时区分集群类型,否则在YarnSession模式中无法正确提交任务

添加数据源MySQL报错

操作界面
图片

依赖
图片

图片

报错
[dlink] 2022-02-16 15:52:14.580 ERROR org.apache.juli.logging.DirectJDKLog 175 log - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: sun/misc/Service] with root cause java.lang.NoClassDefFoundError: sun/misc/Service at com.dlink.metadata.driver.Driver.get(Driver.java:27) ~[dlink-metadata-base-0.5.1.jar!/:?] at com.dlink.metadata.driver.Driver.build(Driver.java:38) ~[dlink-metadata-base-0.5.1.jar!/:?] at com.dlink.service.impl.DataBaseServiceImpl.testConnect(DataBaseServiceImpl.java:30) ~[classes!/:?] at com.dlink.service.impl.DataBaseServiceImpl$$FastClassBySpringCGLIB$$51132ca2.invoke(<generated>) ~[classes!/:?] at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.3.15.jar!/:5.3.15] at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:689) ~[spring-aop-5.3.15.jar!/:5.3.15] at com.dlink.service.impl.DataBaseServiceImpl$$EnhancerBySpringCGLIB$$d4c50160.testConnect(<generated>) ~[classes!/:?] at com.dlink.controller.DataBaseController.testConnect(DataBaseController.java:98) ~[classes!/:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?] at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?] at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) ~[spring-web-5.3.15.jar!/:5.3.15] at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) ~[spring-web-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:909) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at javax.servlet.http.HttpServlet.service(HttpServlet.java:681) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) ~[spring-webmvc-5.3.15.jar!/:5.3.15] at javax.servlet.http.HttpServlet.service(HttpServlet.java:764) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat-embed-websocket-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?] at com.alibaba.druid.support.http.WebStatFilter.doFilter(WebStatFilter.java:124) ~[druid-1.2.8.jar!/:1.2.8] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) ~[spring-web-5.3.15.jar!/:5.3.15] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar!/:5.3.15] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) ~[spring-web-5.3.15.jar!/:5.3.15] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar!/:5.3.15] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) ~[spring-web-5.3.15.jar!/:5.3.15] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar!/:5.3.15] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:540) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1732) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) [tomcat-embed-core-9.0.56.jar!/:?] at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.56.jar!/:?] at java.lang.Thread.run(Thread.java:829) [?:?]

[Bug] mysql 执行报错

1643362839(1)

因为config.getType() 中返回的是“Mysql”

SQLUtils 中DbType.of(dbType) 只能识别小写的“mysql”

可以用 config.getType().toLowerCase() 规避

udf以及connector相关问题

udf易用性是否可以优化下呢,如果每次更新或者新增udf都会重启服务而去flink集群的lib下也要新增才可以,使用起来较麻烦,而且如果是多人使用的话,还要考虑多人用的jar包之间是否存在冲突问题

生成FlinkDDL添加注释信息

生成FlinkDDL添加注释信息
目前状态:
图片

期望:
获取数据源各个表/字段注释 添加在FlinkDDL 中 为后续 sync 元数据到下游提供基础

运维中心布局调整

运维中心页面部分。可以调整下,
1.运行模式可以选择为下拉框。
2.作业名称或者作业id,在点击到搜索框的时候,可以以下拉框的方式显示所有的作业名称或者作业id然后根据模糊搜索定位出来
9ee1f145251cd626b2fc0f5199e9cdc

v0.5.1最新版无法运行最新版本 flink 的 demo

[dlink] 2022-02-10 18:01:28 CST ERROR org.apache.juli.logging.DirectJDKLog 175 log - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoSuchMethodError: 'java.util.Map org.apache.flink.table.api.EnvironmentSettings.toExec
utorProperties()'] with root cause java.lang.NoSuchMethodError: 'java.util.Map org.apache.flink.table.api.EnvironmentSettings.toExecutorProperties()'

Flink 1.14.3 把这个方法取消掉了 EnvironmentSettings.toExecutorProperties

保存Flink作业时任务时,报错。

保存Flink作业时任务时,报错。
log报错信息如下:

SQL: INSERT INTO dlink_task ( alias, type, check_point, save_point_strategy, save_point_path, parallelism, fragment, statement_set, cluster_id, cluster_configuration_id, config_json, create_time, update_time ) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ? )

Cause: java.sql.SQLException: Field 'name' doesn't have a default value

; Field 'name' doesn't have a default value; nested exception is java.sql.SQLException: Field 'name' doesn't have a default value] with root cause java.sql.SQLException: Field 'name' doesn't have a default value

部分截图放在语雀上了:
https://www.yuque.com/docs/share/7075a314-bcb2-42a1-9d55-17e7ddfe59c0?#

SpringBoot与hadoop相关依赖冲突问题

dlink 是一个后台基于SpringBoot的react项目,其提交JobGraph到集群执行时,依赖Flink的源码,当涉及hadoop相关依赖时,会出现servlet等依赖的冲突,当程序内部报错时会产生额外的servlet相关报错。
(待解决,可签领)

编译报错问题!

你好,我是在Windows10下编译的,环境保持一致,到dlink-web的时候报如下错误:
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:exec (exec-npm-install) on project dlink-web: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:exec (exec-npm-install) on project dlink-web: Command execution failed.

有空回复一下,谢谢

flinkSQL执行消费kafka报错

执行SQL
、CREATE TABLE log (
database string,
table string,
type string,
ts bigint,
data ROW<id STRING,nick VARCHAR,sex int,mobile STRING,create_time STRING>
) WITH (
'connector.type' = 'kafka',
'connector.version' = 'universal',
'connector.topic' = 'user_unite_ods_db_xsb',
'connector.startup-mode' = 'earliest-offset',
'connector.properties.0.key' = 'group.id',
'connector.properties.0.value' = 'FlinkSQLtoMySQL_test02',
'connector.properties.1.key' = 'bootstrap.servers',
'connector.properties.1.value' = '192.168.0.0:9092',
'update-mode' = 'append',
'format.type' = 'json',
'format.derive-schema' = 'true'
);

select database,table from log

报错信息
[dlink] 2022-02-15 16:43:46.951 ERROR org.apache.juli.logging.DirectJDKLog 175 log - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: org/apache/flink/configuration/ReadableConfig] with root cause java.lang.ClassNotFoundException: org.apache.flink.configuration.ReadableConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_212]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_212]
at org.springframework.boot.loader.LaunchedURLClassLoader.loadClass(LaunchedURLClassLoader.java:151) ~[dlink-admin-0.5.1.jar:?]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_212]
at com.dlink.job.JobManager.createExecutor(JobManager.java:161) ~[dlink-core-0.5.1.jar!/:?]
at com.dlink.job.JobManager.createExecutorWithSession(JobManager.java:182) ~[dlink-core-0.5.1.jar!/:?]
at com.dlink.job.JobManager.init(JobManager.java:207) ~[dlink-core-0.5.1.jar!/:?]
at com.dlink.job.JobManager.build(JobManager.java:131) ~[dlink-core-0.5.1.jar!/:?]
at com.dlink.service.impl.StudioServiceImpl.executeFlinkSql(StudioServiceImpl.java:96) ~[classes!/:?]
at com.dlink.service.impl.StudioServiceImpl.executeSql(StudioServiceImpl.java:81) ~[classes!/:?]
at com.dlink.controller.StudioController.executeSql(StudioController.java:38) ~[classes!/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_212]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_212]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_212]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_212]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:909) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:681) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) ~[spring-webmvc-5.3.15.jar!/:5.3.15]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:764) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:227) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) ~[tomcat-embed-websocket-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?]
at com.alibaba.druid.support.http.WebStatFilter.doFilter(WebStatFilter.java:124) ~[druid-1.2.8.jar!/:1.2.8]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:117) ~[spring-web-5.3.15.jar!/:5.3.15]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:189) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:162) ~[tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:197) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:540) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:357) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:382) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:895) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1732) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) [tomcat-embed-core-9.0.56.jar!/:?]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-9.0.56.jar!/:?]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]

/opt/dlink/lib:依赖包
aggs-matrix-stats-client-6.3.1.jar j2objc-annotations-1.3.jar
akka-actor_2.11-2.5.21.jar jackson-annotations-2.7.0.jar
akka-protobuf_2.11-2.5.21.jar jackson-annotations-2.9.7.jar
akka-slf4j_2.11-2.5.21.jar jackson-core-2.11.2.jar
akka-stream_2.11-2.5.21.jar jackson-core-asl-1.9.11.jar
aopalliance-1.0.jar jackson-databind-2.7.3.jar
avatica-core-1.17.0.jar jackson-databind-2.8.4.jar
checker-qual-3.5.0.jar jackson-dataformat-cbor-2.8.10.jar
chill_2.11-0.7.6.jar jackson-dataformat-smile-2.8.10.jar
chill-java-0.7.6.jar jackson-dataformat-yaml-2.8.10.jar
clickhouse4j-1.4.4.jar jackson-mapper-asl-1.9.11.jar
clickhouse-jdbc-0.2.2.jar jackson-module-paranamer-2.9.7.jar
commons-beanutils-1.9.3.jar jackson-module-scala_2.11-2.9.7.jar
commons-cli-1.3.1.jar janino-3.0.11.jar
commons-codec-1.9.jar javassist-3.24.0-GA.jar
commons-collections-3.2.2.jar javax.el-3.0.1-b12.jar
commons-collections4-4.1.jar jaxb-api-2.3.0.jar
commons-compiler-3.0.11.jar jedis-3.2.0.jar
commons-compress-1.20.jar jna-4.5.1.jar
commons-io-2.7.jar joda-time-2.9.9.jar
commons-lang3-3.3.2.jar jopt-simple-5.0.2.jar
commons-logging-1.2.jar json4s-ast_2.11-3.5.3.jar
commons-math3-3.5.jar json4s-ast_2.11-3.6.7.jar
commons-pool-1.5.4.jar json4s-core_2.11-3.5.3.jar
commons-pool2-2.6.2.jar json4s-core_2.11-3.6.7.jar
config-1.3.3.jar json4s-jackson_2.11-3.5.3.jar
curvesapi-1.04.jar json4s-native_2.11-3.6.7.jar
data-flow-1.0-SNAPSHOT.jar json4s-scalap_2.11-3.5.3.jar
dlink-client-1.13-0.5.1.jar json4s-scalap_2.11-3.6.7.jar
dlink-connector-jdbc-1.13-0.5.1.jar jsr305-1.3.9.jar
dlink-function-0.5.1.jar junit-4.12.jar
dlink-metadata-clickhouse-0.5.1.jar kafka-clients-2.4.1.jar
dlink-metadata-mysql-0.5.1.jar kryo-2.24.0.jar
dlink-metadata-oracle-0.5.1.jar listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar
dlink-metadata-postgresql-0.5.1.jar log4j-1.2.17.jar
easypoi-annotation-3.2.0.jar log4j-1.2-api-2.12.1.jar
easypoi-base-3.2.0.jar log4j-api-2.10.0.jar
easypoi-web-3.2.0.jar log4j-api-2.12.1.jar
elasticsearch-6.3.1.jar log4j-api-2.14.0.jar
elasticsearch-cli-6.3.1.jar log4j-api-scala_2.11-11.0.jar
elasticsearch-core-6.3.1.jar log4j-to-slf4j-2.14.0.jar
elasticsearch-rest-client-6.3.1.jar lombok-1.18.16.jar
elasticsearch-rest-high-level-client-6.3.1.jar lucene-analyzers-common-7.3.1.jar
elasticsearch-secure-sm-6.3.1.jar lucene-backward-codecs-7.3.1.jar
elasticsearch-x-content-6.3.1.jar lucene-core-7.3.1.jar
error_prone_annotations-2.3.4.jar lucene-grouping-7.3.1.jar
failureaccess-1.0.1.jar lucene-highlighter-7.3.1.jar
fastjson-1.2.58.jar lucene-join-7.3.1.jar
fastjson-1.2.75.jar lucene-memory-7.3.1.jar
fastutil-6.5.7.jar lucene-misc-7.3.1.jar
flink-annotations-1.12.0.jar lucene-queries-7.3.1.jar
flink-cep_2.11-1.12.0.jar lucene-queryparser-7.3.1.jar
flink-clients_2.11-1.12.0.jar lucene-sandbox-7.3.1.jar
flink-connector-base-1.12.0.jar lucene-spatial3d-7.3.1.jar
flink-connector-elasticsearch6_2.11-1.13.2.jar lucene-spatial-7.3.1.jar
flink-connector-elasticsearch-base_2.11-1.13.2.jar lucene-spatial-extras-7.3.1.jar
flink-connector-files-1.12.0.jar lucene-suggest-7.3.1.jar
flink-connector-jdbc_2.11-1.12.0.jar lz4-1.3.0.jar
flink-connector-kafka_2.11-1.12.0.jar lz4-java-1.6.0.jar
flink-connector-mysql-cdc-1.1.1.jar minlog-1.2.jar
flink-connector-redis_2.11-1.1.5.jar mysql-connector-java-5.1.47.jar
flink-core-1.12.0.jar mysql-connector-java-8.0.22.jar
flink-csv-1.12.0.jar objenesis-2.1.jar
flink-dist_2.11-1.12.0.jar paranamer-2.8.jar
flink-file-sink-common-1.12.0.jar parent-join-client-6.3.1.jar
flink-hadoop-fs-1.12.0.jar parquet-avro-1.9.0.jar
flink-java-1.12.0.jar parquet-column-1.9.0.jar
flink-json-1.12.0.jar parquet-common-1.9.0.jar
flink-metrics-core-1.12.0.jar parquet-encoding-1.9.0.jar
flink-optimizer_2.11-1.12.0.jar parquet-format-2.3.1.jar
flink-queryable-state-client-java-1.12.0.jar parquet-hadoop-1.9.0.jar
flink-runtime_2.11-1.12.0.jar parquet-jackson-1.9.0.jar
flink-runtime-web_2.11-1.12.0.jar platform-common-1.0-SNAPSHOT.jar
flink-scala_2.11-1.12.0.jar poi-3.15.jar
flink-shaded-asm-7-7.1-12.0.jar poi-ooxml-3.15.jar
flink-shaded-guava-18.0-12.0.jar poi-ooxml-schemas-3.15.jar
flink-shaded-jackson-2.10.1-12.0.jar protobuf-java-2.5.0.jar
flink-shaded-netty-4.1.49.Final-12.0.jar rank-eval-client-6.3.1.jar
flink-shaded-zookeeper-3-3.4.14-12.0.jar reactive-streams-1.0.2.jar
flink-shaded-zookeeper-3.4.14.jar reflections-0.9.10.jar
flink-sql-connector-kafka_2.11-1.12.0.jar scalacheck_2.11-1.15.2.jar
flink-streaming-java_2.11-1.12.0.jar scala-compiler-2.11.12.jar
flink-streaming-scala_2.11-1.12.0.jar scalactic_2.11-3.2.0-SNAP10.jar
flink-table_2.11-1.12.0.jar scala-java8-compat_2.11-0.7.0.jar
flink-table-api-java-bridge_2.11-1.12.0.jar scala-library-2.11.12.jar
flink-table-api-scala-bridge_2.11-1.12.0.jar scala-parser-combinators_2.11-1.0.4.jar
flink-table-blink_2.11-1.12.0.jar scala-reflect-2.11.12.jar
flink-table-common-1.12.0.jar scalatest_2.11-3.2.0-SNAP10.jar
flink-table-planner-blink_2.11-1.12.0.jar scala-xml_2.11-1.0.5.jar
flink-table-runtime-blink_2.11-1.12.0.jar scopt_2.11-3.5.0.jar
force-shading-1.12.0.jar slf4j-api-1.7.30.jar
grizzled-slf4j_2.11-1.3.2.jar slf4j-log4j12-1.7.30.jar
guava-30.1-jre.jar snakeyaml-1.17.jar
hamcrest-core-1.3.jar snappy-java-1.1.4.jar
HdrHistogram-2.1.9.jar ssl-config-core_2.11-0.3.7.jar
hppc-0.7.1.jar stax-api-1.0.1.jar
httpasyncclient-4.1.2.jar t-digest-3.2.jar
httpclient-4.5.jar test-interface-1.0.jar
httpcore-4.4.1.jar validation-api-1.1.0.Final.jar
httpcore-nio-4.4.5.jar xmlbeans-2.6.0.jar
httpmime-4.5.2.jar zstd-jni-1.4.3-1.jar

[Bug And Suggests]

too many bugs:
1.can not save sql to job
2.if i use a simple sql based on flink-faker-connector to test sql-graph,the application return nothing;i think sql-graph should not analyse the flink "properties"(connector='faker'.....)。
3.for the column origin,calcite may be a better way,calcite has provided metadata to find the origin of the column
4.cluster manage should consider the free resource of cluster,and show the resource in the ui
5.the project's log is bad,when occur error,i did not see logs

Share your tips and complete grammar

Dlink 的SQL自动提示与补全功能可以显著提升编写效率与改善交互体验,灵活使用该特性将降低团队开发成本,在此您可以分享您认为较为实用的例子,您的分享通过后将会完善到Dlink默认文档中,供大家使用。
格式:(关键字:补全内容)
例子:(hivesql:SET table.sql-dialect=hive;)

0.3.0 main分支 sql脚本中 dlink_cluster缺少version字段

Dlink版本:0.3.1

ERROR org.apache.juli.logging.DirectJDKLog 175 log - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.jdbc.BadSqlGrammarException:

Error querying database. Cause: java.sql.SQLSyntaxErrorException: Unknown column 'version' in 'field list'

The error may exist in com/dlink/mapper/ClusterMapper.java (best guess)

The error may involve defaultParameterMap

The error occurred while setting parameters

SQL: SELECT id,alias,type,hosts,job_manager_host,version,status,note,name,enabled,create_time,update_time FROM dlink_cluster WHERE (enabled = ?)

Cause: java.sql.SQLSyntaxErrorException: Unknown column 'version' in 'field list'

CREATE TABLE dlink_cluster (
id int(11) NOT NULL AUTO_INCREMENT COMMENT 'ID',
name varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL COMMENT '名称',
alias varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL COMMENT '别名',
type varchar(50) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL COMMENT '类型',
hosts text CHARACTER SET utf8 COLLATE utf8_general_ci NULL COMMENT 'HOSTS',
job_manager_host varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL COMMENT 'JMhost',
status int(1) NULL DEFAULT NULL COMMENT '状态',
note varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci NULL DEFAULT NULL COMMENT '注释',
enabled tinyint(1) NOT NULL DEFAULT 1 COMMENT '是否启用',
create_time datetime(0) NULL DEFAULT NULL COMMENT '创建时间',
update_time datetime(0) NULL DEFAULT NULL COMMENT '更新时间',
PRIMARY KEY (id) USING BTREE,
UNIQUE INDEX idx_name(name) USING BTREE
) ENGINE = InnoDB CHARACTER SET = utf8 COLLATE = utf8_general_ci COMMENT = '集群' ROW_FORMAT = Dynamic;

在表中手动添加上version字段后正常

[Bug] SQL编辑器中格式化问题

Dlinky Version: 0.5.0-release

问题描述:
由于mysql中字段未避开关键字 在编辑SQL时会使用 eg: status 此种方式 语法校验正常
在此平台中编辑后使用 格式化SQL 功能后 会变成 eg: status 字段两侧有空格 语法校验失败

如下图:
格式化前: SQL正常校验
图片

图片

格式化后: SQL语法校验失败
图片

图片

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.