GithubHelp home page GithubHelp logo

webankfintech / dataspherestudio Goto Github PK

View Code? Open in Web Editor NEW
3.0K 181.0 985.0 240.79 MB

DataSphereStudio is a one stop data application development& management portal, covering scenarios including data exchange, desensitization/cleansing, analysis/mining, quality measurement, visualization, and task scheduling.

Home Page: https://github.com/WeBankFinTech/DataSphereStudio-Doc

License: Apache License 2.0

Java 91.01% Scala 4.98% Shell 0.61% SCSS 3.25% Dockerfile 0.04% Less 0.11%
workflow governance azkaban davinci linkis spark hive hadoop visualis zeppelin

dataspherestudio's Introduction

DSS

License

English | 中文

Introduction

       DataSphere Studio (DSS for short) is WeDataSphere, a one-stop data application development management portal developed by WeBank.

       With the pluggable integrated framework design and the Linkis, a computing middleware, DSS can easily integrate various upper-layer data application systems, making data development simple and easy to use.

       DataSphere Studio is positioned as a data application development portal, and the closed loop covers the entire process of data application development. With a unified UI, the workflow-like graphical drag-and-drop development experience meets the entire lifecycle of data application development from data import, desensitization cleaning, data analysis, data mining, quality inspection, visualization, scheduling to data output applications, etc.

       With the connection, reusability, and simplification capabilities of Linkis, DSS is born with financial-grade capabilities of high concurrency, high availability, multi-tenant isolation, and resource management.

UI preview

       Please be patient, it will take some time to load gif.

DSS-V1.0 GIF

Core features

1. One-stop, full-process application development management UI

       DSS is highly integrated. Currently integrated components include(DSS version compatibility for the above components, please visit: Compatibility list of integrated components):

       1. Data Development IDE Tool - Scriptis

       2. Data Visualization Tool - Visualis (Based on the open source project Davinci contributed by CreditEase)

       3. Data Quality Management Tool - Qualitis

       4. Workflow scheduling tool - Schedulis

       5. Data Exchange Tool - Exchangis

       6. Data Api Service - DataApiService

       7. Streaming Application Development Management Tool - Streamis

       8. One-stop machine Learning Platform - Prophecis

       9. Workflow Task Scheduling Tool - DolphinScheduler (In Code Merging)

       10. Help documentation and beginner's guide - UserGuide (In Code Merging)

       11. Data Model Center - DataModelCenter (In development)

       DSS version compatibility for the above components, please visit: Compatibility list of integrated components.

       With a pluggable framework architecture, DSS is designed to allow users to quickly integrate new data application tools, or replace various tools that DSS has integrated. For example, replace Scriptis with Zeppelin, and replace Schedulis with DolphinScheduler...

DSS one-stop video

2. AppConn, based on Linkis,defines a unique design concept

       AppConn is the core concept that enables DSS to easily and quickly integrate various upper-layer web systems.

       AppConn, an application connector, defines a set of unified front-end and back-end three-level integration protocols, allowing external data application systems to easily and quickly becoming a part of DSS data application development.

       The three-level specifications of AppConn are: the first-level SSO specification, the second-level organizational structure specification, and the third-level development process specification.

       DSS arranges multiple AppConns in series to form a workflow that supports real-time execution and scheduled execution. Users can complete the entire process development of data applications with simple drag and drop operations.

       Since AppConn is integrated with Linkis, the external data application system shares the capabilities of resource management, concurrent limiting, and high performance. AppConn also allows sharable context across system level and thus makes external data application completely gets away from application silos.

3. Workspace, as the management unit

       With Workspace as the management unit, it organizes and manages business applications of various data application systems, defines a set of common standards for collaborative development of workspaces across data application systems, and provides user role management capabilities.

4. Integrated data application components

       DSS has integrated a variety of upper-layer data application systems by implementing multiple AppConns, which can basically meet the data development needs of users.

       If desired, new data application systems can also be easily integrated to replace or enrich DSS's data application development process. Click me to learn how to quickly integrate new application systems

Component Description DSS0.X compatible version (DSS0.9.1 recommended) DSS1.0 compatible version (DSS1.1.0 recommended)
Linkis Computing middleware Apache Linkis, by providing standard interfaces such as REST/WebSocket/JDBC/SDK, upper-layer applications can easily connect and access underlying engines such as MySQL/Spark/Hive/Presto/Flink. Linkis0.11.0 is recommended (*Released *) >= Linkis1.1.1 (released)
DataApiService (DSS has built-in third-party application tools) data API service. The SQL script can be quickly published as a Restful interface, providing Rest access capability to the outside world. Not supported DSS1.1.0 recommended (released)
Scriptis (DSS has built-in third-party application tools) support online writing of SQL, Pyspark, HiveQL and other scripts, and submit to [Linkis](https ://github.com/WeBankFinTech/Linkis) data analysis web tool. Recommended DSS0.9.1 (Released) Recommended DSS1.1.0 (Released)
Schedulis Workflow task scheduling system based on Azkaban secondary development, with financial-grade features such as high performance, high availability and multi-tenant resource isolation. Recommended Schedulis0.6.1 (released) >= Schedulis0.7.0 (Released)
EventCheck (a third-party application tool built into DSS) provides signal communication capabilities across business, engineering, and workflow. Recommended DSS0.9.1 (Released) Recommended DSS1.1.0 (Released)
SendEmail (DSS has built-in third-party application tools) provides the ability to send data, all the result sets of other workflow nodes can be sent by email DSS0.9.1 is recommended (released) Recommended DSS1.1.0 (Released)
Qualitis Data quality verification tool, providing data verification capabilities such as data integrity and correctness Qualitis0.8.0 is recommended (**Released **) >= Qualitis0.9.2 (Released)
Streamis Streaming application development management tool. It supports the release of Flink Jar and Flink SQL, and provides the development, debugging and production management capabilities of streaming applications, such as: start-stop, status monitoring, checkpoint, etc. Not supported >= Streamis0.2.0 (Released)
Prophecis A one-stop machine learning platform that integrates multiple open source machine learning frameworks. Prophecis' MLFlow can be connected to DSS workflow through AppConn. Not supported >= Prophecis 0.3.2 (Released)
Exchangis A data exchange platform that supports data transmission between structured and unstructured heterogeneous data sources, the upcoming Exchangis1. 0, will work with DSS workflow not supported = Exchangis1.0.0 (Released)
Visualis A data visualization BI tool based on the secondary development of Davinci, an open source project of CreditEase, provides users with financial-level data visualization capabilities in terms of data security. Recommended Visualis0.5.0 = Visualis1.0.0 (Released)
DolphinScheduler Apache DolphinScheduler, a distributed and easily scalable visual workflow task scheduling platform, supports one-click publishing of DSS workflows to DolphinScheduler. Not supported DolphinScheduler1.3.X (Released)
UserGuide (DSS will be built-in third-party application tools) contains help documents, beginner's guide, Dark mode skinning, etc. Not supported >= DSS1.1.0 (Released)
DataModelCenter (the third-party application tool that DSS will build) mainly provides data warehouse planning, data model development and data asset management capabilities. Data warehouse planning includes subject domains, data warehouse hierarchies, modifiers, etc.; data model development includes indicators, dimensions, metrics, wizard-based table building, etc.; data assets are connected to Apache Atlas to provide data lineage capabilities . Not supported Planned in DSS1.2.0 (under development)
UserManager (DSS has built-in third-party application tools) automatically initialize all user environments necessary for a new DSS user, including: creating Linux users, various user paths, directory authorization, etc. Recommended DSS0.9.1 (Released) Planning
Airflow Supports publishing DSS workflows to Apache Airflow for scheduled scheduling. PR not yet merged Not supported

Demo Trial environment

       The function of DataSphere Studio supporting script execution has high security risks, and the isolation of the WeDataSphere Demo environment has not been completed. Considering that many users are inquiring about the Demo environment, we decided to first issue invitation codes to the community and accept trial applications from enterprises and organizations.

       If you want to try out the Demo environment, please join the DataSphere Studio community user group (Please refer to the end of the document), and contact WeDataSphere Group Robot to get an invitation code.

       DataSphereStudio Demo environment login page: click me to enter

Download

       Please go to the DSS Releases Page to download a compiled version or a source code package of DSS.

Compile and deploy

       Please follow Compile Guide to compile DSS from source code.

       Please refer to Deployment Documents to do the deployment.

Examples and Guidance

       You can find examples and guidance for how to use DSS in User Manual.

Documents

       For a complete list of documents for DSS1.0, see DSS-Doc

       The following is the installation guide for DSS-related AppConn plugins:

Architecture

DSS Architecture

Usage Scenarios

      DataSphere Studio is suitable for the following scenarios:

      1. Scenarios in which big data platform capability is being prepared or initialized but no data application tools are available.

      2. Scenarios in which users already have big data foundation platform capabilities but with only a few data application tools.

      3. Scenarios in which users have the ability of big data foundation platform and comprehensive data application tools, but suffers strong isolation and and high learning costs because those tools have not been integrated together.

      4. Scenarios in which users have the capabilities of big data foundation platform and comprehensive data application tools. but lacks unified and standardized specifications, while a part of these tools have been integrated.

Contributing

       Contributions are always welcomed, we need more contributors to build DSS together. either code, or doc, or other supports that could help the community.

       For code and documentation contributions, please follow the contribution guide.

Communication

       For any questions or suggestions, please kindly submit an issue.

       You can scan the QR code below to join our WeChat to get more immediate response.

communication

Who is using DSS

       We opened an issue for users to feedback and record who is using DSS.

       Since the first release of DSS in 2019, it has accumulated more than 700 trial companies and 1000+ sandbox trial users, which involving diverse industries, from finance, banking, tele-communication, to manufactory, internet companies and so on.

License

       DSS is under the Apache 2.0 license. See the License file for details.

dataspherestudio's People

Contributors

5herhom avatar adamyuanyuan avatar chaogefeng avatar demonray avatar dependabot[bot] avatar det101 avatar elishazhang avatar firefoxahri avatar hantang1 avatar hmhwz avatar htyredc avatar jackchen0810 avatar jackxu2011 avatar jinyangrao avatar liuyou2 avatar luban08 avatar mingfengwang avatar peacewong avatar sargentti avatar schumiyi avatar selfimpr001 avatar wanap avatar wushengyeyouya avatar wxyn avatar yangzhiyue avatar yhdup avatar yuankang134 avatar yuchenyao avatar zqburde avatar zwx-master avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dataspherestudio's Issues

登录后无法访问数据库。

查看linkis-metadata组件的log日志,报错信息为
ERROR(Druid-ConnectionPool-Create) ERROR DruidDataSource - create connection error
java.sql.SQLSyntaxErrorException:Syntax error:Encountered "" at line 1 ,column 8.

添加项目报错

添加项目报错,具体错误如下,我是一台机器部署的标准版linkis,一台部署的简单版studio, eureka上只看到dss-server一个studio的服务,重启过几次studio, 其他的几个studio服务一直看不到。
image

error code(错误码): 90002, error message(错误信息): add scheduler project failederrCode: 90019 ,desc: errCode: 90020 ,desc: 用户token为空 ,ip: cancer-03.hdp.com ,port: 9004 ,serviceKind: dss-server ,ip: cancer-03.hdp.com ,port: 9004 ,serviceKind: dss-server.

请求失败截图
image

image

启动出现问题

简单版
安装Linkis(0.9.2)算是正常吧,除了JDBC服务没有启动,其它服务正常启动;
简单版
安装dss(0.6.0)没有明显的错误日志,不知道怎么解决
<-------------------------------->
Begin to start dss-server
INFO: + End to start dss-server
<-------------------------------->
<-------------------------------->
Begin to start dss-flow-execution-entrance
INFO: + End to start dss-flow-execution-entrance
<-------------------------------->
<-------------------------------->
Begin to start linkis-appjoint-entrance
INFO: + End to start linkis-appjoint-entrance
<-------------------------------->
<-------------------------------->
Begin to start visualis-server
INFO: + End to start visualis-server
<-------------------------------->

前端iframe嵌入域名对容器不友好

前后端嵌入iframe的形式, 对容器嵌入并不是特别友好.

容器中一般都指定一个service的名称这样去访问, 但是在浏览器中就不认识了.

最好可以考虑使用一个反向代理转发一下, 嵌入的还是自己服务的固定的路径, 转发到对应的服务.

azkaban登录 失败:无此用户

When I create a project in DataSphereStudio Web, it shows an Exception below:

2019-12-05 18:27:42.597 ERROR [qtp370279024-91] com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP 83 apply - operation failed(操作失败)s java.lang.IllegalAccessError: azkaban登录
失败:无此用户
        at com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.service.AzkabanSecurityService.lambda$getSession$1(AzkabanSecurityService.java:107) ~[dss-azkaban-scheduler-appjoint-0.5.0.jar:?]

This is my config in azkaban-user.xml in azkaban:

<azkaban-users>
  <user groups="azkaban" password="azkaban" roles="admin" username="azkaban"/>
  <user groups="jiaorenyu" password="jiaorenyu" roles="admin" username="jiaorenyu"/>
  <user password="metrics" roles="metrics" username="metrics"/>

  <role name="admin" permissions="ADMIN"/>
  <role name="metrics" permissions="METRICS"/>
</azkaban-users>

This is my config in token.properties in dss:
jiaorenyu=jiaorenyu

I can login both system(azkaban and dss) use user=jiaorenyu, password=jiaorenyu

Workflow node qualitis execution wrong and throws null pointer exception

2019-12-12 23:04:41.706 ERROR [Engine-Scheduler-ThreadPool-2] com.webank.wedatasphere.appjoint.QualitisAppJoint 106 submit - Error! Can not submit job java.lang.NullPointerException: null
        at com.webank.wedatasphere.appjoint.QualitisNodeExecution.submit(QualitisNodeExecution.java:67) [dss-qualitis-appjoint-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.appjoint.execution.core.LongTermNodeExecution.execute(LongTermNodeExecution.java:65) [dss-appjoint-core-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.linkis.appjoint.entrance.execute.AppJointEntranceEngine.execute(AppJointEntranceEngine.scala:169) [linkis-appjoint-entrance-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.linkis.appjoint.entrance.job.AppJointEntranceJob$$anonfun$1.apply(AppJointEntranceJob.scala:75) [linkis-appjoint-entrance-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.linkis.appjoint.entrance.job.AppJointEntranceJob$$anonfun$1.apply(AppJointEntranceJob.scala:75) [linkis-appjoint-entrance-0.6.0.jar:?]
        at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.1.jar:?]
        at com.webank.wedatasphere.dss.linkis.appjoint.entrance.job.AppJointEntranceJob.run(AppJointEntranceJob.scala:75) [linkis-appjoint-entrance-0.6.0.jar:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_141]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_141]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_141]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_141]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_141]

Failed to connect database

**问题描述:登陆后无法连接上数据库

**已尝试解决方法:

已在 ${LINKIS_HOME}/linkis-metadata/conf/linkis.properties中添加了Hive meta信息,如下:
##datasource
wds.linkis.server.login.use.default=false
hive.meta.url=jdbc:mysql://127.0.0.1:3306/hive?characterEncoding=UTF-8
hive.meta.user=xxxx
hive.meta.password=xxxx

**查看日志
${LINKIS_HOME}/linkis-metadata/logs/linkis-metada.out

ERROR (main) ERROR DruidDataSource - mysql should not use 'PoolPreparedStatements'

DataShphere Studio安装问题汇总&&解决方案

问题1:在visualis界面执行sql
错误如下,前端一直转圈
image
后端错误
image
问题解决方案:
在visualis-server/lib目录删除jsr311-api-1.1.1.jar这个包【jsr包冲突导致】
验证:已经正常,如下图
image

问题2:工程中工作流节点保存失败
错误如下:
image
解决方案:在linkis-publish linkins-properties里面新增getway的key-value
wds.linkis.gateway.ip=192.168.201.85
wds.linkis.gateway.port=9011
重启linkis-publish微服务即可。

问题3:工作流执行失败
错误如下:
image
解决方案:
打开linkis-getway /conf/linkis.properties文件,如下:
image
将wds.linkis.gateway.conf.url.pass.auth 其中/api/rest_j/v1/entrance删掉,只留下/dws即可正常执行工作流。

问题4:工作流发布到azkaban里面执行错误,但是在dss单独执行工作流成功
错误如下:
image
问题原因:由于安装azkaban的服务器hadoop+hive jar存在低版本的netty包,存在兼容问题。
解决方案:暂未解决

问题5:工作流节点执行成功,但是保存失败
错误如下:
image

问题6:关于Qualitis初始化任务成功,执行任务失败的问题
错误截图如下:前提条件,scripts、visualis执行spark任务都是成功的,获取队列无异常。
image
问题解决方案:单独部署的Qualitis,问题已定位,是Qualitis默认用户admin的队列和Dss用户hive的队列不一致导致的。我利用admin用户创建hive用户,并为hive赋admin权限,即可解决此问题。
目前数据质量已经集成到DSS。

How does dss integrate with azkaban?

Hello,DSS community.I would like to know:

How does work flow of dss translate to azkaban's runtime?

How does azkaban communicate with dss?

I am reading the code but still confused.Very appreciated it if you could give me some hints.

One-click deployment optimization

  1. The execution directory is separated from the installation directory
    2.The default path of visualis-server's yml file front end is wrong
  2. Simple version does not install qualitis and azkaban by default

wrong user name or password

image
i follow the Installation guide but
image
there no user in the linkis or dss database and

linkis-gateway error logs

2019-12-12 19:13:54.007 WARN  [reactor-http-nio-3]
com.webank.wedatasphere.linkis.gateway.security.LDAPUserRestful 55 warn - 
wrong user name or password
javax.naming.ConfigurationException: 
java.naming.provider.url property does not contain a URL

help!thanks

Failed to create workflow in a project

The error message was:

"
operation failed(操作失败)s!the reason(原因):HttpClientResultException: errCode: 10905 ,desc: URL http://127.0.0.1:9001/api/rest_j/v1/bml/upload request failed! ResponseBody is {"method":null,"status":1,"message":"error code(错误码): 50073, error message(错误信息): 提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n; bad SQL grammar []; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist.","data":{"errorMsg":{"serviceKind":"bml-server","level":2,"port":9999,"errCode":50073,"ip":"DataSphere-web1","desc":"提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist\n; bad SQL grammar []; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'linkis.linkis_resources_task' doesn't exist"}}}. ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server
"

According to the page https://github.com/WeBankFinTech/DataSphereStudio/blob/master/docs/zh_CN/ch1/DSS%E5%AE%89%E8%A3%85%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98%E5%88%97%E8%A1%A8.md#6-dss%E5%88%9B%E5%BB%BA%E5%B7%A5%E7%A8%8B%E6%8A%A5%E8%A1%A8linkislinkis_resources_task%E4%B8%8D%E5%AD%98%E5%9C%A8, I needed to run db/moudle/linkis-bml.sql manually.

Tried this SQL file, found only linkis_resources_task table was missing, ran the query manually on the mysql session. The above error was gone, found a new one.

"
operation failed(操作失败)s!the reason(原因):HttpClientResultException: errCode: 10905 ,desc: URL http://127.0.0.1:9001/api/rest_j/v1/bml/upload request failed! ResponseBody is {"method":null,"status":1,"message":"error code(错误码): 50073, error message(错误信息): 提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n; SQL []; Column 'system' cannot be null; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null.","data":{"errorMsg":{"serviceKind":"bml-server","level":2,"port":9999,"errCode":50073,"ip":"DataSphere-web1","desc":"提交上传资源任务失败:\n### Error updating database. Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n### The error may involve com.webank.wedatasphere.linkis.bml.dao.TaskDao.insert-Inline\n### The error occurred while setting parameters\n### SQL: INSERT INTO linkis_resources_task( resource_id,version,operation,state, submit_user,system,instance, client_ip,err_msg,start_time,end_time,last_update_time, extra_params ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\n### Cause: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null\n; SQL []; Column 'system' cannot be null; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Column 'system' cannot be null"}}}. ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server
"

mysql> show create table linkis_resources_task \G
*************************** 1. row ***************************
       Table: linkis_resources_task
Create Table: CREATE TABLE `linkis_resources_task` (
  `id` bigint(20) NOT NULL AUTO_INCREMENT,
  `resource_id` varchar(50) DEFAULT NULL COMMENT '资源id,资源的uuid',
  `version` varchar(20) DEFAULT NULL COMMENT '当前操作的资源版本号',
  `operation` varchar(20) NOT NULL COMMENT '操作类型.upload = 0, update = 1',
  `state` varchar(20) NOT NULL DEFAULT 'Schduled' COMMENT '任务当前状态:Schduled, Running, Succeed, Failed,Cancelled',
  `submit_user` varchar(20) NOT NULL DEFAULT '' COMMENT '任务提交用户名',
  `system` varchar(20) NOT NULL DEFAULT '' COMMENT '子系统名 wtss',
  `instance` varchar(50) NOT NULL COMMENT '物料库实例',
  `client_ip` varchar(50) DEFAULT NULL COMMENT '请求IP',
  `extra_params` text COMMENT '额外关键信息.如批量删除的资源IDs及versions,删除资源下的所有versions',
  `err_msg` varchar(2000) DEFAULT NULL COMMENT '任务失败信息.e.getMessage',
  `start_time` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT '开始时间',
  `end_time` datetime DEFAULT NULL COMMENT '结束时间',
  `last_update_time` datetime NOT NULL COMMENT '最后更新时间',
  PRIMARY KEY (`id`),
  UNIQUE KEY `resource_id_version` (`resource_id`,`version`,`operation`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4
1 row in set (0.00 sec)

Automated Installation Enhance

  1. Basic environmental inspection before installation
  2. Check service status after startup
  3. Simplified simple version installation

Who is using DataSphere Studio?

Who is using DataSphere Studio

We’d like to thank everyone in this community for your constant support of DataSphere Studio. We’re confident that, with our effort and your support, this community could grow more prosperous and serve a greater number of users.

Our Intentions

  1. DataSphere Studio cannot grow without the voice from the community.
  2. DataSphere Studio desires more contributions from more partners.
  3. DataSphere Studio provides a series of enterprise level features as a one-stop data application development management portal. We hope to get closer to the practical scenarios to plan the roadmap for future releases of DataSphere Studio.

Our Expectation

We would appreciate it if you could leave us a comment with the following information:

  • Your company, colleage or any other organizations
  • Your city & nation
  • Your contact infomation: Weibo, email or WeChat
  • Your practical business scenarios

Sample:

  • Organization:Webank
  • Location:Shenzhen, China
  • Contact information:[email protected]
  • Business scenario:As a one-stop data application development and management portal, it integrates all functional components of the big data platform and provides it to users.

Thanks again!!!
Your support is the biggest motivator for facilitating the progress of open-sourcing the DataSphere Studio!

Yours sincerely,
DataSphere Studio Team


谁在使用DataSphere Studio

感谢社区每一位关注并使用DataSphere Studio的伙伴。我们会持续投入,争取将DataSphere Studio社区和生态打造的更加繁荣,让更多伙伴从DataSphere Studio中受益。

此Issue初衷

  1. DataSphere Studio的成长,离不开社区的声音
  2. DataSphere Studio需要更多伙伴参与进来一起贡献
  3. DataSphere Studio作为数据应用开发管理门户,提供了非常多的企业级特性,我们希望了解大家的实际应用场景,以便规划DataSphere Studio的后续版本

期待

期望您提交一条评论, 内容包括:

  • 您所在公司、学校或组织
  • 您所在的国家和城市
  • 您的联系方式: 微博、邮箱或微信
  • 您的实际业务场景

示例:

  • 公司:微众银行
  • 地点:**深圳
  • 联系方式:[email protected]
  • 业务场景:作为一站式数据应用开发管理门户,整合大数据平台的所有功能组件,提供给用户使用。

再次感谢!!!
您的支持是DataSphere Studio开源前进的最大动力!!

DataSphere Studio团队拜上

One-click deployment optimization

1.The execution directory is separated from the installation directory.
2.The default path of visualis-server's yml file frontEnd is wrong.
3.Simple version does not install qualitis and azkaban by default.

dss 0.5与azkaban 2.5对接问题列表

  1. 安装linkis jobtypes
    按照官方安装文档进行自动化安装,执行sh install.sh最后一步报错:{"error":"Missing required parameter 'execid'."}。并没有看到文档中所说的“如果安装成功最后会打印:{"status":"success"}”,但是能在azkaban的/plugins/jobtypes目录下看到已经安装好的linkis任务插件。通过排查在安装脚本最后一步会去调用"curl http://azkaban_ip:executor_port/executor?action=reloadJobTypePlugins"进行插件的刷新。重启azkaban executor日志中看到已经加载了插件的信息 INFO [JobTypeManager][Azkaban] Loaded jobtype linkis com.webank.wedatasphere.dss.plugins.azkaban.linkis.jobtype.AzkabanDssJobType。当时没有排查到相应的问题于是跳过。当发布linkis任务到azkaban执行成功之后反过来复盘这个问题的时候,这确定应该是个误报信息。

  2. 从dss发布project到azkaban

    问题描述:日志报错azkaban不存在当前用户

    问题排查:确认报用户不存在的用户是能正常访问的azkaban的,异常堆栈日志被捕获了没有太多日志。于是本地远程调试发现在AzkabanSecurityService#getSession方法执行httpClient.execute(httpPost, context)时直接报错了。我们的azkaban开启了https当前登录的接口不支持https,临时的解决方案是关闭了azkaban的https。

  3. 问题2的衍生

    解决完第一个问题之后还是不能发布任务,但是response = httpClient.execute(httpPost, context);

reponse返回的信息已经是变为“incorrect login”。最后排查发现是把azkaban的登录请求中的password写成了userpwd,改了重新打包验证通过。

  1. 任务发布成功但执行失败

    问题描述:

     - azkaban.flow.start.month=12
    05-12-2019 21:15:42 CST sql INFO - azkaban.flow.start.hour=21
    05-12-2019 21:15:42 CST sql INFO - azkaban.flow.uuid=bbdc0985-4a2c-4dab-94ff-ab38e2bbde24
    05-12-2019 21:15:42 CST sql INFO - azkaban.flow.flowid=publish_demo_xc_
    05-12-2019 21:15:42 CST sql INFO - azkaban.flow.start.day=05
    05-12-2019 21:15:42 CST sql INFO - azkaban.job.metadata.file=_job.653.sql.meta
    05-12-2019 21:15:42 CST sql INFO - azkaban.flow.start.timestamp=2019-12-05T21:15:42.829+08:00
    05-12-2019 21:15:42 CST sql INFO - linkistype=linkis.spark.sql
    05-12-2019 21:15:42 CST sql INFO - ****** End Job properties  ******
    05-12-2019 21:15:42 CST sql ERROR - Job run failed!
    05-12-2019 21:15:42 CST sql ERROR - nullnull
    05-12-2019 21:15:42 CST sql INFO - Finishing job sql at 1575551742852 with status FAILED
    

    排查之后发现是没拿到提交用户,获取提交用户用的是

    azkaban.flow.submituser
    

    这个参数经过3.8版本的azkaban和2.5版本的对比发现2.5没有这个参数
    解决方案:临时解决方案是把azkaban.flow.submituser和user.to.proxy 这两个参数写死打包替换,用于流程测试
    另外一种方案就是编译3.8版本然后重新安装部署azkaban

user permisson question

dss create workflow,报
om.webank.wedatasphere.linkis.bml.common.HdfsResourceHelper 74 upload - hadoop write to hdfs:///tmp/linkis/hadoop/20191212/09282afb-bb3c-4b9b-9cb8-c78ece3a2428.json failed, reason is, IOException: java.io.IOException: You have not permission to access path /tmp/linkis/hadoop/20191212/09282afb-bb3c-4b9b-9cb8-c78ece3a2428.json
。 my permission is
drwxrwxrwx - hadoop hadoop 0 2019-12-03 16:52 /tmp .
no problems.
I need help ,thanku!

Integrate Visualis Widget as an appjoint node of DSS

After integrating Dashboard/Display as a node of DSS, we should further enhance the integration with Visualis by adding Widget node, which accepts a Spark-SQL node as its content, and then automatically parses and creates a View based on that SQL statement with meta-data ready. This will make users to only focus on workflow editing page and do not need to jump to Visualis page to do View level construction anymore.

关于workflow保存工作流提示NotClassDefFoundError错误的解决方案

问题描述:关于工程中某个工作流节点单独执行不报错,保存工作流时提示:NotClassDefFoundError:Cloud not initialize class dispatch.Http$

问题原因:linkis-publish微服务关于netty-3.6.2.Final.jar升级包缺失

问题解决:将升级包上传重启linkis-publish微服务即可。

当 DSS 部署端口80时,Qualitis模块接入失败

当部署DSS服务端口为80时,Qualitis模块接入失败!

报错信息:

ERROR [qtp1152298548-27] controller.RedirectController: Failed to redirect to other page, caused by: java.net.ConnectException: http://null/api/rest_j/v1/user/userInfo
java.util.concurrent.ExecutionException: java.net.ConnectException: http://null/api/rest_j/v1/user/userInfo
	at com.ning.http.client.providers.netty.NettyResponseFuture.abort(NettyResponseFuture.java:342) ~[async-http-client-1.8.10.jar:?]
	at com.ning.http.client.providers.netty.NettyConnectListener.operationComplete(NettyConnectListener.java:107) ~[async-http-client-1.8.10.jar:?]
	at org.jboss.netty.channel.DefaultChannelFuture.notifyListener(DefaultChannelFuture.java:427) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.channel.DefaultChannelFuture.addListener(DefaultChannelFuture.java:145) ~[netty-3.9.2.Final.jar:?]
	at com.ning.http.client.providers.netty.NettyAsyncHttpProvider.doConnect(NettyAsyncHttpProvider.java:1138) ~[async-http-client-1.8.10.jar:?]
	at com.ning.http.client.providers.netty.NettyAsyncHttpProvider.execute(NettyAsyncHttpProvider.java:935) ~[async-http-client-1.8.10.jar:?]
	at com.ning.http.client.AsyncHttpClient.executeRequest(AsyncHttpClient.java:499) ~[async-http-client-1.8.10.jar:?]
	at dispatch.HttpExecutor$class.apply(execution.scala:47) ~[dispatch-core_2.11-0.11.2.jar:0.11.2]
	at dispatch.Http.apply(execution.scala:12) ~[dispatch-core_2.11-0.11.2.jar:0.11.2]
	at dispatch.HttpExecutor$class.apply(execution.scala:42) ~[dispatch-core_2.11-0.11.2.jar:0.11.2]
	at dispatch.Http.apply(execution.scala:12) ~[dispatch-core_2.11-0.11.2.jar:0.11.2]
	at com.webank.wedatasphere.linkis.httpclient.AbstractHttpClient.executeAsyncRequest(AbstractHttpClient.scala:199) ~[linkis-httpclient-0.9.0.jar:?]
	at com.webank.wedatasphere.linkis.httpclient.AbstractHttpClient.executeRequest(AbstractHttpClient.scala:195) ~[linkis-httpclient-0.9.0.jar:?]
	at com.webank.wedatasphere.linkis.httpclient.AbstractHttpClient.execute(AbstractHttpClient.scala:94) ~[linkis-httpclient-0.9.0.jar:?]
	at com.webank.wedatasphere.linkis.httpclient.AbstractHttpClient.execute(AbstractHttpClient.scala:87) ~[linkis-httpclient-0.9.0.jar:?]
	at com.webank.wedatasphere.dss.appjoint.auth.impl.AppJointAuthImpl.getRedirectMsg(AppJointAuthImpl.scala:57) ~[dss-appjoint-auth-0.5.0.jar:?]
	at com.webank.wedatasphere.qualitis.controller.RedirectController.redirectToCoordinatePage(RedirectController.java:70) ~[web_app-0.7.0.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_231]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_231]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_231]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_231]
	at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76) ~[jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:243) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:272) [jersey-common-2.26.jar:?]
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:268) [jersey-common-2.26.jar:?]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:316) [jersey-common-2.26.jar:?]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:298) [jersey-common-2.26.jar:?]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:268) [jersey-common-2.26.jar:?]
	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289) [jersey-common-2.26.jar:?]
	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703) [jersey-server-2.26.jar:?]
	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:416) [jersey-container-servlet-core-2.26.jar:?]
	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:370) [jersey-container-servlet-core-2.26.jar:?]
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:389) [jersey-container-servlet-core-2.26.jar:?]
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:342) [jersey-container-servlet-core-2.26.jar:?]
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:229) [jersey-container-servlet-core-2.26.jar:?]
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:865) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1655) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.websocket.server.WebSocketUpgradeFilter.doFilter(WebSocketUpgradeFilter.java:215) [websocket-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at com.webank.wedatasphere.qualitis.filter.Filter1AuthorizationFilter.doFilter(Filter1AuthorizationFilter.java:111) [web_user-0.7.0.jar:?]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:109) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) [jetty-security-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1317) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1219) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.Server.handle(Server.java:531) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281) [jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102) [jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118) [jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:762) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:680) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_231]
Caused by: java.net.ConnectException: http://null/api/rest_j/v1/user/userInfo
	at com.ning.http.client.providers.netty.NettyConnectListener.operationComplete(NettyConnectListener.java:103) ~[async-http-client-1.8.10.jar:?]
	... 88 more
Caused by: java.nio.channels.UnresolvedAddressException
	at sun.nio.ch.Net.checkAddress(Net.java:101) ~[?:1.8.0_231]
	at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622) ~[?:1.8.0_231]
	at org.jboss.netty.channel.socket.nio.NioClientSocketPipelineSink.connect(NioClientSocketPipelineSink.java:108) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.channel.socket.nio.NioClientSocketPipelineSink.eventSunk(NioClientSocketPipelineSink.java:70) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:54) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.handler.codec.http.HttpClientCodec.handleDownstream(HttpClientCodec.java:97) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.handler.stream.ChunkedWriteHandler.handleDownstream(ChunkedWriteHandler.java:109) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.channel.Channels.connect(Channels.java:634) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.channel.AbstractChannel.connect(AbstractChannel.java:207) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.bootstrap.ClientBootstrap.connect(ClientBootstrap.java:229) ~[netty-3.9.2.Final.jar:?]
	at org.jboss.netty.bootstrap.ClientBootstrap.connect(ClientBootstrap.java:182) ~[netty-3.9.2.Final.jar:?]
	at com.ning.http.client.providers.netty.NettyAsyncHttpProvider.doConnect(NettyAsyncHttpProvider.java:1099) ~[async-http-client-1.8.10.jar:?]
	... 85 more

调整DSS服务端口为8080,Qualitis模块接入成功。

Visualis has a bug here

非hive数据源,是mysql数据源
1、View_List可以查询数据
1
2、Widget List无数据
2

建表向导中数据库名下拉框中无匹配数据

  1. 在scripts中新建hive表,无数据库选择项。
    image

  2. console log

vue.esm.js:1897 TypeError: Cannot read property 'value' of undefined
    at setDbList (firstStep.vue:306)
    at s.<anonymous> (firstStep.vue:292)
    at Array.<anonymous> (vue.esm.js:1989)
    at dt (vue.esm.js:1915)
  1. api/rest_j/v1/datasource/dbs 接口返回正常

Failed to create projects

The error message is:

error code(错误码): 90002, error message(错误信息): add scheduler project failederrCode: 90019 ,desc: Connection reset ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server.

Checked dss server log, found the following error messages:

2019-12-26 11:11:32.099 [ERROR] [qtp585718112-25                         ] c.w.w.d.a.s.a.s.AzkabanSecurityService (80) [login] - 获取session失败: java.net.SocketException: Connection reset
        at java.net.SocketInputStream.read(SocketInputStream.java:210) ~[?:1.8.0_232]
        at java.net.SocketInputStream.read(SocketInputStream.java:141) ~[?:1.8.0_232]
        at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137) ~[httpcore-4.4.7.jar:4.4.7]
        at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153) ~[httpcore-4.4.7.jar:4.4.7]
        at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:282) ~[httpcore-4.4.7.jar:4.4.7]
        at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138) ~[httpclient-4.5.4.jar:4.5.4]
        at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56) ~[httpclient-4.5.4.jar:4.5.4]
        at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259) ~[httpcore-4.4.7.jar:4.4.7]
2019-12-26 11:11:32.106 [ERROR] [qtp585718112-25                         ] c.w.w.d.s.s.i.DWSProjectServiceImpl (166) [createSchedulerProject] - add scheduler project failed, com.webank.wedatasphere.dss.appjoint.exception.AppJointErrorException: errCode: 90019 ,desc: Connection reset ,ip: DataSphere-web1 ,port: 9004 ,serviceKind: dss-server
        at com.webank.wedatasphere.dss.appjoint.scheduler.azkaban.service.AzkabanSecurityService.login(AzkabanSecurityService.java:81) ~[?:?]
        at com.webank.wedatasphere.dss.server.function.FunctionInvoker.projectServiceAddFunction(FunctionInvoker.java:76) ~[dss-server-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.server.service.impl.DWSProjectServiceImpl.createSchedulerProject(DWSProjectServiceImpl.java:161) [dss-server-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.server.service.impl.DWSProjectServiceImpl.addProject(DWSProjectServiceImpl.java:109) [dss-server-0.6.0.jar:?]
        at com.webank.wedatasphere.dss.server.service.impl.DWSProjectServiceImpl$$FastClassBySpringCGLIB$$fe55cc96.invoke(<generated>) [dss-server-0.6.0.jar:?]
        at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) [spring-core-5.0.7.RELEASE.jar:5.0.7.RELEASE]
        at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]

Azkaban is running.

tcp        0      0 0.0.0.0:12321           0.0.0.0:*               LISTEN      4789/java
tcp        0      0 0.0.0.0:5005            0.0.0.0:*               LISTEN      4789/java
tcp        0      0 0.0.0.0:8081            0.0.0.0:*               LISTEN      4789/java

出现获取Yarn队列信息异常

yarn队列问题,查看spark-enginemanage的log文件,报错内容如下:

 ERROR [qtp1670196451-332] com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful 72 apply - error code(错误码): 111006, error message(错误信息): Get the Yarn queue information exception.(获取Yarn队列信息异常). com.webank.wedatasphere.linkis.common.exception.ErrorException: errCode: 111006 ,desc: Get the Yarn queue information exception.(获取Yarn队列信息异常) ,ip: localhost ,port: 9003 ,serviceKind: ResourceManager
	at com.webank.wedatasphere.linkis.common.exception.ExceptionManager.generateException(ExceptionManager.java:52) ~[linkis-common-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.transform.RPCConsumer$$anon$1.toObject(RPCConsumer.scala:67) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.BaseRPCSender$$anonfun$ask$1.apply(BaseRPCSender.scala:87) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.BaseRPCSender.execute(BaseRPCSender.scala:80) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.BaseRPCSender.ask(BaseRPCSender.scala:83) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.resourcemanager.client.ResourceManagerClient.requestResource(ResourceManagerClient.scala:56) ~[linkis-resourcemanager-client-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.enginemanager.impl.ResourceRequesterImpl.request(ResourceRequesterImpl.scala:32) ~[linkis-ujes-enginemanager-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.enginemanager.AbstractEngineManager.requestEngine(AbstractEngineManager.scala:64) ~[linkis-ujes-enginemanager-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.enginemanager.AbstractEngineManager.requestEngine(AbstractEngineManager.scala:42) ~[linkis-ujes-enginemanager-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.enginemanager.EngineManagerReceiver.receiveAndReply(EngineManagerReceiver.scala:104) ~[linkis-ujes-enginemanager-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.enginemanager.EngineManagerReceiver.receiveAndReply(EngineManagerReceiver.scala:97) ~[linkis-ujes-enginemanager-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anonfun$receiveAndReply$1.apply(RPCReceiveRestful.scala:139) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anonfun$receiveAndReply$1.apply(RPCReceiveRestful.scala:139) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anonfun$com$webank$wedatasphere$linkis$rpc$RPCReceiveRestful$$receiveAndReply$1$$anonfun$apply$1.apply(RPCReceiveRestful.scala:134) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anonfun$com$webank$wedatasphere$linkis$rpc$RPCReceiveRestful$$receiveAndReply$1$$anonfun$apply$1.apply(RPCReceiveRestful.scala:134) ~[linkis-cloudRPC-0.9.1.jar:?]
	at scala.Option.map(Option.scala:146) ~[scala-library-2.11.8.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anonfun$com$webank$wedatasphere$linkis$rpc$RPCReceiveRestful$$receiveAndReply$1.apply(RPCReceiveRestful.scala:134) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$anonfun$com$webank$wedatasphere$linkis$rpc$RPCReceiveRestful$$receiveAndReply$1.apply(RPCReceiveRestful.scala:130) ~[linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.server.package$.catchMsg(package.scala:57) [linkis-module-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.server.package$.catchIt(package.scala:89) [linkis-module-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful.com$webank$wedatasphere$linkis$rpc$RPCReceiveRestful$$receiveAndReply(RPCReceiveRestful.scala:130) [linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful.receiveAndReply(RPCReceiveRestful.scala:139) [linkis-cloudRPC-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$FastClassBySpringCGLIB$$6973d04a.invoke(<generated>) [linkis-cloudRPC-0.9.1.jar:?]
	at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) [spring-core-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:88) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP$$anonfun$dealMessageRestful$1.apply(RestfulCatchAOP.scala:39) [linkis-module-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP$$anonfun$dealMessageRestful$1.apply(RestfulCatchAOP.scala:39) [linkis-module-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.server.package$.catchMsg(package.scala:57) [linkis-module-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.server.package$.catchIt(package.scala:89) [linkis-module-0.9.1.jar:?]
	at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP.dealMessageRestful(RestfulCatchAOP.scala:38) [linkis-module-0.9.1.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethodWithGivenArgs(AbstractAspectJAdvice.java:644) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethod(AbstractAspectJAdvice.java:633) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.aspectj.AspectJAroundAdvice.invoke(AspectJAroundAdvice.java:70) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:174) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at com.webank.wedatasphere.linkis.rpc.RPCReceiveRestful$$EnhancerBySpringCGLIB$$fda29307.receiveAndReply(<generated>) [linkis-cloudRPC-0.9.1.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
	at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:309) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:315) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:297) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.internal.Errors.process(Errors.java:267) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:292) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1139) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:460) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:386) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:334) [jaxrs-ri-2.21.jar:2.21.]
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:221) [jaxrs-ri-2.21.jar:2.21.]
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:865) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1655) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.websocket.server.WebSocketUpgradeFilter.doFilter(WebSocketUpgradeFilter.java:215) [websocket-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at com.webank.wedatasphere.linkis.server.security.SecurityFilter.doFilter(SecurityFilter.scala:100) [linkis-module-0.9.1.jar:?]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:158) [spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
	at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:126) [spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
	at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:111) [spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.boot.actuate.web.trace.servlet.HttpTraceFilter.doFilterInternal(HttpTraceFilter.java:90) [spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:109) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) [jetty-security-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1317) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473) [jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1219) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.Server.handle(Server.java:531) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260) [jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281) [jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102) [jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118) [jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:762) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:680) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]

请问如何解决?

shareNodeFlowTuning bug

After the selected node of sendemail sending item is deleted, the publishing project will report an error:NoSuchElementException:No value present

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.