GithubHelp home page GithubHelp logo

lensesio / kafka-connect-ui Goto Github PK

View Code? Open in Web Editor NEW
498.0 25.0 131.0 1.19 MB

Web tool for Kafka Connect |

Home Page: http://lenses.io/product/features

License: Other

JavaScript 61.00% HTML 26.11% CSS 10.29% Shell 1.93% Dockerfile 0.66%
kafka kafka-connect elasticsearch cassandra s3 documentdb redis jms mqtt hdfs

kafka-connect-ui's Introduction

kafka-connect-ui

release docker Join the chat at https://gitter.im/Landoop/support

This is a web tool for Kafka Connect for setting up and managing connectors for multiple connect clusters.

Live Demo

kafka-connect-ui.demo.lenses.io

Run standalone with docker

docker run --rm -it -p 8000:8000 \
           -e "CONNECT_URL=http://connect.distributed.url" \
           landoop/kafka-connect-ui

The CONNECT_URL can be a comma separated array of Connect worker endpoints. E.g: CONNECT_URL=http://connect.1.url,http://connect.2.url"

Additionally you can assign custom names to your Connect clusters by appending a semicolon and the cluster name after the endpoint URL. E.g:

"CONNECT_URL=http://connect.1.url;dev cluster,http://connect.2.url;production cluster"

Web UI will be available at localhost:8000

Build from source

git clone https://github.com/Landoop/kafka-connect-ui.git
cd kafka-connect-ui
npm install -g bower http-server
npm install
http-server -p 8080 .

Web UI will be available at localhost:8080

Nginx config

If you use nginx to serve this ui, let angular manage routing with

    location / {
        try_files $uri $uri/ /index.html =404;
        root /folder-with-kafka-connect-ui/;
    }

Setup connect clusters

Use multiple Kafka Connect clusters in env.js :

var clusters = [
   {
     NAME:"prod", //unique name is required
     KAFKA_CONNECT: "http://kafka-connect.prod.url", //required
     KAFKA_TOPICS_UI: "http://kafka-topics-ui.url", //optional
     KAFKA_TOPICS_UI_ENABLED: true //optional
     COLOR: "#141414" //optional
   },
   {
     NAME:"dev",
     KAFKA_CONNECT: "http://kafka-connect.dev.url",
     KAFKA_TOPICS_UI_ENABLED: false
   },
   {
     NAME:"local",
     KAFKA_CONNECT: "http://kafka-connect.local.url",
   }
]

  • Use KAFKA_TOPICS_UI and KAFKA_TOPICS_UI_ENABLED to navigate to the relevant topic when you have kafka-topics-ui installed.
  • Use COLOR to set different header colors for each set up cluster.

Supported Connectors

For our 25+ stream-reactor Kafka Connectors we have a template of metadata within the supported-connectors.js. In any case you will be shown all the existing connectors in your classpath with all the required fields to set them up.

Changelog

Here

License

The project is licensed under the BSL license.

Relevant Projects

  • schema-registry-ui, View, create, evolve and manage your Avro Schemas on your Kafka cluster
  • kafka-topics-ui, UI to browse Kafka data and work with Kafka Topics
  • fast-data-dev, Docker for Kafka developers (schema-registry,kafka-rest,zoo,brokers,landoop)
  • Landoop-On-Cloudera, Install and manage your kafka streaming-platform on you Cloudera CDH cluster

www.landoop.com

kafka-connect-ui's People

Contributors

andmarios avatar antwnis avatar asdf2014 avatar cameronbraid avatar ccl0326 avatar chdask avatar efstathiadisd avatar hyunsanghan avatar isopropylcyanide avatar jglambed avatar jmwilli25 avatar jocelyndrean avatar lucatiozzo91 avatar marvin-roesch avatar michaelandrepearce avatar robvadai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-connect-ui's Issues

Tag version in docker hub

hi

This is a feedback for all your repos, but it seems you only use the latest tag for all images, whereas the product evolves between versions. It'd be good to tag each release

Thanks,
Stephane

Support editable JSON on connector creation

Currently, a properties-based format is converted to JSON on connector creation. The lack of JSON support is also inconsistent with the edit functionality, which uses JSON.

Source plugin and its connectors showing as sink

kafka-connect-ui version: 0.9.3
kafka-connect version: 1.0.0

I am using a mysql connect plugin (io.debezium.connector.mysql.MySqlConnector) which registers as the 'source' type in connect, as shown in the output of /connector-plugins:

[
  {
    "class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
    "type": "sink",
    "version": "3.2.0"
  },
  {
    "class": "io.debezium.connector.mysql.MySqlConnector",
    "type": "source",
    "version": "0.7.0"
  },
  {
    "class": "org.apache.kafka.connect.file.FileStreamSinkConnector",
    "type": "sink",
    "version": "1.0.0"
  },
  {
    "class": "org.apache.kafka.connect.file.FileStreamSourceConnector",
    "type": "source",
    "version": "1.0.0"
  }
]

However, in the UI it is displaying as a 'sink' plugin, and its connectors displaying as 'sink' connectors:

screen shot 2018-01-05 at 12 09 31 pm

screen shot 2018-01-05 at 12 10 00 pm

This is also despite the stored 'type' of the connector(s) being 'source':

{
  "name": "sensors",
  "config": {
    ...
  },
  "tasks": [
    {
      "connector": "sensors",
      "task": 0
    }
  ],
  "type": "source"
}

Not a huge issue, but I thought I'd pass it along. Let me know if you need more info. Thanks!

Connect-UI doesn't seem to be working

I am running Kafka, connect and ZK in Mac OSX. I tried running the below

docker run --rm -it -p 8000:8000 \
>            -e "CONNECT_URL=http://connect.distributed.url" \
>            landoop/kafka-connect-ui
Unable to find image 'landoop/kafka-connect-ui:latest' locally
latest: Pulling from landoop/kafka-connect-ui

3690ec4760f9: Already exists
7650bdbef93c: Pull complete
fb0f52d6cb77: Pull complete
04974ce5c127: Pull complete
450859f253ef: Pull complete
ff9e1d072f6d: Pull complete
Digest: sha256:fc74fd346133f58155f5362f0b2c75d6fe176895c7fb05021f0d6178be3243c4
Status: Downloaded newer image for landoop/kafka-connect-ui:latest

Enabling proxy because Connect doesn't send CORS headers yet.

Activating privacy features... done.
http://0.0.0.0:8000

I visited the http://0.0.0.0:8000 and this is what I saw. What am I missing ? Pls advice

screen shot 2016-12-21 at 3 48 35 pm

JDBC source connector error

I'm trying to get the data from Oracle db into the kafka topics by using the JDBC Source connector.

1> I've selected the JDBC source connector from the available connectors through connect UI (Landoop)
Here are my properties :
name=JDBC-source-connector-demo
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1
connection.password= oracle12
connection.url= jdbc:oracle:thin:@1.1.1.1:33333/ABCDE
connection.user= ORACLE_LA_USR
table.types=VIEW
query=select * from STUDENTS123
mode=timestamp
timestamp.column.name=SYSTEM_N
topic.prefix=JDBC_DB_TOPIC_DEMO
batch.max.rows= 100

transforms=createKey,extract
transforms.createKey.type=org.apache.kafka.connect.transforms.ValueToKey
transforms.createKey.fields=ROLL_N
transforms.extract.type=org.apache.kafka.connect.transforms.ExtractField$Key
transforms.extract.field=ROLL_N

2> After pasting the above properties, I see the following errors in the connect UI

Invalid value java.sql.SQLException: No suitable driver found for jdbc:oracle:thin:@1.1.1.1:33333/ABCDE for configuration Couldn't open connection to jdbc:oracle:thin:@1.1.1.1:33333/ABCDE
Invalid value java.sql.SQLException: No suitable driver found for jdbc:oracle:thin:@1.1.1.1:33333/ABCDE for configuration Couldn't open connection to jdbc:oracle:thin:@1.1.1.1:33333/ABCDE

3> Should I need to paste the ojdbc.jar in some path? If yes, could you please let me know the same ?
I believe ojdbc.jar is already packaged with confluent (which is again packaged as part of the Landoop I believe )

My Landoop kafka connect UI version -

Kafka Connect : /api/kafka-connect
Kafka Connect Version : 0.11.0.1-cp1
Kafka Connect UI Version : 0.9.4

4> And I'm not able to see the connectors in the connect UI even after starting the kafka cluster.

Redis - 'topics.regex' property required when 'topics' is set

Kafka-connect version is 1.1.1-cp1.
I run kafka-connect-ui from source. Thats Redis connector configuration:

name=RedisSinkConnector
connector.class=com.datamountaineer.streamreactor.connect.redis.sink.RedisSinkConnector
topics=TopicName_RedisSinkConnector
tasks.max=1
connect.redis.host=redis-1.m1.example.cloud
connect.redis.port=6379
connect.redis.kcql=INSERT INTO TABLE1 SELECT * FROM redis-topic

UI requires topics.regex to be set, when topics and topics.regex are mutually exclusive options.

image

But it works fine when kafka-connect version is 1.0.0-cp1

Correct loading sources, incorrect loading sinks

With the latest kafka-connect-ui docker (2018-03-21):

loading the sources is OK (when creating a new Kafka Connect Source)

OK source

but results in an empty (class) when creating a new sink.

This also prevents from activating the "CREATE" button when putting a new configuration in the input field (the validation of the input is broken).

Empty sink

I also get the following HTTP 500 error on /api/cluster/connector-plugins/com.datamountaineer.streamreactor.connect.mqtt.sink.MqttSinkConnector/config/validate (probably due to the empty properties template):
{error_code: 500, message: "Must configure one of topics or topics.regex"} error_code : 500 message : "Must configure one of topics or topics.regex"

[Improvement] Reload Caddy Server on Config Change

We use the connect ui in a Kubernetes setup where a sidecar of the connect-ui is notified when a new connect cluster joins. This sidecar updates then the caddy server configuration (mostly proxy settings).

Unfortunately, the caddy server does not restart automatically in case of config changes.

Proposal: use a tool like inotifywait that listens on changes of the caddy config and restart the server using "pkill -USR1 caddy

If you consider this useful as well, I could create a PR for this.

Cant integrate the kafka-connect-ui

I have a docker compose file where i have the services: zookeeper,broker,schema_registry,schema_registry_ui,rest_proxy,topics_ui,connect.
If i use docker compose up -d, all services launch and the kafka registry-UI and kafka-topics work fine.
Then i run the command:

docker run --rm -d -it -p 8002:8000 -e "CONNECT_URL=http://<my-remote-ip>:8083 landoop/kafka-connect-ui

to launch the connect-UI.The command works fine and the connect-UI is reachable.Now i want to integrate this command to the docker-compose.yaml file.I have tried this:
connect_ui:
image: landoop/kafka-connect-ui
hostname: connect_ui
ports:
- "8002:8000"
environment:
CONNECT_URL: http://kafka_connect:8083"
depends_on:
- connect

where the connect service is the one utilising the confluentinc/cp-kafka-connect:3.2.0 image.
The kafka_connect is the hostname of the connect service.Here is the code for the connect service:

connect:
image: confluentinc/cp-kafka-connect:3.2.0
container_name: kafka-connect
hostname: kafka_connect
depends_on:
- zookeeper
- broker
- schema_registry
ports:
- "8083:8083"
restart: always
environment:
CONNECT_BOOTSTRAP_SERVERS: 'broker:9092'
CONNECT_REST_ADVERTISED_HOST_NAME: connect
CONNECT_REST_PORT: 8083
CONNECT_GROUP_ID: compose-connect-group
CONNECT_CONFIG_STORAGE_TOPIC: docker-connect-configs
CONNECT_OFFSET_STORAGE_TOPIC: docker-connect-offsets
CONNECT_STATUS_STORAGE_TOPIC: docker-connect-status
CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: http://schema_registry:8081
CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: http://schema_registry:8081
CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_ZOOKEEPER_CONNECT: "zookeeper:2181"
However, it is not working properly and there appears a connectivity error.Any suggestion is welcome.

Overwrite the CORS Header witha docker-compose file

Having the followinf Docker-Compose

  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    restart: unless-stopped
    ports:
      - "2181:2181"
    environment:
      - ZOOKEEPER_CLIENT_PORT=2181
      - ZOOKEEPER_TICK_TIME=2000

  kafka:
    image: confluentinc/cp-kafka:latest
    ports:
      - "59092:9092"
    environment:
      - KAFKA_BROKER_ID= 1
      - KAFKA_ZOOKEEPER_CONNECT= zookeeper:2181
      - KAFKA_ADVERTISED_LISTENERS= PLAINTEXT://kafka:9092
      - KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR= 1
    depends_on:
      - zookeeper

  schema_registry:
    image: confluentinc/cp-schema-registry:latest
    ports:
      - '58081:8081'
    restart: unless-stopped
    environment:
      - SCHEMA_REGISTRY_LISTENERS= http://127.0.0.1:8081
      - SCHEMA_REGISTRY_HOST_NAME= schema
      - SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL= zookeeper:2181
    depends_on:
      - kafka
      - zookeeper

  proxy:
    image: confluentinc/cp-kafka-rest:latest
    ports:
      - '58082:8082'
    environment:
      - KAFKA_REST_ZOOKEEPER_CONNECT= zookeeper:2181
      - KAFKA_REST_SCHEMA_REGISTRY_URL= schema:8081
      - KAFKA_REST_HOST_NAME= proxy
      - KAFKA_REST_LISTENERS= http://127.0.0.1:8082
    depends_on:
      - schema_registry
      - kafka
      - zookeeper

  kafka-ui:
    image: landoop/kafka-topics-ui:latest
    ports:
      - 58000:8000
    environment:
      - KAFKA_REST_PROXY_URL=http://127.0.0.1:8082
      - PROXY=True
    depends_on:
      - proxy
      - kafka

I am getting the CONNECTIVITY ERROR well know problem.
I am trying to figure out how to overwrite the kafka.rest.conf settings.

Any help?

UI doesn't connect to the connect rest when building from source

I'm trying to start contributing to the project, and to do so I'm trying to run the ui from the source against
a kafka-connect that I run as a docker image on my PC.

this is the connect params:

  docker run -d   --name=kafka-connect   --net=host 
 -e CONNECT_BOOTSTRAP_SERVERS=localhost:29092  
 -e CONNECT_REST_PORT=28082   -e CONNECT_GROUP_ID="quickstart" 
 -e CONNECT_CONFIG_STORAGE_TOPIC="quickstart-config"
 -e CONNECT_OFFSET_STORAGE_TOPIC="quickstart-offsets" 
 -e CONNECT_STATUS_STORAGE_TOPIC="quickstart-status"  
 -e CONNECT_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter"   
 -e CONNECT_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter"  
 -e CONNECT_INTERNAL_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter" 
 -e CONNECT_INTERNAL_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter"
 -e CONNECT_REST_ADVERTISED_HOST_NAME="localhost" 
 confluentinc/cp-kafka-connect:3.2.1

**I have a kafka instance running as a docker image also locally.
When browsing to http://localhost:28082 it shows the version and commit as excepted

this is the env.js file:

//Change the URLs of the endpoints here
var clusters = [
   {
     NAME:"prod",
     KAFKA_CONNECT: "http://localhost:28082",
     COLOR: "#141414"
   },
   {
     NAME:"dev1",
     KAFKA_CONNECT: "http://kafka-connect.dev.url",
     KAFKA_TOPICS_UI_ENABLED: false
   }
]

then I start the server with http-server .
but it still shows the cluster as not responding.

Of course, that I want only the first cluster to be available.

Number of workers on UI

Hi

Can you pls add another box that shows how many workers are live? and how the tasks are distributed between them?

Thanks, I think this will be very helpful and improve the app visibility

D.

Carriage return in properties should not be interpreted

In some case, when creating new connector from editor and copy/pasting from windows notepad(or whatever tools which will convert new lines as \r\n), the curl command will contains \r in properties values

For example, by just copy/pasting the default file configuration, we get the following config

name=file-connector2
topic=mobile.services.catalog.enteruniverse
tasks.max=1
file=/var/log/ansible-confluent/connect.log
connector.class=org.apache.kafka.connect.file.FileStreamSourceConnector

which converts to

cat << EOF > file-connector
.json
{
  "name": "file-connector\r",
  "config": {
    "topic": "kafka-connect-logs\r",
    "tasks.max": "1\r",
    "file": "/var/log/ansible-confluent/connect.log\r",
    "connector.class": "org.apache.kafka.connect.file.FileStreamSourceConnector"
  }
}
EOF
curl -X POST -H "Content-Type: application/json" -H "Accept: application/json" -d @file-connector
.json /api/kafka-connect/connectors 

Which will create an exception on kafka-connect

[2017-03-01 17:33:06,858] ERROR Failed to reconfigure connector's tasks, retrying after backoff: (org.apache.kafka.connect.runtime.distributed.DistributedHerder)
org.apache.kafka.connect.runtime.rest.errors.ConnectRestException: IO Error trying to forward REST request: No content to map due to end-of-input
 at [Source: UNKNOWN; line: 1, column: 1]
        at org.apache.kafka.connect.runtime.rest.RestServer.httpRequest(RestServer.java:242)
        at org.apache.kafka.connect.runtime.distributed.DistributedHerder$12.run(DistributedHerder.java:864)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input
 at [Source: UNKNOWN; line: 1, column: 1]
        at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
        at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3609)
        at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3549)
        at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2650)

Then i assume there is a kafka connect bug, because connector is created (in connect-configs topic) but not visible from api, and if i create a connector without carriage return, then the connector is duplicated in rest api (will try to get a consistent reproductible case to submit to kafka guys)

Disregarding kafka connect bug, the editor should automatically remove \r (or propose to remove it)

Consistency between create and edits

Creating a connector uses a "property" format (key=val), while edit a connector uses a json format.
It's a bit of a pain for copy and paste, maybe have a toggle to easily switch between the two?

502 Bad Gateway and config issues

Using the latest Confluent Kafka, Schema Registry etc versions (Confluent 3.1.2) I'm getting HTTP 502.

This is the same for KC UI and Schema Registry UI as well.

Snippet of my docker compose file:

schema-registry:
    image: confluentinc/cp-schema-registry
    hostname: schema-registry
    depends_on:
      - zookeeper
      - broker
    ports:
      - '8081:8081'
    environment:
      SCHEMA_REGISTRY_HOST_NAME: schema-registry
      SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL: 'zookeeper:2181'
      SCHEMA_REGISTRY_ACCESS_CONTROL_ALLOW_METHODS: GET,POST,OPTIONS
      SCHEMA_REGISTRY_ACCESS_CONTROL_ALLOW_ORIGIN: '*'

  connect-ui:
    image: landoop/kafka-connect-ui
    hostname: connect-ui
    depends_on:
      - connect
    ports:
      - "8000:8000"
    environment:
      CONNECT_URL: 'http://connect:8083'

  schema-registry-ui:
    image: landoop/schema-registry-ui
    hostname: schema-registry-ui
    depends_on:
      - schema-registry
    ports:
      - "8001:8000"
    environment:
      SCHEMAREGISTRY_URL: 'http://schema-registry:8081'
      PROXY: 'true'

Don't get the kafka-connect-ui running

Hello,
I'm trying to get the kafka-connect-ui running.
So I started the confluent services (running properly) and calling

sudo docker run --rm -it -p 8000:8000 -e "CONNECT_URL=http://127.0.0.1:8083" landoop/kafka-connect-ui

Then it is stating:

Enabling proxy because Connect doesn't send CORS headers yet and setting up clusters.

Activating privacy features... done.
http://0.0.0.0:8000

When I now try to look up in firefox on 127.0.0.1:8000 then it's stating:

Kafka Connect : /api/kafka-connect-1
Kafka Connect UI Version : 0.9.3
CONNECTIVITY ERROR

Could you maybe give me some help to get it running?
Thanks Tobias

Click on title resolves without path/basecontext

Hi,

in the index.html there there is following code:

<!-- <h5><a href="{{cluster ? '#/cluster/'+cluster.NAME : '#/'}}">KAFKA CONNECT</a></h5> -->
<h5><a href="/#/" style="color:#fff;font-weight: normal;font-size: 14px;">KAFKA CONNECT</a></h5>

This breaks the function, as we use following URL (with path/basecontext) to access kafka-connect-ui:

https://xyz/path

and now it resovles to

https://xyz/#/

intead of

https://xyz/path/#

In kafak-topics-ui it works fine.

Version: 0.9.5

Connect topology shows wrong topics

Using:

  • Kafka Connect Version : 0.11.0.0-cp1
  • Kafka Connect UI Version : 0.9.2
  • ElasticSearch (confluent) Worker

Configuration like this:

{
  "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
  "type.name": "action",
  "flush.timeout.ms": "10000",
  "topics": "action-log",
  "tasks.max": "1",
  "batch.size": "1000",
  "topic.key.ignore": "true",
  "topic.index.map": "action-log:log",
  "name": "my_logger",
  "connection.url": "http://somedomain:9200",
  "linger.ms": "10",
  "topic.schema.ignore": "false"
}

Following Topics will be recognized:

  • action-log
  • true
  • action-log:log
  • false

Incorrect link to documentation html

In the Kafka connect Docker (latest build
2018-03-21), the browser tries to retrieve documentation at:

/src/documentation///lenses.stream/connectors/source/mqtt.html
or
/src/documentation/undefined

while in fact the documentation is still located at:

/src/documentation/com.datamountaineer.streamreactor.connect.mqtt.source.MqttSourceConnector.html

D3 v3 is going to be deleted

Hi, I am using landoop/kafka-connect-ui version 0.9.7 and I am getting the following alert

D3 v3 is going to be deleted. Please migrate to v4 ASAP.
See go/d3v4 for details and resources on migration

Besides, I've seen that there is a new docker image in Docker Hub that was built 2 months ago with the tag latest. It'd be good if we have a version number, maybe following the sequence it would be 0.9.8

Thanks!

[docker] Don't overwrite /kafka-connect-ui/env.js if volume mounted

In Compose, I did this

    volumes:
      - ./kafka-connect-env.js:/kafka-connect-ui/env.js

But the file was unexpectedly overwritten.

Putting the file in read-only mode lets the container start, but it seems it's not able to connect to any of the clusters. Just giving a single CONNECT_URL via an environment variable with one of the URLs from the JS file works fine.

Restarting a failed task

The restart button in the UI attempts to restart the connector.

"POST /connectors/ES-sink/restart HTTP/1.1" 204 - 18 (org.apache.kafka.connect.runtime.rest.RestServer:60)

Is it possible to restart individual task from the UI?

/connectors/ES-sink/restart/tasks/0/restart

can not run

The env.js file exists, but the server returns the following error:

ui_can_not_work

UI shows sensible authentication credentials

Hi,

first, thanks for this nice Interface.

While playing around with your UI and the MongoDB sink connector, we had to establish authentication against the database. Once the settings were defined, the UI always reveals the password in plaintext.

I would request any connector properties ending with 'password' to be replaced with some asterisk chars.

If you like this feature, I would like to submit a pull request in the next couple days.

Kind regards
Cedric

Deployed docker image doesn't work

I deployed the docker image (latest) in AWS and getting the following error:
image
My CONNECT_URL env variable is http://kafka-connect-dev:8083/

If I look into the docker container, I got the following:

/kafka-connect-ui # cat env.js 
var clusters = [
   {
     NAME: "default",
     KAFKA_CONNECT: "/api/kafka-connect"
   }
]
# cat /caddy/Caddyfile 
0.0.0.0:8000
tls off

root /kafka-connect-ui
log /access.log
proxy /api/kafka-connect http://kafka-connect-dev:8083 {
    without /api/kafka-connect
}

And connectivity exists from the box to the service:

$ nc -z kafka-connect-dev 8083
Connection to kafka-connect-dev 8083 port [tcp/us-srv] succeeded!

I also don't know where to look to see app logs

Create new connector UI fails with sink in Chrome

I can create sources in Chrome, but trying to create a sink fails (It works in Safari).
The console gives me the following errors:

Ace for create-connector loaded
angular.js:12185 PUT http://localhost:8003/api/kafka-connect-1/connector-plugins/io.confluent.connect.jdbc.JdbcSinkConnector/config/validate 500 (Internal Server Error)
angular.js:12185 GET http://localhost:8003/src/documentation/undefined 404 (Not Found)

If I try to type something, I get this error:

TypeError: Cannot read property 'split' of undefined
    at validateConnectorFn (combined.js?rel=182e7e8d7b:2091)
    at combined.js?rel=182e7e8d7b:2069
    at m.$digest (angular.js:17755)
    at m.$apply (angular.js:18021)
    at angular.js:19869
    at e (angular.js:6035)
    at angular.js:6314

I'm running Kafka Connect UI 0.9.6 Docker image (the UI says I'm running 0.9.4).

Any clues?
For now my solution is just using Safari.

Connect UI does not work with the a comma separated of worker endpoints

HI,
I've just installed, configured and runned successfully the kafka-connect-ui setting a single worker endpoint in env.js:

var clusters = [ { NAME:"worker-adreply", KAFKA_CONNECT: "http://domain.com:9995", KAFKA_TOPICS_UI: "http://kafka-topics-ui.url", KAFKA_TOPICS_UI_ENABLED: true , COLOR: "#141414" } ]

But the Ui gets stuck if I try to configure multiple worker endpoints in a comma separated array:

screen shot 2017-04-11 at 16 27 04

Here the env.js
var clusters = [ { NAME:"worker-adreply", KAFKA_CONNECT: "http://domain.com:9995,http://domain.com:9996", KAFKA_TOPICS_UI: "http://kafka-topics-ui.url", KAFKA_TOPICS_UI_ENABLED: true , COLOR: "#141414" } ]

Is there a bug or I'm missing something?

Unable to enter a equal sign into property value.

I would like to enter the following property:

key=set value=1

There is currently no way to include an equal sign using the property syntax due to the logic at: https://github.com/Landoop/kafka-connect-ui/blob/master/src/kafka-connect/configuration/editor/configuration-editor-properties.component.js#L72-L76

Could you please add this ability? I don't care what the syntax looks like, I just need the ability to enter an equal sign into the value :)

Thanks for your time!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.