GithubHelp home page GithubHelp logo

confluentinc / cp-demo Goto Github PK

View Code? Open in Web Editor NEW
24.0 130.0 0.0 112.28 MB

Confluent Platform Demo including Apache Kafka, ksqlDB, Control Center, Schema Registry, Security, Schema Linking, and Cluster Linking

License: Apache License 2.0

Shell 83.46% Makefile 0.97% Java 14.12% JSONiq 0.23% Dockerfile 1.21%
kafka demo ksql security ssl sasl connect confluent confluent-platform ksqldb

cp-demo's Introduction

Kafka Event Streaming Applications

This example and accompanying tutorial show users how to deploy an Apache Kafka® event streaming application using ksqlDB and Kafka Streams for stream processing. All the components in the Confluent Platform have security enabled end-to-end. Run the example with the tutorial.

Table of Contents

Overview

The use case is a Kafka event streaming application for real-time edits to real Wikipedia pages. Wikimedia's EventStreams publishes a continuous stream of real-time edits happening to real wiki pages. Using Kafka Connect, a Kafka source connector kafka-connect-sse streams raw messages for the server sent events (SSE), and a custom Kafka Connect transform kafka-connect-json-schema transforms these messages and then the messages are written to a Kafka cluster. This example uses ksqlDB and a Kafka Streams application for data processing. Then a Kafka sink connector kafka-connect-elasticsearch streams the data out of Kafka and is materialized into Elasticsearch for analysis by Kibana. All data is using Confluent Schema Registry and Avro. Confluent Control Center is managing and monitoring the deployment.

image

Documentation

You can find the documentation for running this example and its accompanying tutorial at https://docs.confluent.io/platform/current/tutorials/cp-demo/docs/index.html.

Additional Examples

For additional examples that showcase streaming applications within an event streaming platform, please refer to the examples GitHub repository.

cp-demo's People

Contributors

andrewegel avatar awalther28 avatar brianstrauch avatar cchristous avatar chuck-confluent avatar cjmatta avatar confluentjenkins avatar confluentsemaphore avatar confluenttools avatar dnozay avatar elismaga avatar gracechensd avatar javabrett avatar jeqo avatar jimgalasyn avatar joel-hamill avatar jssnipes avatar kelvinl3 avatar maxzheng avatar patrick-premont avatar rmoff avatar robcowart avatar rspurgeon avatar sdandu-gh avatar theturtle32 avatar tobydrake7 avatar vdesabou avatar xiangxin72 avatar xli1996 avatar ybyzek avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cp-demo's Issues

ERROR: Service 'connect' failed to build: invalid reference format

Building connect
Step 1/12 : ARG CP_VERSION
Step 2/12 : FROM confluentinc/cp-server-connect-base:$CP_VERSION
ERROR: Service 'connect' failed to build: invalid reference format

image: localbuild/connect:${CONFLUENT_DOCKER_TAG}-${CONNECTOR_VERSION}
I can not know what is CONNECTOR_VERSION and I did not found it in .env file as REPOSITORY but I had added to REPOSITORY=confluentinc and its work fine and I had searched about localbuild/connect in hub.docker.com and not matches

Elasticsearch connector fails to start - timing issue?

cp-demo 5.5.1
Startup from scratch using ./scripts/start.sh

elasticsearch-ksqldb connector fails to start. Worker log shows

HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.128.11] failed: Connection refused (Connection refused)

Elasticsearch is running, and if I restart the connector it comes up fine, so I guess this is a timing error.

docker-compose exec connect curl -X POST --cert /etc/kafka/secrets/connect.certificate.pem --key /etc/kafka/secrets/connect.key --tlsv1.2 --cacert /etc/kafka/secrets/snakeoil-ca-1.crt -u superUser:superUser https://connect:8083/connectors/elasticsearch-ksqldb/tasks/0/restart

Starts fine:

docker-compose exec connect curl -X GET --cert /etc/kafka/secrets/connect.certificate.pem --key /etc/kafka/secrets/connect.key --tlsv1.2 --cacert /etc/kafka/secrets/snakeoil-ca-1.crt -u superUser:superUser https://connect:8083/connectors/elasticsearch-ksqldb/status | jq -r ".connector.state, .tasks[].state"
RUNNING
RUNNING

To avoid this problem the startup script could poll Elasticsearch for readiness e.g. / e.g.

INFO [SocketServer brokerId=0] Failed authentication with /10.244.0.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)

Hello,

We use confluentinc/cp-kafka and kafka version is 5.4.0-ccs.

My use case is to have inter broker communication via PLAINTEXT and producer/consumer via SASL/OAUTHBEARER, hence I have below configuration in my helm chart values;

  "zookeeper.sasl.enabled": false
  # Disable hostname verification, default is https.
  "ssl.endpoint.identification.algorithm":
  "inter.broker.listener.name": PLAINTEXT
  "listener.name.external.sasl.enabled.mechanisms": OAUTHBEARER
  "listener.name.external.oauthbearer.sasl.login.callback.handler.class": oracle.insight.common.kafka.security.OAuthBearerSignedLoginCallbackHandler
  "listener.name.external.oauthbearer.sasl.server.callback.handler.class": oracle.insight.common.kafka.security.OAuthBearerSignedValidatorCallbackHandler
  "listener.security.protocol.map": PLAINTEXT:PLAINTEXT,EXTERNAL:SASL_PLAINTEXT
  "listener.name.external.oauthbearer.sasl.jaas.config": org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required signedLoginStringClaim_ocid=insightAdmin signedLoginKeyServiceClass=oracle.insight.common.security.SMSKeyService signedValidatorKeyServiceClass=oracle.insight.common.security.SMSKeyService;
  "advertised.listeners": EXTERNAL://kafka-$((${KAFKA_BROKER_ID})).mydomain:$((${KAFKA_OUTSIDE_PORT} + ${KAFKA_BROKER_ID}))

With this, when kafka broker is provisioned in k8's I get below continuously. This is critical for our release in few days from now. Please help.

[2020-01-30 17:23:55,228] INFO [SocketServer brokerId=0] Failed authentication with /10.244.0.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)
[2020-01-30 17:23:55,633] INFO [SocketServer brokerId=0] Failed authentication with /10.244.0.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)
[2020-01-30 17:23:55,989] INFO [SocketServer brokerId=0] Failed authentication with /10.244.0.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)

Thank you.

Add validation for docker version

there was a breaking change with docker than means some of the docker-compose exec ... | <cmd> won't work with older versions of docker. We should add a validation check in scripts/helper.functions.sh to check that the user has a newer version of the docker engine

cp-demo - elasticsearch bootstrap checks failed

i got this error on my ubuntu vm for elasticsearch_1 | ERROR: [1] bootstrap checks failed
and so ES and kibana were not ok......

i resolved by adding 1 line in docker-compose.yml ( elasticsearch/environment)
discovery.type: "single-node"

now everything is ok at start ( checked by docker ps -a )

HTH
phil

Customer is trying to run cp-demo from behind a proxy and it fails.

The problem is that the new confluent client tries to check for updates and this breaks the scripts:

[WARN ] unable to get available versions: Get "https://s3-us-west-2.amazonaws.com/confluent.cloud?prefix=confluent-cli/binaries/": dial tcp 52.218.205.24:443: i/o timeout

Kind Regards,

David

Java Version Check before Generating Java Key Stores

Description

My "default" java was java11/16 (something NOT java8). This generates Java Keystores with an algorithm that Java8 inside the containers doesn't understand - This is the exception I saw the kafka brokers throwing:

kafka1            | Caused by: org.apache.zookeeper.common.X509Exception$KeyManagerException: java.io.IOException: Integrity check failed: java.security.NoSuchAlgorithmException: Algorithm HmacPBESHA256 not available
kafka1            | 	at org.apache.zookeeper.common.X509Util.createKeyManager(X509Util.java:417)
kafka1            | 	at org.apache.zookeeper.common.X509Util.createSSLContextAndOptions(X509Util.java:334)
kafka1            | 	... 22 more
kafka1            | Caused by: java.io.IOException: Integrity check failed: java.security.NoSuchAlgorithmException: Algorithm HmacPBESHA256 not available
kafka1            | 	at java.base/sun.security.pkcs12.PKCS12KeyStore.engineLoad(PKCS12KeyStore.java:2167)
kafka1            | 	at java.base/sun.security.util.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:222)
kafka1            | 	at java.base/java.security.KeyStore.load(KeyStore.java:1479)
kafka1            | 	at org.apache.zookeeper.common.StandardTypeFileKeyStoreLoader.loadKeyStore(StandardTypeFileKeyStoreLoader.java:48)
kafka1            | 	at org.apache.zookeeper.common.X509Util.createKeyManager(X509Util.java:406)
kafka1            | 	... 23 more
kafka1            | Caused by: java.security.NoSuchAlgorithmException: Algorithm HmacPBESHA256 not available
kafka1            | 	at java.base/javax.crypto.Mac.getInstance(Mac.java:191)
kafka1            | 	at java.base/sun.security.pkcs12.PKCS12KeyStore.engineLoad(PKCS12KeyStore.java:2145)
kafka1            | 	... 27 more

If I tear down, destroy the certs & keystores created, then set JAVA_HOME=/path/to/a/java/8/Home, and re-run the start.sh script, then things work. A couple of suggested improvements:

  • Do a java version check and throw at least a warning that this isn't advisable.
  • Enforce the Hmac algorithm that java8 supports when creating the java key-stores.

Troubleshooting
Validate every step in the troubleshooting section: https://docs.confluent.io/platform/current/tutorials/cp-demo/docs/index.html#troubleshooting => Nothing about java.security.NoSuchAlgorithmException

Identify any existing issues that seem related: https://github.com/confluentinc/cp-demo/issues?q=is%3Aissue => Searched for NoSuchAlgorithmException or HmacPBESHA256 found nothing like this.

If applicable, please include the output of:

  • docker-compose logs <container name> => see above
  • any other relevant commands => None

Environment

  • GitHub branch: all branches - Ive seen this occur as far back as CP 5.2
  • Operating System: MacOS
  • Version of Docker: irrelevant.
  • Version of Docker Compose: irrelevant

After CP 5.2.* changes startup is broken

After the 5.2.0-SNAPSHOT bump following the instructions to do

git clone https://github.com/confluentinc/cp-demo

Leads to a system which doesn't start properly due to non-existence of dependencies in docker.

Bringing up Docker Compose
Creating network "cp-demo_default" with the default driver
Creating volume "cp-demo_mi3" with default driver
Pulling zookeeper (confluentinc/cp-zookeeper:5.2.0-SNAPSHOT)...
ERROR: manifest for confluentinc/cp-zookeeper:5.2.0-SNAPSHOT not found

It would be better to note that you should explicitly clone a branch (not master) and state which is the most recent "known good" branch.

Huge ass time icons

I cloned this repo (529e469), ran ./scripts/start.sh, opened the control center, and look what I was greeted by:

Screenshot

Time picker

The <svg> element has width="1.3rem" and height="1.3rem" attributes, but they don't work with relative values in some browsers (in my case firefox - but chrome works):
The issue is described in more detail here:
https://wiki.mozilla.org/SVG:Sizing

I did not find a repository for the control center itself, that's why I'm posting it here.

MDS fails to start when running ./scripts/start.sh #274-----Fails after 120 seconds wait time

I tried what @robcowart did by increasing the wait time to 120 seconds for my instance. Still the same.
My MacBook$ CLEAN=true ./scripts/start.sh
./scripts/start.sh: line 5: /Users/IcyRoad/cp-demo/scripts/env.sh: No such file or directory

WARNING: Did you remember to increase the memory available to Docker to at least 8GB (default is 2GB)? Demo may otherwise not work properly.

Stopping tools ... done
Stopping zookeeper ... done
Stopping openldap ... done
Removing tools ... done
Removing kafka2 ... done
Removing kafka1 ... done
Removing zookeeper ... done
Removing openldap ... done
Removing network cp-demo_default
./scripts/start.sh: line 24: clean_demo_env: command not found

Environment parameters
REPOSITORY=
CONNECTOR_VERSION=
C3_KSQLDB_HTTPS=
CLEAN=true

./scripts/start.sh: line 38: create_certificates: command not found
Creating network "cp-demo_default" with the default driver
Creating openldap ... done
openldap is up-to-date
Creating zookeeper ... done
Creating kafka1 ... done
Creating kafka2 ... done
Creating tools ... done
Waiting up to 120 seconds for MDS to start
.....................

MDS fails to start when running ./scripts/start.sh

Using 6.0.0-post branch results in the following error when attempting to start with ./scripts/start.sh

ws2:cp-demo rob$ ./scripts/start.sh 
Generate keys and certificates used for SSL (see /Users/rob/src/github.com/confluentinc/cp-demo/scripts/security)
Generating a 2048 bit RSA private key
........................................................+++
....+++
writing new private key to 'snakeoil-ca-1.key'
-----
Creating certificates
Created certificates for controlcenter
Created certificates for appSA
Created certificates for client
Created certificates for schemaregistry
Created certificates for connect
Created certificates for restproxy
Created certificates for kafka2
Created certificates for badapp
Created certificates for ksqlDBUser
Created certificates for mds
Created certificates for ksqlDBserver
Created certificates for connectorSA
Created certificates for kafka1
Created certificates for zookeeper
Created certificates for clientListen
Creating certificates completed
Generating public and private keys for token signing
Generating RSA private key, 2048 bit long modulus
.....+++
.............+++
e is 65537 (0x10001)
writing RSA key
Setting insecure permissions on some files in /Users/rob/src/github.com/confluentinc/cp-demo/scripts/security for demo purposes

Creating network "cp-demo_default" with the default driver
Creating openldap ... done
openldap is up-to-date
Creating zookeeper ... done
Creating kafka1    ... done
Creating kafka2    ... done
Creating tools     ... done
Waiting up to 90 seconds for MDS to start
..................ERROR: Failed after 90 seconds. Please troubleshoot and run again. For troubleshooting instructions see https://docs.confluent.io/current/tutorials/cp-demo/docs/index.html#troubleshooting

Problem with network_mode: "host' on docker-compose

hi,

I'm working with this demo and it is working fine.
However, I'm now creating a new source connector who accesses an API over the internet.
the docker-compose does not have network_mode: "host" configured for any of the cotainers defined in it.

I have tryed to set network_mode: "host" only on the connect container config as below:

connect:
    image: confluentinc/cp-kafka-connect:5.3.1
    container_name: connect
    cpus: 0.6
    restart: always
    ports:
      - "8083:8083"
    network_mode: "host"
    depends_on:
      - zookeeper
      - kafka1
      - kafka2
      - schemaregistry
...

Unfortunately, when I run the ./script/start.sh and check the docker logs connect --follow I am getting the error below:

[main] ERROR io.confluent.admin.utils.cli.KafkaReadyCommand - Error while running kafka-ready.
org.apache.kafka.common.KafkaException: Failed to create new KafkaAdminClient
	at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:407)
	at org.apache.kafka.clients.admin.AdminClient.create(AdminClient.java:65)
	at io.confluent.admin.utils.ClusterStatus.isKafkaReady(ClusterStatus.java:138)
	at io.confluent.admin.utils.cli.KafkaReadyCommand.main(KafkaReadyCommand.java:150)
Caused by: org.apache.kafka.common.config.ConfigException: No resolvable bootstrap urls given in bootstrap.servers
	at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:88)
	at org.apache.kafka.clients.ClientUtils.parseAndValidateAddresses(ClientUtils.java:47)
	at org.apache.kafka.clients.admin.KafkaAdminClient.createInternal(KafkaAdminClient.java:367)
	... 3 more

can anyone help me to configure the docker-compose so my source connector can access the internet?

Thank you in advance.

Best regards,

Flávio Oliva

Bootstrap checks failed

My colleague and I are both walking through this demo setup. We are using Ubuntu 18.04 running on a Windows 10 VM. We encountered two issues.

  1. The mounting of the $PWD/scripts/security:/etc/kafka/secrets (line 28) for the zookeeper service in docker-compose.yml was not working until I had added myself to the linux docker group using usermod -aG docker [username]. The container would continuously restart, and the docker logs showed repeated errors when it could not locate zookeeper_jaas.conf.

  2. The elasticsearch service would start and then stop 30 seconds later. We traced that to the bootstrap checks elasticsearch does at startup. It reported we only had 4096 file descriptors available and it required a minimum of over 200k. I researched the issue on the elastic search documentation for this specific version and it describes adding an environment variable (discovery.type) to indicate running in development mode will bypass those bootstrap checks. Added the following:

    environment:
      xpack.security.enabled: "false"
      ES_JAVA_OPTS: "-Xms1g -Xmx1g"
      discovery.type: "single-node"     # <-- added this

and everything is working now. I found #11 mentioned this before and it seems like that setting was added way back in Dec 2017 but is no longer present. I'm not sure why - could have saved us a couple of hours.

Failed to start MDS

Tried to run ./script/start.sh to start cp-demo but encountered the following error.. please advise. Thanks.
Waiting up to 120 seconds for MDS to start
........................ERROR: Failed after 120 seconds. Please troubleshoot and run again. For troubleshooting instructions see https://docs.confluent.io/current/tutorials/cp-demo/docs/index.html#troubleshooting
% docker-compose logs kafka1
Attaching to kafka1
kafka1 | ERROR: Did not find SSL certificates in /etc/kafka/secrets/ (did you remember to run ./scripts/start.sh instead of docker-compose up -d?)
% docker-compose ps
Name Command State Ports

kafka1 bash -c if [ ! -f /etc/kaf ... Exit 1
kafka2 bash -c if [ ! -f /etc/kaf ... Exit 1
openldap /container/tool/run --copy ... Up 0.0.0.0:389->389/tcp, 636/tcp
tools /bin/bash Up
zookeeper /etc/confluent/docker/run Up 0.0.0.0:2181->2181/tcp, 0.0.0.0:2182->2182/tcp, 2888/tcp, 3888/tcp

create-certs.sh fails with syntax error

The scripts/start.sh fails, as the certs are not generated, see below related error

 ./scripts/start.sh
WARNING: No swap limit support

Usage:
 kill [options] <pid> [...]

Options:
 <pid> [...]            send signal to every <pid> listed
 -<signal>, -s, --signal <signal>
                        specify the <signal> to be sent
 -l, --list=[<signal>]  list all signal names, or convert one to a name
 -L, --table            list all signal names in a nice table

 -h, --help     display this help and exit
 -V, --version  output version information and exit

For more details see kill(1).
Removing network cpdemo_default
cpdemo_mi3
Generate keys and certificates used for SSL
Generating a 2048 bit RSA private key
................................+++
..................+++
writing new private key to 'snakeoil-ca-1.key'
-----
req: Skipping unknown attribute "S"
./certs-create.sh: line 33: unexpected EOF while looking for matching `)'
./certs-create.sh: line 74: syntax error: unexpected end of file
Bringing up Docker Compose

I will open a PR for the fix

connector problem regarding irc connector

hello, continuing the demo set up i got a problem regarding irc connector ( when the ./scripts_ksql/submit_wikipedia_irc_config.sh is executed ..)
the command docker-compose ps is ok !
and
curl -X GET "http://localhost:8083/connectors"
result is :
[]
but when the script is run it gives error_code":500, ' see below please
curl -X POST -H "Content-Type: application/json" --data "{
"name": "wikipedia-irc",
"config": {
"connector.class": "com.github.cjmatta.kafka.connect.irc.IrcSourceConnector",
"transforms": "WikiEditTransformation",
"transforms.WikiEditTransformation.type": "com.github.cjmatta.kafka.connect.transform.wikiedit.WikiEditTransformation",
"transforms.wikiEditTransformation.save.unparseable.messages": true,
"transforms.wikiEditTransformation.dead.letter.topic": "wikipedia.failed",
"irc.channels": "#en.wikipedia,#fr.wikipedia,#es.wikipedia,#ru.wikipedia,#en.wiktionary,#de.wikipedia,#zh.wikipedia,#sd.wikipedia,#it.wikipedia,#mediawiki.wikipedia,#commons.wikimedia,#eu.wikipedia,#vo.wikipedia,#eo.wikipedia,#uk.wikipedia",
"irc.server": "irc.wikimedia.org",
"kafka.topic": "wikipedia.parsed",
"producer.interceptor.classes": "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor",
"tasks.max": "1"
}
}" http://localhost:8083/connectors
{"error_code":500,"message":"Failed to find any class that implements Connector and which name matches com.github.cjmatta.kafka.connect.irc.IrcSourceConnector, available connectors are: PluginDesc{klass=class io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, name='io.confluent.connect.elasticsearch.ElasticsearchSinkConnector', version='3.3.0', encodedVersion=3.3.0, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class io.confluent.connect.hdfs.HdfsSinkConnector, name='io.confluent.connect.hdfs.HdfsSinkConnector', version='3.3.0', encodedVersion=3.3.0, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class io.confluent.connect.hdfs.tools.SchemaSourceConnector, name='io.confluent.connect.hdfs.tools.SchemaSourceConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class io.confluent.connect.jdbc.JdbcSinkConnector, name='io.confluent.connect.jdbc.JdbcSinkConnector', version='3.3.0', encodedVersion=3.3.0, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class io.confluent.connect.jdbc.JdbcSourceConnector, name='io.confluent.connect.jdbc.JdbcSourceConnector', version='3.3.0', encodedVersion=3.3.0, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class io.confluent.connect.s3.S3SinkConnector, name='io.confluent.connect.s3.S3SinkConnector', version='3.3.0', encodedVersion=3.3.0, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class io.confluent.connect.storage.tools.SchemaSourceConnector, name='io.confluent.connect.storage.tools.SchemaSourceConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='0.11.0.0-cp1', encodedVersion=0.11.0.0-cp1, type=source, typeName='source', location='classpath'}"}


best regards
phil

missing JAAS file properties

#KSQL_PRODUCER_CONFLUENT_MONITORING_INTERCEPTOR_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required \

We're missing the following for the interceptors

              username=\"client\" \
              password=\"client-secret\";"

Demo video seems entirely wrong

In the demo video git clone --recursive is used to clone this repository. It shows sub-modules being installed. Then in the cp-demo folder "make clean all" is executed. My experience disagrees completely with this.

  1. When I clone I see no sub-modules.
  2. There is no Makefile in the cp-demo folder.

Kafka Connect: Uncaught exception in REST call to /permissions

Attempting to spin up cp-demo on a Win 10 WSL 2 Ubuntu 20.04 distro. Everything seems to work except Kafka Connect. Launching Control Center in the browser (localhost:9021) while watching the Connect container log shows repeated uncaught exceptions in REST call to /permissions:

[2022-05-25 23:48:57,010] ERROR Uncaught exception in REST call to /permissions (org.apache.kafka.connect.runtime.rest.errors.ConnectExceptionMapper)
javax.ws.rs.NotFoundException: HTTP 404 Not Found
	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:252)
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248)
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244)
	at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
	at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
	at org.glassfish.jersey.internal.Errors.process(Errors.java:244)
	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265)
	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:234)
	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680)
	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394)
	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346)
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:366)
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:319)
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:550)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:560)
	at io.confluent.common.security.jetty.initializer.ConnectConstraintSecurityHandler.handle(ConnectConstraintSecurityHandler.java:35)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1434)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234)
	at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:179)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:516)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:400)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:645)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:392)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
	at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:555)
	at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:410)
	at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:164)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
	at java.base/java.lang.Thread.run(Thread.java:829)

Control Center shows 0 active Connect clusters:

image
image
image

Have tried:

  • reinstalling Docker
  • deleting Ubuntu distro and reinstalling
  • tried 3 separate PCs

Same result everytime.

Have gone through every step of the troubleshooting doc.

Environment

  • GitHub branch: 7.1.1-post
  • Operating System: Windows 10 21H2 (OS Build 19044.1708) with WSL 2 Ubuntu-20.04
  • Version of Docker: 4.8.2 (79419)
  • Version of Docker Compose: 2.5.1
  • OpenJDK 11.0.15

Network issue

I started the demo on mac os machine with 4gb for ram, docker and ,JB.

I got the following error.
ERROR: The logs in control-center container do not show 'Started NetworkTrafficServerConnector' after 300 seconds. Please troubleshoot with 'docker-compose ps' and 'docker-compose logs'.

This is assuming a fresh instal and the user had selected the start.sh command

A clean maven repository fails

Hi,

This demo is suited for new kafka/confluent users, and that's great ! Good job !

However new kafka/confluent users will have not have any confluent dependencies in their local maven cache. Thus they will not be able to access connect-api-0.11.0.0-cp1.jar as https://github.com/cjmatta/kafka-connect-transform-wikiedit did not defined the confluent repository.

I proposed cjmatta/kafka-connect-transform-wikiedit#1 to fix that.

Meanwhile we can do a few things

To reproduce the problem :

$ echo faking empty maven repository
$ mv ~/.m2/repository ~/.m2/repository-temp
$ make clean all
...
[INFO] BUILD FAILURE
...
[ERROR] Failed to execute goal on project WikiEditTransformation: Could not resolve dependencies for project com.github.cjmatta.kafka.connect.transform.wikiedit:WikiEditTransformation:jar:3.3.0: Could not find artifact org.apache.kafka:connect-api:jar:0.11.0.0-cp1 in central (https://repo.maven.apache.org/maven2) -> [Help 1]
...
$ echo $?
2
$ echo setting back repository
$ rm -rf ~/.m2/repository && mv ~/.m2/repository-temp ~/.m2/repository

Cheers !

Ignored Properties

CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"

When I run docker-compose up, the interceptor doesn't seem to be applied.
[2017-12-13 04:42:40,806] INFO ProducerConfig values:
...
interceptor.classes = null

EOF failing for running KSQL script

@cjmatta is reporting that while this works on mac, it does not report on linux:

docker-compose exec -T ksql-cli ksql http://localhost:8088 <<EOF
run script '/tmp/ksqlcommands';
exit ;
EOF

It appears to just sit at the KSQL CLI prompt ksql>

Note: cp-demo advertises that it has been validated on mac, not linux. However, we should still get this addressed:

  1. Short-term: cp-demo: right way to do this in linux

  2. Long-term: ksql: reinstate the exec functionality confluentinc/ksql#1017

Next step: @cjmatta to provide syntax that does work on linux

running "./scripts/start.sh" fails

Building custom Docker image with Connect version 5.5.0 and connector version 5.5.0
docker build --build-arg CP_VERSION=5.5.0 --build-arg CONNECTOR_VERSION=5.5.0 -t localbuild/connect:5.5.0-5.5.0 -f /Users/binyamingreenberg/my-workspace/cp-demo/scripts/../Dockerfile-confluenthub .
Sending build context to Docker daemon  115.7MB
Step 1/6 : ARG CP_VERSION
Step 2/6 : FROM confluentinc/cp-server-connect-base:$CP_VERSION
 ---> b70f0c2cba1b
Step 3/6 : ARG CONNECTOR_VERSION=5.4.1
 ---> Using cache
 ---> bc103f022980
Step 4/6 : ENV CONNECT_PLUGIN_PATH: "/usr/share/java,/connect-plugins,/usr/share/confluent-hub-components"
 ---> Using cache
 ---> 4f4c12a420f2
Step 5/6 : RUN confluent-hub install --no-prompt confluentinc/kafka-connect-replicator:${CONNECTOR_VERSION}
 ---> Using cache
 ---> f916f60c0621
Step 6/6 : RUN confluent-hub install --no-prompt confluentinc/kafka-connect-elasticsearch:${CONNECTOR_VERSION}
 ---> Running in b27033e7fa97
OCI runtime create failed: container_linux.go:344: starting container process caused "exec: \"/bin/sh\": stat /bin/sh: no such file or directory": unknown
ERROR: Docker image build failed. Please troubleshoot and try again.

No makefile

I run the make all clean and it shows no makefile. Wondering if there is missing files in the repo.

ERROR Error starting the schema registry (io.confluent.kafka.schemaregistry.rest.SchemaRegistryRestApplication)

I have followed this demo and get the following error.

https://github.com/confluentinc/cp-demo
https://github.com/confluentinc/cp-demo/blob/7.0.1-post/docker-compose.yml

I replace KSQL_BOOTSTRAP_SERVERS with my own kafka server and get the following error, what could be the cause of this issue?

[2022-02-08 11:03:09,095] INFO Logging initialized @838ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log)
[2022-02-08 11:03:09,130] INFO Initial capacity 128, increased by 64, maximum capacity 2147483647. (io.confluent.rest.ApplicationServer)
[2022-02-08 11:03:09,204] INFO Adding listener with HTTP/2: https://0.0.0.0:8085 (io.confluent.rest.ApplicationServer)
[2022-02-08 11:03:09,491] ERROR Error starting the schema registry (io.confluent.kafka.schemaregistry.rest.SchemaRegistryRestApplication)
io.confluent.kafka.schemaregistry.exceptions.SchemaRegistryException:  No listener configured with requested scheme http
  at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.getSchemeAndPortForIdentity(KafkaSchemaRegistry.java:303)
  at io.confluent.kafka.schemaregistry.storage.KafkaSchemaRegistry.<init>(KafkaSchemaRegistry.java:148)
  at io.confluent.kafka.schemaregistry.rest.SchemaRegistryRestApplication.initSchemaRegistry(SchemaRegistryRestApplication.java:71)
  at io.confluent.kafka.schemaregistry.rest.SchemaRegistryRestApplication.configureBaseApplication(SchemaRegistryRestApplication.java:90)
  at io.confluent.rest.Application.configureHandler(Application.java:271)
  at io.confluent.rest.ApplicationServer.doStart(ApplicationServer.java:245)
  at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
  at io.confluent.kafka.schemaregistry.rest.SchemaRegistryMain.main(SchemaRegistryMain.java:44)

tools token expire and there is no clear path on how to request a new one

Following demo docs security and schema registry steps:

if keeping the components alive for a while (haven't managed to test how long but probably an hour or so), then message is returned:

docker-compose exec tools bash -c "confluent iam rolebinding create \
    --principal User:appSA \
    --role ResourceOwner \
    --resource Subject:users-value \
    --kafka-cluster-id $KAFKA_CLUSTER_ID \
    --schema-registry-cluster-id schema-registry"
Your token has expired. You are now logged out.
Error: You must log in to run that command.

Is there any guidance on how to request a new topic on the tools service?

docker-compose fails on elasticsearch/kibana images

Hit this when setting up afresh (i.e no prior docker images) and isolated it to the docker-compose command in the start script. Specifically, ES and Kibana versions seem to be problematic?

$ docker-compose up -d
Pulling kibana (docker.elastic.co/kibana/kibana:5.5.2)...
5.5.2: Pulling from kibana/kibana
364f9b7c969a: Verifying Checksum
e49bda868a31: Download complete
19392fbddf76: Download complete
7a6a24f9cca6: Download complete
945295f0f086: Download complete
ad5548b19b09: Download complete
52c4777d1729: Download complete
670929eeaf05: Download complete
568cf9e954fe: Download complete
ERROR: filesystem layer verification failed for digest sha256:364f9b7c969aed6ff05b7ced04b1b53b824386288dffb1e6aec41d5a706dd51e

I bumped both their versions up and it seems to work. Is this a real issue?

HTTP Basic authentication not working

Hello I've added below environment variable to docker-composer yml

CONTROL_CENTER_REST_AUTHENTICATION_METHOD: 'BASIC'
CONTROL_CENTER_REST_REALM: 'c3'
CONTROL_CENTER_REST_AUTHENTICATION_ROLES: 'Administrators,Restricted'
CONTROL_CENTER_REST_RESTRICTED_ROLES: 'Restricted'
CONTROL_CENTER_OPTS: -Djava.security.auth.login.config=/etc/confluent-control-center/propertyfile.conf

and then when i am running https://docs.confluent.io/current/security/basic-auth.html#kconnect-rest-api given command it gives error
"/tmp/confluent/login.properties: No such file or directory"

it will be appreciate to provide http basic authentication configuration on github cp-demo as well.

Thanks.

start.sh stop at subject wikipedia.parsed-value (for topic wikipedia.parsed) to be registered in Schema Registry

hi,

when i run ./scripts/start.sh, it stopped at below step in start.sh

Verify wikipedia.parsed topic is populated and schema is registered

MAX_WAIT=120
echo
echo -e "\nWaiting up to $MAX_WAIT seconds for subject wikipedia.parsed-value (for topic wikipedia.parsed) to be registered in Schema Registry"
retry $MAX_WAIT host_check_schema_registered || exit 1

Waiting up to 120 seconds for subject wikipedia.parsed-value (for topic wikipedia.parsed) to be registered in Schema Registry
........................ERROR: Failed after 120 seconds. Please troubleshoot and run again. For troubleshooting instructions see https://docs.confluent.io/current/tutorials/cp-demo/docs/index.html#troubleshooting

please refer to attached for detail log.

my env is linux centos, jdk / docker version are sufficient for install.
log.txt

thanks.
regards
qx

Too many users in KafkaDevelopers LDAP Group

Here is where users are added to groups: https://github.com/confluentinc/cp-demo/blob/6.2.1-post/scripts/security/ldap_users/20_group_add.ldif

We need to use groups to showcase the power of RBAC. I would suggest two different groups, KafkaDevelopersA and KafkaDevelopersB, with alice and barnie in A and alice and charlie in B. This would make it possible to illustrate the benefits of RBAC a bit better. For example, alice would receive permissions associated with both groups.

I don't see any reason to add the rest of the users to the kafka developers group. Those are all confluent system users who don't need to be in group of developers.

CP-DEMO: unable to resolve docker endpoint: parse "tcp://Content-Type: application/json": invalid port ": application" after host

Hello,
I used to run cp-demo successfully on my local mac. We are in the process of setting up Confluent Replicator. I want to see the cp-demo Replicator instance to see some basic (real time) flows.

I have tried twice, using ./scripts/start.sh. I have enough memory for my docker instance (>8GB)

Output: Below shows the containers are running fine, but the

Waiting up to 240 seconds for Connect to start
..✔

[+] Running 9/9
⠿ Container zookeeper Running 0.0s
⠿ Container openldap Running 0.0s
⠿ Container kafka2 Running 0.0s
⠿ Container kafka1 Running 0.0s
⠿ Container schemaregistry Running 0.0s
⠿ Container restproxy Started 2.2s
⠿ Container connect Running 0.0s
⠿ Container ksqldb-server Started 2.7s
⠿ Container ksqldb-cli Started 3.9s
...

Start streaming from the Wikipedia SSE source connector:
unable to resolve docker endpoint: parse "tcp://Content-Type: application/json": invalid port ": application" after host

Use listener scoped sasl mechanism rather than a fallback KafkaServer section

Description
This issue refers the following discussion
https://confluent.slack.com/archives/C60B5SJ1E/p1622746349449000

TL;DR
The current setup works because it provides a fallback KafkaServer section in the broker_jaas.conf file where it should only use the listener-scope properties.

When providing a listener scoped JAAS config (like listener.name.internal.plain.sasl.jaas.config for instance) Kafka will 1st search for a listener.name.internal.sasl.enabled.mechanisms. If the listener-scoped mechanism is not found then Kafka will try to load the information from the java.auth.file (here broker_jaas.conf) looking for first for a internal.KafkaServer section or falling back to KafkaServer.
Combining both a KafkaServer section and a listener-scoped jaas config is kind of fishy... For instance, the credentials defined for the user_ principals are not used at all in cp-demo (overridden by the one in the inlined jaas config property).

Win_b64 Support

Hello

First thanks for this really funy and nice exemple to demonstrate the power of Kafka & Confluent with ElasticSearch !

However, I had some troubles to install it on a Windows environment.
Your ./script/start.sh fails with MINGW64.

Indeed the script certs-create.sh fails because of the command that generates the CA Key :

# Generate CA key openssl req -new -x509 -keyout snakeoil-ca-1.key -out snakeoil-ca-1.crt -days 365 -subj '/CN=ca1.test.confluent.io/OU=TEST/O=CONFLUENT/L=PaloAlto/S=Ca/C=US' -passin pass:confluent -passout pass:confluent
Error is :
Subject does not start with '/'.

This is a common error when using the Git Bash in windows:
https://stackoverflow.com/questions/31506158/running-openssl-from-a-bash-script-on-windows-subject-does-not-start-with

A fix that work for me :
# Generate CA key openssl req -new -x509 -keyout snakeoil-ca-1.key -out snakeoil-ca-1.crt -days 365 -subj '//CN=ca1.test.confluent.io\OU=TEST\O=CONFLUENT\L=PaloAlto\S=Ca\C=US' -passin pass:confluent -passout pass:confluent

Ciao

Error when running ./scripts/start.sh. Related to SSL certificates verification behind VPN

Description
Error occurs on first installation of connector with confluent-hub command during construction of docker connect image

Troubleshooting
Validate every step in the troubleshooting section: https://docs.confluent.io/platform/current/tutorials/cp-demo/docs/index.html#troubleshooting
-­­> Done
Identify any existing issues that seem related: https://github.com/confluentinc/cp-demo/issues?q=is%3Aissue
Nothing found
Similar issue found in another repo - Unable to install kafka-connect-datagen:0.1.0

If applicable, please include the output of:

  • Error log `docker build --build-arg CP_VERSION=6.2.0 --build-arg REPOSITORY=confluentinc -t localbuild/connect:6.2.0-6.2.0 -f /mnt/c/Source/Git/cp-demo/scripts/helper/../../Dockerfile /mnt/c/Source/Git/cp-demo/scripts/helper/../../.
    [+] Building 11.0s (6/10)
    => [internal] load build definition from Dockerfile 0.0s
    => => transferring dockerfile: 38B 0.0s
    => [internal] load .dockerignore 0.0s
    => => transferring context: 34B 0.0s
    => [internal] load metadata for docker.io/confluentinc/cp-enterprise-replicator:6.2.0 8.8s
    => CACHED [1/6] FROM docker.io/confluentinc/cp-enterprise-replicator:6.2.0@sha256:b0d548c64f8eb33f269f3d268af015 0.0s
    => [internal] load build context 0.2s
    => => transferring context: 1.42kB 0.1s
    => ERROR [2/6] RUN confluent-hub install --no-prompt cjmatta/kafka-connect-sse:1.0 --verbose 2.1s

[2/6] RUN confluent-hub install --no-prompt cjmatta/kafka-connect-sse:1.0 --verbose:
#5 0.500 Running in a verbose mode
#5 0.500 Running in a "--no-prompt" mode
#5 0.567 Client's installation type is: PACKAGE
#5 1.719 Implicit acceptance of the license below:
#5 1.719 The Apache License, Version 2.0
#5 1.719 https://www.apache.org/licenses/LICENSE-2.0
#5 1.719 Implicit confirmation of the question: You are about to install 'kafka-connect-sse' from Christopher Matta, as published on Confluent Hub.
#5 1.719 Downloading component Kafka Connect SSE 1.0, provided by Christopher Matta from Confluent Hub and installing into /usr/share/confluent-hub-components
#5 1.720 Scanning path /usr/share/confluent-hub-components
#5 1.721 Installation state for the kafka-connect-sse in the /usr/share/confluent-hub-components is NOT_INSTALLED
#5 1.721 Creating temporary directory
#5 1.722 Created temporary directory: /tmp/confluent-hub-tmp3870261092370935366
#5 1.970 javax.net.ssl.SSLHandshakeException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
#5 1.970
#5 1.970 Error: Unknown error


executor failed running [/bin/sh -c confluent-hub install --no-prompt cjmatta/kafka-connect-sse:1.0 --verbose]: exit code: 7
ERROR: Docker image build failed.`

Environment

  • GitHub branch: 6.2.0-post
  • Operating System: Windows 10, WSL 2
  • Version of Docker desktop : 3.5.2
  • Version of Docker Engine : 20.10.7
  • Version of Docker Compose: 1.29.2

Probably caused by VPN software : ZScaler Client connector. Each outgoing request verified, if not whitelisted, will re-encrypt traffic with ZScacler certificates chain. Have to add URL explicitly.
More output of common commands to help with troubleshooting

`curl -iv https://api.hub.confluent.io/api/plugins

  • Trying 52.8.154.80:443...
  • TCP_NODELAY set
  • Connected to api.hub.confluent.io (curl -iv https://api.hub.confluent.io/api/plugins
  • Trying XX.XX.XX.XX:443...
  • TCP_NODELAY set
  • Connected to api.hub.confluent.io (XX.XX.XX.XX) port 443 (#0)
  • ALPN, offering h2
  • ALPN, offering http/1.1
  • successfully set certificate verify locations:
  • CAfile: /etc/ssl/certs/ca-certificates.crt
    CApath: /etc/ssl/certs
  • TLSv1.3 (OUT), TLS handshake, Client hello (1):
  • TLSv1.3 (IN), TLS handshake, Server hello (2):
  • TLSv1.2 (IN), TLS handshake, Certificate (11):
  • TLSv1.2 (IN), TLS handshake, Server key exchange (12):
  • TLSv1.2 (IN), TLS handshake, Server finished (14):
  • TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
  • TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
  • TLSv1.2 (OUT), TLS handshake, Finished (20):
  • TLSv1.2 (IN), TLS handshake, Finished (20):
  • SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
  • ALPN, server did not agree to a protocol
  • Server certificate:
  • subject: C=US; ST=CA; L=Mountain View; O=Confluent, Inc.; OU=Information Technology; CN=*.confluent.io
  • start date: Nov 19 00:00:00 2019 GMT
  • expire date: Jan 6 12:00:00 2022 GMT
  • subjectAltName: host "api.hub.confluent.io" matched cert's "api.hub.confluent.io"
  • issuer: C=US; O=DigiCert Inc; CN=DigiCert SHA2 Secure Server CA
  • SSL certificate verify ok.

GET /api/plugins HTTP/1.1
Host: api.hub.confluent.io
User-Agent: curl/7.68.0
Accept: /

<
[{....}]) port 443 (#0)

  • ALPN, offering h2
  • ALPN, offering http/1.1
  • successfully set certificate verify locations:
  • CAfile: /etc/ssl/certs/ca-certificates.crt
    CApath: /etc/ssl/certs
  • TLSv1.3 (OUT), TLS handshake, Client hello (1):
  • TLSv1.3 (IN), TLS handshake, Server hello (2):
  • TLSv1.2 (IN), TLS handshake, Certificate (11):
  • TLSv1.2 (IN), TLS handshake, Server key exchange (12):
  • TLSv1.2 (IN), TLS handshake, Server finished (14):
  • TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
  • TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
  • TLSv1.2 (OUT), TLS handshake, Finished (20):
  • TLSv1.2 (IN), TLS handshake, Finished (20):
  • SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
  • ALPN, server did not agree to a protocol
  • Server certificate:
  • subject: C=US; ST=CA; L=Mountain View; O=Confluent, Inc.; OU=Information Technology; CN=*.confluent.io
  • start date: Nov 19 00:00:00 2019 GMT
  • expire date: Jan 6 12:00:00 2022 GMT
  • subjectAltName: host "api.hub.confluent.io" matched cert's "api.hub.confluent.io"
  • issuer: C=US; O=DigiCert Inc; CN=DigiCert SHA2 Secure Server CA
  • SSL certificate verify ok.

GET /api/plugins HTTP/1.1
Host: api.hub.confluent.io
User-Agent: curl/7.68.0
Accept: /

<
[{....}]`

`confluent-hub install cjmatta/kafka-connect-sse:1.0 --verbose --component-dir . --no-prompt
Running in a verbose mode
Running in a "--no-prompt" mode
Implicit acceptance of the license below:
The Apache License, Version 2.0
https://www.apache.org/licenses/LICENSE-2.0
Implicit confirmation of the question: You are about to install 'kafka-connect-sse' from Christopher Matta, as published on Confluent Hub.
Downloading component Kafka Connect SSE 1.0, provided by Christopher Matta from Confluent Hub and installing into .
Scanning path .
Installation state for the kafka-connect-sse in the . is IMMEDIATE_DIRECTORY
Implicit confirmation of the question: Do you want to uninstall existing version 1.0?
Uninstalling kafka-connect-sse from .
Deleting ./cjmatta-kafka-connect-sse
Creating temporary directory
Created temporary directory: /tmp/confluent-hub-tmp3005423504225809653
Copying data from input stream to file /tmp/confluent-hub-tmp3005423504225809653/cjmatta-kafka-connect-sse-1.0.zip
6375414 bytes saved to disk
Creating directory recursively: .
Unzipping file /tmp/confluent-hub-tmp3005423504225809653/cjmatta-kafka-connect-sse-1.0.zip to ./cjmatta-kafka-connect-sse
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/doc
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/etc
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/etc
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/etc
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/assets
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Creating directory recursively: ./cjmatta-kafka-connect-sse/lib
Deleting /tmp/confluent-hub-tmp3005423504225809653
Client's installation type is: ARCHIVE
Unable to detect Confluent Platform installation. Specify --component-dir and --worker-configs explicitly.

Error: Invalid options or arguments`

`openssl s_client -connect api.hub.confluent.io:443 -showcerts
CONNECTED(00000003)
depth=2 C = US, O = DigiCert Inc, OU = www.digicert.com, CN = DigiCert Global Root CA
verify return:1
depth=1 C = US, O = DigiCert Inc, CN = DigiCert SHA2 Secure Server CA
verify return:1
depth=0 C = US, ST = CA, L = Mountain View, O = "Confluent, Inc.", OU = Information Technology, CN = *.confluent.io
verify return:1

Certificate chain
0 s:C = US, ST = CA, L = Mountain View, O = "Confluent, Inc.", OU = Information Technology, CN = *.confluent.io
i:C = US, O = DigiCert Inc, CN = DigiCert SHA2 Secure Server CA
-----BEGIN CERTIFICATE-----
MIIG2TCCBcGgAwIBAgIQBSEG2pQlpLzV9DvuRT3amzANBgkqhkiG9w0BAQsFADBN
MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMScwJQYDVQQDEx5E
aWdpQ2VydCBTSEEyIFNlY3VyZSBTZXJ2ZXIgQ0EwHhcNMTkxMTE5MDAwMDAwWhcN
MjIwMTA2MTIwMDAwWjCBhjELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAkNBMRYwFAYD
VQQHEw1Nb3VudGFpbiBWaWV3MRgwFgYDVQQKEw9Db25mbHVlbnQsIEluYy4xHzAd
BgNVBAsTFkluZm9ybWF0aW9uIFRlY2hub2xvZ3kxFzAVBgNVBAMMDiouY29uZmx1
ZW50LmlvMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAv/CCPPyJVrUe
uGFVkd5YOjha5mZXmBNr9XVeRLlZI6GSW3xenw3dEanrZTNLbWcznJI1i2NuevUz
tfIJ6yY9BsRGwYZqRyjPnjtpEjsWKFOgeA8BUZ9SyHz46oV45HJJoXheDE0S8GIt
7Rli93O8XRwe2oevMfuLuLEUIul5nAFD3DklNEVf/8oe26hjJDE+9M4cBbMOCSZc
/OIZOPWM9k7zSEUe0RGDvaY4VmZfI6rVR/j9NCZMbPHBD2QUomqsEux/eBKfn2oO
W/PRsX1ZA7ZiL6ecmM4d3TV9E0zRyQJ2Ng3ruAdFKNSISYfL9WU6oWWlvzahOrpQ
TNP0XS0qqQIDAQABo4IDeTCCA3UwHwYDVR0jBBgwFoAUD4BhHIIxYdUvKOeNRji0
LOHG2eIwHQYDVR0OBBYEFO6NGIuEXcnM1Bmg/WmJsw8wSjbPMD0GA1UdEQQ2MDSC
DiouY29uZmx1ZW50Lmlvggxjb25mbHVlbnQuaW+CFGFwaS5odWIuY29uZmx1ZW50
LmlvMA4GA1UdDwEB/wQEAwIFoDAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUH
AwIwawYDVR0fBGQwYjAvoC2gK4YpaHR0cDovL2NybDMuZGlnaWNlcnQuY29tL3Nz
Y2Etc2hhMi1nNi5jcmwwL6AtoCuGKWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9z
c2NhLXNoYTItZzYuY3JsMEwGA1UdIARFMEMwNwYJYIZIAYb9bAEBMCowKAYIKwYB
BQUHAgEWHGh0dHBzOi8vd3d3LmRpZ2ljZXJ0LmNvbS9DUFMwCAYGZ4EMAQICMHwG
CCsGAQUFBwEBBHAwbjAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQu
Y29tMEYGCCsGAQUFBzAChjpodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGln
aUNlcnRTSEEyU2VjdXJlU2VydmVyQ0EuY3J0MAwGA1UdEwEB/wQCMAAwggF8Bgor
BgEEAdZ5AgQCBIIBbASCAWgBZgB1AKS5CZC0GFgUh7sTosxncAo8NZgE+RvfuON3
zQ7IDdwQAAABboVvpCAAAAQDAEYwRAIgCcwDssIv93l6tCL2RrMXseuwYIkA9AgY
+MYnVaRuoaICIFNTiZwRunQ4VDqBQu353jMtzir0oOtx5uv051NiueFyAHUAh3W/
51l8+IxDmV+9827/Vo1HVjb/SrVgwbTq/16ggw8AAAFuhW+kaQAABAMARjBEAiBo
jotH17OVshZfwNfBEuLgnJLKfPxou24Ve47vkF3R7gIgIWfgAu/m9seQAiOU9YZg
ej4ZqNG0KRcxTivX5pi+8IAAdgBWFAaaL9fC7NP14b1Esj7HRna5vJkRXMDvlJhV
1onQ3QAAAW6Fb6RpAAAEAwBHMEUCIA2bUHo7qSwMLX5jm23AvXk/gWZkkLnfdBaX
vR7HQ8c/AiEA/3Z2nNMiDMubFkR3+0hrbujQywf7iUESCqKchBJdJOIwDQYJKoZI
hvcNAQELBQADggEBAHr3ANPL9GLNawc6wz2mZwVJ5XpPcUTrSbU83ikY9ZcLUBFp
oaj+DyiZpiHCNt4u/sXB4Z8jM1ruYW3J/ADm15EY7HZhWQcdxA2ZL0sSdyHwqxbf
biAfSd5/MaIAMy/sH5Oqtz/XxYVnpR52ZSA0x6RgcTa6DfRK95j3+ae7C6R3Ro2a
bUppURcTp7vtrNIIhblDJfQDZKJlkM4LoUfRmdEdrs+yBgAOMPHEFxXwvQJn4xAR
3hYubfOSVU11bhZYGnL8g2sgPexzkregRakGsmAuaZ6VlvBTE9EVDkHD8EObnUO3
p/c0DhhTXJ9Xjs0U6VVYdi2H0eePYqTC1aDj5vU=
-----END CERTIFICATE-----
1 s:C = US, O = DigiCert Inc, CN = DigiCert SHA2 Secure Server CA
i:C = US, O = DigiCert Inc, OU = www.digicert.com, CN = DigiCert Global Root CA
-----BEGIN CERTIFICATE-----
MIIElDCCA3ygAwIBAgIQAf2j627KdciIQ4tyS8+8kTANBgkqhkiG9w0BAQsFADBh
MQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3
d3cuZGlnaWNlcnQuY29tMSAwHgYDVQQDExdEaWdpQ2VydCBHbG9iYWwgUm9vdCBD
QTAeFw0xMzAzMDgxMjAwMDBaFw0yMzAzMDgxMjAwMDBaME0xCzAJBgNVBAYTAlVT
MRUwEwYDVQQKEwxEaWdpQ2VydCBJbmMxJzAlBgNVBAMTHkRpZ2lDZXJ0IFNIQTIg
U2VjdXJlIFNlcnZlciBDQTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEB
ANyuWJBNwcQwFZA1W248ghX1LFy949v/cUP6ZCWA1O4Yok3wZtAKc24RmDYXZK83
nf36QYSvx6+M/hpzTc8zl5CilodTgyu5pnVILR1WN3vaMTIa16yrBvSqXUu3R0bd
KpPDkC55gIDvEwRqFDu1m5K+wgdlTvza/P96rtxcflUxDOg5B6TXvi/TC2rSsd9f
/ld0Uzs1gN2ujkSYs58O09rg1/RrKatEp0tYhG2SS4HD2nOLEpdIkARFdRrdNzGX
kujNVA075ME/OV4uuPNcfhCOhkEAjUVmR7ChZc6gqikJTvOX6+guqw9ypzAO+sf0
/RR3w6RbKFfCs/mC/bdFWJsCAwEAAaOCAVowggFWMBIGA1UdEwEB/wQIMAYBAf8C
AQAwDgYDVR0PAQH/BAQDAgGGMDQGCCsGAQUFBwEBBCgwJjAkBggrBgEFBQcwAYYY
aHR0cDovL29jc3AuZGlnaWNlcnQuY29tMHsGA1UdHwR0MHIwN6A1oDOGMWh0dHA6
Ly9jcmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RDQS5jcmwwN6A1
oDOGMWh0dHA6Ly9jcmw0LmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEdsb2JhbFJvb3RD
QS5jcmwwPQYDVR0gBDYwNDAyBgRVHSAAMCowKAYIKwYBBQUHAgEWHGh0dHBzOi8v
d3d3LmRpZ2ljZXJ0LmNvbS9DUFMwHQYDVR0OBBYEFA+AYRyCMWHVLyjnjUY4tCzh
xtniMB8GA1UdIwQYMBaAFAPeUDVW0Uy7ZvCj4hsbw5eyPdFVMA0GCSqGSIb3DQEB
CwUAA4IBAQAjPt9L0jFCpbZ+QlwaRMxp0Wi0XUvgBCFsS+JtzLHgl4+mUwnNqipl
5TlPHoOlblyYoiQm5vuh7ZPHLgLGTUq/sELfeNqzqPlt/yGFUzZgTHbO7Djc1lGA
8MXW5dRNJ2Srm8c+cftIl7gzbckTB+6WohsYFfZcTEDts8Ls/3HB40f/1LkAtDdC
2iDJ6m6K7hQGrn2iWZiIqBtvLfTyyRRfJs8sjX7tN8Cp1Tm5gr8ZDOo0rwAhaPit
c+LJMto4JQtV05od8GiG7S5BNO98pVAdvzr508EIDObtHopYJeS4d60tbvVS3bR0
j6tJLp07kzQoH3jOlOrHvdPJbRzeXDLz
-----END CERTIFICATE-----

Server certificate
subject=C = US, ST = CA, L = Mountain View, O = "Confluent, Inc.", OU = Information Technology, CN = *.confluent.io

issuer=C = US, O = DigiCert Inc, CN = DigiCert SHA2 Secure Server CA


No client certificate CA names sent
Peer signing digest: SHA512
Peer signature type: RSA
Server Temp Key: ECDH, P-256, 256 bits

SSL handshake has read 3443 bytes and written 438 bytes
Verification: OK

New, TLSv1.2, Cipher is ECDHE-RSA-AES128-GCM-SHA256
Server public key is 2048 bit
Secure Renegotiation IS supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
SSL-Session:
Protocol : TLSv1.2
Cipher : ECDHE-RSA-AES128-GCM-SHA256
Session-ID: B54041465020CE1A7660B18AA43C0691ACD073C37A504A74F82DA967CBD24FDC
Session-ID-ctx:
Master-Key: 81F206C6D5F1524DEB271391C9F71CCAC35A100390D716F320A47FF74E9D867F89CE2EA9E0E55BC631C747D7D032FEF3
PSK identity: None
PSK identity hint: None
SRP username: None
Start Time: 1626108409
Timeout : 7200 (sec)
Verify return code: 0 (ok)
Extended master secret: no
---`

Any idea what could be possible be done to see which command , to which URL is breaking thing up ?

ksqlDB Websocket connection failes unless accessing localhost

If you run everything on your local machine and access it from your browser using localhost as the hostname, it works fine, as expected.

However, I have a separate beefy server machine that I use to run virtual machines, and I ran the example on an Ubuntu 20.04 VM and connected to Control Center from my laptop using the VM's IP address. I noticed that it was nonsensically trying to make a Websocket connection to localhost to get the results of ksqlDB queries, and so I went digging.

Turns out, it's hard-coded to use localhost in the docker-compose.yml file. Line 658-659 (at the time of this writing) contains:

# Access ksqlDB server from a browser on your localhost
CONTROL_CENTER_KSQL_WIKIPEDIA_ADVERTISED_URL: http://localhost:8088

This really should be either dynamically resolved at runtime, or at the very least, this configuration variable should be called out in the quick start instructions.

non-descriptive start.sh error message if docker daemon isn't running

Description
If the docker daemon is not running start.sh will fail with
panic: reflect: indirection through nil pointer to embedded struct [recovered] panic: reflect: indirection through nil pointer to embedded struct

Troubleshooting
Check if the docker daemon is running. If it is stop it.
Execute scripts/start.sh

Environment

  • GitHub branch: 6.2.0-post
  • Operating System: MacOS Catalina
  • Version of Docker: Docker Engine Community 20.10.5
  • Version of Docker Compose: 1.29.0, build 07737305

Docker commands in helper functions.sh need "-T" to properly pipe to jq

Description

The check_connector_status_running isn't properly parsing whether the connector is running, causing the start script to timeout. Same problem with host_check_schema_registered. The solution is to add the -T option to the docker-compose exec commands to disable teletype so the output is properly sent as stdout to the jq command. See this stackoverflow comment.

Here is an example where start.sh times out while retrying:

Start streaming from the Wikipedia SSE source connector:
...
{"name":"wikipedia-sse","connector":{"state":"RUNNING","worker_id":"connect:8083"},"tasks":[{"id":0,"state":"RUNNING","worker_id":"connect:8083"}],"type":"source"}

Troubleshooting
After adding the -T flag to the various docker-compose exec commands in functions.sh fixed the issue and CP Demo was able to start normally. PR incoming.

Environment

  • GitHub branch: [e.g. 7.0.1-post]
  • Operating System: Ubuntu 20.04
  • Version of Docker: 20.10.12
  • Version of Docker Compose: 2.2.3

cm_demo

Hello,

while installing cp_demo (a VM with 24GB RAM, CentOS 7) following these instructions
[https://docs.confluent.io/current/tutorials/cp-demo/docs/index.html],
I've runned into 2 issues:

  1. At #2: In the advanced Docker preferences settings, increase the memory available to Docker to at least 8 GB (default is 2 GB).
    I was not able to find this option. It would be helpful to have this command documented here.

  2. At #3: Verify the status of the Docker containers.
    I receive Exit 78 from eleastic_search and Exit 1 from kafka_client after a short time (see attached txt).
    Don't know whats going wrong. I found no log files under /var/log.
    Also attached the output from "docker-compose logs -f control-center".

Any help would be appreciated.

Klaus
cp_demo.txt
cc.txt

MDS fails to start due to wrong Java version

Description

Start script failure:

Waiting up to 120 seconds for MDS to start
........................ERROR: Failed after 120 seconds. Please troubleshoot and run again. 

Docker logs:

[2021-04-14 20:35:00,087] WARN Failed to initialize a channel. Closing: [id: 0x67e8d0a5] (io.netty.channel.ChannelInitializer)
kafka2      | org.apache.zookeeper.common.X509Exception$SSLContextException: Failed to create KeyManager
kafka2      | 	at org.apache.zookeeper.common.X509Util.createSSLContextAndOptions(X509Util.java:336)
kafka2      | 	at org.apache.zookeeper.common.X509Util.createSSLContext(X509Util.java:268)
kafka2      | 	at org.apache.zookeeper.ClientCnxnSocketNetty$ZKClientPipelineFactory.initSSL(ClientCnxnSocketNetty.java:454)
kafka2      | 	at org.apache.zookeeper.ClientCnxnSocketNetty$ZKClientPipelineFactory.initChannel(ClientCnxnSocketNetty.java:444)
kafka2      | 	at org.apache.zookeeper.ClientCnxnSocketNetty$ZKClientPipelineFactory.initChannel(ClientCnxnSocketNetty.java:429)
kafka2      | 	at io.netty.channel.ChannelInitializer.initChannel(ChannelInitializer.java:129)
kafka2      | 	at io.netty.channel.ChannelInitializer.handlerAdded(ChannelInitializer.java:112)
kafka2      | 	at io.netty.channel.AbstractChannelHandlerContext.callHandlerAdded(AbstractChannelHandlerContext.java:938)
kafka2      | 	at io.netty.channel.DefaultChannelPipeline.callHandlerAdded0(DefaultChannelPipeline.java:609)
kafka2      | 	at io.netty.channel.DefaultChannelPipeline.access$100(DefaultChannelPipeline.java:46)
kafka2      | 	at io.netty.channel.DefaultChannelPipeline$PendingHandlerAddedTask.execute(DefaultChannelPipeline.java:1463)
kafka2      | 	at io.netty.channel.DefaultChannelPipeline.callHandlerAddedForAllHandlers(DefaultChannelPipeline.java:1115)
kafka2      | 	at io.netty.channel.DefaultChannelPipeline.invokeHandlerAddedIfNeeded(DefaultChannelPipeline.java:650)
kafka2      | 	at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:502)
kafka2      | 	at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:417)
kafka2      | 	at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:474)
kafka2      | 	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
kafka2      | 	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
kafka2      | 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
kafka2      | 	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
kafka2      | 	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
kafka2      | 	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
kafka2      | 	at java.base/java.lang.Thread.run(Thread.java:834)

Environment

Based on user troubleshooting, it fails due to Java version.

/Library/Java/JavaVirtualMachines/jdk-16.jdk/Contents/Home/bin/java --version
java 16 2021-03-16
Java(TM) SE Runtime Environment (build 16+36-2231)
Java HotSpot(TM) 64-Bit Server VM (build 16+36-2231, mixed mode, sharing)

The demo should not have a Java dependency (except for building the Kafka Streams app), but maybe it's tied into keytool?

windows error while running "./scripts/start.sh"

running on windows 10 with gitbash.

MINGW64 ~/secureConfluentKAFKA/cp-demo (5.5.1-post) $ ./scripts/start.sh The REPOSITORY variable is not set. Defaulting to a blank string. The CONFLUENT_DOCKER_TAG variable is not set. Defaulting to a blank string. The CONNECTOR_VERSION variable is not set. Defaulting to a blank string. Removing network cp-demo_default Network cp-demo_default not found. Generate keys and certificates used for SSL Generating a RSA private key ................................................................................ .................+++++ ...............................+++++ writing new private key to 'snakeoil-ca-1.key' ----- name is expected to be in the format /type0=value0/type1=value1/type2=... where characters may be escaped by . This name is not in that format: 'C:/Program Fil es/Git/CN=ca1.test.confluentdemo.io/OU=TEST/O=CONFLUENT/L=PaloAlto/ST=Ca/C=US' problems making Certificate Request ------------------------------- kafka1 ------------------------------- Can't open /proc/17632/fd/63 for reading, No such file or directory 8876:error:02001003:system library:fopen:No such process:../openssl-1.1.1a/crypto/bio/bss_file.c:72:fopen('/proc/17632/fd/63','r') 8876:error:2006D080:BIO routines:BIO_new_file:no such file:../openssl-1.1.1a/crypto/bio/bss_file.c:79: keytool error: java.io.FileNotFoundException: .\snakeoil-ca-1.crt (The system cannot find the file specified) keytool error: java.io.FileNotFoundException: kafka1-ca1-signed.crt (The system cannot find the file specified) keytool error: java.io.FileNotFoundException: .\snakeoil-ca-1.crt (The system cannot find the file specified) Certificate stored in file Importing keystore kafka.kafka1.keystore.jks to kafka1.keystore.p12... Entry for alias kafka1 successfully imported. Import command completed: 1 entries successfully imported, 0 entries failed or cancelled

cp-demo: Multiple warnings : WARN[0000] On multiple settings

Hello,
I have been using cp-demo last couple of times successfully, but on 6.2 it is giving me below list of warnings. Due to which some of the services not working.
1st time when I saw the same warnings:

  1. Control Center was not showing the replicator
  2. Simply I have shutdown the cp-demo and removed the docker container cleanly
  3. Re-installed cp-demo
    2nd time when I re-installed and restarted cp-demo, I saw the same warnings
  4. This time, Control center not even launching at localhost:9021

Am I missing anything?

Liz's MacBook$ docker-compose ps
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "CONNECTOR_VERSION" variable is not set. Defaulting to a blank string.
WARN[0000] The "CONNECTOR_VERSION" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "CONTROL_CENTER_KSQL_WIKIPEDIA_ADVERTISED_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "CONTROL_CENTER_KSQL_WIKIPEDIA_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "SSL_CIPHER_SUITES" variable is not set. Defaulting to a blank string.
WARN[0000] The "REPOSITORY" variable is not set. Defaulting to a blank string.
NAME COMMAND SERVICE STATUS PORTS
connect "/etc/confluent/dock…" connect running 0.0.0.0:8083->8083/tcp, :::8083->8083/tcp, 9092/tcp
control-center "/etc/confluent/dock…" control-center running 0.0.0.0:9021->9021/tcp, :::9021->9021/tcp, 0.0.0.0:9022->9022/tcp, :::9022->9022/tcp
kafka1 "bash -c 'if [ ! -f …" kafka1 running 0.0.0.0:8091->8091/tcp, :::8091->8091/tcp, 0.0.0.0:9091->9091/tcp, :::9091->9091/tcp, 9092/tcp, 0.0.0.0:10091->10091/tcp, :::10091->10091/tcp, 0.0.0.0:11091->11091/tcp, :::11091->11091/tcp, :::12091->12091/tcp, 0.0.0.0:12091->12091/tcp
kafka2 "bash -c 'if [ ! -f …" kafka2 running 0.0.0.0:8092->8092/tcp, :::8092->8092/tcp, 0.0.0.0:9092->9092/tcp, :::9092->9092/tcp, 0.0.0.0:10092->10092/tcp, :::10092->10092/tcp, 0.0.0.0:11092->11092/tcp, :::11092->11092/tcp, 0.0.0.0:12092->12092/tcp, :::12092->12092/tcp
openldap "/container/tool/run…" openldap running 0.0.0.0:389->389/tcp, :::389->389/tcp, 636/tcp
schemaregistry "/etc/confluent/dock…" schemaregistry running 8081/tcp, 0.0.0.0:8085->8085/tcp, :::8085->8085/tcp
tools "/bin/bash" tools running
zookeeper "/etc/confluent/dock…" zookeeper running 0.0.0.0:2181->2181/tcp, :::2181->2181/tcp, 0.0.0.0:2182->2182/tcp, :::2182->2182/tcp, 2888/tcp, 3888/tcp
Liz's MacBook$ date
Sat Sep 25 16:38:08 PDT 2021
Liz's MacBook$

Thanks,
Liz.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.