GithubHelp home page GithubHelp logo

kafka-connect-tdengine's People

Contributors

dingbo8128 avatar ecnuhail avatar gccgdb1234 avatar huolibo avatar sangshuduo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

ecnuhail sq-q

kafka-connect-tdengine's Issues

第二次调用一个执行了多次查询的方法会报Unable to establish connection

当一个方法内调用了多次查询
1
微信图片_20240104110004
微信图片_20240104110146
xml配置文件:

<!--  TDengine配置  -->
<bean id="druidDataSource" class="com.alibaba.druid.pool.DruidDataSource">
    <property name="driverClassName" value="com.taosdata.jdbc.TSDBDriver"></property>
    <property name="url" value="jdbc:TAOS://taos.gnss.Singapore:****/gnssdbxjp"></property>
    <property name="username" value="****"></property>
    <property name="password" value="***********"></property>
    <property name="initialSize" value="3"></property>
    <property name="minIdle" value="3"></property>
    <property name="maxActive" value="20"></property>
    <property name="maxWait" value="60000"></property>
    <property name="validationQuery" value="select server_status()"></property>
</bean>

报错日志:
QQ截图20240104110556

当一个方法内执行多次查询,第二次调用这个方法时就会报Unable to establish connection无法建立连接。版本是3.2.0

connect 异常

根据官方网站的技术文档 ,https://www.taosdata.com/chinese/12592.html 进行 sink尝试,在 sink-test.properties 中
name=TDengineSinkConnector
connector.class=com.taosdata.kafka.connect.sink.TDengineSinkConnector
tasks.max=1
topics=meters
connection.url=jdbc:TAOS://127.0.0.1:6030
connection.user=root
connection.password=taosdata
connection.database=power
db.schemaless=line
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter

tasks.max=1 情况下,入库数据失败,改为 大于等于2,可看到数据成功写入到TD。在 tasks.max=1的情况下,出错信息如下:
[2022-12-12 21:19:29,728] INFO Kafka Connect standalone worker initializing ... (org.apache.kafka.connect.cli.ConnectStandalone:68)
[2022-12-12 21:19:29,736] INFO WorkerInfo values:
jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/home/xj/kafka_2.13-3.3.1/bin/../logs, -Dlog4j.configuration=file:./../config/connect-log4j.properties
jvm.spec = Oracle Corporation, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_191, 25.191-b12
jvm.classpath = /home/xj/kafka_2.13-3.3.1/bin/../libs/activation-1.1.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/aopalliance-repackaged-2.6.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/argparse4j-0.7.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/audience-annotations-0.5.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/commons-cli-1.4.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/commons-lang3-3.12.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/commons-lang3-3.8.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-api-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-basic-auth-extension-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-json-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-mirror-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-mirror-client-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-runtime-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/connect-transforms-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/hk2-api-2.6.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/hk2-locator-2.6.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/hk2-utils-2.6.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-annotations-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-core-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-databind-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-dataformat-csv-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-datatype-jdk8-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-jaxrs-base-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-jaxrs-json-provider-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-module-jaxb-annotations-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jackson-module-scala_2.13-2.13.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jakarta.activation-api-1.2.2.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jakarta.annotation-api-1.3.5.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jakarta.inject-2.6.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jakarta.validation-api-2.0.2.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jakarta.ws.rs-api-2.1.6.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jakarta.xml.bind-api-2.3.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/javassist-3.27.0-GA.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/javax.servlet-api-3.1.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/javax.ws.rs-api-2.1.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jaxb-api-2.3.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jersey-client-2.34.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jersey-common-2.34.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jersey-container-servlet-2.34.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jersey-container-servlet-core-2.34.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jersey-hk2-2.34.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jersey-server-2.34.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-client-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-continuation-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-http-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-io-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-security-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-server-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-servlet-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-servlets-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-util-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jetty-util-ajax-9.4.48.v20220622.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jline-3.21.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jopt-simple-5.0.4.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/jose4j-0.7.9.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka_2.13-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-clients-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-log4j-appender-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-metadata-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-raft-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-server-common-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-shell-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-storage-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-storage-api-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-streams-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-streams-examples-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-streams-scala_2.13-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-streams-test-utils-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/kafka-tools-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/lz4-java-1.8.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/maven-artifact-3.8.4.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/metrics-core-2.2.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/metrics-core-4.1.12.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-buffer-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-codec-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-common-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-handler-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-resolver-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-transport-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-transport-classes-epoll-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-transport-native-epoll-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/netty-transport-native-unix-common-4.1.78.Final.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/osgi-resource-locator-1.0.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/paranamer-2.8.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/plexus-utils-3.3.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/reflections-0.9.12.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/reload4j-1.2.19.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/rocksdbjni-6.29.4.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/scala-collection-compat_2.13-2.6.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/scala-java8-compat_2.13-1.0.2.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/scala-library-2.13.8.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/scala-logging_2.13-3.9.4.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/scala-reflect-2.13.8.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/slf4j-api-1.7.36.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/slf4j-reload4j-1.7.36.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/snappy-java-1.1.8.4.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/swagger-annotations-2.2.0.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/trogdor-3.3.1.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/zookeeper-3.6.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/zookeeper-jute-3.6.3.jar:/home/xj/kafka_2.13-3.3.1/bin/../libs/zstd-jni-1.5.2-1.jar
os.spec = Linux, amd64, 4.9.0-8-linx-security-amd64
os.vcpus = 4
(org.apache.kafka.connect.runtime.WorkerInfo:71)
[2022-12-12 21:19:29,742] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectStandalone:77)
[2022-12-12 21:19:29,775] INFO Loading plugin from: /home/xj/connectors/taosdata-kafka-connect-tdengine-1.0.2 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:277)
[2022-12-12 21:19:30,302] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/xj/connectors/taosdata-kafka-connect-tdengine-1.0.2/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:299)
[2022-12-12 21:19:30,303] INFO Added plugin 'com.taosdata.kafka.connect.sink.TDengineSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:30,303] INFO Added plugin 'com.taosdata.kafka.connect.source.TDengineSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:30,303] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:30,303] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:30,303] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,159] INFO Registered loader: sun.misc.Launcher$AppClassLoader@764c12b6 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:299)
[2022-12-12 21:19:32,160] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,160] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,160] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,160] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,160] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,161] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,162] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.Filter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.HeaderFrom$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,163] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.DropHeaders' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,164] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertHeader' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.HeaderFrom$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,165] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,166] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,166] INFO Added plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,166] INFO Added plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,166] INFO Added plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,166] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,166] INFO Added plugin 'org.apache.kafka.common.config.provider.DirectoryConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,167] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:230)
[2022-12-12 21:19:32,167] INFO Added aliases 'TDengineSinkConnector' and 'TDengineSink' to plugin 'com.taosdata.kafka.connect.sink.TDengineSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,168] INFO Added aliases 'TDengineSourceConnector' and 'TDengineSource' to plugin 'com.taosdata.kafka.connect.source.TDengineSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,168] INFO Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,169] INFO Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,169] INFO Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,170] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,170] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,170] INFO Added aliases 'SchemaSourceConnector' and 'SchemaSource' to plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,170] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,172] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,172] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,172] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,173] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,173] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,173] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,173] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,174] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,174] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,174] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,174] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,175] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,175] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,175] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,175] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,176] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,176] INFO Added aliases 'SimpleHeaderConverter' and 'Simple' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,176] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,177] INFO Added aliases 'PredicatedTransformation' and 'Predicated' to plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,177] INFO Added alias 'DropHeaders' to plugin 'org.apache.kafka.connect.transforms.DropHeaders' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,178] INFO Added alias 'Filter' to plugin 'org.apache.kafka.connect.transforms.Filter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,178] INFO Added alias 'InsertHeader' to plugin 'org.apache.kafka.connect.transforms.InsertHeader' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,179] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,179] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,180] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,180] INFO Added alias 'HasHeaderKey' to plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,180] INFO Added alias 'RecordIsTombstone' to plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,180] INFO Added alias 'TopicNameMatches' to plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,181] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:473)
[2022-12-12 21:19:32,181] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,181] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,181] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:476)
[2022-12-12 21:19:32,208] INFO StandaloneConfig values:
access.control.allow.methods =
access.control.allow.origin =
admin.listeners = null
bootstrap.servers = [localhost:9092]
client.dns.lookup = use_all_dns_ips
config.providers = []
connector.client.config.override.policy = All
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = [http://:8083]
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 10000
offset.flush.timeout.ms = 5000
offset.storage.file.filename = /tmp/connect.offsets
plugin.path = [/home/xj/connectors]
response.http.headers.config =
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
ssl.cipher.suites = null
ssl.client.auth = none
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
task.shutdown.graceful.timeout.ms = 5000
topic.creation.enable = true
topic.tracking.allow.reset = true
topic.tracking.enable = true
value.converter = class org.apache.kafka.connect.json.JsonConverter
(org.apache.kafka.connect.runtime.standalone.StandaloneConfig:376)
[2022-12-12 21:19:32,210] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:56)
[2022-12-12 21:19:32,213] INFO AdminClientConfig values:
bootstrap.servers = [localhost:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2022-12-12 21:19:32,284] WARN These configurations '[offset.flush.interval.ms, key.converter.schemas.enable, offset.storage.file.filename, value.converter.schemas.enable, plugin.path, value.converter, key.converter]' were supplied but are not used yet. (org.apache.kafka.clients.admin.AdminClientConfig:385)
[2022-12-12 21:19:32,285] INFO Kafka version: 3.3.1 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-12-12 21:19:32,285] INFO Kafka commitId: e23c59d00e687ff5 (org.apache.kafka.common.utils.AppInfoParser:120)
[2022-12-12 21:19:32,285] INFO Kafka startTimeMs: 1670851172284 (org.apache.kafka.common.utils.AppInfoParser:121)
[2022-12-12 21:19:32,699] INFO Kafka cluster ID: _f6TFm5iSCueIwtCxMsCcQ (org.apache.kafka.connect.util.ConnectUtils:72)
[2022-12-12 21:19:32,702] INFO App info kafka.admin.client for adminclient-1 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2022-12-12 21:19:32,709] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:693)
[2022-12-12 21:19:32,709] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:697)
[2022-12-12 21:19:32,709] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:703)
[2022-12-12 21:19:32,721] INFO Logging initialized @3391ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log:170)
[2022-12-12 21:19:32,778] INFO Added connector for http://:8083 (org.apache.kafka.connect.runtime.rest.RestServer:120)
[2022-12-12 21:19:32,778] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:191)
[2022-12-12 21:19:32,789] INFO jetty-9.4.48.v20220622; built: 2022-06-21T20:42:25.880Z; git: 6b67c5719d1f4371b33655ff2d047d24e171e49a; jvm 1.8.0_191-b12 (org.eclipse.jetty.server.Server:375)
[2022-12-12 21:19:32,824] INFO Started http_8083@293bb8a5{HTTP/1.1, (http/1.1)}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:333)
[2022-12-12 21:19:32,825] INFO Started @3495ms (org.eclipse.jetty.server.Server:415)
[2022-12-12 21:19:32,855] INFO Advertised URI: http://172.16.2.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:364)
[2022-12-12 21:19:32,855] INFO REST server listening at http://172.16.2.5:8083/, advertising URL http://172.16.2.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:206)
[2022-12-12 21:19:32,856] INFO Advertised URI: http://172.16.2.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:364)
[2022-12-12 21:19:32,856] INFO REST admin endpoints at http://172.16.2.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:207)
[2022-12-12 21:19:32,856] INFO Advertised URI: http://172.16.2.5:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:364)
[2022-12-12 21:19:32,856] INFO Setting up All Policy for ConnectorClientConfigOverride. This will allow all client configurations to be overridden (org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy:44)
[2022-12-12 21:19:32,868] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:56)
[2022-12-12 21:19:32,869] INFO AdminClientConfig values:
bootstrap.servers = [localhost:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2022-12-12 21:19:32,874] WARN These configurations '[offset.flush.interval.ms, key.converter.schemas.enable, offset.storage.file.filename, value.converter.schemas.enable, plugin.path, value.converter, key.converter]' were supplied but are not used yet. (org.apache.kafka.clients.admin.AdminClientConfig:385)
[2022-12-12 21:19:32,874] INFO Kafka version: 3.3.1 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-12-12 21:19:32,874] INFO Kafka commitId: e23c59d00e687ff5 (org.apache.kafka.common.utils.AppInfoParser:120)
[2022-12-12 21:19:32,875] INFO Kafka startTimeMs: 1670851172874 (org.apache.kafka.common.utils.AppInfoParser:121)
[2022-12-12 21:19:32,897] INFO Kafka cluster ID: _f6TFm5iSCueIwtCxMsCcQ (org.apache.kafka.connect.util.ConnectUtils:72)
[2022-12-12 21:19:32,898] INFO App info kafka.admin.client for adminclient-2 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2022-12-12 21:19:32,903] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:693)
[2022-12-12 21:19:32,904] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:697)
[2022-12-12 21:19:32,904] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:703)
[2022-12-12 21:19:32,908] INFO Kafka version: 3.3.1 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-12-12 21:19:32,908] INFO Kafka commitId: e23c59d00e687ff5 (org.apache.kafka.common.utils.AppInfoParser:120)
[2022-12-12 21:19:32,908] INFO Kafka startTimeMs: 1670851172908 (org.apache.kafka.common.utils.AppInfoParser:121)
[2022-12-12 21:19:33,033] INFO JsonConverterConfig values:
converter.type = key
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false
(org.apache.kafka.connect.json.JsonConverterConfig:376)
[2022-12-12 21:19:33,034] INFO JsonConverterConfig values:
converter.type = value
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false
(org.apache.kafka.connect.json.JsonConverterConfig:376)
[2022-12-12 21:19:33,042] INFO Kafka Connect standalone worker initialization took 3311ms (org.apache.kafka.connect.cli.ConnectStandalone:99)
[2022-12-12 21:19:33,042] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:50)
[2022-12-12 21:19:33,043] INFO Herder starting (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:98)
[2022-12-12 21:19:33,043] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:203)
[2022-12-12 21:19:33,044] INFO Starting FileOffsetBackingStore with file /tmp/connect.offsets (org.apache.kafka.connect.storage.FileOffsetBackingStore:58)
[2022-12-12 21:19:33,047] INFO Worker started (org.apache.kafka.connect.runtime.Worker:213)
[2022-12-12 21:19:33,047] INFO Herder started (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:101)
[2022-12-12 21:19:33,048] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:211)
[2022-12-12 21:19:33,088] INFO Adding admin resources to main listener (org.apache.kafka.connect.runtime.rest.RestServer:230)
[2022-12-12 21:19:33,162] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:334)
[2022-12-12 21:19:33,162] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:339)
[2022-12-12 21:19:33,164] INFO node0 Scavenging every 600000ms (org.eclipse.jetty.server.session:132)
[2022-12-12 21:19:33,737] INFO Started o.e.j.s.ServletContextHandler@2228db21{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:921)
[2022-12-12 21:19:33,737] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:312)
[2022-12-12 21:19:33,738] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:56)
[2022-12-12 21:19:33,757] INFO AbstractConfig values:
(org.apache.kafka.common.config.AbstractConfig:376)
[2022-12-12 21:19:33,769] INFO [TDengineSinkConnector|worker] Creating connector TDengineSinkConnector of type com.taosdata.kafka.connect.sink.TDengineSinkConnector (org.apache.kafka.connect.runtime.Worker:300)
[2022-12-12 21:19:33,770] INFO [TDengineSinkConnector|worker] SinkConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
topics = [meters]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2022-12-12 21:19:33,773] INFO [TDengineSinkConnector|worker] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
topics = [meters]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2022-12-12 21:19:33,779] INFO [TDengineSinkConnector|worker] Instantiated connector TDengineSinkConnector with version 1.0.2 of type class com.taosdata.kafka.connect.sink.TDengineSinkConnector (org.apache.kafka.connect.runtime.Worker:322)
[2022-12-12 21:19:33,780] INFO [TDengineSinkConnector|worker] Finished creating connector TDengineSinkConnector (org.apache.kafka.connect.runtime.Worker:347)
[2022-12-12 21:19:33,781] INFO [TDengineSinkConnector|worker] Starting Sink Connector (com.taosdata.kafka.connect.sink.TDengineSinkConnector:26)
[2022-12-12 21:19:33,783] INFO SinkConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
topics = [meters]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2022-12-12 21:19:33,786] INFO EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
topics = [meters]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2022-12-12 21:19:33,788] INFO [TDengineSinkConnector|worker] Setting task configurations for 1 workers. (com.taosdata.kafka.connect.sink.TDengineSinkConnector:37)
[2022-12-12 21:19:33,800] INFO [TDengineSinkConnector|task-0] Creating task TDengineSinkConnector-0 (org.apache.kafka.connect.runtime.Worker:619)
[2022-12-12 21:19:33,802] INFO [TDengineSinkConnector|task-0] ConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig:376)
[2022-12-12 21:19:33,804] INFO [TDengineSinkConnector|task-0] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2022-12-12 21:19:33,807] INFO [TDengineSinkConnector|task-0] TaskConfig values:
task.class = class com.taosdata.kafka.connect.sink.TDengineSinkTask
(org.apache.kafka.connect.runtime.TaskConfig:376)
[2022-12-12 21:19:33,807] INFO [TDengineSinkConnector|task-0] Instantiated task TDengineSinkConnector-0 with version 1.0.2 of type com.taosdata.kafka.connect.sink.TDengineSinkTask (org.apache.kafka.connect.runtime.Worker:634)
[2022-12-12 21:19:33,809] INFO [TDengineSinkConnector|task-0] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = key
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2022-12-12 21:19:33,810] INFO [TDengineSinkConnector|task-0] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = value
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2022-12-12 21:19:33,811] INFO [TDengineSinkConnector|task-0] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task TDengineSinkConnector-0 using the connector config (org.apache.kafka.connect.runtime.Worker:649)
[2022-12-12 21:19:33,811] INFO [TDengineSinkConnector|task-0] Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task TDengineSinkConnector-0 using the connector config (org.apache.kafka.connect.runtime.Worker:655)
[2022-12-12 21:19:33,812] INFO [TDengineSinkConnector|task-0] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task TDengineSinkConnector-0 using the worker config (org.apache.kafka.connect.runtime.Worker:660)
[2022-12-12 21:19:33,818] INFO [TDengineSinkConnector|task-0] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:1300)
[2022-12-12 21:19:33,818] INFO [TDengineSinkConnector|task-0] SinkConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
topics = [meters]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2022-12-12 21:19:33,819] INFO [TDengineSinkConnector|task-0] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = com.taosdata.kafka.connect.sink.TDengineSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = TDengineSinkConnector
predicates = []
tasks.max = 1
topics = [meters]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2022-12-12 21:19:33,828] INFO [TDengineSinkConnector|task-0] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [localhost:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = connector-consumer-TDengineSinkConnector-0
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-TDengineSinkConnector
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2022-12-12 21:19:33,869] WARN [TDengineSinkConnector|task-0] These configurations '[metrics.context.connect.kafka.cluster.id]' were supplied but are not used yet. (org.apache.kafka.clients.consumer.ConsumerConfig:385)
[2022-12-12 21:19:33,869] INFO [TDengineSinkConnector|task-0] Kafka version: 3.3.1 (org.apache.kafka.common.utils.AppInfoParser:119)
[2022-12-12 21:19:33,870] INFO [TDengineSinkConnector|task-0] Kafka commitId: e23c59d00e687ff5 (org.apache.kafka.common.utils.AppInfoParser:120)
[2022-12-12 21:19:33,870] INFO [TDengineSinkConnector|task-0] Kafka startTimeMs: 1670851173869 (org.apache.kafka.common.utils.AppInfoParser:121)
[2022-12-12 21:19:33,880] INFO Created connector TDengineSinkConnector (org.apache.kafka.connect.cli.ConnectStandalone:109)
[2022-12-12 21:19:33,883] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Subscribed to topic(s): meters (org.apache.kafka.clients.consumer.KafkaConsumer:973)
[2022-12-12 21:19:33,884] INFO [TDengineSinkConnector|task-0] Starting TDengine Sink task... (com.taosdata.kafka.connect.sink.TDengineSinkTask:35)
[2022-12-12 21:19:33,885] INFO [TDengineSinkConnector|task-0] SinkConfig values:
batch.size = 3000
connection.attempts = 3
connection.backoff.ms = 5000
connection.database = power
connection.database.prefix =
connection.password = taosdata
connection.timezone = UTC
connection.url = jdbc:TAOS://127.0.0.1:6030
connection.user = root
data.precision =
db.charset = UTF-8
db.schemaless = line
max.retries = 3
retry.backoff.ms = 3000
(com.taosdata.kafka.connect.sink.SinkConfig:376)
[2022-12-12 21:19:33,887] DEBUG [TDengineSinkConnector|task-0] Started TDengine sink task (com.taosdata.kafka.connect.sink.TDengineSinkTask:45)
[2022-12-12 21:19:33,887] INFO [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:313)
[2022-12-12 21:19:33,888] INFO [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} Executing sink task (org.apache.kafka.connect.runtime.WorkerSinkTask:198)
[2022-12-12 21:19:33,904] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Resetting the last seen epoch of partition meters-0 to 3 since the associated topicId changed from null to aAzPEb-5R7Gh3aL0i2Oeqw (org.apache.kafka.clients.Metadata:402)
[2022-12-12 21:19:33,908] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Cluster ID: _f6TFm5iSCueIwtCxMsCcQ (org.apache.kafka.clients.Metadata:287)
[2022-12-12 21:19:33,909] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Discovered group coordinator 172.16.2.5:9092 (id: 2147483646 rack: null) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:900)
[2022-12-12 21:19:33,912] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:566)
[2022-12-12 21:19:33,929] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Request joining group due to: need to re-join with the given member-id: connector-consumer-TDengineSinkConnector-0-a59c45ea-eaa9-4445-82f3-e52aea65a2ea (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1066)
[2022-12-12 21:19:33,930] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1066)
[2022-12-12 21:19:33,930] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:566)
[2022-12-12 21:19:36,934] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Successfully joined group with generation Generation{generationId=1, memberId='connector-consumer-TDengineSinkConnector-0-a59c45ea-eaa9-4445-82f3-e52aea65a2ea', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:627)
[2022-12-12 21:19:36,938] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Finished assignment for group at generation 1: {connector-consumer-TDengineSinkConnector-0-a59c45ea-eaa9-4445-82f3-e52aea65a2ea=Assignment(partitions=[meters-0])} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:705)
[2022-12-12 21:19:36,947] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Successfully synced group in generation Generation{generationId=1, memberId='connector-consumer-TDengineSinkConnector-0-a59c45ea-eaa9-4445-82f3-e52aea65a2ea', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:802)
[2022-12-12 21:19:36,947] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Notifying assignor about the new Assignment(partitions=[meters-0]) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:300)
[2022-12-12 21:19:36,950] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Adding newly assigned partitions: meters-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:312)
[2022-12-12 21:19:36,960] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Found no committed offset for partition meters-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1538)
[2022-12-12 21:19:36,971] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Resetting offset for partition meters-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[172.16.2.5:9092 (id: 1 rack: null)], epoch=3}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:399)
[2022-12-12 21:19:37,052] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-12-12 21:19:37,110] ERROR [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. Error: commitCallbackHandler (org.apache.kafka.connect.runtime.WorkerSinkTask:609)
java.lang.NoSuchMethodError: commitCallbackHandler
at com.taosdata.jdbc.TSDBJNIConnector.initImp(Native Method)
at com.taosdata.jdbc.TSDBJNIConnector.init(TSDBJNIConnector.java:43)
at com.taosdata.jdbc.TSDBDriver.connect(TSDBDriver.java:162)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at com.taosdata.kafka.connect.db.TSDBConnectionProvider.getConnection(TSDBConnectionProvider.java:35)
at com.taosdata.kafka.connect.db.CacheProcessor.getConnection(CacheProcessor.java:40)
at com.taosdata.kafka.connect.db.CacheProcessor.execute(CacheProcessor.java:66)
at com.taosdata.kafka.connect.db.CacheProcessor.initDB(CacheProcessor.java:55)
at com.taosdata.kafka.connect.db.CacheProcessor.setDbName(CacheProcessor.java:33)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:94)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:244)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
[2022-12-12 21:19:37,117] ERROR [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:196)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:611)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:244)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError: commitCallbackHandler
at com.taosdata.jdbc.TSDBJNIConnector.initImp(Native Method)
at com.taosdata.jdbc.TSDBJNIConnector.init(TSDBJNIConnector.java:43)
at com.taosdata.jdbc.TSDBDriver.connect(TSDBDriver.java:162)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at com.taosdata.kafka.connect.db.TSDBConnectionProvider.getConnection(TSDBConnectionProvider.java:35)
at com.taosdata.kafka.connect.db.CacheProcessor.getConnection(CacheProcessor.java:40)
at com.taosdata.kafka.connect.db.CacheProcessor.execute(CacheProcessor.java:66)
at com.taosdata.kafka.connect.db.CacheProcessor.initDB(CacheProcessor.java:55)
at com.taosdata.kafka.connect.db.CacheProcessor.setDbName(CacheProcessor.java:33)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:94)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
... 10 more
[2022-12-12 21:19:37,117] INFO [TDengineSinkConnector|task-0] Stopping TDengine sink task (com.taosdata.kafka.connect.sink.TDengineSinkTask:160)
[2022-12-12 21:19:37,118] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Revoke previously assigned partitions meters-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:331)
[2022-12-12 21:19:37,118] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Member connector-consumer-TDengineSinkConnector-0-a59c45ea-eaa9-4445-82f3-e52aea65a2ea sending LeaveGroup request to coordinator 172.16.2.5:9092 (id: 2147483646 rack: null) due to the consumer is being closed (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1127)
[2022-12-12 21:19:37,120] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Resetting generation and member id due to: consumer pro-actively leaving the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1019)
[2022-12-12 21:19:37,120] INFO [TDengineSinkConnector|task-0] [Consumer clientId=connector-consumer-TDengineSinkConnector-0, groupId=connect-TDengineSinkConnector] Request joining group due to: consumer pro-actively leaving the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1066)
[2022-12-12 21:19:37,122] INFO [TDengineSinkConnector|task-0] Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:693)
[2022-12-12 21:19:37,122] INFO [TDengineSinkConnector|task-0] Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:697)
[2022-12-12 21:19:37,123] INFO [TDengineSinkConnector|task-0] Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:703)
[2022-12-12 21:19:37,129] INFO [TDengineSinkConnector|task-0] App info kafka.consumer for connector-consumer-TDengineSinkConnector-0 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)

创建connector后发现connector状态错误

java.lang.NoClassDefFoundError: Could not initialize class com.taosdata.jdbc.TSDBJNIConnector\n\tat com.taosdata.jdbc.TSDBDriver.connect(TSDBDriver.java:159)\n\tat java.sql.DriverManager.getConnection(DriverManager.java:664)\n\tat java.sql.DriverManager.getConnection(DriverManager.java:208)\n\tat com.taosdata.kafka.connect.db.TSDBConnectionProvider.getConnection(TSDBConnectionProvider.java:35)\n\tat com.taosdata.kafka.connect.source.MonitorThread.init(MonitorThread.java:50)\n\tat com.taosdata.kafka.connect.source.MonitorThread.(MonitorThread.java:40)\n\tat com.taosdata.kafka.connect.source.TDengineSourceConnector.start(TDengineSourceConnector.java:30)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doStart(WorkerConnector.java:190)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.start(WorkerConnector.java:215)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doTransitionTo(WorkerConnector.java:360)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doTransitionTo(WorkerConnector.java:343)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.doRun(WorkerConnector.java:143)\n\tat org.apache.kafka.connect.runtime.WorkerConnector.run(WorkerConnector.java:121)\n\tat org.apache.kafka.connect.runtime.isolation.Plugins.lambda$withClassLoader$1(Plugins.java:177)\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:750)\n

开始时间问题以及数据重复拉取问题

"timestamp.initial":"2023-06-18 23:00:00",开始时间只能从12个小时之前开始,并且拉取数据时一直在重复拉取数据并且拉取也只能拉取12小时前的数据
Connector configuration is invalid and contains the following 1 error(s):\nInvalid value 2023-06-18 23:00:00 for configuration timestamp.initial: timestamp initial value must be before now\nYou can also find the above list of errors at the endpoint /connector-plugins/{connectorType}/config/validate"

mvn打包报错

以下是整个打包信息:
环境:ubuntu20.04(虚拟机
java版本:java11
maven版本:3.6.3

[INFO] Scanning for projects...
[INFO]
[INFO] ----------------< com.taosdata:kafka-connect-tdengine >-----------------
[INFO] Building kafka-connect-tdengine 1.0.2
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ kafka-connect-tdengine ---
[INFO] Deleting /home/software/kafka-connect-tdengine/target
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ kafka-connect-tdengine ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (default) @ kafka-connect-tdengine ---
[WARNING] Parameter tasks is deprecated, use target instead
[INFO] Executing tasks

main:
[copy] Copying 1 file to /home/software/kafka-connect-tdengine
[INFO] Executed tasks
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ kafka-connect-tdengine ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 30 source files to /home/software/kafka-connect-tdengine/target/classes
[WARNING] /home/software/kafka-connect-tdengine/src/main/java/com/taosdata/kafka/connect/config/CharsetValidator.java: Some input files use unchecked or unsafe operations.
[WARNING] /home/software/kafka-connect-tdengine/src/main/java/com/taosdata/kafka/connect/config/CharsetValidator.java: Recompile with -Xlint:unchecked for details.
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ kafka-connect-tdengine ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/software/kafka-connect-tdengine/src/test/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ kafka-connect-tdengine ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 7 source files to /home/software/kafka-connect-tdengine/target/test-classes
[INFO]
[INFO] --- maven-surefire-plugin:2.22.2:test (default-test) @ kafka-connect-tdengine ---
[INFO]
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running com.taosdata.kafka.connect.source.SourceInsertTest
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.291 s <<< FAILURE! - in com.taosdata.kafka.connect.source.SourceInsertTest
[ERROR] prepareData Time elapsed: 0.262 s <<< ERROR!
java.lang.NoSuchMethodError: commitCallbackHandler
at com.taosdata.kafka.connect.source.SourceInsertTest.createConnection(SourceInsertTest.java:27)
at com.taosdata.kafka.connect.source.SourceInsertTest.prepareData(SourceInsertTest.java:21)

[INFO] Running com.taosdata.kafka.connect.source.TimeStampOffsetTest
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.064 s - in com.taosdata.kafka.connect.source.TimeStampOffsetTest
[INFO] Running com.taosdata.kafka.connect.config.TimeZoneValidatorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.012 s - in com.taosdata.kafka.connect.config.TimeZoneValidatorTest
[INFO] Running com.taosdata.kafka.connect.config.SchemalessValidatorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 s - in com.taosdata.kafka.connect.config.SchemalessValidatorTest
[INFO] Running com.taosdata.kafka.connect.TDengineTaskTest
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.015 s - in com.taosdata.kafka.connect.TDengineTaskTest
[INFO] Running com.taosdata.kafka.connect.sink.TDengineSinkConnectorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.045 s - in com.taosdata.kafka.connect.sink.TDengineSinkConnectorTest
[INFO] Running com.taosdata.kafka.connect.util.VersionUtilsTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.014 s - in com.taosdata.kafka.connect.util.VersionUtilsTest
[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] SourceInsertTest.prepareData:21->createConnection:27 » NoSuchMethod commitCall...
[INFO]
[ERROR] Tests run: 8, Failures: 0, Errors: 1, Skipped: 0
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13.159 s
[INFO] Finished at: 2022-12-26T03:58:31-05:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.22.2:test (default-test) on project kafka-connect-tdengine: There are test failures.
[ERROR]
[ERROR] Please refer to /home/software/kafka-connect-tdengine/target/surefire-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

kafka-connect-tdengine安装执行mvn clean package 时报错

你好:
我在根据《如何同步 Kafka 的数据到 TDengine?》的步骤安装kafka-connect-tdengine时,执行mvn clean package后报错,图示如下,请帮我查看下,谢谢。
我的版本如下: java版本 1.8
maven版本3.8.6
TDenginge版本3.0.2.4 已开启taosd,taosadapter进程

image
具体报错内容:
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] TDengineTaskTest.before:121 » NoSuchMethod commitCallbackHandler
[INFO]
[ERROR] Tests run: 7, Failures: 0, Errors: 1, Skipped: 0
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:18 min
[INFO] Finished at: 2023-02-03T14:35:01+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.22.2:test (default-test) on project kafka-connect-tdengine: There are test failures.
[ERROR]
[ERROR] Please refer to /home/kafka/kafka-connect-tdengine/target/surefire-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

你好,我使用该connect出现报错:TDengine ERROR (80000216): Syntax error in SQL,请问怎么解决

[2022-08-09 16:18:18,956] WARN [TDengineSinkConnector|task-0] Write of 500 records failed, remainingRetries=1 (com.taosdata.kafka.connect.sink.TDengineSinkTask:110) java.sql.SQLException: TDengine ERROR (80000216): Syntax error in SQL at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:76) at com.taosdata.jdbc.TSDBJNIConnector.insertLines(TSDBJNIConnector.java:376) at com.taosdata.jdbc.SchemalessWriter.write(SchemalessWriter.java:35) at com.taosdata.kafka.connect.db.CacheProcessor.schemalessInsert(CacheProcessor.java:86) at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:108) at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85) at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581) at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333) at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [2022-08-09 16:18:18,956] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://localhost:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94) [2022-08-09 16:18:19,457] ERROR [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} RetriableException from SinkTask: (org.apache.kafka.connect.runtime.WorkerSinkTask:600) org.apache.kafka.connect.errors.RetriableException: java.sql.SQLException: Exception chain: java.sql.SQLException: TDengine ERROR (80000216): Syntax error in SQL at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:121) at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85) at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581) at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333) at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.sql.SQLException: Exception chain:

Consumption throws away the entire batch when encountering an error

Bug Description
kafka连接器在遇到不符合规范的数据,会丢弃整个batch,而不是将batch中可以插入的数据插入,丢弃错误数据。

To Reproduce
依据官方示例
https://docs.taosdata.com/third-party/kafka
将数据文件test-data.txt改为

meters,location=California.LosAngeles,groupid=2 current=11.8,voltage=221,phase=0.28 1648432611249000000
meters,location=California.LosAngeles,groupid=2 current=13.4,voltage=223,phase=0.29 1648432611250000000
errorline
meters,location=California.LosAngeles,groupid=3 current=10.8,voltage=223,phase=0.29 1648432611249000000
meters,location=California.LosAngeles,groupid=3 current=11.3,voltage=221,phase=0.35 1648432611250000000

即可复现
Screenshots
If applicable, add screenshots to help explain your problem.

Environment (please complete the following information):
TDengine Version TDengine-server-2.6.0.8-Linux-x64
taosdata-kafka-connect-TDengine Version 1.0.1

Additional Context
经过整体批重试后,会进入一个对批中每个记录的迭代,此时由于connection已经被关闭,丢失了数据库名,造成即使是正确的记录也无法插入,整个批次都被丢弃。详见log加粗部分

log如下
[2022-07-21 14:45:29,154] WARN [TDengineSinkConnector|task-0] Write of 5 records failed, remainingRetries=2 (com.taosdata.kafka.connect.sink.TDengineSinkTask:110)
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist
at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:76)
at com.taosdata.jdbc.TSDBJNIConnector.insertLines(TSDBJNIConnector.java:376)
at com.taosdata.jdbc.SchemalessWriter.write(SchemalessWriter.java:35)
at com.taosdata.kafka.connect.db.CacheProcessor.schemalessInsert(CacheProcessor.java:86)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:108)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
[2022-07-21 14:45:29,156] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:29,557] ERROR [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} RetriableException from SinkTask: (org.apache.kafka.connect.runtime.WorkerSinkTask:600)
org.apache.kafka.connect.errors.RetriableException: java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:121)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

Caused by: java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:116)
    ... 12 more

[2022-07-21 14:45:32,558] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:32,559] DEBUG [TDengineSinkConnector|task-0] Received 5 records. First record kafka coordinates:(meters-0-48594). Writing them to the database... (com.taosdata.kafka.connect.sink.TDengineSinkTask:101)
[2022-07-21 14:45:32,560] WARN [TDengineSinkConnector|task-0] Write of 5 records failed, remainingRetries=1 (com.taosdata.kafka.connect.sink.TDengineSinkTask:110)
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist
at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:76)
at com.taosdata.jdbc.TSDBJNIConnector.insertLines(TSDBJNIConnector.java:376)
at com.taosdata.jdbc.SchemalessWriter.write(SchemalessWriter.java:35)
at com.taosdata.kafka.connect.db.CacheProcessor.schemalessInsert(CacheProcessor.java:86)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:108)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
[2022-07-21 14:45:32,560] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:33,011] ERROR [TDengineSinkConnector|task-0] WorkerSinkTask{id=TDengineSinkConnector-0} RetriableException from SinkTask: (org.apache.kafka.connect.runtime.WorkerSinkTask:600)
org.apache.kafka.connect.errors.RetriableException: java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:121)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

Caused by: java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:116)
    ... 12 more

[2022-07-21 14:45:34,917] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:34,919] DEBUG [TDengineSinkConnector|task-0] Received 5 records. First record kafka coordinates:(meters-0-48594). Writing them to the database... (com.taosdata.kafka.connect.sink.TDengineSinkTask:101)
[2022-07-21 14:45:34,920] WARN [TDengineSinkConnector|task-0] Write of 5 records failed, remainingRetries=0 (com.taosdata.kafka.connect.sink.TDengineSinkTask:110)
java.sql.SQLException: TDengine ERROR (80000362): Table does not exist
at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:76)
at com.taosdata.jdbc.TSDBJNIConnector.insertLines(TSDBJNIConnector.java:376)
at com.taosdata.jdbc.SchemalessWriter.write(SchemalessWriter.java:35)
at com.taosdata.kafka.connect.db.CacheProcessor.schemalessInsert(CacheProcessor.java:86)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:108)
at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
[2022-07-21 14:45:34,921] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:35,021] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:35,024] ERROR [TDengineSinkConnector|task-0] Error encountered in task TDengineSinkConnector-0. Executing stage 'TASK_PUT' with class 'org.apache.kafka.connect.sink.SinkTask'. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000217): Database not specified or available

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.unrollAndRetry(TDengineSinkTask.java:153)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:124)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

[2022-07-21 14:45:35,042] INFO [TDengineSinkConnector|task-0] [Producer clientId=connector-dlq-producer-TDengineSinkConnector-0] Resetting the last seen epoch of partition my-connector-errors-0 to 0 since the associated topicId changed from null to OpZ4KkyLQm6cesUeJh8GlA (org.apache.kafka.clients.Metadata:402)
[2022-07-21 14:45:35,062] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:35,531] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:35,532] ERROR [TDengineSinkConnector|task-0] Error encountered in task TDengineSinkConnector-0. Executing stage 'TASK_PUT' with class 'org.apache.kafka.connect.sink.SinkTask'. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000217): Database not specified or available

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.unrollAndRetry(TDengineSinkTask.java:153)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:124)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

[2022-07-21 14:45:35,533] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:36,034] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:36,035] ERROR [TDengineSinkConnector|task-0] Error encountered in task TDengineSinkConnector-0. Executing stage 'TASK_PUT' with class 'org.apache.kafka.connect.sink.SinkTask'. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000217): Database not specified or available

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.unrollAndRetry(TDengineSinkTask.java:153)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:124)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

[2022-07-21 14:45:36,037] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:36,538] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:36,539] ERROR [TDengineSinkConnector|task-0] Error encountered in task TDengineSinkConnector-0. Executing stage 'TASK_PUT' with class 'org.apache.kafka.connect.sink.SinkTask'. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000217): Database not specified or available

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.unrollAndRetry(TDengineSinkTask.java:153)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:124)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

[2022-07-21 14:45:36,542] INFO [TDengineSinkConnector|task-0] Try closing connection jdbc:TAOS://127.0.0.1:6030 (com.taosdata.kafka.connect.db.CacheProcessor:94)
[2022-07-21 14:45:37,023] INFO [TDengineSinkConnector|task-0] create TDengine Connection, Attempt 0 of 3 (com.taosdata.kafka.connect.db.TSDBConnectionProvider:34)
[2022-07-21 14:45:37,024] ERROR [TDengineSinkConnector|task-0] Error encountered in task TDengineSinkConnector-0. Executing stage 'TASK_PUT' with class 'org.apache.kafka.connect.sink.SinkTask'. (org.apache.kafka.connect.runtime.errors.LogReporter:66)
java.sql.SQLException: Exception chain:
java.sql.SQLException: TDengine ERROR (80000217): Database not specified or available

    at com.taosdata.kafka.connect.sink.TDengineSinkTask.getAllMessagesException(TDengineSinkTask.java:141)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.unrollAndRetry(TDengineSinkTask.java:153)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:124)
    at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:750)

使用REST连接时报错user is required

connect.log日志如下:

java.sql.SQLException: ERROR (0x2319): user is required
	at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:87)
	at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:74)
	at com.taosdata.jdbc.rs.ConnectionParam.getParam(ConnectionParam.java:146)
	at com.taosdata.jdbc.SchemalessWriter.init(SchemalessWriter.java:109)
	at com.taosdata.jdbc.SchemalessWriter.<init>(SchemalessWriter.java:51)
	at com.taosdata.kafka.connect.db.CacheProcessor.schemalessInsert(CacheProcessor.java:80)
	at com.taosdata.kafka.connect.sink.TDengineSinkTask.bulkWriteBatch(TDengineSinkTask.java:108)
	at com.taosdata.kafka.connect.sink.TDengineSinkTask.put(TDengineSinkTask.java:85)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:601)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:350)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:250)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:219)
	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:204)
	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:259)
	at org.apache.kafka.connect.runtime.isolation.Plugins.lambda$withClassLoader$1(Plugins.java:236)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)

sink-demo.json:

{
  "name": "TDengineSinkConnector",
  "config": {
    "connector.class":"com.taosdata.kafka.connect.sink.TDengineSinkConnector",
    "tasks.max": "1",
    "topics": "meters",
    "connection.url": "jdbc:TAOS-RS://127.0.0.1:6041",
    "connection.user": "root",
    "connection.password": "taosdata",
    "connection.database": "power",
    "db.schemaless": "line",
    "data.precision": "ns",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter",
    "errors.tolerance": "all",
    "errors.deadletterqueue.topic.name": "dead_letter_topic",
    "errors.deadletterqueue.topic.replication.factor": 1
  }
}

我将sink-demo.json其中的"connection.url": "jdbc:TAOS://127.0.0.1:6030"修改为"connection.url": "jdbc:TAOS-RS://127.0.0.1:6041?user=root&password=taosdata"就不会再出现问题了,怀疑是改成REST连接后,connection.user配置项不生效,只好在url上直接拼接了。

希望能验证这个问题并修复。

mvn clean package失败

[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running com.taosdata.kafka.connect.config.SchemalessValidatorTest
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 s - in com.taosdata.kafka.connect.config.SchemalessValidatorTest
[INFO] Running com.taosdata.kafka.connect.config.TimeZoneValidatorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 s - in com.taosdata.kafka.connect.config.TimeZoneValidatorTest
[INFO] Running com.taosdata.kafka.connect.util.VersionUtilsTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 s - in com.taosdata.kafka.connect.util.VersionUtilsTest
[INFO] Running com.taosdata.kafka.connect.TDengineTaskTest
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.036 s <<< FAILURE! - in com.taosdata.kafka.connect.TDengineTaskTest
[ERROR] com.taosdata.kafka.connect.TDengineTaskTest Time elapsed: 0.036 s <<< ERROR!
java.lang.NoClassDefFoundError: com/alibaba/fastjson/JSONObject
at com.taosdata.kafka.connect.TDengineTaskTest.before(TDengineTaskTest.java:121)
Caused by: java.lang.ClassNotFoundException: com.alibaba.fastjson.JSONObject
at com.taosdata.kafka.connect.TDengineTaskTest.before(TDengineTaskTest.java:121)

[INFO] Running com.taosdata.kafka.connect.sink.TDengineSinkConnectorTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.018 s - in com.taosdata.kafka.connect.sink.TDengineSinkConnectorTest
[INFO] Running com.taosdata.kafka.connect.source.TimeStampOffsetTest
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.011 s - in com.taosdata.kafka.connect.source.TimeStampOffsetTest
[INFO] Running com.taosdata.kafka.connect.source.SourceInsertTest
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.004 s <<< FAILURE! - in com.taosdata.kafka.connect.source.SourceInsertTest
[ERROR] prepareData Time elapsed: 0.003 s <<< ERROR!
java.lang.NoClassDefFoundError: com/alibaba/fastjson/JSONObject
at com.taosdata.kafka.connect.source.SourceInsertTest.createConnection(SourceInsertTest.java:27)
at com.taosdata.kafka.connect.source.SourceInsertTest.prepareData(SourceInsertTest.java:21)

[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] TDengineTaskTest.before:121 » NoClassDefFound com/alibaba/fastjson/JSONObject
[ERROR] SourceInsertTest.prepareData:21->createConnection:27 » NoClassDefFound com/ali...
[INFO]
[ERROR] Tests run: 7, Failures: 0, Errors: 2, Skipped: 0
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.878 s
[INFO] Finished at: 2023-03-20T17:15:56+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.22.2:test (default-test) on project kafka-connect-tdengine: There are test failures.
[ERROR]
[ERROR] Please refer to /data/usr/kafka/connect/kafka-connect-tdengine-3.0/target/surefire-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

TDengine Sink Connector 数据同步失败

背景使用 TDengine Sink Connector 将Kafka中的数据同步至TDengine中一直同步不成功,怎么查看同步日志
数据库版本:3.1.0.0
Kafka:2.2.2
现象:使用OpenTSDB JSON 协议格式。一直同步不过去,没办法查看日志
sink连接状态显示
image
sink-demo.json文件配置
image
{ "name": "TDengineSinkConnector", "config": { "connector.class":"com.taosdata.kafka.connect.sink.TDengineSinkConnector", "tasks.max": "1", "topics": "bigdata_plateform_meters", "connection.url": "jdbc:TAOS://127.0.0.1:6030", "connection.user": "root", "connection.password": "taosdata", "connection.database": "bigdata_plateform", "db.schemaless": "json", "data.precision": "ms", "key.converter": "org.apache.kafka.connect.storage.StringConverter", "value.converter": "org.apache.kafka.connect.storage.StringConverter", "errors.tolerance": "all", "errors.deadletterqueue.topic.name": "bigdata_plateform_dead_letter_topic", "errors.deadletterqueue.topic.replication.factor": 1 } }

Kafka消息内容
image
[ { "metric": "sys.cpu.nice", "timestamp": 1648432611250000, "value": 18, "tags": { "host": "web01", "dc": "lga" } }, { "metric": "sys.cpu.nice", "timestamp": 1648432311251000, "value": 9, "tags": { "host": "web02", "dc": "lga" } }, { "metric": "sys.cpu.nice", "timestamp": 1648433311251000, "value": 9, "tags": { "host": "web02", "dc": "lga" } } ]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.