Comments (8)
Can you explain a little more what you are trying to do? What is the format you want the data written in when sent to Pub/Sub? In what format is the data you've written to Kafka stored?
from pubsub.
Thanks @kamalaboulhosn for your help!
Yes, so I'm trying to sink data from Kafka topics that has been written using kafka-avro-serializer, so basically there are byte arrays within the topics, containing the schema ID registered within the Confluent Schema Registry then the payload of the event itself.
I would like to copy the data from those topics to Pub/Sub, in JSON target format for instance.
I would like the connector to read the data, deserialize it then convert it to json and send to PubSub topic.
That's what does for instance the GCS sink connector using the following confs
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
from pubsub.
I believe the notion of the schema registry is a concept specific to Confluent Kafka and not part of the generic, open-source Kafka connect infrastructure. The GCE connector you link to is provided by Confluent, whereas this one is not. At this time, we do not support any lookup of schema in a schema registry via this connector.
from pubsub.
Ok so for now the only way to read avro from kafka topics would be to provide avro file with the schema embedded within every kafka message?
from pubsub.
You can dump the Avro messages into the Cloud Pub/Sub as-is since they are just bytes. You'd then have to rely on your subscribers to decode the messages. If all messages on a topic use the same schema, then you could potentially take advantage of Pub/Sub's schema support.
from pubsub.
Actually we use Kafka topics with several schemas. For instance UserDeleted
& UserCreated
messages have different schemas but need to be stored in the same topic partition to ensure ordering.
from pubsub.
Yeah, so in that case, there is no way to convert the Avro into another format within the connector. You could store your schema in Pub/Sub and then manually attach the path to it as an attribute in your messages so that you can pull the schema and decode messages, though this would require your Kafka publisher to publish with the metadata in the headers.
from pubsub.
I have a topic that has also an avro schema for the message key. I am configuring the key.converter for this connector but I can not see the messages are sinked in my pubsub topic.
"key.converter": "io.confluent.connect.avro.AvroConverter", "key.converter.schema.registry.url": "<%= @schema_registry_url %>"
In the pubsub topic, I have created the topic and defined avro schema for pubsub topic(for message value offcourse - copied the avro scehma from the confluent schema registry and created a new schema in gcp pubsub for the topic). I dont see any error etc in my logs that why messages are not being sinked in pubsub topic.
Is there any possibility to sink the kafka message key e.g {"mykey":"mykeyvalue"} in this pubsub sink connector?
from pubsub.
Related Issues (20)
- GCEComputeResourceController creates VM incompatible default Python version
- CMake build got broken in one of the latest versions (vcpkg) HOT 1
- netty dependency is too old to compile on aarch64 HOT 1
- Pub sub connector not working with 2.8.1
- PubSub Sink Task Flush issues with clearing partition tracking.
- Receiving base64 encoded messages in kafka from pubsub HOT 1
- A potential Denial of Service issue in protobuf-java HOT 3
- Kafka Connector [Quickstart - copy_tool.py] doesn't verify the integrity of the downloaded files
- Intermittent error related to AuthMetadataPluginCallback : ValueError: None could not be converted to unicode
- Question: How Can I enable encrypted transmission between the connector and On Premise Kafka
- loadtest: unable to connect to client
- Can't reach GCP PubSub Emulator
- Ordering Key prober does not build
- TopicAdminClient UnauthenticatedException When Settings Endpoints HOT 1
- Load test framework doesn't use pagination when calling instanceGroupManagers.listManagedInstances
- TypeError: this.auth.getUniverseDomain is not a function HOT 1
- load-test-framework: go generated protobuf code broken
- pub a message with ordering key error, keep getting : "Execution cancelled because executing previous runnable failed"
- Shouldn't be 208 code considered as positive ack for push subscription?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pubsub.