For context on getting started with ingestion, check out our metadata ingestion guide.
To install this plugin, run
pip install 'acryl-datahub[kafka]'.
This plugin extracts the following:
- Topics from the Kafka broker
- Schemas associated with each topic from the schema registry
Check out the following recipe to get started with ingestion! See below for full configuration options.
For general pointers on writing and running a recipe, see our main recipe guide.
source: type: "kafka" config: # Coordinates connection: bootstrap: "broker:9092" schema_registry_url: http://localhost:8081 sink: # sink configs
Note that a
. is used to denote nested fields in the YAML recipe.
|Schema registry location.|
|Extra schema registry config. These options will be passed into Kafka's SchemaRegistryClient. See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html?#schemaregistryclient.|
|Extra consumer config. These options will be passed into Kafka's DeserializingConsumer. See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#deserializingconsumer and https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md.|
|Extra producer config. These options will be passed into Kafka's SerializingProducer. See https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#serializingproducer and https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md.|
|List of regex patterns for topics to include in ingestion.|
|List of regex patterns for topics to exclude from ingestion.|
|Whether to ignore case sensitivity during pattern matching.|
The options in the consumer config and schema registry config are passed to the Kafka DeserializingConsumer and SchemaRegistryClient respectively.
For a full example with a number of security options, see this example recipe.
If you've got any questions on configuring this source, feel free to ping us on our Slack!