mirror of
https://github.com/open-metadata/OpenMetadata.git
synced 2025-07-12 11:39:39 +00:00
71 lines
2.7 KiB
Markdown
71 lines
2.7 KiB
Markdown
![]() |
---
|
||
|
title: Redpanda
|
||
|
slug: /connectors/messaging/redpanda
|
||
|
---
|
||
|
|
||
|
{% connectorDetailsHeader
|
||
|
name="Redpanda"
|
||
|
stage="PROD"
|
||
|
platform="OpenMetadata"
|
||
|
availableFeatures=["Topics", "Sample Data"]
|
||
|
unavailableFeatures=[]
|
||
|
/ %}
|
||
|
|
||
|
|
||
|
In this section, we provide guides and references to use the Redpanda connector.
|
||
|
|
||
|
Configure and schedule Redpanda metadata and profiler workflows from the OpenMetadata UI:
|
||
|
|
||
|
- [Requirements](#requirements)
|
||
|
- [Metadata Ingestion](#metadata-ingestion)
|
||
|
|
||
|
{% partial file="/v1.4/connectors/ingestion-modes-tiles.md" variables={yamlPath: "/connectors/messaging/redpanda/yaml"} /%}
|
||
|
|
||
|
## Requirements
|
||
|
|
||
|
Connecting to Redpanda does not require any previous configuration.
|
||
|
|
||
|
The ingestion of the Kafka topics' schema is done separately by configuring the **Schema Registry URL**. However, only the **Bootstrap Servers** information is mandatory.
|
||
|
|
||
|
## Metadata Ingestion
|
||
|
|
||
|
{% partial
|
||
|
file="/v1.4/connectors/metadata-ingestion-ui.md"
|
||
|
variables={
|
||
|
connector: "Redpanda",
|
||
|
selectServicePath: "/images/v1.4/connectors/redpanda/select-service.png",
|
||
|
addNewServicePath: "/images/v1.4/connectors/redpanda/add-new-service.png",
|
||
|
serviceConnectionPath: "/images/v1.4/connectors/redpanda/service-connection.png",
|
||
|
}
|
||
|
/%}
|
||
|
|
||
|
{% stepsContainer %}
|
||
|
{% extraContent parentTagName="stepsContainer" %}
|
||
|
|
||
|
#### Connection Details
|
||
|
|
||
|
- **Bootstrap Servers**: List of brokers as comma separated values of broker `host` or `host:port`. Example: `host1:9092,host2:9092`
|
||
|
- **Schema Registry URL**: URL of the Schema Registry used to ingest the schemas of the topics.
|
||
|
- **SASL Username**: SASL username for use with the PLAIN and SASL-SCRAM mechanisms.
|
||
|
- **SASL Password**: SASL password for use with the PLAIN and SASL-SCRAM mechanisms.
|
||
|
- **SASL Mechanism**: SASL mechanism to use for authentication.
|
||
|
- **Basic Auth User Info**: Schema Registry Client HTTP credentials in the form of `username:password`. By default, user info is extracted from the URL if present.
|
||
|
- **Consumer Config**: The accepted additional values for the consumer configuration can be found in the following [link](https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md).
|
||
|
- **Schema Registry Config**: The accepted additional values for the Schema Registry configuration can be found in the following [link](https://docs.confluent.io/5.5.1/clients/confluent-kafka-python/index.html#confluent_kafka.schema_registry.SchemaRegistryClient).
|
||
|
|
||
|
{% note %}
|
||
|
To ingest the topic schema `Schema Registry URL` must be passed
|
||
|
{% /note %}
|
||
|
|
||
|
{% /extraContent %}
|
||
|
|
||
|
{% partial file="/v1.4/connectors/test-connection.md" /%}
|
||
|
|
||
|
{% partial file="/v1.4/connectors/messaging/configure-ingestion.md" /%}
|
||
|
|
||
|
{% partial file="/v1.4/connectors/ingestion-schedule-and-deploy.md" /%}
|
||
|
|
||
|
{% /stepsContainer %}
|
||
|
|
||
|
{% partial file="/v1.4/connectors/troubleshooting.md" /%}
|