Destinations
Kafka
Send processed data from Unstructured to Kafka.
The requirements are as follows.
-
A Kafka cluster in Confluent Cloud. (Create a cluster.)
The following video shows how to set up a Kafka cluster in Confluent Cloud:
-
The hostname and port number of the bootstrap Kafka cluster to connect to..
-
The name of the topic to read messages from or write messages to on the cluster. Create a topic. Access available topics.
-
For authentication, an API key and secret.
To create the destination connector:
- On the sidebar, click Connectors.
- Click Destinations.
- Cick New or Create Connector.
- Give the connector some unique Name.
- In the Provider area, click Kafka.
- Click Continue.
- Follow the on-screen instructions to fill in the fields as described later on this page.
- Click Save and Test.
Fill in the following fields:
- Name (required): A unique name for this connector.
- Bootstrap Server (required): The hostname of the bootstrap Kafka cluster to connect to.
- Port: The port number of the cluster.
- Group ID: The ID of the consumer group, if any, that is associated with the target Kafka cluster.
(A consumer group is a way to allow a pool of consumers to divide the consumption of data
over topics and partitions.) The default is
default_group_id
if not otherwise specified. - Topic (required): The unique name of the topic to read messages from and write messages to on the cluster.
- Number of messages to consume: The maximum number of messages to get from the topic. The default is
100
if not otherwise specified. - Batch Size: The maximum number of messages to send in a single batch. The default is
100
if not otherwise specified. - API Key (required): The Kafka API key value.
- Secret (required): The secret value for the Kafka API key.
Was this page helpful?