Kafka
If you’re new to Unstructured, read this note first.
Before you can create a source connector, you must first sign up for Unstructured. After you sign up, the Unstructured user interface (UI) appears, which you use to create the source connector.
After you create the source connector, add it along with a destination connector to a workflow. Then run the worklow as a job. To learn how, try out the hands-on UI quickstart or watch the 4-minute video tutorial.
You can also create source connectors with the Unstructured API. Learn how.
If you need help, reach out to the community on Slack, or contact us directly.
You are now ready to start creating a source connector! Keep reading to learn how.
Ingest your files into Unstructured from Kafka.
The requirements are as follows.
-
A Kafka cluster in Confluent Cloud. (Create a cluster.)
The following video shows how to set up a Kafka cluster in Confluent Cloud:
-
The hostname and port number of the bootstrap Kafka cluster to connect to..
-
The name of the topic to read messages from or write messages to on the cluster. Create a topic. Access available topics.
-
For authentication, an API key and secret.
To create the source connector:
- On the sidebar, click Connectors.
- Click Sources.
- Cick New or Create Connector.
- Give the connector some unique Name.
- In the Provider area, click Kafka.
- Click Continue.
- Follow the on-screen instructions to fill in the fields as described later on this page.
- Click Save and Test.
Fill in the following fields:
- Name (required): A unique name for this connector.
- Bootstrap Server (required): The hostname of the bootstrap Kafka cluster to connect to.
- Port: The port number of the cluster.
- Group ID: The ID of the consumer group, if any, that is associated with the target Kafka cluster.
(A consumer group is a way to allow a pool of consumers to divide the consumption of data
over topics and partitions.) The default is
default_group_id
if not otherwise specified. - Topic (required): The unique name of the topic to read messages from and write messages to on the cluster.
- Number of messages to consume: The maximum number of messages to get from the topic. The default is
100
if not otherwise specified. - Batch Size: The maximum number of messages to send in a single batch. The default is
100
if not otherwise specified. - API Key (required): The Kafka API key value.
- Secret (required): The secret value for the Kafka API key.
Was this page helpful?