If you’re new to Unstructured, read this note first.

Before you can create a source connector, you must first sign in to your Unstructured account:

  • If you do not already have an Unstructured account, go to https://unstructured.io/contact and fill out the online form to indicate your interest.
  • If you already have an Unstructured account, go to https://platform.unstructured.io and sign in by using the email address, Google account, or GitHub account that is associated with your Unstructured account.

After you sign in, the Unstructured user interface (UI) appears, which you use to create your source connector.

After you create the source connector, add it along with a destination connector to a workflow. Then run the worklow as a job. To learn how, try out the hands-on UI quickstart or watch the 4-minute video tutorial.

You can also create source connectors with the Unstructured API. Learn how.

If you need help, reach out to the community on Slack, or contact us directly.

You are now ready to start creating a source connector! Keep reading to learn how.

Ingest your files into Unstructured from Kafka.

The requirements are as follows.

To create the source connector:

  1. On the sidebar, click Connectors.
  2. Click Sources.
  3. Cick New or Create Connector.
  4. Give the connector some unique Name.
  5. In the Provider area, click Kafka.
  6. Click Continue.
  7. Follow the on-screen instructions to fill in the fields as described later on this page.
  8. Click Save and Test.

Fill in the following fields:

  • Name (required): A unique name for this connector.
  • Bootstrap Server (required): The hostname of the bootstrap Kafka cluster to connect to.
  • Port: The port number of the cluster.
  • Group ID: The ID of the consumer group, if any, that is associated with the target Kafka cluster. (A consumer group is a way to allow a pool of consumers to divide the consumption of data over topics and partitions.) The default is default_group_id if not otherwise specified.
  • Topic (required): The unique name of the topic to read messages from and write messages to on the cluster.
  • Number of messages to consume: The maximum number of messages to get from the topic. The default is 100 if not otherwise specified.
  • Batch Size: The maximum number of messages to send in a single batch. The default is 100 if not otherwise specified.
  • API Key (required): The Kafka API key value.
  • Secret (required): The secret value for the Kafka API key.

Learn more