This page was recently updated. What do you think about it? Let us know!.

Batch process all your records to store structured outputs in LanceDB.

You will need:

The LanceDB prerequisites:

  • A LanceDB open source software (OSS) installation on a local machine, a server, or a virtual machine. (LanceDB Cloud is not supported.)

  • For LanceDB OSS with local data storage:

    • The local path to the folder where the LanceDB data is (or will be) stored. See Connect to a database in the LanceDB documentation.
    • The name of the target LanceDB table within the local data folder.
  • For LanceDB OSS with data storage in an Amazon S3 bucket:

    • The URI for the target Amazon S3 bucket and any target folder path within that bucket. Use the format s3://<bucket-name>[/<folder-name>].
    • The name of the target LanceDB table within the Amazon S3 bucket.
    • The AWS access key ID and AWS secret access key for the AWS IAM entity that has access to the Amazon S3 bucket.

    For more information, see AWS S3 in the LanceDB documentation, along with the following video:

  • For LanceDB OSS with data storage in an Azure Blob Storage account:

    • The name of the target Azure Blob Storage account. = The URI for the target container within that Azure Blob Storage account and any target folder path within that container. Use the format az://<container-name>[/<folder-name>].
    • The name of the target LanceDB table within the Azure Blob Storage account.
    • The access key for the Azure Blob Storage account.

    For more information, see Azure Blob Storage in the LanceDB documentation, along with the following video:

  • For LanceDB OSS with data storage in a Google Cloud Storage bucket:

    • The URI for the target Google Cloud Storage bucket and any target folder path within that bucket. Use the format gs://<bucket-name>[/<folder-name>].
    • The name of the target LanceDB table within the Google Cloud Storage bucket.
    • A single-line string that contains the contents of the downloaded service account key file for the Google Cloud service account that has access to the Google Cloud Storage bucket.

    For more information, see Google Cloud Storage in the LanceDB documentation, along with the following video:

The LanceDB connector dependencies:

CLI, Python
pip install "unstructured-ingest[lancedb]"

You might also need to install additional dependencies, depending on your needs. Learn more.

The following environment variables:

  • For LanceDB OSS with local data storage:

    • LANCEDB_URI - The local path to the folder where the LanceDB data is stored, represented by --uri (CLI) or uri (Python).
    • LANCEDB_TABLE - The name of the target LanceDB table within the local data folder, represented by --table-name (CLI) or table_name (Python).
  • For LanceDB OSS with data storage in an Amazon S3 bucket:

    • LANCEDB_URI - The URI for the target Amazon S3 bucket and any target folder path within that bucket. Use the format s3://<bucket-name>[/<folder-name>]. This is represented by --uri (CLI) or uri (Python).
    • LANCEDB_TABLE - The name of the target LanceDB table within the Amazon S3 bucket, rrepresented by --table-name (CLI) or table_name (Python).
    • AWS_ACCESS_KEY_ID - The AWS access key ID for the AWS IAM entity that has access to the Amazon S3 bucket, represented by --aws-access-key-id (CLI) or aws_access_key_id (Python).
    • AWS_SECRET_ACCESS_KEY - The AWS secret access key for the AWS IAM entity that has access to the Amazon S3 bucket, represented by --aws-secret-access-key (CLI) or aws_secret_access_key (Python).
  • For LanceDB OSS with data storage in an Azure Blob Storage account:

    • LANCEDB_URI - The URI for the target container within that Azure Blob Storage account and any target folder path within that container. Use the format az://<container-name>[/<folder-name>]. This is represented by --uri (CLI) or uri (Python).
    • LANCEDB_TABLE - The name of the target LanceDB table within the Azure Blob Storage account, represented by --table-name (CLI) or table_name (Python).
    • AZURE_STORAGE_ACCOUNT_NAME - The name of the target Azure Blob Storage account, represented by --azure-storage-account-name (CLI) or azure_storage_account_name (Python).
    • AZURE_STORAGE_ACCOUNT_KEY - The access key for the Azure Blob Storage account, represented by --azure-storage-account-key (CLI) or azure_storage_account_key (Python).
  • For LanceDB OSS with data storage in a Google Cloud Storage bucket:

    • LANCEDB_URI - The URI for the target Google Cloud Storage bucket and any target folder path within that bucket. Use the format gs://<bucket-name>[/<folder-name>]. This is represented by --uri (CLI) or uri (Python).
    • LANCEDB_TABLE - The name of the target LanceDB table within the Google Cloud Storage bucket, represented by --table-name (CLI) or table_name (Python).
    • GCS_SERVICE_ACCOUNT_KEY - A single-line string that contains the contents of the downloaded service account key file for the Google Cloud service account that has access to the Google Cloud Storage bucket, represented by --google-service-account-key (CLI) or google_service_account_key (Python).

These environment variables:

  • UNSTRUCTURED_API_KEY - Your Unstructured API key value.
  • UNSTRUCTURED_API_URL - Your Unstructured API URL.

Now call the Unstructured CLI or Python SDK. The source connector can be any of the ones supported. This example uses the local source connector: