Azure
Batch process all your records to store structured outputs in an Azure Storage account.
You will need:
The Azure Storage account prerequisites.
The following video shows how to fulfill the minimum set of Azure Storage account prerequisites:
If you are generating an SAS token as shown in the preceding video, be sure to set the following permissions:
- Read and List for reading from the container only.
- Write and List for writing to the container only.
- Read, Write, and List for both reading from and writing to the container.
Here are some more details about these prerequisites:
-
An Azure account. To create one, learn how.
-
An Azure Storage account, and a container within that account. Create a storage account. Create a container.
-
The Azure Storage remote URL, using the format
az://<container-name>/<path/to/file/or/folder/in/container/as/needed>
For example, if your container is named
my-container
, and there is a folder in the container namedmy-folder
, the Azure Storage remote URL would beaz://my-container/my-folder/
. -
An SAS token (recommended), access key, or connection string for the Azure Storage account. Create an SAS token (recommended). Get an access key. Get a connection string.
Create an SAS token (recommended):
Get an access key or connection string:
The Azure Storage account connector dependencies:
pip install "unstructured-ingest[azure]"
You might also need to install additional dependencies, depending on your needs. Learn more.
These environment variables:
-
AZURE_STORAGE_REMOTE_URL
- The Azure Storage remote URL, represented by--remote-url
(CLI) orremote_url
(Python).The remote URL takes the format
az://<container-name>/<path/to/file/or/folder/in/container/as/needed>
For example, if your container is named
my-container
, and there is a folder in the container namedmy-folder
, the Azure Storage remote URL would beaz://my-container/my-folder/
. -
AZURE_STORAGE_ACCOUNT_NAME
- The name of the Azure Storage account, represented by--account-name
(CLI) oraccount_name
(Python). -
AZURE_STORAGE_ACCOUNT_KEY
,AZURE_STORAGE_CONNECTION_STRING
, orAZURE_STORAGE_SAS_TOKEN
- The name of the key, connection string, or SAS token for the Azure Storage account, depending on your security configuration, represented by--account-key
(CLI) oraccount_key
(Python),--connection-string
(CLI) orconnection_string
(Python), and--sas_token
(CLI) orsas_token
(Python), respectively.
Now call the Unstructured Ingest CLI or Unstructured Ingest Python. The source connector can be any of the ones supported. This example uses the local source connector.
This example sends files to Unstructured API services for processing by default. To process files locally instead, see the instructions at the end of this page.
For the Unstructured Ingest CLI and the Unstructured Ingest Python library, you can use the --partition-by-api
option (CLI) or partition_by_api
(Python) parameter to specify where files are processed:
-
To do local file processing, omit
--partition-by-api
(CLI) orpartition_by_api
(Python), or explicitly specifypartition_by_api=False
(Python).Local file processing does not use an Unstructured API key or API URL, so you can also omit the following, if they appear:
--api-key $UNSTRUCTURED_API_KEY
(CLI) orapi_key=os.getenv("UNSTRUCTURED_API_KEY")
(Python)--partition-endpoint $UNSTRUCTURED_API_URL
(CLI) orpartition_endpoint=os.getenv("UNSTRUCTURED_API_URL")
(Python)- The environment variables
UNSTRUCTURED_API_KEY
andUNSTRUCTURED_API_URL
-
To send files to Unstructured API services for processing, specify
--partition-by-api
(CLI) orpartition_by_api=True
(Python).Unstructured API services also requires an Unstructured API key and API URL, by adding the following:
--api-key $UNSTRUCTURED_API_KEY
(CLI) orapi_key=os.getenv("UNSTRUCTURED_API_KEY")
(Python)--partition-endpoint $UNSTRUCTURED_API_URL
(CLI) orpartition_endpoint=os.getenv("UNSTRUCTURED_API_URL")
(Python)- The environment variables
UNSTRUCTURED_API_KEY
andUNSTRUCTURED_API_URL
, representing your API key and API URL, respectively.
Was this page helpful?