Delta Table
Batch process all your records to store structured outputs in a Delta Table in an Amazon S3 bucket.
You will need:
The Delta Table prerequisites for Amazon S3:
The following video shows how to fulfill the minimum set of S3 prerequisites:
The preceding video does not show how to create an AWS account or an S3 bucket.
For more information about prerequisites, see the following:
-
An AWS account. Create an AWS account.
-
An S3 bucket. Create an S3 bucket. Additional approaches are in the following video and in the how-to sections at the end of this page.
-
For authenticated bucket read access, the authenticated AWS IAM user must have at minimum the permissions of
s3:ListBucket
ands3:GetObject
for that bucket. Learn how. -
For bucket write access, authenticated access to the bucket must be enabled (anonymous access must not be enabled), and the authenticated AWS IAM user must have at minimum the permission of
s3:PutObject
for that bucket. Learn how. -
For authenticated access, an AWS access key and secret access key for the authenticated AWS IAM user in the account. Create an AWS access key and secret access key.
-
If the target files are in the root of the bucket, the path to the bucket, formatted as
protocol://bucket/
(for example,s3://my-bucket/
). If the target files are in a folder, the path to the target folder in the S3 bucket, formatted asprotocol://bucket/path/to/folder/
(for example,s3://my-bucket/my-folder/
). -
If the target files are in a folder, make sure the authenticated AWS IAM user has authenticated access to the folder as well. Enable authenticated folder access.
Add an access policy to an existing bucket
To use the Amazon S3 console to add an access policy that allows all authenticated AWS IAM users in the corresponding AWS account to read and write to an existing S3 bucket, do the following.
-
Sign in to the AWS Management Console.
-
Open the Amazon S3 Console.
-
Browse to the existing bucket and open it.
-
Click the Permissions tab.
-
In the Bucket policy area, click Edit.
-
In the Policy text area, copy the following JSON-formatted policy. To change the following policy to restrict it to a specific user in the AWS account, change
root
to that specific username.In this policy, replace the following:
- Replace
<my-account-id>
with your AWS account ID. - Replace
<my-bucket-name>
in two places with the name of your bucket.
- Replace
-
Click Save changes.
Create a bucket with AWS CloudFormation
To use the AWS CloudFormation console to create an Amazon S3 bucket that allows all authenticated AWS IAM users in the corresponding AWS account to read and write to the bucket, do the following.
-
Save the following YAML to a file on your local machine, for example
create-s3-bucket.yaml
. To change the following bucket policy to restrict it to a specific user in the AWS account, changeroot
to that specific username. -
Sign in to the AWS Management Console.
-
Open the AWS CloudFormation Console.
-
Click Create stack > With new resources (standard).
-
On the Create stack page, with Choose an existing template already selected, select Upload a template file.
-
Click Choose file, and browse to and select the YAML file from your local machine.
-
Click Next.
-
Enter a unique Stack name and BucketName.
-
Click Next two times.
-
Click Submit.
-
Wait until the Status changes to CREATE_COMPLETE.
-
After the bucket is created, you can delete the YAML file, if you want.
Create a bucket with the AWS CLI
To use the AWS CLI to create an Amazon S3 bucket that allows all authenticated AWS IAM users in the corresponding AWS account to read and write to the bucket, do the following.
-
Copy the following script to a file on your local machine, for example a file named
create-s3-bucket.sh
. To change the following bucket policy to restrict it to a specific user in the AWS account, changeroot
to that specific username.In this script, replace the following:
- Replace
<my-account-id>
with your AWS account ID. - Replace
<my-unique-bucket-name>
with the name of your bucket. - Replace
<us-east-1>
with your AWS Region.
- Replace
-
Run the script, for example:
-
After the bucket is created, you can delete the script file, if you want.
The Delta Table connector dependencies for Amazon S3:
You might also need to install additional dependencies, depending on your needs. Learn more.
The following environment variables:
AWS_S3_URL
- The path to the S3 bucket or folder, formatted ass3://my-bucket/
(if the files are in the bucket’s root) ors3://my-bucket/my-folder/
, represented by--table-uri
(CLI) ortable_uri
(Python).AWS_ACCESS_KEY_ID
- The AWS access key ID for the authenticated AWS IAM user, represented by--aws-access-key-id
(CLI) oraws_access_key
(Python).AWS_SECRET_ACCESS_KEY
- The corresponding AWS secret access key, represented by--aws-secret-access-key
(CLI) oraws_secret_access_key
(Python).
Now call the Unstructured Ingest CLI or the Unstructured Ingest Python library. The source connector can be any of the ones supported. This example uses the local source connector.
This example sends files to Unstructured API services for processing by default. To process files locally instead, see the instructions at the end of this page.
For the Unstructured Ingest CLI and the Unstructured Ingest Python library, you can use the --partition-by-api
option (CLI) or partition_by_api
(Python) parameter to specify where files are processed:
-
To do local file processing, omit
--partition-by-api
(CLI) orpartition_by_api
(Python), or explicitly specifypartition_by_api=False
(Python).Local file processing does not use an Unstructured API key or API URL, so you can also omit the following, if they appear:
--api-key $UNSTRUCTURED_API_KEY
(CLI) orapi_key=os.getenv("UNSTRUCTURED_API_KEY")
(Python)--partition-endpoint $UNSTRUCTURED_API_URL
(CLI) orpartition_endpoint=os.getenv("UNSTRUCTURED_API_URL")
(Python)- The environment variables
UNSTRUCTURED_API_KEY
andUNSTRUCTURED_API_URL
-
To send files to Unstructured API services for processing, specify
--partition-by-api
(CLI) orpartition_by_api=True
(Python).Unstructured API services also requires an Unstructured API key and API URL, by adding the following:
--api-key $UNSTRUCTURED_API_KEY
(CLI) orapi_key=os.getenv("UNSTRUCTURED_API_KEY")
(Python)--partition-endpoint $UNSTRUCTURED_API_URL
(CLI) orpartition_endpoint=os.getenv("UNSTRUCTURED_API_URL")
(Python)- The environment variables
UNSTRUCTURED_API_KEY
andUNSTRUCTURED_API_URL
, representing your API key and API URL, respectively.
Was this page helpful?