- Images for Slack unfurl and scheduled deliveries
- Results on JSONL/CSV/Excel format
Configure cloud storage on Google Cloud Platform
Go into GCP: Cloud storage Create a new bucket with the following details:- give it an unique
Bucket name
like -lightdash-cloud-file-storage-eu
- Select region (like multi US or multi EU or single region EU-west-1)
- select storage class: default standard
- disable
enforce public access
and selectfine-grained
- Protection:
none
Access key for service account
- Copy the
access key
andsecret
S3_ENDPOINT
for google is https://storage.googleapis.com
S3_REGION
for google is auto
Configure cloud storage on AWS
- Navigate to the S3 section of the AWS Management Console and click on the Create Bucket button.
- Give your bucket a name and select the region where you want to store your data.
- Next, you need to set the permissions for your bucket. Make it private.
- Navigate to the IAM section and click on the Users tab.
- Click on the user whose credentials you want to export.
- Click on the Security Credentials tab and locate the Access Keys section.
- Click on the Create Access Key button.
- Download the CSV file that contains your Access Key ID and Secret Access Key.
S3_ENDPOINT
for your bucket
Configure cloud storage using MinIO
Creating a bucket in MinIO- Login to the MinIO console and click on “Buckets” in the side bar
- Click on “Create Bucket”
- Give your bucket a name and click “Create Bucket”
- Click on “Access Keys” in the side bar
- Click “Create access key”
- Give a name to your new access key and click “Create”
- Download the JSON file containing both your Access Key ID and Secret Access Key
S3_FORCE_PATH_STYLE: true
in your environment variables.
Azure Storage
Azure Blob Storage is not natively compatible with the S3 API. While Lightdash supports external object storage by allowing integration with S3-compatible APIs, Azure’s storage service does not provide this compatibility out of the box. This means that you cannot use Azure Blob Storage as a drop-in replacement for S3 in Lightdash deployments. Instead, you can use one of the following S3-compatible solutions within your Azure setup:- MinIO: S3-compatible object storage
- s3proxy: A lightweight proxy that adds an S3-compatible API layer on top of Azure Blob Storage.
Configure Lightdash to use S3 credentials
To enable Lightdash to use your S3 bucket for cloud storage, you’ll need to set the following environment variables:Using IAM roles
Lightdash also supports authentication via IAM roles. If you omit theS3_ACCESS_KEY
and S3_SECRET_KEY
variables, the S3 library will automatically attempt to use IAM roles. For more details on how this works, refer to the AWS SDK for JavaScript documentation on setting credentials in Node.js.
If you are using an IAM role to generate signed URLs, be aware that these URLs have a maximum validity of 7 days due to AWS limitations, independently of the S3_EXPIRATION_TIME
configuration.