This is a reference to all environment variables that can be used to configure a Lightdash deployment.

VariableDescriptionRequired?Default
PGHOSTHostname of postgres server to store Lightdash data
PGPORTPort of postgres server to store Lightdash data
PGUSERUsername of postgres user to access postgres server to store Lightdash data
PGPASSWORDPassword for PGUSER
PGDATABASEDatabase name inside postgres server to store Lightdash data
PGCONNECTIONURIConnection URI for postgres server to store Lightdash data in the format postgresql://user:password@host:port/db?params. This is an alternative to providing the previous PG variables.
PGMAXCONNECTIONSMaximum number of connections to the database
PGMINCONNECTIONSMinimum number of connections to the database
LIGHTDASH_SECRETSecret key used to secure various tokens in Lightdash. This must be fixed between deployments. If the secret changes, you won’t have access to Lightdash data.
SECURE_COOKIESOnly allows cookies to be stored over a https connection. We use cookies to keep you logged in. This is recommended to be set to true in production.false
COOKIES_MAX_AGE_HOURSHow many hours a user session exists before the user is automatically signed out. For example if 24, then the user will be automatically after 24 hours of inactivity.
TRUST_PROXYThis tells the Lightdash server that it can trust the X-Forwarded-Proto header it receives in requests. This is useful if you use SECURE_COOKIES=true behind a HTTPS terminated proxy that you can trust.false
SITE_URLSite url where Lightdash is being hosted. It should include the protocol. E.g https://lightdash.mycompany.comhttp://localhost:8080
INTERNAL_LIGHTDASH_HOSTInternal Lightdash host for the Headless browser to send requests when your Lightdash instance is not accessible from the Internet. Needs to support https if SECURE_COOKIES=trueSame as SITE_URL
STATIC_IPServer static IP so users can add the IP to their warehouse allow-list.http://localhost:8080
LIGHTDASH_QUERY_MAX_LIMITQuery max rows limit5000
LIGHTDASH_QUERY_DEFAULT_LIMITDefault number of rows to return in a query500
LIGHTDASH_QUERY_MAX_PAGE_SIZEMaximum page size for paginated queries2500
SCHEDULER_ENABLEDEnables/Disables the scheduler worker that triggers the scheduled deliveries.true
SCHEDULER_CONCURRENCYHow many scheduled delivery jobs can be processed concurrently.3
SCHEDULER_JOB_TIMEOUTAfter how many milliseconds the job should be timeout so the scheduler worker can pick other jobs.600000 (10 minutes)
SCHEDULER_SCREENSHOT_TIMEOUTTimeout in milliseconds for taking screenshots
SCHEDULER_INCLUDE_TASKSComma-separated list of scheduler tasks to include
SCHEDULER_EXCLUDE_TASKSComma-separated list of scheduler tasks to exclude
LIGHTDASH_CSV_CELLS_LIMITMax cells on CSV file exports100000
LIGHTDASH_CHART_VERSION_HISTORY_DAYS_LIMITConfigure how far back the chart versions history goes in days3
LIGHTDASH_PIVOT_TABLE_MAX_COLUMN_LIMITConfigure maximum number of columns in pivot table60
GROUPS_ENABLEDEnables/Disables groups functionalityfalse
AUTH_ENABLE_OIDC_LINKINGEnables/Disables linking the new OIDC(aka SSO) identity to an existing user if they already have another OIDC with the same emailfalse
AUTH_ENABLE_OIDC_TO_EMAIL_LINKINGEnables/Disables linking OIDC identity to an existing user by emailfalse
CUSTOM_VISUALIZATIONS_ENABLEDEnables/Disables custom chart functionalityfalse
LIGHTDASH_MAX_PAYLOADMaximum HTTP request body size5mb
LIGHTDASH_LICENSE_KEYLicense key for Lightdash Enterprise Edition. Talk to us about Lightdash Enterprise Edition
HEADLESS_BROWSER_HOSTHostname for the headless browser
HEADLESS_BROWSER_PORTPort for the headless browser3001
ALLOW_MULTIPLE_ORGSIf set to true, new users registering on Lightdash will have their own organization, separated from othersfalse
LIGHTDASH_MODEMode for Lightdash (default, demo, pr, etc.)default
DISABLE_PATDisables Personal Access Tokensfalse
PAT_ALLOWED_ORG_ROLESComma-separated list of organization roles allowed to use Personal Access TokensAll roles
PAT_MAX_EXPIRATION_TIME_IN_DAYSMaximum expiration time in days for Personal Access Tokens
MAX_DOWNLOADS_AS_CODEMaximum number of downloads as code100
EXTENDED_USAGE_ANALYTICSEnables extended usage analyticsfalse
USE_SECURE_BROWSERUse secure WebSocket connections for headless browserfalse

Lightdash also accepts all standard postgres environment variables

SMTP

This is a reference to all the SMTP environment variables that can be used to configure a Lightdash email client.

VariableDescriptionRequired?Default
EMAIL_SMTP_HOSTHostname of email server
EMAIL_SMTP_PORTPort of email server587
EMAIL_SMTP_SECURESecure connectiontrue
EMAIL_SMTP_USERAuth user
EMAIL_SMTP_PASSWORDAuth password[1]
EMAIL_SMTP_ACCESS_TOKENAuth access token for Oauth2 authentication[1]
EMAIL_SMTP_ALLOW_INVALID_CERTAllow connection to TLS server with self-signed or invalid TLS certificatefalse
EMAIL_SMTP_SENDER_EMAILThe email address that sends emails
EMAIL_SMTP_SENDER_NAMEThe name of the email address that sends emailsLightdash

[1] EMAIL_SMTP_PASSWORD or EMAIL_SMTP_ACCESS_TOKEN needs to be provided

SSO

These variables enable you to control Single Sign On (SSO) functionality.

VariableDescriptionRequired?Default
AUTH_DISABLE_PASSWORD_AUTHENTICATIONIf “true” disables signing in with plain passwordsfalse
AUTH_ENABLE_GROUP_SYNCIf “true” enables assigning SSO groups to Lightdash groupsfalse
AUTH_GOOGLE_OAUTH2_CLIENT_IDRequired for Google SSO
AUTH_GOOGLE_OAUTH2_CLIENT_SECRETRequired for Google SSO
AUTH_OKTA_OAUTH_CLIENT_IDRequired for Okta SSO
AUTH_OKTA_OAUTH_CLIENT_SECRETRequired for Okta SSO
AUTH_OKTA_OAUTH_ISSUERRequired for Okta SSO
AUTH_OKTA_DOMAINRequired for Okta SSO
AUTH_OKTA_AUTHORIZATION_SERVER_IDOptional for Okta SSO with a custom authorization server
AUTH_OKTA_EXTRA_SCOPESOptional for Okta SSO scopes (e.g. groups) without a custom authorization server
AUTH_ONE_LOGIN_OAUTH_CLIENT_IDRequired for One Login SSO
AUTH_ONE_LOGIN_OAUTH_CLIENT_SECRETRequired for One Login SSO
AUTH_ONE_LOGIN_OAUTH_ISSUERRequired for One Login SSO
AUTH_AZURE_AD_OAUTH_CLIENT_IDRequired for Azure AD
AUTH_AZURE_AD_OAUTH_CLIENT_SECRETRequired for Azure AD
AUTH_AZURE_AD_OAUTH_TENANT_IDRequired for Azure AD
AUTH_AZURE_AD_OIDC_METADATA_ENDPOINTOptional for Azure AD
AUTH_AZURE_AD_X509_CERT_PATHOptional for Azure AD
AUTH_AZURE_AD_X509_CERTOptional for Azure AD
AUTH_AZURE_AD_PRIVATE_KEY_PATHOptional for Azure AD
AUTH_AZURE_AD_PRIVATE_KEYOptional for Azure AD

S3

These variables allow you to configure S3 Object Storage.

VariableDescriptionRequired?Default
S3_ENDPOINTS3 endpoint for storing results
S3_BUCKETName of the S3 bucket for storing files
S3_REGIONRegion where the S3 bucket is located
S3_ACCESS_KEYAccess key for authenticating with the S3 bucket
S3_SECRET_KEYSecret key for authenticating with the S3 bucket
S3_EXPIRATION_TIMEExpiration time for scheduled deliveries files259200 (3d)
S3_FORCE_PATH_STYLEForce path style addressing, needed for MinIO setup e.g. http://your.s3.domain/BUCKET/KEY instead of http://BUCKET.your.s3.domain/KEYfalse
RESULTS_S3_BUCKETName of the S3 bucket used for storing query resultsS3_BUCKET
RESULTS_S3_REGIONRegion where the S3 query storage bucket is locatedS3_REGION
RESULTS_S3_ACCESS_KEYAccess key for authenticating with the S3 query storage bucketS3_ACCESS_KEY
RESULTS_S3_SECRET_KEYSecret key for authenticating with the S3 query storage bucketS3_SECRET_KEY

Cache

Note that you will need an Enterprise License Key for this functionality.

VariableDescriptionRequired?Default
RESULTS_CACHE_ENABLEDEnables caching for chart resultsfalse
AUTOCOMPLETE_CACHE_ENABLEDEnables caching for filter autocomplete resultsfalse
CACHE_STALE_TIME_SECONDSDefines how long cached results remain valid before being considered stale86400 (24h)

These variables are deprecated; use the RESULTS_S3_* versions instead.

VariableDescriptionRequired?Default
RESULTS_CACHE_S3_BUCKETDeprecated - use RESULTS_S3_BUCKETS3_BUCKET
RESULTS_CACHE_S3_REGIONDeprecated - use RESULTS_S3_REGIONS3_REGION
RESULTS_CACHE_S3_ACCESS_KEYDeprecated - use RESULTS_S3_ACCESS_KEYS3_ACCESS_KEY
RESULTS_CACHE_S3_SECRET_KEYDeprecated - use RESULTS_S3_SECRET_KEYS3_SECRET_KEY

Logging

VariableDescriptionRequired?Default
LIGHTDASH_LOG_LEVELThe minimum level of log messages to displayINFO
LIGHTDASH_LOG_FORMATThe format of log messagespretty
LIGHTDASH_LOG_OUTPUTSThe outputs to send log messages toconsole
LIGHTDASH_LOG_CONSOLE_LEVELThe minimum level of log messages to display on the consoleLIGHTDASH_LOG_LEVEL
LIGHTDASH_LOG_CONSOLE_FORMATThe format of log messages on the consoleLIGHTDASH_LOG_FORMAT
LIGHTDASH_LOG_FILE_LEVELThe minimum level of log messages to write to the log fileLIGHTDASH_LOG_LEVEL
LIGHTDASH_LOG_FILE_FORMATThe format of log messages in the log fileLIGHTDASH_LOG_FORMAT
LIGHTDASH_LOG_FILE_PATHThe path to the log file./logs/all.log

Prometheus

VariableDescriptionRequired?Default
LIGHTDASH_PROMETHEUS_ENABLEDEnables/Disables Prometheus metrics endpointfalse
LIGHTDASH_PROMETHEUS_PORTPort for Prometheus metrics endpoint9090
LIGHTDASH_PROMETHEUS_PATHPath for Prometheus metrics endpoint/metrics
LIGHTDASH_PROMETHEUS_PREFIXPrefix for metric names.
LIGHTDASH_GC_DURATION_BUCKETSBuckets for duration histogram in seconds.0.001, 0.01, 0.1, 1, 2, 5
LIGHTDASH_EVENT_LOOP_MONITORING_PRECISIONPrecision for event loop monitoring in milliseconds. Must be greater than zero.10
LIGHTDASH_PROMETHEUS_LABELSLabels to add to all metrics. Must be valid JSON

Security

VariableDescriptionRequired?Default
LIGHTDASH_CSP_REPORT_ONLYEnables Content Security Policy (CSP) reporting only mode. This is recommended to be set to false in production.true
LIGHTDASH_CSP_ALLOWED_DOMAINSList of domains that are allowed to load resources from. Values must be separated by commas.
LIGHTDASH_CSP_REPORT_URIURI to send CSP violation reports to.
LIGHTDASH_CORS_ENABLEDEnables Cross-Origin Resource Sharing (CORS)false
LIGHTDASH_CORS_ALLOWED_DOMAINSList of domains that are allowed to make cross-origin requests. Values must be separated by commas.

Analytics & Event Tracking

VariableDescriptionRequired?Default
RUDDERSTACK_WRITE_KEYRudderStack key used to track events (by default Lightdash’s key is used)
RUDDERSTACK_DATA_PLANE_URLRudderStack data plane URL to which events are tracked (by default Lightdash’s data plane is used)
RUDDERSTACK_ANALYTICS_DISABLEDSet to true to disable RudderStack analytics
POSTHOG_PROJECT_API_KEYAPI key for Posthog (by default Lightdash’s key is used)
POSTHOG_FE_API_HOSTHostname for Posthog’s front-end API
POSTHOG_BE_API_HOSTHostname for Posthog’s back-end API

AI Analyst

These variables enable you to configure the AI Analyst functionality. Note that you will need an Enterprise Licence Key for this functionality.

VariableDescriptionRequired?Default
OPENAI_API_KEYRequired for AI Analyst
OPENAI_MODEL_NAMERequired for AI Analyst
AI_COPILOT_ENABLEDRequired for AI Analyst
OPENAI_BASE_URLOptional base url for OpenAI compatible API
AI_COPILOT_EMBEDDING_SEARCH_ENABLEDExperimental. Required for the AI Analyst to use semantic search to find relevant data
AI_COPILOT_TELEMETRY_ENABLEDEnables telemetry for AI Copilotfalse
AI_COPILOT_REQUIRES_FEATURE_FLAGRequires a feature flag to use AI Copilotfalse
AI_DEFAULT_PROVIDERDefault AI provider to use
ANTHROPIC_API_KEYAPI key for Anthropic
ANTHROPIC_MODEL_NAMEModel name for Anthropic
AZURE_AI_API_KEYAPI key for Azure AI
AZURE_AI_ENDPOINTEndpoint for Azure AI
AZURE_AI_API_VERSIONAPI version for Azure AI
AZURE_AI_DEPLOYMENT_NAMEDeployment name for Azure AI

Slack Integration

These variables enable you to configure the Slack integration.

VariableDescriptionRequired?Default
SLACK_SIGNING_SECRETRequired for Slack integration
SLACK_CLIENT_IDRequired for Slack integration
SLACK_CLIENT_SECRETRequired for Slack integration
SLACK_STATE_SECRETRequired for Slack integrationslack-state-secret
SLACK_APP_TOKENApp token for Slack
SLACK_PORTPort for Slack integration4351
SLACK_SOCKET_MODEEnable socket mode for Slackfalse
SLACK_CHANNELS_CACHED_TIMETime in milliseconds to cache Slack channels600000 (10 minutes)
SLACK_SUPPORT_URLURL for Slack support

GitHub Integration

These variables enable you to configure Github integrations

VariableDescriptionRequired?Default
GITHUB_PRIVATE_KEYGitHub private key for GitHub App authentication
GITHUB_APP_IDGitHub Application ID
GITHUB_CLIENT_IDGitHub OAuth client ID
GITHUB_CLIENT_SECRETGitHub OAuth client secret
GITHUB_APP_NAMEName of the GitHub App
GITHUB_REDIRECT_DOMAINDomain for GitHub OAuth redirection

Microsoft Teams Integration

These variables enable you to configure Microsoft Teams integration.

VariableDescriptionRequired?Default
MICROSOFT_TEAMS_ENABLEDEnables Microsoft Teams integrationfalse

Google Cloud Platform

These variables enable you to configure Google Cloud Platform integration.

VariableDescriptionRequired?Default
GOOGLE_CLOUD_PROJECT_IDGoogle Cloud Platform project ID
GOOGLE_DRIVE_API_KEYGoogle Drive API key
AUTH_GOOGLE_ENABLEDEnables Google authenticationfalse
AUTH_ENABLE_GCLOUD_ADCEnables Google Cloud Application Default Credentialsfalse

Embedding

Note that you will need an Enterprise Licence Key for this functionality.

VariableDescriptionRequired?Default
EMBEDDING_ENABLEDEnables embedding functionalityfalse
LIGHTDASH_IFRAME_EMBEDDING_DOMAINSList of domains that are allowed to embed Lightdash in an iframe. Values must be separated by commas.

Service account

Note that you will need an Enterprise Licence Key for this functionality.

VariableDescriptionRequired?Default
SERVICE_ACCOUNT_ENABLEDEnables service account functionalityfalse

SCIM

Note that you will need an Enterprise Licence Key for this functionality.

VariableDescriptionRequired?Default
SCIM_ENABLEDEnables SCIM (System for Cross-domain Identity Management)false

Sentry

These variables enable you to configure Sentry for error tracking.

VariableDescriptionRequired?Default
SENTRY_DSNSentry DSN for both frontend and backend
SENTRY_BE_DSNSentry DSN for backend only
SENTRY_FE_DSNSentry DSN for frontend only
SENTRY_BE_SECURITY_REPORT_URIURI for Sentry backend security reports
SENTRY_TRACES_SAMPLE_RATESample rate for Sentry traces (0.0 to 1.0)0.1
SENTRY_PROFILES_SAMPLE_RATESample rate for Sentry profiles (0.0 to 1.0)0.2
SENTRY_ANR_ENABLEDEnables Sentry Application Not Responding detectionfalse
SENTRY_ANR_CAPTURE_STACKTRACECaptures stacktrace for ANR eventsfalse
SENTRY_ANR_TIMEOUTTimeout in milliseconds for ANR detection

Intercom & Pylon

These variables enable you to configure Intercom and Pylon for customer support and feedback.

VariableDescriptionRequired?Default
INTERCOM_APP_IDIntercom application ID
INTERCOM_APP_BASEBase URL for Intercom APIhttps://api-iam.intercom.io
PYLON_APP_IDPylon application ID
PYLON_IDENTITY_VERIFICATION_SECRETSecret for verifying Pylon identities

Kubernetes

These variables enable you to configure Kubernetes integration.

VariableDescriptionRequired?Default
K8S_NODE_NAMEName of the Kubernetes node
K8S_POD_NAMEName of the Kubernetes pod
K8S_POD_NAMESPACENamespace of the Kubernetes pod
LIGHTDASH_CLOUD_INSTANCEIdentifier for Lightdash cloud instance

Organization appearance

These variables allow you to customize the default appearance settings for your Lightdash instance’s organizations. This color palette will be set for all organizations in your instance. You can’t choose another one while these env vars are set.

VariableDescriptionRequired?Default
OVERRIDE_COLOR_PALETTE_NAMEName of the default color palette
OVERRIDE_COLOR_PALETTE_COLORSComma-separated list of hex color codes for the default color palette (must be 20 colors)

Initialize instance

When a new Lightdash instance is created, and there are no orgs and projects. You can initialize a new org and project using ENV variables to simplify the deployment process.

Initialize instance is only available on Lightdash Enterprise plans.

For more information on our plans, visit our pricing page.

Currently we only support Databricks project types and Github dbt configuration.

VariableDescriptionRequired?Default
LD_SETUP_ADMIN_NAMEName of the admin user for initial setupAdmin User
LD_SETUP_ADMIN_EMAILEmail of the admin user for initial setup
LD_SETUP_ORGANIZATION_EMAIL_DOMAINEmail domain for the organization whitelisting
LD_SETUP_ORGANIZATION_DEFAULT_ROLEDefault role for new organization membersviewer
LD_SETUP_ORGANIZATION_NAMEName of the organization
LD_SETUP_ADMIN_API_KEYAPI key for the admin user, must start with ldpat_ prefix
LD_SETUP_API_KEY_EXPIRATIONNumber of days until API key expires (0 for no expiration)30
LD_SETUP_SERVICE_ACCOUNT_TOKENA pre-set token for the service account, must start with ldsvc_ prefix
LD_SETUP_SERVICE_ACCOUNT_EXPIRATIONNumber of days until service account token expires (0 for no expiration)30
LD_SETUP_PROJECT_NAMEName of the project
LD_SETUP_PROJECT_CATALOGCatalog name for Databricks project
LD_SETUP_PROJECT_SCHEMASchema/database name for the project
LD_SETUP_PROJECT_HOSTHostname for the Databricks server
LD_SETUP_PROJECT_HTTP_PATHHTTP path for Databricks connection
LD_SETUP_PROJECT_PATPersonal access token for Databricks
LD_SETUP_START_OF_WEEKDay to use as start of weekSUNDAY
LD_SETUP_PROJECT_COMPUTEJSON string with Databricks compute configuration like {"name": "string", "httpPath": "string"}
LD_SETUP_DBT_VERSIONVersion of dbt to use (eg: v1.8)latest
LD_SETUP_GITHUB_PATGitHub personal access token
LD_SETUP_GITHUB_REPOSITORYGitHub repository for dbt project
LD_SETUP_GITHUB_BRANCHGitHub branch for dbt project
LD_SETUP_GITHUB_PATHSubdirectory path within GitHub repository/

In order to login as the admin user using SSO, you must enable the following ENV variable too:

AUTH_ENABLE_OIDC_TO_EMAIL_LINKING=true

This will alow you to link your SSO account with the email provided without using an invitation code.
This email will be trusted, and any user with an OIDC account with that email will access the admin user.