In this blog post A Practical Guide to Securing Streamlit Environment Vars with TOML we will show you how to keep API keys, database URLs, and service credentials safe while building fast Streamlit apps.
Secrets are the backbone of most apps. If you hardcode them, you risk leaks; if you make them too hard to manage, teams slow down. This post explains a clean, practical way to secure environment variables for Streamlit using TOML files. We start with the big picture, then walk through hands-on steps you can apply today.
Why this matters
Streamlit makes it simple to build data apps. But simplicity should not mean unsafe. A disciplined secrets approach prevents accidental commits of API keys, reduces blast radius if a key leaks, and keeps dev, test, and prod neatly separated. Using TOML for secrets gives you an easy, typed, and structured configuration that Streamlit understands out of the box.
The technology behind it
Streamlit secrets
Streamlit provides st.secrets
, a secure configuration interface that loads from a .streamlit/secrets.toml
file during local development. On Streamlit Community Cloud, you store secrets in the app settings UI; those values are injected into st.secrets
at runtime, never committed to your repo.
TOML essentials
TOML (Tom’s Obvious, Minimal Language) is a human-friendly configuration format with typed values and nested sections. Think of it like a safer, more structured alternative to .env
files. It supports strings, integers, booleans, arrays, and tables (sections), which makes it ideal for grouping credentials and environment settings. Streamlit natively reads TOML for secrets, which is why it’s the preferred format here.
High-level workflow
- Local development: Place secrets in
.streamlit/secrets.toml
, which you never commit. - Streamlit Community Cloud: Add secrets in the app’s “Secrets” section;
st.secrets
reads them at runtime. - Other hosting (Docker, VM, Kubernetes): Mount or generate a
secrets.toml
at deploy time, or read from environment variables as a fallback.
This pattern keeps secrets out of your codebase and makes promotion from dev to prod predictable.
Project structure
my-streamlit-app/
├─ app.py
├─ requirements.txt
└─ .streamlit/
└─ secrets.toml # local only; DO NOT COMMIT
Create your secrets.toml
Define logical sections so you can rotate secrets independently and limit what each component can access.
# .streamlit/secrets.toml (local dev only)
[api]
openai_key = "<your-openai-key>"
maps_key = "<your-maps-key>"
[database]
url = "postgresql+psycopg2://user:pass@host:5432/dbname"
[email]
smtp_host = "smtp.example.com"
smtp_user = "apikey"
smtp_password = "<your-smtp-password>"
Keep it out of git
Add this to your .gitignore
:
.streamlit/secrets.toml
Use secrets in your Streamlit app
Access values through st.secrets
. Optionally fall back to OS environment variables so your app can run even if a TOML file isn’t present in production.
import os
import streamlit as st
from sqlalchemy import create_engine
# Prefer st.secrets; fall back to environment variables when needed
OPENAI_KEY = st.secrets.get("api", {}).get("openai_key") or os.getenv("OPENAI_API_KEY")
DB_URL = st.secrets.get("database", {}).get("url") or os.getenv("DATABASE_URL")
# Example: use a DB engine without logging credentials
engine = create_engine(DB_URL) if DB_URL else None
st.title("Secrets demo")
if OPENAI_KEY:
st.write("OpenAI key configured.")
else:
st.warning("OpenAI key missing. Set api.openai_key in secrets or OPENAI_API_KEY env var.")
if engine:
st.write("Database configured for:", engine.url.database) # safe: database name only
else:
st.warning("Database URL missing. Set database.url or DATABASE_URL.")
Never log secrets
- Do not print or write secrets to Streamlit widgets or logs.
- When debugging, reveal only non-sensitive portions, e.g., hostnames or database names.
Local vs cloud
Local development
- Create
.streamlit/secrets.toml
as shown above. - Run
streamlit run app.py
.st.secrets
will load your TOML values.
Streamlit Community Cloud
- Open the app’s settings and add secrets under “Secrets”.
- Those values are encrypted and available via
st.secrets
at runtime. - You don’t need a
secrets.toml
in the repo for cloud usage.
Docker and other hosts
Mount the secrets file at runtime so it never bakes into your image layers:
# Build the image
docker build -t my-streamlit-app .
# Run with secrets mounted into the container
docker run \
-p 8501:8501 \
-v $PWD/.streamlit/secrets.toml:/app/.streamlit/secrets.toml:ro \
my-streamlit-app
Alternatively, generate the file on the host during deployment and keep it out of your source repository and image.
CI/CD patterns
In pipelines (e.g., GitHub Actions, GitLab CI, Azure DevOps), store secrets in the platform’s secure vault. At deploy time, write them into .streamlit/secrets.toml
just before starting the app.
GitHub Actions example
name: Deploy Streamlit app
on: [push]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Write secrets to TOML
run: |
mkdir -p .streamlit
cat > .streamlit/secrets.toml << 'EOF'
[api]
openai_key = "${{ secrets.OPENAI_API_KEY }}"
[database]
url = "${{ secrets.DATABASE_URL }}"
EOF
- run: pip install -r requirements.txt
- run: streamlit run app.py --server.headless true
Notes:
- Never echo secret values in logs.
- Grant the least permissions needed to deploy.
- Rotate repository or environment secrets periodically.
Validate required secrets
Fail fast with a small helper so missing secrets don’t become runtime surprises.
import streamlit as st
REQUIRED = [
("api", "openai_key"),
("database", "url"),
]
def require(secrets, section, key):
if section not in secrets or key not in secrets[section]:
raise KeyError(f"Missing secret: [{section}] {key}")
return secrets[section][key]
try:
OPENAI_KEY = require(st.secrets, "api", "openai_key")
DB_URL = require(st.secrets, "database", "url")
except KeyError as e:
st.error(str(e))
st.stop()
Environment variables as a fallback
Some platforms prefer environment variables only. Keep a thin adapter in your code so you can run in both modes.
import os
import streamlit as st
def get_secret(section, key, env_var=None):
# Try st.secrets first
val = st.secrets.get(section, {}).get(key) if section in st.secrets else None
if val:
return val
# Fallback to environment variable
if env_var:
return os.getenv(env_var)
return None
OPENAI_KEY = get_secret("api", "openai_key", env_var="OPENAI_API_KEY")
Rotation, revocation, and scope
- Use separate keys per environment (dev, staging, prod) to limit blast radius.
- Rotate keys on a schedule or after personnel changes.
- Prefer narrowly scoped credentials (e.g., DB user with read-only access for dashboards).
- Revoke on suspicion of leakage and re-deploy immediately with new secrets.
Common pitfalls and fixes
- Secrets committed to git: Remove the file, rotate keys immediately, and add to
.gitignore
. - KeyError when reading
st.secrets
: Ensure the TOML structure matches your code’s expected sections and keys. - Docker can’t find secrets: Verify the volume mount path matches your container’s working directory and that the file has read permissions.
- Mixed environments: Log which environment you are in (dev/staging/prod) without revealing secrets.
Security checklist
- Never hardcode secrets in code, notebooks, or markdown.
- Keep
.streamlit/secrets.toml
out of version control. - Use
st.secrets
for Streamlit-native loading; use environment variables as a fallback. - Validate required secrets at startup and fail fast.
- Rotate, scope, and audit keys regularly.
- In CI/CD, write secrets at runtime; do not bake into images.
Wrapping up
Securing Streamlit apps doesn’t have to be complicated. By leaning on TOML-based secrets and a few disciplined patterns, you get safer configuration, simpler deployments, and fewer surprises. Whether you deploy on Streamlit Community Cloud or your own infrastructure, the approach above scales cleanly from a single developer to a larger team.
If your team wants a second set of eyes on security, configuration, or CI/CD for data apps, CloudProinc.com.au can help you blueprint and implement a robust setup tailored to your stack.
Discover more from CPI Consulting
Subscribe to get the latest posts sent to your email.