A CLI tool for running read-only SQL queries against PostgreSQL databases across multiple environments (dev, staging, production).
Connections are enforced as read-only via default_transaction_read_only=on, so you can safely explore without risk of accidental writes.
Requires Python 3.9+.
pip install .Or for development:
pip install -e .On first run, explore-db creates a default config directory at ~/.explore-db/ containing:
| File | Purpose |
|---|---|
config.json |
Maps environment names to .env files (or direct connection strings) |
.env.dev |
Dev database connection |
.env.staging |
Staging database connection |
.env.production |
Production database connection |
To initialize manually:
explore-db --dev "SELECT 1"Then edit the generated files to set your actual connection strings.
~/.explore-db/config.json maps each environment to either a .env filename or a direct PostgreSQL URL:
{
"dev": ".env.dev",
"staging": ".env.staging",
"production": "postgresql://user:password@host:5432/dbname"
}Each .env file should contain a DATABASE_URL variable:
DATABASE_URL=postgresql://user:password@host:5432/dbname
Pick an environment with --dev, --staging, or --production, then provide SQL in one of three ways:
Inline query:
explore-db --dev "SELECT * FROM users LIMIT 10"From a file:
explore-db --staging -f query.sqlPiped from stdin:
cat query.sql | explore-db --productionResults are printed as JSON with column names and row count:
Columns: ['id', 'name', 'email']
Row count: 2
[
{
"id": 1,
"name": "Alice",
"email": "alice@example.com"
},
{
"id": 2,
"name": "Bob",
"email": "bob@example.com"
}
]