Database Magic
The %db and %%db magic commands give you direct access to your databases from notebook cells — connect to datasources, inspect schemas, and ask natural language questions that return pandas DataFrames.
Quick Reference
Cell Magic
%%db ask <datasource_id>
Your question in natural language
%%db ask <datasource_id> --model anthropic:claude-3-5-sonnet-20241022
Your question with a specific modelLine Magic
%db list # List available datasources
%db connect <datasource_id> # Create SQLAlchemy engine
%db connect <datasource_id> --var engine # Use custom variable name
%db schema <datasource_id> # Show tables and columns
%db help # Show helpNatural Language SQL (%%db ask)
Ask questions in plain English — the AI generates SQL, executes it, and returns a DataFrame.
%%db ask sales_db
What were the top 10 customers by revenue last quarter?Results are automatically stored in:
_df— pandas DataFrame of results_sql— the generated SQL query
# Use the results
print(_sql) # See the generated query
_df.head() # Explore the data
_df.plot() # VisualizeSpecify a Model
%%db ask sales_db --model anthropic:claude-3-5-sonnet-20241022
Show me monthly revenue trends for this yearThe default model comes from the AI model selected in the sidebar.
Connecting to Datasources
List Available Datasources
%db listLists all datasources from your credential vault with their IDs and types.
Connect
%db connect sales_dbCreates a SQLAlchemy engine stored in a variable named after the datasource ID. Use --var to customize:
%db connect sales_db --var engineThen use it directly with pandas or SQLAlchemy:
import pandas as pd
df = pd.read_sql("SELECT * FROM orders LIMIT 100", engine)Inspect Schema
%db schema sales_dbDisplays all tables and columns in the datasource — useful before asking questions so you understand the data structure.
Supported Databases
| Database | Notes |
|---|---|
| PostgreSQL | Full support |
| MySQL / MariaDB | Full support |
| SQLite | Full support |
| BigQuery | GCP credentials required |
| Snowflake | Snowflake credentials required |
| Redshift | AWS credentials required |
| ClickHouse | Full support |
| DuckDB | Full support |
| MS SQL Server | Full support |
| Oracle | Full support |
| Databricks | Databricks token required |
Datasources are configured in your workspace credential vault — contact your admin or use the Hub to add connections.
Tips
- Run
%db schema <id>before asking questions to get better SQL generation - Results in
_dfpersist across cells — chain analysis steps - Use
--modelto try a different AI if results aren’t accurate - Tab completion works for datasource IDs and flags