Most teams start with CSV exports and stay there longer than they should. Then someone asks why the revenue figures in the board deck don't match the database, and suddenly everyone wants a real solution.
Why Manual CSV Exports Break Down
The export-and-paste workflow isn't just tedious โ it's actively risky. You're making a manual copy of data at a specific moment in time, then putting that static snapshot in a place where people treat it as current.
According to a 2023 survey by Fivetran, 67% of data analysts spend more than 4 hours per week on manual data movement tasks that could be automated. That's across all tools, but spreadsheet-to-database syncing is consistently near the top of the list.
For a finance team running weekly reporting cycles, a stale Postgres export means P&L numbers that are days or weeks behind actual transaction data. The problem isn't technical incompetence โ it's that the easy path (download CSV, paste into Sheets) doesn't scale past "once."
The 4 Approaches, Compared
| Method | Setup time | Refresh | SQL control | Cost |
|---|---|---|---|---|
| CSV export/import | 5 min | Manual only | Full | Free |
| Apps Script + JDBC | 2-4 hours | Automated | Full | Free |
| ETL tools (Fivetran, Airbyte) | 30-60 min | Automated | Limited | $50โ$500+/mo |
| DuckDB (via ModelMonkey) | 10 min | On-demand | Full | Included |
The right choice depends on whether you want to write code, pay for infrastructure, or just ask in plain language.
Apps Script + JDBC: The Free But Painful Route
Google Apps Script supports JDBC connections directly to PostgreSQL. You write a function that opens a connection, runs a query, and writes results to a named range. Automated triggers handle the refresh schedule.
Here's the minimum viable implementation:
function syncFromPostgres() {
const conn = Jdbc.getConnection(
'jdbc:postgresql://your-host:5432/your_db',
'username',
'password'
);
const stmt = conn.prepareStatement(
'SELECT date, revenue, cogs, gross_margin FROM financials.p_and_l WHERE date >= CURRENT_DATE - 90'
);
const results = stmt.executeQuery();
const sheet = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('P&L');
let row = 2;
while (results.next()) {
sheet.getRange(row, 1, 1, 4).setValues([[
results.getString(1),
results.getFloat(2),
results.getFloat(3),
results.getFloat(4)
]]);
row++;
}
results.close();
conn.close();
}
This works, but it has real limitations. Google's JDBC support requires your PostgreSQL instance to be publicly accessible or whitelisted against Google's IP ranges โ a security tradeoff many teams won't accept. As of March 2026, Google doesn't publish a stable list of Apps Script IP addresses for allowlisting, which makes firewall configuration into a guessing game.
The Apps Script execution timeout is 6 minutes for consumer accounts, 30 minutes for Workspace. If your query returns more than a few thousand rows, you'll hit the wall.
ETL Connectors: Right for Some Teams, Overkill for Others
Tools like Fivetran, Airbyte, and Stitch will sync your Postgres tables to Google Sheets (or more commonly, to BigQuery, which you then connect to Sheets). The setup is genuinely easy โ configure your source, authenticate your destination, schedule a sync.
The tradeoff is cost and abstraction. Fivetran's base plan starts around $100/month and prices by monthly active rows. Airbyte's cloud tier starts at $3 per million synced records. For a team that needs to sync 3 tables on a daily schedule, this is probably worth it. For an analyst who wants to run ad-hoc queries against a production Postgres instance, it's excessive.
The bigger issue: ETL connectors replicate entire tables. You don't write SQL against the source โ you configure which tables to sync, then query the destination. If you want to run a custom join across 4 tables with specific filter conditions, that logic has to live somewhere downstream.
DuckDB: The Option Most Analysts Haven't Tried
DuckDB includes a core postgres extension that attaches a live PostgreSQL database as a queryable source. ModelMonkey uses this architecture to let you write SQL directly against your Postgres instance and land the results in your spreadsheet.
The connection setup looks like this internally:
ATTACH 'dbname=analytics user=readonly password=... host=db.internal port=5432'
AS pg (TYPE postgres);
SELECT
date_trunc('month', t.created_at) AS month,
p.name AS product,
SUM(t.amount) AS revenue,
COUNT(*) AS transaction_count
FROM pg.transactions t
JOIN pg.products p ON t.product_id = p.id
WHERE t.created_at >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY 1, 2
ORDER BY 1 DESC, 3 DESC;
The results land directly in your sheet. You're not copying data into an intermediate warehouse โ you're querying Postgres through DuckDB's memory and writing the output to a range.
This is the approach described in ModelMonkey's DuckDB integration documentation, and it's meaningfully different from the ETL pattern because your SQL runs fresh each time, against live data.
The One Thing Nobody Tells You About Postgres โ Sheets Connections
The connection itself is the easy part. The hard part is credential management.
Every approach except CSV export requires your Postgres credentials to live somewhere: in Apps Script project properties, in a third-party connector's encrypted vault, or in a secrets management system. The threat model is different for each.
Apps Script stores credentials in script properties, which are visible to anyone with editor access to the project. That's often more people than you'd expect. The Google Apps Script documentation notes explicitly that "script properties are shared among all users of a script" โ meaning every collaborator on your spreadsheet can read your database password if you store it there.
ModelMonkey's Postgres integration uses envelope encryption via Google Cloud KMS, where credentials are wrapped with a per-user key. The plaintext password never persists to disk. That distinction matters if your Postgres instance has anything beyond test data.
Setting Up a Postgres Connection in ModelMonkey
- Open the ModelMonkey sidebar in Google Sheets
- Navigate to Data Sources and select Add Connection
- Choose PostgreSQL as the provider type
- Enter your host, port, database name, and schema
- Provide credentials โ these are encrypted and never stored in plaintext
- Click Test Connection to verify access
- Once connected, run queries directly: "Pull last 90 days of orders grouped by product category"
ModelMonkey will translate that to SQL, run it against your Postgres instance via DuckDB, and write the results to your active sheet. You can re-run it anytime without re-entering credentials.
For teams already using DuckDB for Sheets analysis, the PostgreSQL connector fits into the same workflow โ same duckdb_query tool, same output format, just a different data source.