The old mybusiness.googleapis.com API went dark in April 2022. Any setup guide referencing it is hitting deprecated endpoints that return nothing.
Two Paths: Sheets-Native vs. a Google My Business PostgreSQL Integration
Most FP&A teams don't need a PostgreSQL pipeline. They need GMB data in a model. Those are different things.
| Sheets-native | PostgreSQL pipeline | |
|---|---|---|
| Best for | 5–20 locations | 20+ locations |
| Who builds it | FP&A analyst | Data engineer |
| Setup time | 1–2 hours | 2–4 weeks |
| Revenue joins | SUMIFS across tabs | SQL at query time |
| Row volume (18 months) | ~65K rows per location | 270K wide / 1.6M long format |
| Retroactive adjustment risk | ±3–5% on recent weeks | Same — pull after T+7 |
Under 20 locations, the Sheets-native path gets you to the same analytical output in an afternoon. Over 20, the row volume and join complexity make the PostgreSQL ticket worth filing with your data team.
The Sheets-Native Path for FP&A Analysts
The Google Business Profile Performance API accepts OAuth tokens from Apps Script directly — no service account, no backend. The required scope is https://www.googleapis.com/auth/business.manage. Add it to your script's appsscript.json, and ScriptApp.getOAuthToken() handles the rest.
Here's a minimal script that pulls 90 days of direction requests and call clicks for one location into a sheet called GMB_Raw:
function pullGMBMetrics() {
// Find your location ID in the GBP console URL
const LOCATION_ID = 'locations/12345678901234567';
const SHEET = SpreadsheetApp.getActiveSpreadsheet()
.getSheetByName('GMB_Raw');
const end = new Date();
const start = new Date();
start.setDate(end.getDate() - 90);
const fmt = d => ({
year: d.getFullYear(),
month: d.getMonth() + 1,
day: d.getDate()
});
const body = {
dailyMetricRequests: [
{ dailyMetric: 'DIRECTION_REQUESTS' },
{ dailyMetric: 'CALL_CLICKS' },
{ dailyMetric: 'WEBSITE_CLICKS' },
{ dailyMetric: 'BUSINESS_IMPRESSIONS_DESKTOP_MAPS' }
],
dailyRange: { startDate: fmt(start), endDate: fmt(end) }
};
const resp = UrlFetchApp.fetch(
`https://businessprofileperformance.googleapis.com/v1/${LOCATION_ID}/dailyMetrics:batchGet`,
{
method: 'POST',
headers: {
Authorization: 'Bearer ' + ScriptApp.getOAuthToken(),
'Content-Type': 'application/json'
},
payload: JSON.stringify(body)
}
);
// Parse resp.getContentText() and write rows to SHEET
// Columns: [date, location_id, direction_requests, call_clicks, website_clicks, impressions]
}
Once GMB_Raw has date, location, and metric columns, your model can reference it with standard multi-tab SUMIFS. This formula pulls direction requests for a specific store between the date range set in your Assumptions tab:
=SUMIFS('GMB_Raw'!C:C,
'GMB_Raw'!A:A,">="&Assumptions!$B$3,
'GMB_Raw'!A:A,"<="&Assumptions!$B$4,
'GMB_Raw'!B:B,Revenue!$A4)
Same pattern as any cross-tab lookup. Assumptions!$B$3 and $B$4 are your period start and end; Revenue!$A4 is the store identifier that ties the rows together.
Apps Script has a 6-minute execution timeout. At the API's rate limit of 3,600 requests per minute (as of April 2026), you can process roughly 50 locations per nightly run before hitting that wall. Set a time-driven trigger for 02:00 local time; each run pulls a rolling 90-day window.
When You Actually Need a Google My Business PostgreSQL Integration
Two things push you to PostgreSQL: location count and join complexity.
At 50 locations with 6 metric types over 18 months, you're looking at roughly 1.6 million rows in long format or about 270,000 in wide format. Wide format is what FP&A models want — one row per location per day, metrics as columns, an index on (location_id, report_date). It queries faster, and the shape maps directly to how you'd lay out a Sheets tab.
The schema to hand your data team:
CREATE TABLE gmb_daily_metrics (
location_id TEXT NOT NULL,
report_date DATE NOT NULL,
impressions_maps INTEGER,
direction_requests INTEGER,
call_clicks INTEGER,
website_clicks INTEGER,
updated_at TIMESTAMPTZ DEFAULT NOW(),
PRIMARY KEY (location_id, report_date)
);
CREATE INDEX ON gmb_daily_metrics (report_date DESC);
The pipeline uses a service account with domain-wide delegation rather than the Apps Script OAuth flow. According to Google's service account documentation, domain-wide delegation is required for non-user-initiated API calls. That's the part that takes 2–4 weeks: IT creates the service account, IT approves the OAuth scope in the Google Workspace admin console, and an engineer wires up the nightly cron job.
Joining GMB Data to Revenue in PostgreSQL
Direction requests are a weak leading indicator of revenue — R² values of 0.3–0.5 against same-store revenue are typical, and the relationship is stronger in food service and retail than in professional services. That's not a reason to skip the analysis. It's the baseline you set with stakeholders before presenting a chart showing direction requests up 22% week-over-week. Without the R² context, someone will ask whether that's actually driving sales. With it, you have the honest answer ready.
The SQL that produces the output your model needs:
SELECT
g.location_id,
DATE_TRUNC('week', g.report_date) AS week_start,
SUM(g.direction_requests) AS direction_requests,
SUM(r.net_revenue) AS net_revenue
FROM gmb_daily_metrics g
JOIN revenue_actuals r
ON g.location_id = r.store_id
AND g.report_date = r.transaction_date
WHERE g.report_date >= '2025-10-01'
GROUP BY 1, 2
ORDER BY 2 DESC;
That result set drops cleanly into a Sheets tab — one row per location per week, ready for variance analysis against your budget.
What Actually Breaks
Retroactive adjustments. The Google Business Profile Performance API documentation notes that data may be adjusted retroactively for spam and fraud. In practice, ±3–5% corrections on the most recent 7–10 days are common. Pull with a T+7 lag if you need stable numbers in a board pack.
The 18-month hard cap. The API won't return data older than 18 months from the current date. There's no backfill path once the window closes — if you're not pulling and storing incrementally from day one, that history is gone.
Apps Script timeout at scale. The 6-minute execution limit breaks the Sheets-native path beyond 50 locations per run. If you're approaching that threshold, split locations across multiple triggered functions or move to the PostgreSQL pipeline.
Bringing GMB Data Into Google Sheets from PostgreSQL
Once the pipeline runs, the practical question for FP&A is how the PostgreSQL data gets back into your model. ModelMonkey connects directly to a PostgreSQL data source, pulls query results into a refreshable Sheets table, and schedules the daily refresh — you write the SQL once and the model updates without anyone running a script. Try it free for 14 days.