Skip to content

Google Sheets Integration

Bridge Town integrates with Google Sheets through a snapshot-based workflow: import your sheet as an immutable CSV snapshot, build and run models against that snapshot, then write results back to the sheet. The live sheet is not accessed during model execution — freshness is controlled by when you call connect_google_sheet.

  • A Bridge Town account with at least one project
  • Access to the Bridge Town web app at app.bridgetown.builders
  • A Google account with access to the sheet(s) you want to import

Before Claude can import a Google Sheet, you must authorise Bridge Town in the web app. This is a one-time step per Google account.

  1. Open app.bridgetown.builders and navigate to your project.
  2. Go to Data Sources → Connect Google Sheet.
  3. Click Connect Google Account and complete the Google sign-in flow.
  4. Grant Bridge Town the requested permissions (read-only for import; read/write if you plan to export back).

Once authorised, Bridge Town stores an encrypted OAuth refresh token. You will not need to re-authenticate unless you revoke access in your Google account settings.

Ask Claude to connect your sheet, or call connect_google_sheet directly:

{
"name": "connect_google_sheet",
"arguments": {
"sheet_url": "https://docs.google.com/spreadsheets/d/ABC123/edit",
"project_name": "revenue-model",
"tab_names": ["Sales", "Costs"]
}
}

Omit tab_names to import all tabs. Supply schedule_interval_minutes to enable automatic refresh:

{
"name": "connect_google_sheet",
"arguments": {
"sheet_url": "https://docs.google.com/spreadsheets/d/ABC123/edit",
"project_name": "revenue-model",
"schedule_interval_minutes": 60
}
}

The tool reads each tab, converts it to CSV, and uploads it to Bridge Town’s data store as an immutable Parquet snapshot.

ErrorCauseFix
OAuth credentials missingWeb app OAuth not completedComplete Step 1
Data source not foundSheet URL or project name incorrectCheck both values
Tab not foundTab name is case-sensitiveVerify exact tab name in Google Sheets
Google API rate limitToo many requestsWait 60 seconds and retry

The connect_google_sheet response includes row counts per tab and the snapshot_taken_at timestamp, which you can use to confirm freshness:

{
"tabs_captured": ["Sales", "Costs"],
"snapshot_taken_at": "2026-04-09T17:00:00+00:00",
"rows_by_tab": { "Sales": 842, "Costs": 156 }
}

To explore the imported data, use query_data with SQL — Google Sheet snapshots are queryable alongside uploaded Parquet files. Each tab is exposed as a DuckDB table named {source_name}_{tab_name} (non-alphanumeric characters in the tab name are replaced with underscores). For example, if your data source is named revenue-model and the tab is Q1 Sales, the table name is revenue_model_Q1_Sales.

SELECT product_line, SUM(revenue) AS total
FROM revenue_model_Sales
GROUP BY product_line
ORDER BY total DESC

The sources field in the query_data response includes snapshot_taken_at for each Google Sheets source so you can verify freshness. Call connect_google_sheet again to refresh the snapshot before querying when current data is needed.

Alternatively, generate a model that reads from the snapshot at runtime via read_csv() for more complex transformations.

Once data is imported, use create_file to create a model file, then patch_file or update_file to implement the logic that reads from it:

Generate a revenue forecast model using the Sales and Costs data sources I just imported.

Models reference imported data via DuckDB’s read_csv() function pointed at the snapshot path. Each model run mounts the snapshot that existed at run time, ensuring reproducibility.

After a model run completes, write its outputs back to Google Sheets with write_gsheet. Use spec.mode to choose the operation. Target the linked sheet by source_name (the name shown in list_data_sources) — this is the preferred approach because it avoids manual spreadsheet ID copy/paste and works naturally with the linked-sheet workflow.

spec.mode="replace" writes to a targeted A1 notation range. Only cells within the specified range are overwritten — content outside is preserved. Pass cell_range in A1 notation including the sheet name prefix ("Forecast!A1" to start from the top-left of a tab, or "Sheet1!B2:F20" for a sub-region).

{
"name": "write_gsheet",
"arguments": {
"project_name": "revenue-model",
"spec": {
"mode": "replace",
"run_id": "uuid-from-run",
"output_name": "forecast.json",
"source_name": "budget",
"cell_range": "Forecast!A1"
}
}
}

spec.mode="append" adds rows after existing content without overwriting. Use this to accumulate a history of model runs.

{
"name": "write_gsheet",
"arguments": {
"project_name": "revenue-model",
"spec": {
"mode": "append",
"run_id": "uuid-from-run",
"output_name": "monthly_actuals.json",
"source_name": "budget",
"sheet_name": "Actuals Log"
}
}
}

write_gsheet also supports spec.mode="clear" (clear values in a range) and spec.mode="batch" (write explicit values to multiple ranges in one call). All modes require the spreadsheet to already be connected via connect_google_sheet.

After writing data, apply an opinionated formatting preset with format_gsheet. This auto-detects the data bounds and applies header styling, borders, and (for variance reports) conditional colouring in a single call. Number formats are preserved by default — pass number_format in preset_params to override:

{
"name": "format_gsheet",
"arguments": {
"project_name": "revenue-model",
"preset": "financial_table",
"source_name": "budget",
"tab_name": "Forecast"
}
}

Available presets: financial_table (general tables), kpi_dashboard (two-column key/value), variance_report (actuals vs budget with red/green variance), audit_log (append/journal tabs with timestamps and alternating row bands). Call format_gsheet(spec={"kind":"list_presets"}) to see all presets and their parameters.

For fine-grained control, use format_gsheet with the appropriate spec.kind (range_format, borders, freeze_panes, protect_range, validation, or conditional_format).

If you’re in Claude.ai, you can ask Claude to walk you through the entire connection process using the built-in connect_google_sheets skill:

“Help me connect my Google Sheet to Bridge Town and set up a model.”

Claude will check whether OAuth is complete, collect the sheet URL and tab names, run the import, verify the data, and suggest next steps — all in one conversation.

To keep your snapshots current without manual intervention, set a refresh interval when connecting:

"schedule_interval_minutes": 1440

Common values: 60 (hourly), 1440 (daily), 10080 (weekly). To change the schedule on an existing connection, call connect_google_sheet again with the new interval — this updates the stored schedule without re-importing.

  • Encryption at rest — OAuth refresh tokens are encrypted with AES-256-GCM before storage. The encryption key is held in AWS Secrets Manager, not in the database.
  • No client exposure — refresh tokens are never returned in API responses or MCP tool output. They are decrypted in-memory only, within the MCP server process, for the duration of each API call.
  • Least privilege — Bridge Town requests only the Google OAuth scopes required for the operations you perform (read-only for import; read/write for export).
  • Revocation — To disconnect a sheet, delete the data source in the web app under Data Sources. This removes the stored credentials immediately.
Guide / ToolDescription
Multi-Model PipelinesChain models with PIPELINE and /upstream transport — Google Sheets are for external I/O at the edges, not intra-run transport
connect_google_sheetImport a sheet as a CSV snapshot
list_data_sourcesDiscover linked sheets and their source_name for write-back
write_gsheetUnified write tool (replace, append, clear, batch)
modify_gsheet_structureUnified structure tool (ensure tab, insert/delete rows)
format_gsheetUnified formatting tool (presets, borders, validation, etc.)
format_gsheet(spec={"kind":"list_presets"})Discover available formatting presets
query_dataQuery snapshots (uploaded files and linked sheets) with SQL
create_fileCreate a model file in the project
patch_fileApply targeted model edits from instructions
update_fileReplace model source with full updated code