This directory contains the data artifacts and infrastructure setup scripts for the MCP support for BigQuery & Google Maps demo.
This scenario demonstrates an AI Agent's ability to orchestrate enterprise data (BigQuery) and real-world geospatial context (Google Maps) to solve a complex business problem:
"How would you help a friend launch a new high-end sourdough bakery in Los Angeles?"
The agent autonomously queries BigQuery to find macro trends and uses Google Maps to validate micro-location details. The demo relies on three key datasets:
- Demographics: To identify neighborhoods with high foot traffic using census data (Macro Discovery).
- Market Data: To analyze competitor pricing and suggest a premium price point (Pricing Strategy).
- Sales History: To forecast potential revenue based on comparable store trends (Forecasting).
The diagram above illustrates the flow of information in this demo. The Agent, powered by Gemini 3 Pro Preview, orchestrates requests between the user and Google Cloud services. It uses a remote (Google hosted) MCP server to securely access BigQuery for demographic and sales data, and Google Maps APIs for real-world location analysis and validation.
launchmybakery/ βββ data/ # Pre-generated CSV files for BigQuery β βββ demographics.csv β βββ bakery_prices.csv β βββ sales_history_weekly.csv β βββ foot_traffic.csv βββ adk_agent/ # AI Agent Application (ADK) β βββ mcp_bakery_app/ # App directory β βββ agent.py # Agent definition β βββ tools.py # Custom tools for the agent βββ setup/ # Infrastructure setup scripts β βββ setup_bigquery.sh # Script to provision BigQuery dataset and tables β βββ setup_env.sh # Script to set up environment variables βββ cleanup/ # Infrastructure clean up environment β βββ cleanup_env.sh # Script to remove resources in environment βββ README.md # This documentation - Google Cloud Project with billing enabled.
- Google Cloud Shell (Recommended) or a local terminal with the
gcloudCLI installed.
Follow these steps in Google Cloud Shell to provision the demo environment.
git clone https://github.com/google/mcp.git cd mcp/examples/launchmybakeryRun the following command to authenticate with your Google Cloud account. This is required for the ADK to access BigQuery.
gcloud config set project [YOUR-PROJECT-ID] gcloud auth application-default loginFollow the prompts to complete the authentication process.
Run the environment setup script. This script will:
- Enable necessary Google Cloud APIs (Maps, BigQuery, remote MCP).
- Create a restricted Google Maps Platform API Key.
- Create a
.envfile with required environment variables.
chmod +x setup/setup_env.sh ./setup/setup_env.shRun the setup script. This script automates the following:
- Creates a Cloud Storage bucket.
- Uploads the CSV data files.
- Creates the
mcp_bakeryBigQuery dataset. - Loads the data into BigQuery tables.
chmod +x ./setup/setup_bigquery.sh ./setup/setup_bigquery.shCreate a virtual environment, install the ADK, and run the agent.
# Create virtual environment python3 -m venv .venv # If the above fails, you may need to install python3-venv: # apt update && apt install python3-venv # Activate virtual environment source .venv/bin/activate # Install ADK pip install google-adk # Navigate to the app directory cd adk_agent/ # Run the ADK web interface adk webOpen the link provided by adk web in your browser. You can now chat with the agent and ask it questions about the bakery data.
Sample Questions:
- "Iβm looking to open my fourth bakery location in Los Angeles. I need a neighborhood with early activity. Find the zip code with the highest 'morning' foot traffic score."
- "Can you search for 'Bakeries' in that zip code to see if it's saturated? If there are too many, check for 'Specialty Coffee' shops, so I can position myself near them to capture foot traffic."
- "Okay and I want to position this as a premium brand. What is the maximum price being charged for a 'Sourdough Loaf' in the LA Metro area?"
- "Now I want a revenue projection for December 2025. Look at my sales history and take data from my best performing store for the 'Sourdough Loaf'. Run a forecast for December 2025 to estimate the quantity I'll sell. Then, calculate the projected total revenue using just under the premium price we found (let's use $18)"
- "That'll cover my rent. Lastly, let's verify logistics. Find the closest "Restaurant Depot" to the proposed area and make sure that drive time is under 30 minutes for daily restocking."
To avoid incurring ongoing costs for BigQuery storage or other Google Cloud resources, you can run the cleanup script. This script will delete the BigQuery dataset, the Cloud Storage bucket, and the API keys created during setup. Navigate back to the root directory of the repository and run the following command:
chmod +x cleanup/cleanup_env.sh ./cleanup/cleanup_env.shThe data in this repository is synthetic but structured to support specific demo narratives and successful agent reasoning chains.
| Table | Demo Purpose | Narrative Logic |
|---|---|---|
foot_traffic | Target Discovery Finding the target neighborhood. | Morning activity is uniquely spiked in 90403, allowing the Agent to pinpoint it as the optimal location for a morning-focused business like a bakery. |
demographics | Community Profiling Analyzing market depth. | Santa Monica (90403) is modeled with a dense, established residential population, providing a stable baseline for customer volume. |
bakery_prices | Pricing Strategy Setting a price point. | Erewhon Market has the highest price ceiling for a Sourdough Loaf (~$18.50), while the market average is ~$8.20. This allows the Agent to confidently suggest a premium price point of ~$15-18. |
sales_history | Forecasting Predicting growth. | Silver Lake shows aggressive week-over-week growth trends, while Playa Vista represents a stable, high-volume flagship store, providing distinct patterns for forecasting models. |
