Documentation Index
Fetch the complete documentation index at: https://agno-v2-shaloo-ai-support-link.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Enable Agno agents to perform petabyte-scale data analysis, execute complex SQL, and even run machine learning models directly within Google Cloud’s data warehouse.
Prerequisites
-
Set the following environment variables for your Google Cloud project:
export GOOGLE_CLOUD_PROJECT="your-project-id"
export GOOGLE_CLOUD_LOCATION="your-location"
-
Instruct the agent to prepend the table name with the project name and dataset name.
-
Describe the table schemas in instructions and use thinking tools for better responses.
from agno.agent import Agent
from agno.models.google import Gemini
from agno.tools.google_bigquery import GoogleBigQueryTools
# ---------------------------------------------------------------------------
# Create Agent
# ---------------------------------------------------------------------------
agent = Agent(
instructions=[
"You are an expert Big query Writer",
"Always prepend the table name with your_project_id.your_dataset_name when run_sql tool is invoked",
],
tools=[GoogleBigQueryTools(dataset="test_dataset")],
model=Gemini(id="gemini-3-flash-preview", vertexai=True),
)
# ---------------------------------------------------------------------------
# Run Agent
# ---------------------------------------------------------------------------
if __name__ == "__main__":
agent.print_response(
"List the tables in the dataset. Tell me about contents of one of the tables",
markdown=True,
)
Run the Example
# Clone and setup repo
git clone https://github.com/agno-agi/agno.git
cd agno/cookbook/91_tools
# Create and activate virtual environment
./scripts/demo_setup.sh
source .venvs/demo/bin/activate
python google_bigquery_tools.py
For details, see Google BigQuery cookbook.