Documentation Index
Fetch the complete documentation index at: https://agno-v2-shaloo-ai-support-link.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
The User Profile Store captures structured fields about users: name, preferred name, and custom fields you define.
| Aspect | Value |
|---|
| Scope | Per user |
| Persistence | Forever (updated as new info is learned) |
| Default mode | Always |
| Supported modes | Always, Agentic |
Basic Usage
from agno.agent import Agent
from agno.db.postgres import PostgresDb
from agno.learn import LearningMachine
from agno.models.openai import OpenAIResponses
agent = Agent(
model=OpenAIResponses(id="gpt-5.2"),
db=PostgresDb(db_url="postgresql+psycopg://ai:ai@localhost:5532/ai"),
learning=LearningMachine(user_profile=True),
)
# Session 1: Share information
agent.print_response(
"Hi! I'm Alice Chen, but please call me Ali.",
user_id="alice@example.com",
session_id="session_1",
)
# Session 2: Profile is recalled automatically
agent.print_response(
"What's my name?",
user_id="alice@example.com",
session_id="session_2",
)
Always Mode
Extraction happens automatically after each response. No tools visible to the agent.
from agno.learn import LearningMachine, LearningMode, UserProfileConfig
agent = Agent(
model=OpenAIResponses(id="gpt-5.2"),
db=db,
learning=LearningMachine(
user_profile=UserProfileConfig(mode=LearningMode.ALWAYS),
),
)
Tradeoff: extra LLM call per interaction.
Agentic Mode
The agent receives an update_user_profile tool and decides when to update.
from agno.learn import LearningMachine, LearningMode, UserProfileConfig
agent = Agent(
model=OpenAIResponses(id="gpt-5.2"),
db=db,
learning=LearningMachine(
user_profile=UserProfileConfig(mode=LearningMode.AGENTIC),
),
)
agent.print_response(
"Please remember that my name is Bob Smith.",
user_id="bob@example.com",
)
Tradeoff: agent may miss implicit profile info.
Default Fields
| Field | Description |
|---|
name | Full name |
preferred_name | Name they prefer to be called |
Custom Schemas
Extend the base schema for your domain:
from dataclasses import dataclass, field
from typing import Optional
from agno.learn.schemas import UserProfile
@dataclass
class CustomerProfile(UserProfile):
company: Optional[str] = field(
default=None,
metadata={"description": "Company or organization"}
)
plan_tier: Optional[str] = field(
default=None,
metadata={"description": "Subscription tier: free | pro | enterprise"}
)
role: Optional[str] = field(
default=None,
metadata={"description": "Job title or role"}
)
timezone: Optional[str] = field(
default=None,
metadata={"description": "User's timezone"}
)
agent = Agent(
model=OpenAIResponses(id="gpt-5.2"),
db=db,
learning=LearningMachine(
user_profile=UserProfileConfig(schema=CustomerProfile),
),
)
The metadata["description"] tells the LLM what each field captures.
Accessing Profile Data
lm = agent.get_learning_machine()
# Get profile
profile = lm.user_profile_store.get(user_id="alice@example.com")
print(profile.name)
print(profile.preferred_name)
# Debug output
lm.user_profile_store.print(user_id="alice@example.com")
Context Injection
Profiles are automatically injected into the system prompt:
<user_profile>
Name: Alice Chen
Preferred Name: Ali
Company: Acme Corp
Role: Data Scientist
</user_profile>
No manual context building needed.
User Profile vs User Memory
| User Profile | User Memory |
|---|
| Structured fields | Unstructured text |
| Fixed schema | Flexible observations |
| Updated in place | Appended over time |
| Exact recall | Semantic search |
Use User Profile for: name, company, role, preferences with defined values.
Use User Memory for: observations like “prefers detailed explanations” or “works on ML projects.”