Base class for interacting with providers that implement the Open Responses API specification. This provides a foundation for multi-provider, interoperable LLM interfaces based on the OpenAI Responses API. Providers that implement this spec include Ollama (v0.13.3+) and OpenRouter.Documentation Index
Fetch the complete documentation index at: https://agno-v2-shaloo-ai-support-link.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Key Differences from OpenAI Responses
- Configurable
base_urlfor pointing to different API endpoints - Stateless by default (no
previous_response_idchaining) - Flexible
api_keyhandling for providers that don’t require authentication
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
id | str | "not-provided" | The ID of the model to use |
name | str | "OpenResponses" | The name of the model |
provider | str | "OpenResponses" | The provider of the model |
api_key | Optional[str] | "not-provided" | The API key for authentication |
store | Optional[bool] | False | Whether to store responses (disabled by default for compatible providers) |
Usage
For most use cases, prefer the provider-specific classes:- OllamaResponses for Ollama
- OpenRouterResponses for OpenRouter