Chat streaming, provider settings, global conversations, and page agent management.
Stream an AI response. The primary AI endpoint.
Body:
{
"pageId": "string",
"message": "string",
"provider": "string (optional)",
"model": "string (optional)"
}
Response: Server-sent event stream with AI response chunks, tool calls, and tool results.
Auth: Edit permission on the page.
Check provider configuration status for the current user.
Update page-specific AI settings (provider, model).
Body:
{
"pageId": "string",
"aiProvider": "string",
"aiModel": "string"
}
Auth: Edit permission on the page.
Load chat messages for a page in chronological order, including tool calls and results.
Query params: pageId
Auth: View permission on the page.
Check AI provider configuration status.
Save API key for a provider.
Body:
{
"provider": "openrouter | google | openai | anthropic | xai",
"apiKey": "string",
"baseUrl": "string (optional)"
}
API keys are encrypted before storage.
Update current provider/model selection.
Remove API key for a provider.
List the user's global AI conversations.
Create a new global AI conversation.
Get a specific global conversation.
Update conversation metadata.
Delete a global conversation.
List messages in a global conversation.
Send a message in a global conversation (streams AI response).
Delete a specific message.
Get AI usage statistics for a conversation.
Get the most recent active global conversation.
Create a new page-based AI agent.
Get agent configuration.
Update agent configuration.
List conversations for a page agent.
Create a new conversation with a page agent.
Consult a page agent (agent-to-agent communication).
Body:
{
"agentId": "string",
"question": "string",
"context": "string (optional)"
}
List all page agents across multiple drives.
List available Ollama models for local AI processing.
Search docs, blog posts, and more.