AI Mode¶
AI Mode is the AI layer of an Aksara application. It powers the Studio AI Console and AI Flows, exports your app as MCP-compatible tools, and adds analysis surfaces such as Schema Doctor, AI Debugger, Architecture Review, and Performance Analyzer.
Start Here¶
Use this section in the same order you would adopt the features in a real project:
| Goal | Read First | Then Continue With |
|---|---|---|
| Explore your app in Studio | Interactive Console | AI Flows, AI Debugger, Architecture Review, Performance Analyzer |
| Connect an external AI agent | MCP Integration | Tools, Providers, AI Connectors |
| Automate larger tasks | Agent Mode | Agent Workflows, Planner, Agent Runtime |
| Generate or refactor code safely | CodeGen | Patch Engine, Safety |
| Inspect schema and query health | Schema Doctor | Project Graph, Query Engine |
Studio vs AI Mode Docs¶
The documentation is split by responsibility.
Read Studio when you need the built-in web UI, Studio endpoints, or Studio configuration. Read AI Mode when you need the AI surfaces that live inside Studio or when you want to connect external agents and model providers.
Five-Minute Tour¶
Start your app:
Then try the three entry points that matter most:
- Open http://127.0.0.1:8000/studio/ui and use the AI Console.
- Fetch the MCP tool catalog from http://127.0.0.1:8000/ai/tools/mcp.
- Run
aksara doctor fix-planto see the remediation workflow Aksara can generate from live diagnostics.
Core Surfaces¶
| Surface | What It Does | Where to Learn More |
|---|---|---|
| AI Console | Natural-language interface inside Studio for asking questions about models, routes, queries, and migrations | console.md |
| AI Flows | Guided actions for model review, route review, query analysis, migration explanation, and diagnostics | flows.md |
| MCP Tools | Exports your models and routes as MCP-shaped tools at /ai/tools/mcp for external agents |
mcp.md |
| Schema Doctor | Finds schema health problems and pairs them with actionable remediation output | schema-doctor.md |
| AI Debugger | Explains failures and points at likely root causes | debugger.md |
| Architecture Review | Reviews coupling, structure, and design pressure across the codebase | architecture-review.md |
| Performance Analyzer | Surfaces slow queries, missing indexes, and common ORM performance traps | performance-analyzer.md |
Provider Configuration¶
AI Mode is provider-agnostic. Configure the provider and the model name that make sense for your environment.
AKSARA = {
"AI_PROVIDER": "anthropic", # or "openai", "ollama", "custom_http"
"AI_MODEL": "<provider-model-name>",
}
API key required
AI features that call an external LLM need a provider API key in the environment
(e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY). Ollama runs locally and needs no key.
If no key is set, Aksara still works — Studio, MCP exports, and prompt packs are
fully functional — but the AI Console and AI Flows cannot execute prompts.
See Connectors for the full list of environment variables.
See Providers, Bring Your Own LLM, and Ollama for concrete setups.
Why Metadata Matters¶
AI Mode works best when the rest of your Aksara app is well described. Field metadata such as ai_description, ai_sensitive, and ai_agent_writable influences what the AI Console sees, what MCP exports expose, and what agent-driven write paths are allowed to change.
If you are new to those flags, start with Fields before you wire external agents into a production app.
Next Reads¶
- Interactive Console
- MCP Integration
- Providers
- Schema Doctor
- AI Debugger
- Architecture Review
- Performance Analyzer
See Studio UI for the visual interface.
Related Documentation¶
- Tools — AI-callable functions
- Context Engine — Context gathering
- Query Engine — Natural language queries
- Codegen — Code generation
- Patch Engine — Code modifications
- Planner — Task planning
- Agent Runtime — Agent execution
- Schema Doctor — Schema analysis
- Configuration — AI Mode settings
- Safety — Safety features
- Studio AI Helpers — Studio UI integration