Engines, providers, and integrations
This page is the short version. It is here so you can check what Sentinel supports without digging through the longer runtime and configuration pages.
If you want the mechanics behind any of this, the deeper explanation lives in Engines and runtime and Configuration. This page is just the compact reference view.
Engines
Sentinel currently supports four chat engines:
| Engine | Meaning |
|---|---|
sentinel | Built-in harness and built-in engine path |
codex | Local Codex runtime carried inside Sentinel thread state |
claude | Local Claude Code runtime carried inside Sentinel thread state |
copilot | Local GitHub Copilot runtime carried inside Sentinel thread state |
Model providers
Current AI provider support includes OpenAI, Anthropic, Google AI Studio, Google Vertex AI, xAI, Azure OpenAI, Amazon Bedrock, Groq, Cohere, Moonshot AI, Mistral, Ollama, OpenRouter, and Vercel AI Gateway.
Search providers
Search provider support includes Exa and SearXNG.
Voice transcription providers
Voice transcription provider support includes OpenAI, Groq, and Azure.
Integration providers
Integration support currently covers Gmail, Google Calendar, Google Drive, Airtable, Slack, Notion, GitHub, Linear, PostgreSQL, MySQL, MongoDB, Yahoo Finance, arXiv, and PubMed.
MCP transports
Sentinel supports stdio and http MCP transports.
Related pages
For the longer version, see Sentinel engine, Thread state, Codex runtime, Claude Code, GitHub Copilot, and Integrations and MCP.