Sentinel
Reference

Engines, providers, and integrations

This page is the short version. It is here so you can check what Sentinel supports without digging through the longer runtime and configuration pages.

If you want the mechanics behind any of this, the deeper explanation lives in Engines and runtime and Configuration. This page is just the compact reference view.

Engines

Sentinel currently supports four chat engines:

EngineMeaning
sentinelBuilt-in harness and built-in engine path
codexLocal Codex runtime carried inside Sentinel thread state
claudeLocal Claude Code runtime carried inside Sentinel thread state
copilotLocal GitHub Copilot runtime carried inside Sentinel thread state

Model providers

Current AI provider support includes OpenAI, Anthropic, Google AI Studio, Google Vertex AI, xAI, Azure OpenAI, Amazon Bedrock, Groq, Cohere, Moonshot AI, Mistral, Ollama, OpenRouter, and Vercel AI Gateway.

Search providers

Search provider support includes Exa and SearXNG.

Voice transcription providers

Voice transcription provider support includes OpenAI, Groq, and Azure.

Integration providers

Integration support currently covers Gmail, Google Calendar, Google Drive, Airtable, Slack, Notion, GitHub, Linear, PostgreSQL, MySQL, MongoDB, Yahoo Finance, arXiv, and PubMed.

MCP transports

Sentinel supports stdio and http MCP transports.

For the longer version, see Sentinel engine, Thread state, Codex runtime, Claude Code, GitHub Copilot, and Integrations and MCP.

On this page