Standard MCP server
Built on the Model Context Protocol — the open standard for connecting tools to LLMs. Works with Claude, ChatGPT, Cursor, Continue, and custom agents over JSON-RPC.
Connect Cometly to Claude, ChatGPT, Cursor, and custom AI agents via Model Context Protocol — so your AI workflows operate on real attribution data, not guesses. Now in public beta.
Closed-won pipeline this month is $1.42M, led by Google Ads ($512K). LinkedIn Ads improved the most — +38% MoM on the back of three new outbound-warmed campaigns.
Built on the Model Context Protocol — the open standard for connecting tools to LLMs. Works with Claude, ChatGPT, Cursor, Continue, and custom agents over JSON-RPC.
Agents query your real Cometly data — pipeline, CAC, LTV, journeys — through the same metrics layer your dashboards use. Numbers are always current, never a stale export.
MCP currently exposes Cometly's read endpoints — every metric your dashboards and Ask AI can return. Write actions (pause a campaign, sync an audience, save a dashboard from chat) are rolling out as the beta matures.
Scoped API keys with per-workspace, per-tool permissions. Agents only see and act on what their key authorizes, and any token can be revoked in one click with a full audit log.
One MCP server, many clients. Plug into Claude Desktop, Cursor, Continue, ChatGPT custom actions, or your own LangChain / LlamaIndex agents — no custom integration work.
Generate an API key in Cometly, drop the MCP config into Claude Desktop or Cursor, done. Most teams have AI agents querying their funnel within an hour.
Cometly's MCP server speaks the open standard, so any agent or LLM that supports MCP can query your attribution data — no custom integration work required.
Which clients work today, what's queryable vs roadmap, and how keys are scoped. Book a demo and we'll plug into your preferred MCP setup.
Talk to salesGive Claude, Cursor, and your custom agents direct access to your live attribution data — so every answer is grounded in your real funnel instead of a guess.