Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.secapi.ai/llms.txt

Use this file to discover all available pages before exploring further.

OMNI Chat

OMNI Chat is a public-beta chat surface at secapi.ai/chat. It pairs a large language model with the same MCP-routed tools that power the OMNI Datastream API: filings search, ownership graphs, enforcement events, dilution events, and section-level citations into the SEC corpus.

What it can do

  • Answer questions about specific filings — e.g. “What did Apple disclose about supply chain risk in their latest 10-K?”
  • Compare across companies — e.g. “Show me dilution events for the top three EV manufacturers in the last 12 months.”
  • Surface citations inline — every claim links back to the underlying filing section with a highlighted snippet.
  • Run multi-step research workflows — picks tools, fetches data, summarizes findings.

Sign in

OMNI Chat is gated behind a WorkOS sign-in. Visit secapi.ai/chat and you’ll be redirected to sign in if you’re not already authenticated. We use a short-lived, single-use ticket to authorize the WebSocket — your session token never appears in URLs.

What you’ll see

  • Streaming responses — tokens render as they arrive.
  • Tool call cards — when OMNI calls a Datastream tool (e.g. filings.search), a status card shows the call, expands to show args and the result.
  • Citation pills — each citation links to a SEC.gov filing and includes the highlighted snippet that backed the claim.
  • Cost meter — running USD cost for the conversation, with the current model name (Claude Opus 4.6).
  • Stop button — interrupt mid-response. Display halts immediately; the backend may finish its current thought.

Limits and caveats

  • Beta — gated on VITE_PUBLIC_FEATURE_CHAT_PUBLIC_BETA. Day-30 checkpoint decides public flip.
  • Read-only by defaultpermissionMode: "safe" is the v1 default; tools that mutate state are not exposed in this surface.
  • Single-session resumption — the URL carries ?sessionId= so a refresh resumes the active conversation. Multi-session history sidebar ships in a follow-up.
  • Stop is local-only today — the visible stop button halts display immediately; the backend may continue to bill the in-flight LLM call for ~30 seconds. Backend cancellation is a tracked follow-up.

Pricing

Beta usage is billed against your existing OMNI Datastream quota. The cost meter in the chat header surfaces per-conversation USD running totals; per-org budget caps from the metering middleware (Omni-Budget-Used etc.) still apply.

Reporting issues

File issues in the OMNI repo. For chat-specific problems, mention the sessionId from the URL in your report — it’s how we trace state on the backend.