COMPETITIVE INTELLIGENCE OS

Signal monitoring. AI synthesis. Delivered.

Tesseract Intelligence tracks what competitors ship, price, hire, and announce — then synthesizes it into structured briefings delivered to your inbox, Slack, or Discord. Built as infrastructure, not a dashboard.

5
Phases shipped
48
E2E tests green
229
Unit tests passing
4
Delivery channels

ARCHITECTURE

Built in 5 phases

Each phase is independently deployable and testable. No big-bang launches — every layer ships with its own E2E coverage before the next one starts.

01Foundation
  • Onboarding wizard
  • Competitor landscape config
  • Supabase + RLS
  • Clerk auth bypass for E2E
02Signal Pipeline
  • GitHub commit monitoring
  • Changelog RSS parsing
  • Pricing page diffs
  • Hiring signal detection
03Push API
  • API key generation (SHA-256)
  • Rate limiting by tier
  • Webhook ingestion
  • Context upsert endpoint
04Delivery
  • Resend transactional email
  • Slack Block Kit messages
  • Discord embed chunking
  • Scheduler orchestration
05Access Control
  • Stripe checkout + portal
  • Trial gating logic
  • Tier-based feature flags
  • Webhook subscription sync
06Dogfood Run

Knox as first user. Real competitors. Real keys. Real briefings. In queue.

SIGNAL PIPELINE

Four signal types, one pipeline

Pull fetchers run on a configurable schedule per competitor. Each fetcher is independently gated — enable only what's relevant to your landscape.

GitHub Activity

Public commit history, release tags, and repository activity. Spot when a competitor ships a feature before they announce it.

1hr in-memory cache · GitHub public API

Changelog RSS

RSS/Atom feed monitoring for product changelog pages. Parses both RSS 2.0 and Atom formats — catches the changelog-as-signal pattern most teams overlook.

RSS 2.0 + Atom · pub date sorting

Pricing Diffs

SHA-256 hash of the full pricing page HTML, stored in Postgres. Any change — even a minor copy edit — surfaces as a signal for review.

SHA-256 · pricing_snapshots table

Hiring Signals

Greenhouse and Lever public job APIs. Headcount growth in specific departments (engineering, sales, ML) is a leading indicator of strategic direction.

Greenhouse API · Lever API · dept filtering

PIPELINE FLOW

Fetch signalsClassify with HaikuScore relevanceSynthesize with SonnetStore briefingDispatch delivery

AI BRIEFINGS

Synthesis, not summaries

Signals are classified by claude-haiku-4-5 for relevance scoring and tagging. The top signals from the past 7 days are then synthesized by claude-sonnet-4-6 into a structured briefing with mandatory source URLs for every claim.

The classifier returns a relevance score (0–1), a one-line summary, and tags. The briefing generator produces an executive summary, a categorized signal list, strategic implications, and a focus recommendation.

Executive summary (3–5 sentences)
Categorized signals with source URLs
Strategic implications per signal
Focus recommendation for the week

CLASSIFIER

claude-haiku-4-5-20251001

Runs on every raw signal. Returns relevance, summary, tags. Fast and cheap — designed to run at volume.

SYNTHESIZER

claude-sonnet-4-6

Runs once per landscape per scheduled cycle. Takes the top signals from the past 7 days and produces the full structured briefing. Source URLs are required per item — hallucination guardrail baked into the prompt.

PIPELINE TRIGGER

POST /api/scheduler/run

Pipeline secret-authenticated endpoint. Scheduler iterates all active landscapes, runs signal fetching, classification, and briefing generation in sequence. Designed for launchd cron or any external orchestrator.

POST/api/push/signal
{
  "competitor_id": "uuid",
  "type": "product_update",
  "title": "New API released",
  "content": "They shipped streaming...",
  "source_url": "https://...",
  "relevance_score": 0.92,
  "tags": ["api", "infrastructure"]
}
KEY FORMAT
tss_live_<32 hex chars>

Stored as SHA-256 hash.
Never recoverable after creation.
Revocable in settings.

PUSH API

Inject signals programmatically

The Pull pipeline covers public signals. The Push API covers everything else — your internal sales intel, customer calls, custom scrapers, anything that doesn't have a public feed.

Three endpoints: /signal for raw signals, /context for competitor metadata upserts, and /webhook for raw payload ingestion from any source.

API key auth — SHA-256 hash storage
Rate limiting per subscription tier
Webhook endpoint for any payload source
Trial-blocked — Pro+ only

DELIVERY

Briefings reach you where you work

Each channel is independently implemented and tier-gated. Email is available on all paid tiers. Slack and Discord are Pro+.

Email

Inline HTML briefings via Resend. Structured sections — executive summary, signals, implications. No template engine — assembled in code.

Slack

Block Kit messages with dividers, markdown sections, and source links. One block per signal so threads stay navigable.

Discord

Embed-based delivery, chunked 3-per-message to stay under Discord's 10-embed limit. Color-coded by signal type.

STACK

What it runs on

Next.js 16

App Router · Turbopack

Supabase

Postgres + RLS policies

Clerk Auth

JWT + E2E bypass

Anthropic Claude

Sonnet 4.6 + Haiku 4.5

Stripe

Checkout + webhooks

Resend

Transactional email

TypeScript

Strict · 229 unit tests

Playwright

48 E2E tests · all green

PHASE 06 — IN QUEUE

The dogfood run

The next phase is using it. Knox as the first user — real competitors, real API keys, real briefings delivered to a real Slack channel. Tracking against Crayon, Klue, and Kompyte: the established players in the space. If the system surfaces something they miss, it ships.