--- url: /supabase-tools/plugins/plugin-mcp-server.md description: >- MCP server for @sbtools/mcp-server — exposes all 21 studio tools and 3 resources over stdio for Claude Desktop, Cursor, and VS Code Copilot. --- # @sbtools/mcp-server [![npm](https://img.shields.io/npm/v/@sbtools/mcp-server.svg)](https://www.npmjs.com/package/@sbtools/mcp-server) A thin MCP ([Model Context Protocol](https://modelcontextprotocol.io/)) server that exposes every `@sbtools/plugin-migration-studio` tool and resource to AI agents via stdio. No business logic lives here — it is a registration layer over the existing tool catalog. ## Quick Start ```bash npx @sbtools/mcp-server --cwd /path/to/your/supabase-project ``` Or point directly at the built file if you are in the monorepo: ```bash node packages/mcp-server/dist/index.js --cwd /path/to/your/supabase-project ``` ## Claude Desktop Configuration Add to `~/Library/Application Support/Claude/claude_desktop_config.json` (macOS) or the equivalent path on your OS: ```json { "mcpServers": { "sbtools": { "command": "node", "args": ["/absolute/path/to/packages/mcp-server/dist/index.js"], "cwd": "/absolute/path/to/your/supabase-project" } } } ``` With `npx`: ```json { "mcpServers": { "sbtools": { "command": "npx", "args": ["@sbtools/mcp-server"], "cwd": "/absolute/path/to/your/supabase-project" } } } ``` ## `--cwd` Flag The server walks up from its working directory looking for `supabase-tools.config.json` to locate the project root. If it is not found it falls back to `cwd`. Override explicitly: ```bash node dist/index.js --cwd /path/to/project ``` ## Tools (21) All tools discovered from `@sbtools/plugin-migration-studio`'s tool catalog are registered automatically. The MCP tool name matches the studio tool ID exactly (e.g. `studio-introspect`, `studio-release-gate`). ### Understand Layer | Tool ID | What it does | |---|---| | `studio-introspect` | Query live DB → `studio.schema.snapshot` artifact | | `studio-sql-parse` | Parse migration files → `studio.sql.ast` artifact | | `studio-intent-sync` | Confidence-score DB entities vs SQL → `studio.intent.sync-report` | | `studio-intent-init` | Build intent graph from sync report → `studio.intent.graph` | ### Generate Layer | Tool ID | What it generates | |---|---| | `studio-create-table` | `CREATE TABLE` migration | | `studio-add-column` | `ALTER TABLE … ADD COLUMN` migration | | `studio-add-index` | `CREATE INDEX` migration | | `studio-add-constraint` | `ALTER TABLE … ADD CONSTRAINT` migration | | `studio-add-rls-policy` | `CREATE POLICY` migration | | `studio-add-function` | `CREATE OR REPLACE FUNCTION` migration | | `studio-create-rpc` | RPC function (schema: public) migration | | `studio-create-view` | `CREATE OR REPLACE VIEW` migration | | `studio-greenfield-init` | Initialize empty intent graph for a new project | ### Validate Layer | Tool ID | What it validates | |---|---| | `studio-rls-check` | RLS coverage gaps per entity | | `studio-rpc-lint` | Function security (DEFINER, search\_path, exposure) | | `studio-migration-plan` | Ordered SQL change plan vs intent graph | | `studio-lint` | Migration file risk flags and naming violations | | `studio-release-gate` | Aggregated pass/fail gate with blocking reasons | ### Graph & Mapping | Tool ID | Description | |---|---| | `studio-intent-patch` | Mutate a single entity's managed-status | | `studio-endpoint-map` | Derive PostgREST endpoints from intent graph entities/functions | ## Resources (3) Resources are read-only and available in any MCP client that supports resource listing. | Resource URI | Content | |---|---| | `sbtools://studio/catalog` | Full tool + workflow catalog (filterable JSON) | | `sbtools://studio/intent-graph` | Current `studio.intent.graph` artifact data or `null` | | `sbtools://studio/llm-context` | Single-call orientation: intent graph summary, artifact freshness, gate status, catalog | Read `sbtools://studio/llm-context` first — it is the fastest way for an AI agent to understand the project state. ## Typical Agent Workflow ``` 1. Read sbtools://studio/llm-context ← orient: what exists, what's stale 2. Call studio-introspect ← snapshot the live DB 3. Call studio-sql-parse ← parse migration history 4. Call studio-intent-sync ← score confidence 5. Call studio-intent-init ← build intent graph 6. Call studio-release-gate ← validate before any apply ``` ## Monorepo Convenience If the MCP server package is installed alongside `plugin-migration-studio`, you can start it via: ```bash sbt mcp # or sbt mcp --cwd /path/to/project ``` ## See Also * [plugin-migration-studio →](./plugin-migration-studio) — all 21 tools with HTTP + CLI reference * [CLI Reference →](../cli-reference) — `sbt mcp` command --- --- url: /supabase-tools/plugins/plugin-db-test.md description: >- Run database tests using pgTAP. Supports live PostgreSQL (Docker) or in-memory PGlite mode. --- # @sbtools/plugin-db-test [![npm](https://img.shields.io/npm/v/@sbtools/plugin-db-test.svg)](https://www.npmjs.com/package/@sbtools/plugin-db-test) Plugin that runs database tests using pgTAP. Supports live database mode (Docker) or in-memory PGlite mode. ## Quick Start ```bash npm install @sbtools/plugin-db-test ``` Add to config: `{ "path": "@sbtools/plugin-db-test" }` ```bash # Live mode (requires sbt start) npm run sbt -- test # In-memory mode (no Docker required) npm run sbt -- test --mem ``` ## Commands | Command | Description | |---------|-------------| | `test` | Run pgTAP tests from supabase/tests | | `test --mem` | Run tests in-memory via PGlite | | `test --server` | Run tests against server | ``` $ npm run sbt -- test Running pgTAP tests from supabase/tests Mode: live (Docker) supabase/tests/auth_policies.sql ok 1 - anon cannot read profiles ok 2 - authenticated user can read own profile ok 3 - service_role bypasses RLS supabase/tests/subscriptions.sql ok 4 - active subscription required for premium ok 5 - expired subscription denied 5/5 tests passed ``` ## Configuration Plugin config goes in `plugins[].config`: ```json { "plugins": [{ "path": "@sbtools/plugin-db-test", "config": { "testsDir": "supabase/tests" } }] } ``` | Key | Default | Description | |-----|---------|-------------| | `testsDir` | `supabase/tests` | Directory containing .sql test files | | `migrationsDir` | Root `paths.migrations` | Migrations to apply in --mem mode | --- --- url: /supabase-tools/plugins/plugin-deno-functions.md description: >- Document Supabase Edge Functions by statically analysing TypeScript source files. --- # @sbtools/plugin-deno-functions [![npm](https://img.shields.io/npm/v/@sbtools/plugin-deno-functions.svg)](https://www.npmjs.com/package/@sbtools/plugin-deno-functions) Plugin that documents Supabase Edge Functions by statically analysing TypeScript source files. ## Quick Start ```bash npm install @sbtools/plugin-deno-functions ``` Add to config: ```json { "plugins": [ { "path": "@sbtools/plugin-deno-functions", "config": { "baseUrl": "/functions/v1", "configTomlPath": "supabase/config.toml" } } ] } ``` ```bash npx sbt edge-functions ``` ## Commands | Command | Description | |---------|-------------| | `edge-functions` | List discovered edge functions | | `edge-functions --brief` | Summary table only | | `edge-functions --json` | Output raw JSON | | `edge-functions --openapi` | Generate OpenAPI spec | ``` $ npx sbt edge-functions --brief Edge Functions (4 discovered) ┌──────────────────┬────────┬─────────────────────────┐ │ Function │ Verify │ Endpoint │ ├──────────────────┼────────┼─────────────────────────┤ │ send-email │ JWT │ /functions/v1/send-email │ │ process-payment │ JWT │ /functions/v1/process-… │ │ generate-report │ JWT │ /functions/v1/generate-… │ │ health-check │ none │ /functions/v1/health-c… │ └──────────────────┴────────┴─────────────────────────┘ ``` ## Configuration | Key | Default | Description | |-----|---------|-------------| | `baseUrl` | `/functions/v1` | URL prefix for edge function endpoints | | `configTomlPath` | `supabase/config.toml` | Path to Supabase config.toml | ## Integration Results integrate into Backend Atlas, Swagger UI/ReDoc, and `sbt status`. --- --- url: /supabase-tools/plugins/plugin-depgraph.md description: >- Visualize backend dependency relationships as interactive HTML graph and Mermaid diagrams. --- # @sbtools/plugin-depgraph [![npm](https://img.shields.io/npm/v/@sbtools/plugin-depgraph.svg)](https://www.npmjs.com/package/@sbtools/plugin-depgraph) Plugin that visualizes backend dependency relationships (tables, functions, triggers, policies, views, enums) as an interactive HTML graph and Mermaid diagrams. ## Quick Start ```bash npm install @sbtools/plugin-depgraph ``` Add to config: `{ "path": "@sbtools/plugin-depgraph" }` ```bash # Ensure atlas data exists first npx sbt generate-atlas # Generate dependency graphs npx sbt depgraph # → docs/dependency-graph.html # → docs/dependency-graph.md ``` ## Commands | Command | Description | |---------|-------------| | `depgraph` | Generate both HTML and Mermaid | | `depgraph --html` | HTML only | | `depgraph --mermaid` | Mermaid only | | `depgraph --json` | Raw JSON output | ![Dependency graph dashboard page](../images/dashboard-dependencies.png) *Dependencies page — interactive graph with focus depth, palette presets, and node detail panel* ## Relationships Tracks: triggers→tables, policies→tables, functions→tables, views→tables, FK constraints, enum usage. ## Dashboard Graph Controls The dashboard `Dependencies` page (`/depgraph`) supports: * Node focus mode with selectable depth (`0..4`) * Palette presets by node type (`Default`, `Colorblind-safe`, `High contrast`, `Muted`) * Quick filters: orphan nodes, type multi-select, and connection-count buckets * Search, pan/zoom, and node detail inspection ## Requirements Run `sbt generate-atlas` first — plugin reads from `docs/backend-atlas-data.json`. --- --- url: /supabase-tools/plugins/plugin-erd.md description: >- Generate Mermaid ERD diagrams for each public table. Connects to DB to introspect columns, keys, and relationships. --- # @sbtools/plugin-erd [![npm](https://img.shields.io/npm/v/@sbtools/plugin-erd.svg)](https://www.npmjs.com/package/@sbtools/plugin-erd) Plugin that generates Mermaid ERD diagrams for each public table. Connects to the database to introspect columns, primary keys, and foreign keys. ## Quick Start ```bash npm install @sbtools/plugin-erd ``` Add to config: `{ "path": "@sbtools/plugin-erd" }` ```bash # Ensure database is running npx sbt start npx sbt generate-erd # Output: docs/entity-relations/.md ``` ## Commands | Command | Description | |---------|-------------| | `generate-erd` | Generate Mermaid ERD for all public tables | Sample generated ERD (`docs/entity-relations/users.md`): ```` ```mermaid erDiagram users { uuid id PK text email text full_name timestamptz created_at } organizations { uuid id PK text slug } users ||--o{ memberships : "has" organizations ||--o{ memberships : "has" ``` ```` ## Configuration Plugin config goes in `plugins[].config`: ```json { "plugins": [{ "path": "@sbtools/plugin-erd", "config": { "erdOutput": "docs/development/entity-relations", "displayColumns": ["name", "email", "full_name", "slug", "title"] } }] } ``` | Key | Default | Description | |-----|---------|-------------| | `erdOutput` | `/entity-relations` | Output directory (derives from root `paths.docsOutput`) | | `displayColumns` | `["name", "email", "full_name", "slug", "title"]` | Columns to display on referenced entities | Global ERD display columns can also be set at the root level under `erd.displayColumns`. ## Dashboard Integration ![ERD Diagrams dashboard page](../images/dashboard-erd.png) *ERD page — Mermaid entity-relationship diagram with table list sidebar* When active, the ERD plugin contributes to the dashboard automatically — no extra commands needed. * **`getAtlasData()`** — reads the generated `.md` files from `erdOutput` at `sbt generate-atlas` time and adds an `erd_diagrams` category to `backend-atlas-data.json` * **`getDashboardView()`** — declares the ERD section so the dashboard router shows it in the nav The dashboard ERD page (`/erd`) renders each diagram as an interactive Mermaid SVG with table search and a raw source toggle. If `erd_diagrams` is missing from atlas data (e.g. `generate-erd` hasn't run yet), the dashboard falls back to reading `.md` files directly from the `erdOutput` directory at request time. > **Note on `erdOutput`:** If you set a custom `erdOutput` path in plugin config, make sure it matches where `generate-erd` writes files. The dashboard resolves this path from `supabase-tools.config.json` at runtime — a mismatch causes the ERD page to appear empty. --- --- url: /supabase-tools/plugins/plugin-frontend-usage.md description: >- Scan frontend .ts/.tsx/.js/.jsx files for Supabase SDK usage and generate interactive HTML report. --- # @sbtools/plugin-frontend-usage [![npm](https://img.shields.io/npm/v/@sbtools/plugin-frontend-usage.svg)](https://www.npmjs.com/package/@sbtools/plugin-frontend-usage) Plugin that scans frontend `.ts/.tsx/.js/.jsx` files for Supabase SDK usage and generates an interactive HTML report. ## Quick Start ```bash npm install @sbtools/plugin-frontend-usage ``` Add to config: `{ "path": "@sbtools/plugin-frontend-usage" }` ```bash npx sbt frontend-usage # → docs/frontend-usage.html ``` ## Commands | Command | Description | |---------|-------------| | `frontend-usage` | Scan, generate HTML, open in browser | | `frontend-usage --json` | Output raw JSON | | `frontend-usage --no-open` | Skip opening report | ## Patterns Detected Tables (`.from()`), RPCs (`.rpc()`), auth, storage, edge functions, REST calls. ## Configuration | Key | Default | Description | |-----|---------|-------------| | `scanPaths` | `["src/"]` | Directories to scan | --- --- url: /supabase-tools/plugins/plugin-logs.md description: >- Live Docker log tailing, pg_stat_statements query monitoring, and standalone HTML log viewer. --- # @sbtools/plugin-logs [![npm](https://img.shields.io/npm/v/@sbtools/plugin-logs.svg)](https://www.npmjs.com/package/@sbtools/plugin-logs) Plugin that adds live Docker log tailing, `pg_stat_statements` query monitoring, and a standalone HTML log viewer. ## Dashboard Integration * The unified `sbt dashboard` also exposes live logs directly in the Logs page. * Live stream is served by dashboard APIs (`/api/logs/stream`, `/api/logs/services`) so you can monitor logs without leaving the dashboard shell. * `logs viewer` remains available as a standalone, dedicated viewer. ## Quick Start ```bash npm install @sbtools/plugin-logs ``` Add to config: `{ "path": "@sbtools/plugin-logs" }` ```bash # Tail all services npx sbt logs # Open log viewer npx sbt logs viewer # → http://localhost:3333 ``` ## Commands | Command | Description | |---------|-------------| | `logs` | Tail all services (multiplexed) | | `logs ` | Tail single service | | `logs --list` | List services | | `logs pg-stats` | Top 20 queries by execution time | | `logs pg-stats --slow` | By mean execution time | | `logs pg-stats --frequent` | By call count | | `logs viewer` | Start HTML log viewer | ``` $ npx sbt logs --list Available services: ✓ functions (running) ✓ db (running) ✓ rest (running) ✓ auth (running) ✓ kong (running) ✓ storage (running) ✓ realtime (running) ✓ studio (running) ``` ## Configuration | Key | Default | Description | |-----|---------|-------------| | `viewerPort` | 3333 | Log viewer port | | `tailLines` | 100 | Initial tail lines | | `dbContainer` | supabase-db | DB container name | --- --- url: /supabase-tools/plugins/plugin-migration-audit.md description: >- Compare migration files with database tracking. Read-only drift detection, CLI/JSON/HTML/Atlas reporting. --- # @sbtools/plugin-migration-audit [![npm](https://img.shields.io/npm/v/@sbtools/plugin-migration-audit.svg)](https://www.npmjs.com/package/@sbtools/plugin-migration-audit) Plugin that compares migration files on disk with `app_migrations.schema_migrations`. Detects drift, missing files, pending migrations. Reports via CLI, JSON, HTML, and Backend Atlas. **Read-only** — makes zero schema modifications. ## Quick Start ```bash npm install @sbtools/plugin-migration-audit ``` Add to config: `{ "path": "@sbtools/plugin-migration-audit" }` ```bash # Run audit (DB optional — disk-only if unreachable) npx sbt migration-audit # → docs/migration-audit.html # → CLI summary + open in browser ``` ``` $ npx sbt migration-audit Migration Audit ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Migrations Dir supabase/migrations Database Connected Tracking Table Exists Total: 63 Applied: 63 Pending: 0 Missing: 0 Migrations: APPLIED 20251218135835_create_users.sql (2026-02-12T11:43:40) APPLIED 20251218135847_add_organizations.sql (2026-02-12T11:43:40) APPLIED 20251224154957_create_subscriptions.sql (2026-02-12T11:43:40) ... ✓ HTML report → docs/migration-audit.html ✓ Detail pages → docs/migration-audit/.html ``` Each migration has a detail page with SQL viewer, parsed operations, risk flags, and touched objects. Links from the main report and Backend Atlas cards. ## Commands | Command | Description | |---------|-------------| | `migration-audit` | CLI summary + HTML report + open browser | | `migration-audit --json` | Output raw audit JSON | | `migration-audit --html` | Generate HTML only | | `migration-audit --no-open` | Skip opening browser | ![Migrations dashboard page](../images/dashboard-migrations.png) *Migrations page — audit summary, inventory table with applied/pending/missing filters* ## Issues Detected | Code | Severity | Description | |------|----------|-------------| | `MISSING_FILE` | error | Applied in DB but file missing on disk | | `PENDING_MIGRATION` | warning | Files not yet applied | | `NO_TRACKING_TABLE` | warning | `app_migrations.schema_migrations` missing | | `ORDERING_GAP` | warning | Applied out of chronological order | | `TIMESTAMP_PARSE_FAILURE` | info | Non-standard filename (no YYYYMMDDHHMMSS prefix) | | `EMPTY_MIGRATION` | info | 0-byte migration files | ## Artifact Produces `migration.analysis` v1.0.0. Includes per-migration `sqlAnalysis` (operations, touched objects, risk flags, confidence). Consumed by migration-studio for schema fallback and migrations context. ## Configuration No config fields required. Uses `paths.migrations` and `paths.docsOutput`. Database optional — uses `DATABASE_URL` / `SUPABASE_DB_URL` / `POSTGRES_URL` when available. --- --- url: /supabase-tools/plugins/plugin-migration-studio.md description: >- Flagship plugin — 21 tools, 4 workflows, intent graph, release gate, HTTP tool surface for AI agents. --- # @sbtools/plugin-migration-studio [![npm](https://img.shields.io/npm/v/@sbtools/plugin-migration-studio.svg)](https://www.npmjs.com/package/@sbtools/plugin-migration-studio) The flagship plugin. Provides a complete **backend design platform** for Supabase projects across five layers: Understand → Design → Generate → Validate → Apply. Every tool is available via both CLI (`sbt studio-`) and HTTP (`POST /api/studio/`), making the platform directly usable by AI agents. ## Installation ```bash npm install @sbtools/plugin-migration-studio ``` ```json { "plugins": [ { "path": "@sbtools/plugin-migration-studio", "config": {} } ] } ``` ## Starting the Server ```bash sbt migration-studio # port 3335 (default) sbt migration-studio --port N # custom port sbt migration-studio --restart # force-restart if port in use ``` ## LLM / Agent Usage > **Using Claude Desktop, Cursor, or VS Code?** The [`@sbtools/mcp-server`](./plugin-mcp-server) package exposes every tool below as a first-class MCP tool — no HTTP needed. AI agents using the HTTP API can orient themselves in one call: ``` GET /api/studio/llm-context ``` Returns: intent graph summary, artifact freshness, full tool catalog with descriptions, migration count. Then the agent can call any tool: ``` POST /api/studio/introspect → snapshot the DB POST /api/studio/intent-init → build intent graph POST /api/studio/create-table → generate migration SQL POST /api/studio/release-gate → validate before applying POST /api/apply → apply if gate passes ``` Discover available tools and workflows: ``` GET /api/studio/catalog?audience=backend-dev&mode=managed&type=tools ``` ## Tool Catalog (21 tools) ### Understand Layer | CLI Command | HTTP Route | Produces Artifact | |---|---|---| | `studio-introspect` | `POST /api/studio/introspect` | `studio.schema.snapshot` | | `studio-sql-parse` | `POST /api/studio/sql-parse` | `studio.sql.ast` | | `studio-intent-sync` | `POST /api/studio/intent-sync` | `studio.intent.sync-report` | | `studio-intent-init` | `POST /api/studio/intent-init` | `studio.intent.graph` | ### Generate Layer (Scaffold Tools) | CLI Command | HTTP Route | What it generates | |---|---|---| | `studio-create-table` | `POST /api/studio/create-table` | `CREATE TABLE` migration | | `studio-add-column` | `POST /api/studio/add-column` | `ALTER TABLE ... ADD COLUMN` | | `studio-add-index` | `POST /api/studio/add-index` | `CREATE INDEX` | | `studio-add-constraint` | `POST /api/studio/add-constraint` | `ALTER TABLE ... ADD CONSTRAINT` | | `studio-add-rls-policy` | `POST /api/studio/add-rls-policy` | `CREATE POLICY` | | `studio-add-function` | `POST /api/studio/add-function` | `CREATE OR REPLACE FUNCTION` | | `studio-create-rpc` | `POST /api/studio/create-rpc` | RPC function (schema: public) | | `studio-create-view` | `POST /api/studio/create-view` | `CREATE OR REPLACE VIEW` | ### Validate Layer | CLI Command | HTTP Route | Produces Artifact | |---|---|---| | `studio-rls-check` / `studio-migration-lint` | `POST /api/studio/rls-check` | `studio.rls.plan` + `studio.rls.report` | | `studio-migration-lint` | `POST /api/studio/migration-lint` | `studio.migration.lint` | | `studio-rpc-lint` | `POST /api/studio/rpc-lint` | `studio.rpc.plan` | | `studio-migration-plan` | `POST /api/studio/migration-plan` | `studio.migration.plan` | | `studio-release-gate` | `POST /api/studio/release-gate` | `studio.release.gate` | ### Graph & Mapping | CLI Command | HTTP Route | Description | |---|---|---| | `studio-intent-patch` | `POST /api/studio/intent-graph/entity` | Mutate entity managed-status | | `studio-endpoint-map` | `POST /api/studio/endpoint-map` | Derive PostgREST endpoints from intent graph | | `studio-greenfield-init` | `POST /api/studio/greenfield-init` | Init empty intent graph for new project | ## Workflow Catalog (4 workflows) Workflows chain multiple tools with optional human checkpoints. | Workflow ID | Steps | Use Case | |---|---|---| | `adopt-backend` | introspect → sql-parse → intent-sync → intent-init | Brownfield project adoption (2 checkpoints) | | `release-check` | migration-plan → rls-check → rpc-lint → release-gate | Pre-release validation | | `create-table` | create-table → sql-parse | Table + immediate AST update | | `add-rls-policy` | add-rls-policy → rls-check | Policy + immediate coverage check | Run the brownfield adoption workflow: ```bash sbt studio-adopt ``` Browse all workflows: `GET /api/studio/catalog?type=workflows` ## Intent Graph The intent graph (`studio.intent.graph`) is the central artifact. It assigns every DB entity a `managedStatus`: | Status | Meaning | |---|---| | `managed` | Full confidence — schema matches SQL, safe to modify | | `assisted` | Partial match — LLM/human should review before modifying | | `opaque` | Unknown origin — do not modify without explicit human sign-off | | `excluded` | Intentionally excluded from tracking | Read the current graph: `GET /api/studio/intent-graph` Mutate an entity's status: `POST /api/studio/intent-graph/entity` ## Release Gate The release gate (`studio-release-gate`) aggregates evidence from all validation tools into a single `{ status: 'pass' | 'fail', reasons: [] }` signal. `POST /api/apply` respects the gate — migrations are blocked if the gate has failed. This is the recommended final step before any migration apply, especially for LLM-driven pipelines. ## Migration Studio UI The plugin also provides a **CodeMirror 6 SQL editor** UI accessible at `http://localhost:3335` (or via the dashboard at `/studio`). Features: * Syntax highlighting, autocomplete (tables/columns from DB schema) * SQL analysis and dry-run validation * Save/apply migration files * Schema sidebar and migration list ## HTTP API Reference | Route | Description | |---|---| | `GET /api/health` | Health check | | `GET /api/events` | SSE — live cache invalidation events | | `GET /api/schema` | Live DB schema (tables, columns, policies, functions) | | `GET /api/templates` | Migration template list | | `GET /api/migrations` | Migration file list | | `POST /api/analyze` | SQL analysis (parse + classify operations) | | `POST /api/validate` | SQL dry-run against live schema | | `POST /api/save` | Save SQL to a migration file | | `POST /api/apply` | Apply pending migrations (enforces release gate) | | `GET /api/studio/llm-context` | Full project orientation for AI agents | | `GET /api/studio/intent-graph` | Current intent graph or null | | `GET /api/studio/catalog` | Filterable tool/workflow catalog | | `GET /api/studio/adopt/status` | Current adoption workflow run state | | `POST /api/studio/adopt/start` | Start adoption workflow | | `POST /api/studio/adopt/resume` | Resume from checkpoint | | `POST /api/studio/` | Run any discovered tool (21 total) | ## Artifacts Produced | Artifact | Description | |---|---| | `studio.schema.snapshot` | Live DB schema as typed nodes | | `studio.sql.ast` | SQL AST from migration files | | `studio.intent.sync-report` | Confidence scores per entity (DB vs SQL) | | `studio.intent.graph` | Typed entity graph with managed/assisted/opaque status | | `studio.rls.plan` | Suggested RLS policies for uncovered entities | | `studio.rls.report` | RLS coverage analysis and gaps | | `studio.migration.lint` | Risk flags, naming violations, lock-safety | | `studio.rpc.plan` | Function security audit results | | `studio.migration.plan` | Ordered change plan with change-class annotations | | `studio.release.gate` | Aggregated pass/fail gate with blocking reasons | | `studio.apply.log` | Apply history with timestamps and output | | `studio.workflow.run` | Current workflow run state (for resuming) | ## See Also * [Platform Architecture →](./plugin-migration-studio-platform) — 5-layer design and contributing guide * [CLI Reference →](../cli-reference) — Full command listing with flags --- --- url: /supabase-tools/plugins/plugin-scaffold.md description: >- Scaffold new supabase-tools plugins with consistent boilerplate and SDK integration. --- # @sbtools/plugin-scaffold [![npm](https://img.shields.io/npm/v/@sbtools/plugin-scaffold.svg)](https://www.npmjs.com/package/@sbtools/plugin-scaffold) Plugin that scaffolds new supabase-tools plugins with consistent boilerplate. ## Quick Start ```bash npm install @sbtools/plugin-scaffold ``` Add to config: `{ "path": "@sbtools/plugin-scaffold" }` ```bash # Internal plugin (packages/plugin-/) npx sbt scaffold-plugin analytics # External plugin (sibling directory) npx sbt scaffold-plugin my-feature --external # With Atlas hooks npx sbt scaffold-plugin dashboard --hooks ``` ## Commands | Command | Description | |---------|-------------| | `scaffold-plugin ` | Create internal plugin | | `scaffold-plugin --external` | Create external plugin | | `scaffold-plugin --hooks` | Include Atlas hook stubs | ``` $ npx sbt scaffold-plugin analytics Scaffolding plugin: analytics ✓ packages/plugin-analytics/package.json ✓ packages/plugin-analytics/tsconfig.json ✓ packages/plugin-analytics/src/index.ts ✓ packages/plugin-analytics/README.md Done — run `npm install` to link the new package. ``` ## Configuration No config required. --- --- url: /supabase-tools/plugins/plugin-typegen.md description: >- Generate TypeScript types from the running Supabase instance using the PostgREST types endpoint. --- # @sbtools/plugin-typegen [![npm](https://img.shields.io/npm/v/@sbtools/plugin-typegen.svg)](https://www.npmjs.com/package/@sbtools/plugin-typegen) Plugin that generates TypeScript types from the running Supabase instance using the PostgREST types endpoint. ## Quick Start ```bash npm install @sbtools/plugin-typegen ``` Add to config: `{ "path": "@sbtools/plugin-typegen" }` ```bash # Ensure database is running npx sbt start npx sbt generate-types # Output: src/integrations/supabase/types.ts ``` ## Commands | Command | Description | |---------|-------------| | `generate-types` | Fetch types from PostgREST and write to file | ``` $ npx sbt generate-types Fetching types from http://localhost:54321/rest/v1/?apikey=... ✓ Written to src/integrations/supabase/types.ts (248 tables, 12 views, 34 enums) ``` ## Configuration Plugin config goes in `plugins[].config`: ```json { "plugins": [{ "path": "@sbtools/plugin-typegen", "config": { "typesOutput": "src/types/supabase.ts" } }] } ``` | Key | Default | Description | |-----|---------|-------------| | `typesOutput` | `src/integrations/supabase/types.ts` | Output file path | Environment: `SUPABASE_TYPES_SCHEMAS` to limit schemas (comma-separated). --- --- url: /supabase-tools/ui-web.md description: >- @sbtools/ui-web — Shared React UI package: SSR renderers for standalone HTML reports and the Vite React dashboard SPA. --- # @sbtools/ui-web [![npm](https://img.shields.io/npm/v/@sbtools/ui-web.svg)](https://www.npmjs.com/package/@sbtools/ui-web) Shared UI package for supabase-tools. Provides two main things: 1. **SSR renderers** — Node.js functions that generate standalone HTML files for plugin CLI commands 2. **React dashboard SPA** — Vite-built single-page app served by `sbt dashboard` (port 3400) The dashboard is bundled into `@sbtools/core` at build time — end users do not need to install this package directly. It is primarily a dependency for plugins that generate standalone HTML pages. ## Installation ```bash npm install @sbtools/ui-web ``` ## SSR Renderers Server-side page renderers used by plugin CLI commands to produce standalone HTML reports: | Export | Used by | |--------|---------| | `renderDepgraphPage` | `sbt depgraph --html` | | `renderFrontendUsagePage` | `sbt frontend-usage` | | `renderMigrationAuditPage` | `sbt migration-audit --html` | | `renderMigrationDetailPage` | Per-migration detail pages from `migration-audit` | | `renderLogsViewerPage` | `sbt logs viewer` | | `renderMigrationStudioPage` | `sbt migration-studio` (legacy standalone page) | | `renderRawDocument` | Base HTML document shell — used by all above | ### Usage ```ts import { renderDepgraphPage, renderRawDocument } from "@sbtools/ui-web"; ``` All renderers accept typed data objects and return an HTML string. Write it to a file with `writeFileInDir` from `@sbtools/sdk`. ### Shared Tokens CSS design tokens are exported for use in SSR pages and plugin renderers: ```ts import { SHARED_TOKENS_CSS, SHARED_TOKENS_DARK } from "@sbtools/ui-web"; ``` These provide the same light/dark mode CSS custom properties used by the dashboard SPA, ensuring visual consistency between standalone HTML reports and the dashboard. ## React Dashboard SPA The dashboard SPA at `sbt dashboard` (`http://localhost:3400`) is a Vite-built React application. It includes: ![Dashboard overview page](../images/dashboard-overview.png) *Overview — entity stats, mini charts, and the Entity Explorer table* ### Pages | Page | Route | Description | |------|-------|-------------| | Overview | `/` | Entity stats, mini charts, clickable filter tabs | | Details | `/details/:section/:id` | Node detail view with metadata grid, edge tables | | Migrations | `/migrations` | Migration list, audit status, embedded/pop-out Studio | | Migration Studio | `/migration-studio` | React dashboard page for Migration Studio (connects to `sbt migration-studio` server) | | Dependency Graph | `/depgraph` | Interactive graph with focus depth, palette, filters | | ERD | `/erd` | Mermaid ERD diagrams per table with search and raw toggle | | Frontend Usage | `/frontend-usage` | Filter-driven SDK usage views: hot components, component map, resource impact | | Logs | `/logs` | Live Docker log stream with service filters and inline search | | Commands | `/commands` | Run any `sbt` command from the browser with live streaming output | | Not Found | `*` | 404 fallback | ### Shared Components * `StatCard` — Clickable stat cards with tone, used on Overview * `Badge` — Status/type badges with tone map support * `DataTable` / `AppDataTable` — Sortable, paginated tables with column resize * `CardGrid` / `GenericSection` — Plugin-driven section renderers * `SearchInput` — Global search with Ctrl+K hotkey, arrow navigation * `CodeBlock` — Syntax-highlighted code viewer * `ValueRenderer` — Collapsible JSON tree, SQL highlighting, auto-detection * `EmptyState` — Unified empty state * `Dropdown` — Multi-action button menus * `MiniBarChart` / `MiniDonutChart` — Inline charts (Recharts) * `MermaidRenderer` — Renders Mermaid diagrams as SVG ### Design System All colors, spacing, and typography use CSS custom properties from `tokens.css`. Light/dark mode is supported via `[data-theme="dark"]`. Fonts are Geist + Geist Mono (loaded from Fontsource). ### Dashboard API routes (served by `sbt dashboard`) | Route | Returns | |-------|---------| | `GET /api/atlas-data` | `backend-atlas-data.json` | | `GET /api/dashboard-config` | Plugin-contributed section definitions | | `GET /api/commands` | All registered commands with category | | `GET /api/run/stream?command=...` | SSE: stream `sbt ` output | | `GET /api/logs/services` | Docker service statuses | | `GET /api/logs/stream?services=...` | Live SSE Docker log stream | | `GET /api/fs/list?scope=...&path=...` | Safe file listing | | `GET /api/fs/file?scope=...&path=...` | Safe file content | ## Development To run the dashboard SPA with Vite HMR during development: ```bash # 1. Start the backend (serves API on port 3400) sbt dashboard # 2. Generate atlas data (required for /api/atlas-data) sbt generate-atlas # 3. Start Vite dev server (proxies /api/* to port 3400) npm run dev # from packages/ui-web/ # or npm run dashboard:dev # from repo root ``` The UI runs at `http://localhost:5173`. Set `DASHBOARD_API_PORT` if the backend uses a different port. ## Dependencies | Dependency | Purpose | |------------|---------| | `react`, `react-dom` | SSR (`renderToStaticMarkup`) and dashboard SPA | | `mermaid` | ERD diagram rendering | | `recharts` | Mini bar/donut charts | | `@codemirror/*` | SQL editor in Migration Studio page | | `lucide-react` | Icon set | | `vite`, `@vitejs/plugin-react` | Dashboard SPA build (devDependencies) | --- --- url: /supabase-tools/architecture.md --- # Architecture Documentation for supabase-tools architecture decisions and cross-package contracts. ## Package Dependencies * [Package & Artifact Dependencies](./package-dependencies.md) — NPM deps, artifact produce/consume, command flows, real-time update requirements ## Versioned Artifacts Versioned artifacts are the contract layer for cross-plugin collaboration. New integrations should use artifact-based contracts instead of implicit file conventions or hook-time object sharing. * [Artifact ID Registry](./artifact-registry.md) — Official registry of artifact IDs and ownership * [Artifact Contract Guide](./artifact-contract-guide.md) — How to produce and consume artifacts * [Artifact Compatibility Policy](./artifact-compatibility-policy.md) — Semver behavior and contributor checklist * [Implicit File Contracts](./implicit-file-contracts.md) — Documented output paths and merge semantics (legacy; prefer artifacts) --- --- url: /supabase-tools/architecture/artifact-compatibility-policy.md --- # Artifact Compatibility Policy This policy defines how artifact schemas evolve and how producers and consumers must behave for backward compatibility. ## Semver behavior | Bump | Meaning | Consumer impact | |------|---------|-----------------| | **MAJOR** | Breaking data schema changes (field removed, type changed, structure changed) | Consumers must be updated to handle the new schema | | **MINOR** | Additive compatible fields (new optional fields, new categories) | Consumers may ignore unknown fields; safe to upgrade | | **PATCH** | Docs, bugfix semantics, no structural break | No consumer impact | ## Producer rules 1. **Validate output** against schema before write. Use `validateArtifactEnvelope` before `writeArtifact` in strict mode. 2. **Include freshness inputs** so consumers can detect staleness (e.g. `sourceHash`, `snapshotHash`). 3. **Document confidence limits** where relevant (e.g. "DB unreachable — disk-only analysis"). 4. **Never remove or change types** of existing fields in a PATCH or MINOR. Use a new MAJOR for breaking changes. ## Consumer rules 1. **Read exact compatible major** when possible. If you need `migration.analysis` 1.x, accept any 1.y.z. 2. **Tolerate unknown fields** — additive minor changes must not break consumers. 3. **Degrade gracefully** when artifact is missing, invalid, or stale: * Show a warning (e.g. "Run migration-audit to refresh") * Provide a fallback path (e.g. live computation) when feasible * Do not crash or block the user without a clear recovery path 4. **Prefer strict major matching** — do not assume 2.x is compatible with 1.x. ## Deprecation process 1. **Announce** deprecation in release notes and schema docs. 2. **Publish** new major artifact schema with migration notes. 3. **Block** incompatible consumers in CI/integration tests. 4. **Remove** deprecated major after documented cutover release (e.g. 2 releases later). 5. **Update** the [artifact registry](./artifact-registry.md) with deprecation status. ## Contributor checklist When adding or modifying artifacts: * \[ ] Schema changes follow semver (MAJOR for breaking, MINOR for additive). * \[ ] Producer validates output before write. * \[ ] Consumer degrades gracefully when artifact missing/invalid/stale. * \[ ] Artifact ID is registered in the registry. * \[ ] Contract tests exist for producer and consumer. * \[ ] Migration notes documented for any breaking change. --- --- url: /supabase-tools/architecture/artifact-contract-guide.md --- # Artifact Contract Authoring Guide This guide explains how to produce and consume versioned artifacts in supabase-tools plugins. ## What is a versioned artifact? A versioned artifact is a **persisted, typed, semantically-versioned envelope** produced by one package and consumed by one or more others. It replaces implicit file conventions and hook-time object sharing with explicit contracts. ## Envelope structure Every artifact follows the canonical envelope schema: ```json { "id": "migration.analysis", "version": "1.0.0", "producer": "@sbtools/plugin-migration-audit", "generatedAt": "2026-02-15T00:00:00.000Z", "schemaRef": "https://sbtools.dev/contracts/migration.analysis/1.0.0", "inputs": { "projectRoot": "/workspace", "sourceHash": "sha256:...", "snapshotHash": "sha256:..." }, "meta": { "toolVersion": "0.3.0", "buildId": "..." }, "data": {} } ``` * **id**: Stable identifier (see [artifact registry](./artifact-registry.md)). Never include version in the ID. * **version**: Full semver (MAJOR.MINOR.PATCH) for schema compatibility. * **producer**: Package name that produced the artifact. * **generatedAt**: ISO 8601 timestamp. * **schemaRef**: Optional URL to schema documentation. * **inputs**: Fingerprints for freshness/staleness (e.g. source hashes). * **meta**: Tool/build metadata. * **data**: The actual payload (schema varies per artifact). ## Storage convention Default path: `.sbt/artifacts///latest.json` Example: `migration.analysis` at `1.0.0` → `.sbt/artifacts/migration.analysis/1.0.0/latest.json` Optional immutable snapshots: `.sbt/artifacts///.json` ## Producer checklist 1. **Validate output** against your schema before write. 2. **Include freshness inputs** (e.g. source hash, snapshot hash) so consumers can detect staleness. 3. **Use the SDK helpers** (`writeArtifact`, `ArtifactEnvelope`) for consistency. 4. **Document confidence limits** where relevant (e.g. "disk-only analysis when DB unreachable"). 5. **Register the artifact** in the [artifact registry](./artifact-registry.md). ## Consumer checklist 1. **Read exact compatible major** when possible. 2. **Tolerate unknown fields** (additive minor changes). 3. **Degrade gracefully** when artifact is missing, invalid, or stale — show a warning or fallback, do not crash. 4. **Prefer `readArtifact`** from the SDK for path resolution and validation. ## SDK usage ```typescript import { writeArtifact, readArtifact, type ArtifactEnvelope } from "@sbtools/sdk"; // Producer const envelope: ArtifactEnvelope = { id: "migration.analysis", version: "1.0.0", producer: "@sbtools/plugin-migration-audit", generatedAt: new Date().toISOString(), inputs: { projectRoot: ctx.projectRoot, sourceHash: "..." }, meta: { toolVersion: "0.4.0" }, data: myAuditResult, }; writeArtifact(ctx, envelope); // Consumer const envelope = readArtifact(ctx, "migration.analysis", "1.0.0"); if (!envelope) { ui.warn("Migration analysis artifact not found. Run migration-audit first."); return; } // envelope.data is typed as MyData ``` ## Plugin capability declaration Plugins can declare artifact capabilities in their `SbtPlugin` definition: ```typescript const plugin: SbtPlugin = { name: "@sbtools/plugin-migration-audit", version: "0.4.0", artifactCapabilities: { produces: ["migration.analysis"], }, // ... }; ``` This enables tooling to validate artifact availability and surface warnings. --- --- url: /supabase-tools/architecture/artifact-registry.md --- # Artifact ID Registry This document is the **single source of truth** for official artifact IDs in supabase-tools. All new artifact IDs must be registered here and follow the [artifact contract guide](./artifact-contract-guide.md). ## Naming rules * `id` is stable and **never** includes version suffixes (e.g. `migration.analysis`, not `migration.analysis.v1`). * Use dot-separated lowercase: `.` or `..`. * Avoid duplicate semantics under different IDs. ## Status definitions | Status | Meaning | |--------|---------| | **Active** | Producer and consumer both exist and work end-to-end | | **Producing** | Producer writes the artifact; no consumer reads it yet | | **Planned** | Defined in the plan but no implementation yet | | **Convention** | Naming convention for a family of artifacts | | **Optional** | Low-priority; implement only if ROI is justified | ## Official registry | Artifact ID | Owner Package | Schema Version | Status | Description | |-------------|---------------|----------------|--------|-------------| | `atlas.data` | core | — | Planned | Optional wrapper for backend atlas data contract | | `docs.route-manifest` | core | — | Planned | Plugin-generated page routes and labels | | `openapi.partial.deno-functions` | plugin-deno-functions | 1.0.0 | Active | Deno functions partial OpenAPI spec (consumed by core docs) | | `openapi.partial.` | (producing plugin) | — | Convention | Plugin partial OpenAPI specs; merged deterministically | | `migration.analysis` | plugin-migration-audit | 1.0.0 | Active | Migration audit result; per-migration sqlAnalysis; consumed by migration-studio | | `migration.lineage` | plugin-migration-audit | — | Planned | Migration dependency/lineage graph | | `migration.staleness` | plugin-migration-audit | — | Planned | Staleness and drift metrics | | `migration.studio.draft` | plugin-migration-studio | — | Planned | Studio-owned draft migration metadata | | `studio.schema.snapshot` | plugin-migration-studio | 1.0.0 | Active | Live DB state — tables, columns, constraints, indexes, policies, functions, views, triggers, extensions | | `studio.sql.ast` | plugin-migration-studio | 1.0.0 | Active | Migration file parse results — per-file AST, extracted intent nodes, aggregated entity/policy/function arrays | | `studio.intent.sync-report` | plugin-migration-studio | 1.0.0 | Active | Confidence-scored match between DB snapshot and SQL AST; matched, unmatchedDb, unmatchedIntent lists | | `studio.intent.graph` | plugin-migration-studio | 1.0.0 | Active | Final intent graph — managed/assisted/opaque entity nodes, opaque blocks, managed scope declaration | | `studio.rls.plan` | plugin-migration-studio | 1.0.0 | Active | RLS policy plan for managed entities — proposed policies, coverage gaps | | `studio.rls.report` | plugin-migration-studio | 1.0.0 | Active | RLS coverage report — per-entity gap analysis, SECURITY DEFINER warnings | | `studio.rpc.plan` | plugin-migration-studio | 1.0.0 | Active | RPC/function security audit — DEFINER\_NO\_SEARCH\_PATH, public exposure, empty body | | `studio.migration.plan` | plugin-migration-studio | 1.0.0 | Active | Ordered SQL change plan with change-class annotations; includes `snapshotHash` | | `studio.migration.lint` | plugin-migration-studio | 1.0.0 | Active | Migration lint results — destructive ops, missing transactions, naming violations | | `studio.release.gate` | plugin-migration-studio | 1.0.0 | Active | Release gate decision — pass/fail with blocking reasons; read by `POST /api/apply` | | `studio.workflow.run` | plugin-migration-studio | 1.0.0 | Active | Workflow run state — step results, status, timestamps, current step pointer | | `studio.apply.log` | plugin-migration-studio | 1.0.0 | Active | Apply audit record — `appliedAt`, truncated `output`, `success`; written after every successful apply | | `typescript.schema-types` | plugin-typegen | — | Optional | Typegen output metadata (path, hash, timestamp) | | `depgraph.graph` | plugin-depgraph | 1.0.0 | Producing | Dependency graph | | `frontend.usage` | plugin-frontend-usage | 1.0.0 | Producing | Frontend usage scan results | | `runtime.service-health` | plugin-logs | — | Optional | Runtime service health snapshots | | `runtime.query-stats` | plugin-logs | — | Optional | Query statistics snapshots | ## Adding a new artifact 1. Propose the ID and schema in a PR. 2. Add a row to this registry with owner, schema version, status, and description. 3. Add a schema file and examples per the [contract guide](./artifact-contract-guide.md). 4. Ensure contract tests for producer and consumer. ## Deprecation When deprecating an artifact: 1. Publish a new major artifact schema with migration notes. 2. Block incompatible consumers in CI/integration tests. 3. Remove the deprecated major after the documented cutover release. 4. Update this registry with deprecation status and removal date. --- --- url: /supabase-tools/cli-reference.md description: >- Complete command reference for sbt — core commands, plugin commands, flags, and environment variables. --- # CLI Reference All commands: `npx sbt [options]` Run `npx sbt help` to list available commands (including installed plugins). ## Core Commands ### Docker | Command | Description | |---------|-------------| | `start` | Start Supabase Docker stack | | `stop` | Stop all services | | `restart` | Restart all services | | `status` | Show service URLs, keys, and connection info | ### Database | Command | Description | |---------|-------------| | `migrate` | Apply SQL migrations from `supabase/migrations/` | | `snapshot [schema...] [all]` | Export DB objects (functions, views, triggers, policies, types, enums) to filesystem | | `watch [--scope migration]` | Watch DB/files and keep migration artifacts fresh (`migration-audit --no-open`) | | `dashboard [--port N]` | Start unified dashboard UI (React) with APIs for atlas data, live logs, and file browsing | `migrate` env vars: * `MIGRATION_BASELINE=1` — record all migrations as applied without running them * `MIGRATION_REAPPLY=1` — force reapply even if DB has existing tables `watch` options: * `--scope migration` — phase-1 scope (default) * `--debounce-ms N` — debounce bursty events (default: `1500`) * `--no-db-hooks` — skip DB trigger/event-trigger helper install * `--verbose` — print event payloads ### Generation | Command | Description | |---------|-------------| | `generate-atlas` | Generate Backend Atlas data (`.sbt/docs/backend-atlas-data.json` by default) | | `init` | Generate `supabase-tools.config.json` with defaults | ### Other | Command | Description | |---------|-------------| | `help` / `-h` / `--help` | Show all available commands | `dashboard` options: * `--port N` — listen on custom port (default: 3400) `dashboard` API routes: * `GET /api/atlas-data` — Backend Atlas JSON * `GET /api/dashboard-config` — Combined dashboard section definitions (core + plugins) * `GET /api/commands` — All registered commands (core + plugin) with category metadata * `GET /api/run/stream?command=...` — SSE: spawn `sbt ` and stream stdout/stderr * `GET /api/logs/services` — Current Docker service statuses * `GET /api/logs/stream?services=...` — Live SSE log stream from Docker * `GET /api/fs/list?scope=...&path=...` — Safe file listing (`snapshot`, `migrations`, `docs`, `project`) * `GET /api/fs/file?scope=...&path=...` — Safe file content view in browser ## Plugin Commands ### plugin-db-test | Command | Description | |---------|-------------| | `test` | Run pgTAP database tests against live DB | | `test --mem` | Run tests in-memory using PGlite | Config: `testsDir`, `migrationsDir` ### plugin-deno-functions | Command | Description | |---------|-------------| | `edge-functions` | List discovered edge functions | | `edge-functions --brief` | Summary table only | | `edge-functions --json` | Raw JSON output | | `edge-functions --openapi` | Generate OpenAPI spec at `docs/edge-functions-openapi.json` | Config: `baseUrl`, `configTomlPath` ### plugin-depgraph | Command | Description | |---------|-------------| | `depgraph` | Generate HTML + Mermaid dependency graph | | `depgraph --html` | HTML only | | `depgraph --mermaid` | Mermaid only | | `depgraph --json` | Raw JSON to stdout | | `depgraph --no-open` | Skip opening in browser | Requires `generate-atlas` first. Config: `typesFilePath` ### docs (core) | Command | Description | |---------|-------------| | `docs` / `docs all` | Start all doc services | | `docs swagger` | Swagger UI (port 8081) | | `docs redoc` | ReDoc (port 8082) | | `docs schemaspy` | SchemaSpy (port 8083/schemaspy/) | | `docs stop` | Stop all docs containers | ### plugin-erd | Command | Description | |---------|-------------| | `generate-erd` | Generate Mermaid ERD per public table | Config: `erdOutput`, `displayColumns` ### plugin-frontend-usage | Command | Description | |---------|-------------| | `frontend-usage` | Scan frontend for Supabase SDK usage, generate HTML report | | `frontend-usage --json` | Raw JSON output | | `frontend-usage --no-open` | Skip opening in browser | Config: `scanPaths` ### plugin-migration-audit | Command | Description | |---------|-------------| | `migration-audit` | Compare disk migrations vs DB; CLI summary + HTML report + detail pages | | `migration-audit --json` | Output raw audit JSON | | `migration-audit --html` | Generate HTML only | | `migration-audit --no-open` | Skip opening browser | Produces `migration.analysis` artifact. Detail pages at `{docsOutput}/migration-audit/.html`. ### plugin-migration-studio | Command | Description | |---------|-------------| | `migration-studio` | Start schema-aware migration authoring UI at `http://localhost:3335` | | `migration-studio --port N` | Use custom port | | `migration-studio --restart` | Kill existing process on port, then start (auto-retry on port conflict) | Requires DB for schema introspection (falls back to atlas-data/artifact when unreachable). #### Brownfield Adoption | Command | Description | |---------|-------------| | `studio-introspect` | Query live DB → `studio.schema.snapshot` artifact | | `studio-sql-parse` | Parse migration files → `studio.sql.ast` artifact | | `studio-adopt` | Full adoption workflow (introspect → sql-parse → review → intent-sync → approve → intent-init) | | `studio-catalog [--audience ] [--mode ] [--type ]` | List discovered tools/workflows from catalog with persona/control-mode filters | | `studio-intent-patch --entity --action [--status ]` | Mutate a single entity's managed-status in the intent graph | | `studio-endpoint-map` | Derive PostgREST `EndpointNode` declarations for all managed entities/functions | #### Scaffold Tools (Generate Layer) | Command | Description | |---------|-------------| | `studio-create-table --schema --name [--columns ] [--no-rls]` | Generate `CREATE TABLE` migration | | `studio-add-column --entity --name --type [--nullable] [--default ]` | Generate `ALTER TABLE ... ADD COLUMN` migration (requires intent graph) | | `studio-add-rls-policy --entity --name --command --roles ` | Generate `CREATE POLICY` migration | | `studio-add-index --entity --name --columns [--unique] [--method ]` | Generate `CREATE INDEX` migration | | `studio-add-constraint --entity --name --type [--options...]` | Generate `ALTER TABLE ... ADD CONSTRAINT` migration | | `studio-add-function --schema --name --returns --language --body-file ` | Generate `CREATE OR REPLACE FUNCTION` migration | | `studio-create-rpc --name --returns --language --body-file ` | Same as add-function, forces `schema: public` | | `studio-create-view --schema --name --query "SELECT ..."` | Generate `CREATE OR REPLACE VIEW` migration | | `studio-greenfield-init` | Initialize an empty intent graph for a new project (no DB introspection needed) | #### Validation Tools (Validate Layer) | Command | Description | |---------|-------------| | `studio-rls-check` | RLS coverage check — gap analysis per managed entity → `studio.rls.plan` + `studio.rls.report` | | `studio-rpc-lint` | Function security audit (DEFINER/search\_path/exposure) → `studio.rpc.plan` | | `studio-migration-plan` | Intent graph diff → ordered SQL change plan → `studio.migration.plan` | | `studio-lint` | Migration file lint (destructive ops, naming, lock safety) → `studio.migration.lint` | | `studio-migration-lint` | Alias for `studio-lint` | | `studio-release-gate` | Aggregate RLS/RPC/lint findings → pass/fail → `studio.release.gate` | | `studio-release-check` | One-shot: runs full release-check workflow (no server needed). Exits 0 on pass, 1 on fail. `--json` for raw output. | | `mcp [--cwd ]` | Start the MCP stdio server — exposes all 21 studio tools + 3 resources to Claude Desktop, Cursor, and VS Code Copilot | ### plugin-logs | Command | Description | |---------|-------------| | `logs` | Tail all running services | | `logs ` | Tail specific service | | `logs --list` | List services with running/stopped status | | `logs --tail N` | Number of historical lines (default: 100) | | `logs --no-color` | Disable ANSI colors | | `logs --timestamps` | Show Docker timestamps | | `logs pg-stats` | Query performance stats | | `logs pg-stats --slow` | Top 20 by mean execution time | | `logs pg-stats --frequent` | Top 20 by call count | | `logs pg-stats --reset` | Reset pg\_stat\_statements | | `logs pg-stats --json` | Raw JSON output | | `logs viewer` | Start HTML log viewer (default port 3333) | | `logs viewer --port N` | Custom viewer port | Config: `viewerPort`, `tailLines`, `dbContainer` ### plugin-scaffold | Command | Description | |---------|-------------| | `scaffold-plugin ` | Create internal plugin in `packages/` | | `scaffold-plugin --external` | Create external plugin at project root | | `scaffold-plugin --hooks` | Include Atlas/status/OpenAPI hook stubs | ### plugin-typegen | Command | Description | |---------|-------------| | `generate-types` | Generate TypeScript types from running DB | Config: `typesOutput`. Env: `SUPABASE_TYPES_SCHEMAS` (comma-separated schemas). ## Environment Variables **Database connection** (checked in order): 1. `DATABASE_URL` 2. `SUPABASE_DB_URL` 3. `POSTGRES_URL` 4. Falls back to `db.url` in config **Debug mode:** * `SBT_DEBUG=1` — verbose output for diagnosing issues --- --- url: /supabase-tools/configuration.md description: >- Full reference for supabase-tools.config.json — paths, db, api, plugins, and validation. --- # Configuration All fields in `supabase-tools.config.json` are optional. Defaults work out of the box. ## Full Reference ```json { "paths": { "migrations": "supabase/migrations", "snapshot": "supabase/current", "docsOutput": "docs", "functions": "supabase/functions" }, "db": { "url": "postgresql://postgres:postgres@localhost:54322/postgres", "container": "supabase-db" }, "api": { "url": "http://localhost:54321", "studioUrl": "http://localhost:54323", "inbucketUrl": "http://localhost:54324" }, "project": { "name": "your-project-name" }, "plugins": [ { "path": "@sbtools/plugin-erd", "config": { "displayColumns": ["name", "email"] } }, { "path": "@sbtools/plugin-typegen", "config": { "typesOutput": "src/types/supabase.ts" } }, { "path": "@sbtools/plugin-db-test", "config": { "testsDir": "supabase/tests" } } ] } ``` ## Overrides **Database URL** — Set via env: `DATABASE_URL`, `SUPABASE_DB_URL`, or `POSTGRES_URL`. **External Supabase** — Set `api.url` to your instance (e.g. `https://your-project.supabase.co`). **Plugin config** — Each plugin has its own `config` object for plugin-specific settings (output paths, feature flags, etc.). Set `enabled: false` to disable without removing the entry. See individual plugin docs for available config keys. ## Validation Invalid config produces clear errors: ``` ❌ Invalid supabase-tools.config.json: • api.url: api.url must be a valid URL • plugins.0.path: Plugin path must be a non-empty string ``` Unknown top-level keys are rejected (strict mode). --- --- url: /supabase-tools/getting-started.md description: >- Install @sbtools/core and plugins, create supabase-tools.config.json, and run your first commands. --- # Getting Started ## Installation ```bash # Install the core CLI + migration studio (flagship plugin) npm install @sbtools/core @sbtools/plugin-migration-studio # Optional additional plugins npm install @sbtools/plugin-erd npm install @sbtools/plugin-migration-audit npm install @sbtools/plugin-logs ``` **npm:** [@sbtools/core](https://www.npmjs.com/package/@sbtools/core) · [@sbtools/plugin-migration-studio](https://www.npmjs.com/package/@sbtools/plugin-migration-studio) · [All packages](https://www.npmjs.com/org/sbtools) ## Configuration Run `npx sbt init` to create `supabase-tools.config.json`, or create it manually: ```json { "plugins": [ { "path": "@sbtools/plugin-migration-studio", "config": {} }, { "path": "@sbtools/plugin-migration-audit", "config": {} } ] } ``` You can use either: * **npm package name** — `"path": "@sbtools/plugin-migration-studio"` (resolved from node\_modules) * **filesystem path** — `"path": "./local-plugins/my-plugin"` (for local development) ## Quick Start (Human) ```bash # Start Supabase Docker services npx sbt start # Check service URLs and connection info npx sbt status # Start the dashboard UI (port 3400) npx sbt dashboard # Start the migration studio server (port 3335) npx sbt migration-studio # Run pending migrations npx sbt migrate ``` ## Quick Start (AI Agent / LLM) The migration studio exposes a typed HTTP tool surface that AI agents can call directly. ```bash # Start the studio server npx sbt migration-studio ``` Then the agent can orient itself in one call: ``` GET http://localhost:3335/api/studio/llm-context ``` This returns the current intent graph state, artifact freshness, full tool catalog with descriptions, and migration count — everything needed to understand the project without exploring files. **Typical agent workflow:** ``` # 1. Understand the current backend POST /api/studio/introspect → live DB schema → studio.schema.snapshot POST /api/studio/sql-parse → migration SQL AST → studio.sql.ast POST /api/studio/intent-init → build intent graph → studio.intent.graph # 2. Generate a migration POST /api/studio/create-table → CREATE TABLE SQL → saved to migrations/ # 3. Validate before applying POST /api/studio/release-gate → { status: 'pass' | 'fail', reasons: [] } # 4. Apply if safe POST /api/apply → runs migrations (blocked if gate fails) ``` Discover all available tools: ``` GET /api/studio/catalog?audience=backend-dev&mode=managed&type=tools ``` ## Requirements * Node.js 18+ * Docker (for `start`, `migrate`, and Docker-based plugins) ## Next Steps * [CLI Reference](./cli-reference) — All commands and flags * [Configuration](./configuration) — Full config schema * [Migration Studio](./plugins/plugin-migration-studio) — Complete tool reference * [Writing Plugins](./writing-plugins) — Extend with your own plugin --- --- url: /supabase-tools/architecture/implicit-file-contracts.md --- # Implicit File Contracts and Output Paths This document catalogs the implicit file conventions used by core and plugins. These are real but undocumented contracts. Prefer versioned artifacts (`.sbt/artifacts/`) for cross-plugin data sharing. ## Core output paths | Path | Producer | Purpose | |------|----------|---------| | `{docsOutput}/backend-atlas-data.json` | `sbt generate-atlas` | Atlas JSON data; plugins contribute via `getAtlasData` | ## Plugin output paths (by convention) | Path | Producer | Purpose | |------|----------|---------| | `{docsOutput}/migration-audit.html` | plugin-migration-audit | Full migration audit report | | `{docsOutput}/migration-audit/.html` | plugin-migration-audit | Per-migration detail pages (SQL viewer, operations, risk) | | `{docsOutput}/entity-relations/*.md` | plugin-erd | Mermaid ERD diagrams | | `{docsOutput}/openapi-spec.json` | core (docs) | Merged OpenAPI spec | ## Artifact paths (versioned, preferred) | Path | Artifact ID | Purpose | |------|-------------|---------| | `.sbt/artifacts/migration.analysis/1.0.0/latest.json` | migration.analysis | Migration audit result | | `.sbt/artifacts/depgraph.graph/1.0.0/latest.json` | depgraph.graph | Dependency graph | | `.sbt/artifacts/openapi.partial.deno-functions/1.0.0/latest.json` | openapi.partial.deno-functions | Deno functions OpenAPI partial | | `.sbt/artifacts/frontend.usage/1.0.0/latest.json` | frontend.usage | Frontend SDK usage scan | See [artifact registry](./artifact-registry.md) for the full contract set. ## Merge semantics * **Atlas categories**: Plugins add keys to `data.categories`. Key collision overwrites previous value. Use unique, namespaced keys (e.g. `migration_audit`, not `migrations`). * **Atlas stats**: Stats are appended; `countKey` collision updates `meta.object_counts`. Use lowercase-with-underscores for labels to avoid collision. * **Dashboard**: Plugins contribute sections via `getDashboardView()` returning JSON-serializable `DashboardSectionDef[]`. * **OpenAPI**: Deep merge of paths/components. Path collisions are overwritten; plugin should namespace or avoid PostgREST paths. ## Migration path New plugins should use versioned artifacts for data sharing. Atlas/OpenAPI hooks remain for UI integration but should use unique, prefixed identifiers. --- --- url: /supabase-tools/plugins/plugin-migration-studio-contributing.md description: >- How to contribute new Migration Studio tools and workflows using the catalog-driven architecture. --- # Migration Studio Contributing This guide explains how to add new tools and workflows to `@sbtools/plugin-migration-studio` using the current single-source catalog model. ## Architecture recap Tool and workflow catalogs are discovery-driven: * Tools are discovered from: `packages/plugin-migration-studio/src/tools/modules/*.tool.ts` * Workflows are discovered from: `packages/plugin-migration-studio/src/workflows/*.workflow.ts` Core implementation logic lives in: * `packages/plugin-migration-studio/src/tools/core/*.core.ts` CLI and HTTP wiring are generated from discovered tool definitions: * `packages/plugin-migration-studio/src/index.ts` * `packages/plugin-migration-studio/src/server.ts` ## Add a new tool 1. Create core logic: * `packages/plugin-migration-studio/src/tools/core/studio-.core.ts` 2. Create tool module: * `packages/plugin-migration-studio/src/tools/modules/studio-.tool.ts` 3. Export: * `tool` (`StudioToolDefinition`) * `metadata` (`ToolAudienceMetadata`) Minimum tool module shape: ```ts import { runMyTool } from "../core/studio-my-tool.core.js"; import type { StudioToolDefinition, ToolAudienceMetadata } from "../tool-definition.js"; export const tool: StudioToolDefinition = { id: "studio-my-tool", async run(ctx) { await runMyTool(ctx); }, cli: { command: "studio-my-tool", description: "Describe command", help: "help text...", parseArgs: () => undefined, }, http: { method: "POST", path: "/api/studio/my-tool", parseRequest: async () => undefined, }, }; export const metadata: ToolAudienceMetadata = { title: "My Tool", whatItDoes: "Plain-English behavior summary.", whenToUse: "When this tool should be used.", whatItNeeds: ["input A", "input B"], whatItProduces: ["artifact X"], audience: "backend-dev", controlModes: ["managed", "assisted"], }; ``` Notes: * `cli.aliases` is optional for backward-compatible command aliases. * `audience` and `controlModes` power persona/mode filtering in `studio-catalog` and `/api/studio/catalog`. ## Add a new workflow 1. Create file: * `packages/plugin-migration-studio/src/workflows/.workflow.ts` 2. Export a `workflow` object of type `StudioWorkflowDefinition`. 3. Reference existing tool IDs in workflow steps. Example: ```ts import type { WorkflowStep } from "@sbtools/sdk"; import type { StudioWorkflowDefinition } from "./workflow-definition.js"; const steps: WorkflowStep[] = [ { id: "my-step", tool: "studio-my-tool", inputArtifacts: [], outputArtifact: { id: "artifact.my", version: "1.0.0" }, }, ]; export const workflow: StudioWorkflowDefinition = { id: "my-workflow", description: "One-line workflow purpose", steps, }; ``` ## Validation checklist Run: ```bash npm run build -w packages/plugin-migration-studio npm run test -w packages/plugin-migration-studio npm run test:e2e -w packages/plugin-migration-studio ``` Recommended tests: * Tool unit tests under `packages/plugin-migration-studio/tests/tools/` * Workflow/catalog tests under `packages/plugin-migration-studio/tests/workflows/` * DB-aware e2e tests under `packages/plugin-migration-studio/tests/e2e/` DB-backed e2e behavior: * If DB is unavailable, tests should warn and skip DB-dependent flows. * Start local stack with `sbt start` to run full DB-dependent e2e paths. * Set `SBT_STUDIO_E2E_REQUIRE_DB=1` to fail fast instead of warning+skip when DB is unavailable. ## Catalog surfaces for contributors * CLI: * `sbt studio-catalog --type all` * `sbt studio-catalog --audience backend-dev --mode managed` * HTTP: * `GET /api/studio/catalog?type=all` * `GET /api/studio/catalog?audience=business&mode=assisted` Use these to confirm your new tool/workflow appears with expected metadata. --- --- url: /supabase-tools/plugins/plugin-migration-studio-platform.md description: >- Full-stack backend design platform — understand, design, generate, validate, and apply schema changes with confidence. --- # Migration Studio Platform [![npm](https://img.shields.io/npm/v/@sbtools/plugin-migration-studio.svg)](https://www.npmjs.com/package/@sbtools/plugin-migration-studio) The Migration Studio Platform is a workflow-driven backend design system that replaces raw SQL authoring with structured intent. It covers five layers: 1. **Understand** — Introspect your live DB and parse migrations into a confidence-scored *intent graph* 2. **Design** — Visual Schema Builder in the dashboard (tables, policies, functions, RPCs, views) 3. **Generate** — Scaffold tools write migration files from intent; no raw SQL authoring needed 4. **Validate** — RLS coverage check, migration lint, RPC security audit, release gate 5. **Apply** — Gate-enforced apply with snapshot staleness detection and audit log Works for both brownfield (existing Postgres/Supabase projects) and greenfield (start from scratch). ## Quick Start **Brownfield (existing project):** ```bash sbt studio-adopt # introspect DB + parse migrations → intent graph sbt studio-release-check # one-shot: rls-check + rpc-lint + lint + release-gate (exits 0/1) sbt migration-studio # open editor, apply ``` **CI / pre-push hook (no server needed):** ```bash sbt studio-release-check --json | jq '.status' # → "pass" or "fail" ``` **Greenfield (new project):** ```bash sbt studio-greenfield-init # create empty intent graph # open dashboard → Schema Builder → design tables and policies sbt studio-release-gate # run gate sbt migration-studio # apply ``` ## Commands ### Layer 1 — Understand | Command | Description | |---------|-------------| | `studio-introspect` | Query live DB → `studio.schema.snapshot` | | `studio-sql-parse` | Parse migration files → `studio.sql.ast` | | `studio-adopt` | Full adoption workflow (introspect → sql-parse → review → intent-sync → approve → intent-init) | | `studio-catalog [--audience ] [--mode ] [--type ]` | List discovered tools/workflows from catalog with persona/control-mode filters | | `studio-intent-patch --entity --action [--status managed\|assisted]` | Mutate a single entity's managed-status in the intent graph | | `studio-endpoint-map` | Derive PostgREST `EndpointNode` declarations for all managed entities and public-schema functions | ### Layer 2 — Generate | Command | Description | |---------|-------------| | `studio-greenfield-init` | Initialize an empty intent graph (mode: `greenfield`) | | `studio-create-table --schema --name [--no-rls]` | `CREATE TABLE` migration | | `studio-add-column --entity --name --type [--nullable] [--default ]` | `ALTER TABLE ... ADD COLUMN` (requires intent graph) | | `studio-add-rls-policy --entity --name --command --roles [--using ]` | `CREATE POLICY` migration | | `studio-add-index --entity --name --columns [--unique]` | `CREATE INDEX` migration | | `studio-add-constraint --entity --name --type ` | `ALTER TABLE ... ADD CONSTRAINT` | | `studio-add-function --schema --name --returns --language --body-file ` | `CREATE OR REPLACE FUNCTION` | | `studio-create-rpc --name --returns --language --body-file ` | Same as add-function, forces `schema: public` | | `studio-create-view --schema --name --query "SELECT ..."` | `CREATE OR REPLACE VIEW` | ### Layer 4 — Validate | Command | Description | |---------|-------------| | `studio-rls-check` | RLS coverage check per managed entity → `studio.rls.plan` + `studio.rls.report` | | `studio-rpc-lint` | Function security audit (DEFINER/search\_path/exposure) → `studio.rpc.plan` | | `studio-migration-plan` | Diff intent graph vs DB snapshot, classify changes → `studio.migration.plan` | | `studio-lint` | Lint migration files (destructive ops, naming, lock safety) → `studio.migration.lint` | | `studio-migration-lint` | Alias for `studio-lint` | | `studio-release-gate` | Aggregate all findings → pass/fail decision → `studio.release.gate` | | `studio-release-check` | **One-shot:** run full release-check workflow without a server. Exits 0 on pass, 1 on fail. `--json` for raw output. | ## Pipeline ``` Live DB ──────────────────► introspect ──────► studio.schema.snapshot Migration files (.sql) ────► sql-parse ──────► studio.sql.ast │ intent-sync (confidence scoring) │ studio.intent.sync-report │ intent-init (build graph) │ studio.intent.graph ``` Each step produces a **versioned artifact** written to `.sbt/artifacts/`. Artifacts are JSON files with stable IDs — readable, diffable, and usable as inputs to downstream tools. ## The adoption workflow The **adopt-backend** workflow runs all four tools in sequence with two human checkpoints: | Step | Tool | Output artifact | Checkpoint | |------|------|-----------------|------------| | 1 | `studio-introspect` | `studio.schema.snapshot` | — | | 2 | `studio-sql-parse` | `studio.sql.ast` | — | | — | *(review confidence scores)* | — | **review** | | 3 | `studio-intent-sync` | `studio.intent.sync-report` | — | | — | *(confirm managed scope)* | — | **approve** | | 4 | `studio-intent-init` | `studio.intent.graph` | — | After step 2 the workflow pauses. You review the sync report to see which entities were matched between DB and migrations, and at what confidence. After you approve, `intent-init` builds the final intent graph. ### Workflow catalog (current) | Workflow | Steps | Status | |------|-------|--------| | `adopt-backend` | `studio-introspect -> studio-sql-parse -> studio-intent-sync -> studio-intent-init` | Implemented | | `release-check` | `studio-rls-check -> studio-rpc-lint -> studio-lint -> studio-release-gate` | Implemented | | `create-table` | `studio-create-table -> studio-lint` | Implemented (guided workflow definition) | | `add-rls-policy` | `studio-add-rls-policy -> studio-rls-check` | Implemented (guided workflow definition) | ## What each tool produces ### `studio-introspect` → `studio.schema.snapshot` Queries `pg_catalog` and `information_schema` for the live state of your database: * All user tables with full column definitions (type, nullable, default, identity, generated) * Constraints (PRIMARY KEY, UNIQUE, CHECK, FOREIGN KEY) * Indexes (method, uniqueness, partial predicate) * RLS policies (command, roles, USING/WITH CHECK expressions) * Functions (language, security type, return type) * Views and materialized views * Triggers * Extensions System schemas (`auth`, `storage`, `realtime`, `supabase_functions`, `extensions`, `pg_catalog`, `information_schema`) are excluded from all queries. ```json // .sbt/artifacts/studio.schema.snapshot/1.0.0/latest.json { "id": "studio.schema.snapshot", "version": "1.0.0", "data": { "capturedAt": "2025-01-15T14:22:31.000Z", "pgVersion": "PostgreSQL 15.6", "entities": [ { "id": "public.users", "schema": "public", "name": "users", "managedStatus": "managed", "confidence": 0.9, "columns": [ { "name": "id", "type": "uuid", "nullable": false, "default": "gen_random_uuid()" }, { "name": "email", "type": "text", "nullable": false }, { "name": "created_at", "type": "timestamptz", "nullable": true, "default": "now()" } ], "constraints": [ { "name": "users_pkey", "type": "primary_key", "columns": ["id"], "definition": "PRIMARY KEY (id)" } ], "indexes": [] } ], "policies": [...], "extensions": [ { "id": "extension:uuid-ossp", "type": "extension", "name": "uuid-ossp", "details": { "version": "1.1" } } ] } } ``` ### `studio-sql-parse` → `studio.sql.ast` Reads every `.sql` file in your migrations directory alphabetically. For each file: * Parses all statements using `@supabase/pg-parser` (real Postgres C parser compiled to WASM) * Extracts structured intent nodes: entities, policies, functions, views, triggers, extensions * Runs the regex analyzer for risk metadata (destructive ops, transaction wrapping) * Records opaque blocks (DO $$ ... $$, statements that don't map to an intent node type) All extracted nodes are aggregated into `allEntities`, `allPolicies`, etc. for easy access by downstream tools. ```json // .sbt/artifacts/studio.sql.ast/1.0.0/latest.json { "data": { "migrationsDir": "supabase/migrations", "parsedAt": "2025-01-15T14:22:35.000Z", "files": [ { "filename": "20240101_init.sql", "statementCount": 4, "opaqueBlockCount": 1, "entities": [{ "id": "public.users", "schema": "public", "name": "users", "columns": [...] }], "policies": [{ "id": "public.users.users_select", "command": "SELECT", "roles": ["authenticated"] }], "riskMeta": { "riskFlags": { "hasTransaction": true, "hasDestructive": false } } } ], "allEntities": [...], "allPolicies": [...] } } ``` ### `studio-intent-sync` → `studio.intent.sync-report` Matches entities between the DB snapshot and the SQL AST by their stable ID (`schema.name`). Produces a confidence score per entity: | Condition | Score | |-----------|-------| | Entity found in both DB and migration files (base) | 0.85 | | Column count exactly matches | +0.05 | | All column names match | +0.05 | | All column types match | +0.03 | | Each column in DB missing from SQL | −0.10 | | Each type mismatch per column | −0.05 | | Entity in DB only (no migration found) | **0.35 fixed** | | Entity in migrations only (not yet in DB) | **0.70 fixed** | Score ≥ 0.80 → `managed`. 0.50–0.79 → `assisted`. < 0.50 → `opaque`. ```json // .sbt/artifacts/studio.intent.sync-report/1.0.0/latest.json { "data": { "syncedAt": "2025-01-15T14:22:37.000Z", "matched": [ { "objectId": "public.users", "objectType": "entity", "confidence": 0.98 }, { "objectId": "public.orders", "objectType": "entity", "confidence": 0.73, "driftDetails": "column 'legacy_id' in DB but not in SQL" } ], "unmatchedDb": [ { "objectId": "public.audit_log", "objectType": "entity", "reason": "no_sql_source" } ], "unmatchedIntent": [ { "objectId": "public.invoices", "objectType": "entity" } ], "summary": { "matchedCount": 14, "unmatchedDbCount": 2, "unmatchedIntentCount": 1, "averageConfidence": 0.881 } } } ``` ### `studio-intent-init` → `studio.intent.graph` Builds the intent graph from the sync report. High-confidence entities become `managed` — the platform understands them well enough to generate, update, and lint changes. Low-confidence or DB-only entities become `opaque` blocks, preserved verbatim. ```json // .sbt/artifacts/studio.intent.graph/1.0.0/latest.json { "data": { "version": "1.0.0", "mode": "brownfield-managed", "entities": [ { "id": "public.users", "managedStatus": "managed", "confidence": 0.98, "columns": [...] }, { "id": "public.orders", "managedStatus": "assisted", "confidence": 0.73, "columns": [...] } ], "opaqueBlocks": [ { "id": "entity:public.audit_log", "reason": "too-complex", "astAvailable": false, "touchedObjects": ["public.audit_log"] } ], "managedScope": { "schemas": { "public": "managed", "auth": "excluded" }, "explicitExclusions": [] } } } ``` ## Running the workflow Run via CLI (`sbt studio-adopt`), dashboard Adoption page (`sbt migration-studio` then `sbt dashboard` → Adoption → Start Adoption), or API (`startWorkflow` / `resumeWorkflow` from `engine/runner.js`). The Adoption page fetches live data from the studio server (port 3335); it does not use `backend-atlas-data.json`. ## Scaffold Tools (Layer 3 — Generate) All scaffold tools write a timestamped `.sql` migration file to your migrations directory and return `{ sql, filename }`. No DB connection required (except `studio-add-column`, which needs the intent graph). | Tool | Generates | Intent graph required? | |------|-----------|----------------------| | `studio-create-table` | `CREATE TABLE ... ENABLE ROW LEVEL SECURITY` | No | | `studio-add-column` | `ALTER TABLE ... ADD COLUMN ...` | Yes (entity must be managed) | | `studio-add-rls-policy` | `CREATE POLICY "..." ON ... FOR ... TO ... USING (...)` | No | | `studio-add-index` | `CREATE INDEX ... ON ... (...)` | No | | `studio-add-constraint` | `ALTER TABLE ... ADD CONSTRAINT ...` | No | | `studio-add-function` | `CREATE OR REPLACE FUNCTION ...` | No | | `studio-create-rpc` | Same as add-function, forces `schema: public` | No | | `studio-create-view` | `CREATE OR REPLACE VIEW schema.name AS ` | No | ## Validation Tools (Layer 4 — Validate) Each validation tool reads existing artifacts and produces structured findings. No DB writes. ### `studio-rls-check` → `studio.rls.plan` + `studio.rls.report` For every managed entity with RLS enabled: * Verifies SELECT/INSERT/UPDATE/DELETE policy coverage (ALL command counts for all) * Flags tables with no policies at all * Flags SECURITY DEFINER functions that could bypass RLS ### `studio-rpc-lint` → `studio.rpc.plan` For every `FunctionNode` in the intent graph: * `DEFINER_NO_SEARCH_PATH` — definer function without `search_path` set (SQL injection risk) * `DEFINER_PUBLIC_EXPOSURE` — definer function in public schema exposed to PostgREST without explicit auth guard * `EMPTY_FUNCTION_BODY` — function registered in intent graph with no body ### `studio-migration-plan` → `studio.migration.plan` Diffs the intent graph against the live DB snapshot. Classifies each change: | Change class | Example | Safety | |---|---|---| | `additive_safe` | Add nullable column | Always safe | | `additive_with_default` | Add NOT NULL column with default | Safe with lock note | | `type_change_narrowing` | `text` → `varchar(100)` | Risky | | `drop` | Drop column or table | Dangerous | | `policy_change` | Update RLS expression | Review carefully | | `constraint_change` | Add FK | Table scan required | Additive changes are ordered first; destructive changes last. Includes a `snapshotHash` to detect stale plans. ### `studio-lint` → `studio.migration.lint` Reads the SQL AST artifact. Checks per migration file: * `TRUNCATE_DETECTED` — error (blocks release gate) * `DROP_DETECTED` — warning * `DESTRUCTIVE_NO_TRANSACTION` — warning (DROP/TRUNCATE not wrapped in transaction) * `LOW_PARSE_CONFIDENCE` — info (high opaque block ratio) * `NAMING_VIOLATION` — info (filename not timestamp-prefixed) ### `studio-release-gate` → `studio.release.gate` Aggregates all findings from RLS report, RPC plan, and migration lint: * Any lint error or RLS gap → `status: 'fail'` (blocking) * Any warning → gate warning (non-blocking) * `NO_VALIDATION` when no evidence artifacts exist `POST /api/apply` reads this artifact. If `status === 'fail'` → 422 response, apply blocked. If no gate artifact → apply proceeds with a `gateWarning` in the response. ## Greenfield Workflow For new projects with no existing DB: ```bash sbt studio-greenfield-init # creates empty intent graph (mode: 'greenfield') # open dashboard → /schema-builder → design tables, policies, functions, views # each "Generate Migration" button writes a .sql file to your migrations dir sbt studio-release-gate # validate before applying sbt migration-studio # apply via browser editor ``` ## Intent Graph Mutation After the adoption workflow, you can reclassify entities without re-running the full workflow: ```bash # Exclude a table from management (e.g. a Supabase internal table) sbt studio-intent-patch --entity public.audit_log --action exclude # Promote an assisted entity to managed sbt studio-intent-patch --entity public.orders --action set-status --status managed ``` The Adoption dashboard page provides the same functionality via per-row "Manage" and "Exclude" buttons with color-coded status badges. ## Endpoint Mapping Derives PostgREST endpoint declarations from the intent graph: ```bash sbt studio-endpoint-map ``` For each managed entity: creates a `table-crud` endpoint with `allowedRoles` derived from its RLS policies. For each managed public-schema function: creates an `rpc` endpoint. Results are written back into the intent graph as `EndpointNode[]`. ## Apply (Layer 5) `POST /api/apply` in the migration studio server enforces the full apply safety chain: 1. **Gate check** — reads `studio.release.gate`; blocks with 422 if `status === 'fail'` 2. **Snapshot staleness** — reads `studio.migration.plan`'s `snapshotHash`, recomputes from current snapshot; includes `snapshotStale: true` in the response if the DB has changed since the plan was generated (non-blocking warning) 3. **Apply** — runs migrations 4. **Audit log** — writes `studio.apply.log` with `{ appliedAt, output, success: true }` for every successful apply ## Dashboard Integration * **Adoption page** (`/adoption`) — workflow status, Start/Resume/Restart, step table, intent graph entity list with interactive status badges and "Manage"/"Exclude" buttons per row, "Map Endpoints" button. Requires `sbt migration-studio` running. * **Schema Builder page** (`/schema-builder`) — visual forms for creating tables, RLS policies, functions, RPCs, and views; live SQL preview; "Initialize Greenfield Project" when no intent graph exists; Release Gate panel. * **Overview** — intent entities appear as a tab after `sbt generate-atlas`. The plugin contributes `studio_intent_entities` via `getAtlasData()`. ## Reading the artifacts All artifacts land in `.sbt/artifacts/` as `latest.json` files: ``` .sbt/artifacts/ studio.schema.snapshot/1.0.0/latest.json ← live DB state studio.sql.ast/1.0.0/latest.json ← migration file parse results studio.intent.sync-report/1.0.0/latest.json ← confidence-scored match studio.intent.graph/1.0.0/latest.json ← final intent graph + endpoints studio.workflow.run/1.0.0/latest.json ← run state (step results, status) studio.rls.plan/1.0.0/latest.json ← proposed RLS policies studio.rls.report/1.0.0/latest.json ← coverage gaps and warnings studio.rpc.plan/1.0.0/latest.json ← function security findings studio.migration.plan/1.0.0/latest.json ← ordered SQL change plan studio.migration.lint/1.0.0/latest.json ← migration file lint results studio.release.gate/1.0.0/latest.json ← pass/fail release decision studio.apply.log/1.0.0/latest.json ← most recent apply record ``` Read them from code: ```ts import { readArtifactOrNull } from "@sbtools/sdk"; import type { SchemaSnapshotData } from "@sbtools/plugin-migration-studio/src/artifacts/writers.js"; const snapshot = readArtifactOrNull(ctx, "studio.schema.snapshot", "1.0.0"); if (snapshot) { console.log(`${snapshot.data.entities.length} tables found`); console.log(`${snapshot.data.policies.length} RLS policies`); } ``` ## Workflow engine The engine (`engine/runner.ts`) is a ~120-line sequential pipeline runner with no external framework dependency: ```ts import { startWorkflow, resumeWorkflow } from "...engine/runner.js"; import type { WorkflowStep, ToolRegistry } from "...engine/runner.js"; ``` Key behaviors: * **Sequential execution** — steps run in order; a step cannot start until the previous completes * **Checkpoint pause** — steps with `checkpoint: 'review'` or `'approve'` pause the run and persist state; the caller resumes when ready * **Failure isolation** — a failing step records the error in the run artifact and stops; subsequent steps are not called * **Skip conditions** — steps with `skipWhen: (run) => boolean` are skipped if the condition holds at runtime * **Full audit trail** — every step result (status, artifact produced, timestamps, error) is written to `studio.workflow.run` Resuming a paused run: ```ts // Load persisted run from artifact storage const existingRun = loadWorkflowRun(ctx); if (existingRun?.status === 'waiting_checkpoint') { const continued = await resumeWorkflow(existingRun, steps, ctx, registry); } ``` ## Artifact IDs | Artifact | ID | Version | Written by | |----------|----|---------|-----------| | DB schema snapshot | `studio.schema.snapshot` | 1.0.0 | `studio-introspect` | | SQL AST parse results | `studio.sql.ast` | 1.0.0 | `studio-sql-parse` | | Confidence sync report | `studio.intent.sync-report` | 1.0.0 | `studio-intent-sync` | | Intent graph | `studio.intent.graph` | 1.0.0 | `studio-intent-init`, `studio-intent-patch`, `studio-endpoint-map`, `studio-greenfield-init` | | Workflow run | `studio.workflow.run` | 1.0.0 | workflow engine | | RLS plan | `studio.rls.plan` | 1.0.0 | `studio-rls-check` | | RLS coverage report | `studio.rls.report` | 1.0.0 | `studio-rls-check` | | RPC security plan | `studio.rpc.plan` | 1.0.0 | `studio-rpc-lint` | | Migration change plan | `studio.migration.plan` | 1.0.0 | `studio-migration-plan` | | Migration lint results | `studio.migration.lint` | 1.0.0 | `studio-lint` | | Release gate decision | `studio.release.gate` | 1.0.0 | `studio-release-gate` | | Apply audit log | `studio.apply.log` | 1.0.0 | `POST /api/apply` | ## Confidence scoring explained Confidence is a 0.0–1.0 score per entity node that answers: *"How well does the platform understand this object?"* | Range | Status | What the platform does | |-------|--------|------------------------| | **0.80 – 1.0** | `managed` | Full management. Can generate, update, lint, gate. | | **0.50 – 0.79** | `assisted` | Managed with caution. User must review before updates. | | **< 0.50** | `opaque` | Preserved verbatim. Not touched by generation. | An entity gets a high confidence score when: * It appears in both the live DB and the migration files * Its column names and types match between both sources * There are no unexplained columns in the DB that don't appear in any migration An entity gets a low confidence score when: * It exists only in the DB with no migration history (manually created, seeded externally) * Its DB columns diverge significantly from what the migrations declare Opaque nodes are not deleted — they become `OpaqueBlock` entries in the intent graph with `reason: 'too-complex'`. Future tools can promote opaque nodes to assisted or managed as more intent graph node types are added. ## Managed scope rules The platform never touches what it doesn't understand: * **`managed`** — fully controlled; platform can generate and update SQL * **`assisted`** — partially understood; user reviews before any changes * **`opaque`** — unknown or too-complex; preserved verbatim, never rewritten * **`excluded`** — explicitly opted out; ignored even if parseable System schemas (`auth`, `storage`, `realtime`, etc.) are always `excluded`. User schemas default to `managed` in the scope declaration but individual objects within them may be `opaque` based on confidence. ## Dependencies `pg` is required for `studio-introspect`. It is already listed in `plugin-migration-studio`'s dependencies. `@supabase/pg-parser` (WASM) is required for `studio-sql-parse`. It lazy-loads the WASM binary on first parse call — no explicit initialization needed. ## See also * [Migration Studio editor](./plugin-migration-studio.md) — the SQL authoring UI * [Migration Audit](./plugin-migration-audit.md) — drift detection against the DB tracking table * [Artifact Registry](../architecture/artifact-registry.md) — all artifact IDs in the project * [Artifact Contract Guide](../architecture/artifact-contract-guide.md) — how to produce and consume artifacts --- --- url: /supabase-tools/plans/platform-status-and-roadmap.md --- # Migration Studio Platform — Status & Roadmap to 100% > Last updated: 2026-02-20 (catalog expansion + persona roadmap update) > This document is a living status report. Update the checkboxes and sections as work completes. *** ## The Vision A **workflow-driven backend platform** that replaces raw SQL authoring with structured intent: 1. **Understand** what exists (introspect live DB + parse migrations → confidence-scored knowledge graph) 2. **Design** what should exist (visual schema builder, policy builder, function builder in the dashboard) 3. **Generate** atomic SQL from that intent (migration files, not raw editing) 4. **Validate** before applying (lint, RLS coverage check, release gate) 5. **Apply** with confidence (gate must pass; audit trail in artifacts) The end state: a developer opens the dashboard, designs a table with columns and RLS policies through forms, clicks "Generate Migration", reviews the SQL in the editor, and applies it — without writing a line of SQL directly. *** ## Layer Map The platform has five conceptual layers. Implementation status follows. ``` ┌─────────────────────────────────────────────────────────────────┐ │ 5. APPLY │ Gate check → sbt migrate → audit trail │ ├─────────────────┼─────────────────────────────────────────────────┤ │ 4. VALIDATE │ RLS check, RPC lint, migration lint, release gate│ ├─────────────────┼─────────────────────────────────────────────────┤ │ 3. GENERATE │ Scaffold tools → migration files │ ├─────────────────┼─────────────────────────────────────────────────┤ │ 2. DESIGN │ Visual builder UI (forms → intent → preview) │ ├─────────────────┼─────────────────────────────────────────────────┤ │ 1. UNDERSTAND │ Introspect DB + parse migrations → intent graph │ └─────────────────┴─────────────────────────────────────────────────┘ ``` *** ## Current Status by Layer ### Layer 1 — Understand ✅ Complete The "brownfield adoption" chain is fully implemented and tested. | Component | File | Status | |---|---|---| | SQL parser (WASM) | `sql-parser.ts` | ✅ Done | | DB introspect tool | `tools/modules/studio-introspect.tool.ts` | ✅ Done | | Migration file parser | `tools/modules/studio-sql-parse.tool.ts` | ✅ Done | | Confidence scorer | `tools/modules/studio-intent-sync.tool.ts` | ✅ Done | | Intent graph builder | `tools/modules/studio-intent-init.tool.ts` | ✅ Done | | Workflow engine | `engine/runner.ts` | ✅ Done | | adopt-backend workflow | `workflows/adopt-backend.workflow.ts` | ✅ Done | | `sbt studio-adopt` CLI | `index.ts` | ✅ Done | | HTTP adopt routes | `server.ts` | ✅ Done | | Dashboard Adoption page | `pages/Adoption.tsx` | ✅ Done (interactive — status badges, Exclude/Manage buttons) | | Artifact writers + schemas | `artifacts/writers.ts` | ✅ Done | **What works end-to-end today:** Run `sbt studio-adopt`, watch it introspect your live DB, parse your migration files, score confidence per entity, and produce an intent graph that classifies every table/function/policy as `managed`, `assisted`, or `opaque`. **What works end-to-end in Phase 10:** * `sbt studio-intent-patch --entity public.logs --action exclude` patches a single entity's classification and persists to disk * `sbt studio-endpoint-map` derives `EndpointNode` declarations for all managed entities and public-schema functions * The Adoption dashboard page is now interactive: color-coded status badges, "Manage" / "Exclude" buttons per row, "Map Endpoints" button with result count *** ### Layer 2 — Design ✅ Complete A visual Schema Builder page exists at `/schema-builder` in the dashboard. It calls the existing scaffold HTTP routes on port 3335. | Component | File | Status | |---|---|---| | Schema entity builder (new table form) | `pages/SchemaBuilder.tsx` | ✅ Done | | RLS policy builder (visual form) | `pages/SchemaBuilder.tsx` | ✅ Done | | Live SQL preview (client-side) | `pages/SchemaBuilder.tsx` | ✅ Done | | Column editor (add/remove/change type) | `pages/SchemaBuilder.tsx` | ✅ Done (inline table) | | Scaffold HTTP routes | `server.ts` | ✅ All 13 routes wired | | Scaffold tools (backend) | `tools/modules/studio-*.tool.ts` + `tools/core/studio-*.core.ts` | ✅ All 8 tools | | Function builder (form + body editor) | `pages/SchemaBuilder.tsx` | ✅ Done (Phase 11) | | RPC builder (public schema, PostgREST) | `pages/SchemaBuilder.tsx` | ✅ Done (Phase 11) | | View builder | `pages/SchemaBuilder.tsx` | ✅ Done (Phase 11) | | Greenfield init (start with no DB) | `pages/SchemaBuilder.tsx` | ✅ Done (Phase 9) | | Intent graph mutation from UI | `pages/Adoption.tsx` | ✅ Done (Phase 10 — entity row actions) | **What works:** Open the dashboard at `/schema-builder` → fill in forms for any object type (table, RLS policy, function, RPC, view) → live SQL preview updates as you type → click "Generate Migration" → file written to disk → ready to apply. The Adoption page lets you reclassify entities directly and map endpoints. *** ### Layer 3 — Generate ✅ Complete All scaffold tools are implemented. Each generates a migration SQL file and writes it to the migrations directory. #### Scaffold tools | Tool | File | Generates | Status | |---|---|---|---| | add-column | `tools/core/studio-add-column.core.ts` | `ALTER TABLE ... ADD COLUMN ...` | ✅ | | add-function | `tools/core/studio-add-function.core.ts` | `CREATE OR REPLACE FUNCTION ...` | ✅ | | create-rpc | `tools/core/studio-create-rpc.core.ts` | Same, forced `schema: public` | ✅ | | create-table | `tools/core/studio-create-table.core.ts` | `CREATE TABLE ... ENABLE ROW LEVEL SECURITY` | ✅ (Phase 6) | | add-rls-policy | `tools/core/studio-add-rls-policy.core.ts` | `CREATE POLICY ...` | ✅ (Phase 6) | | add-index | `tools/core/studio-add-index.core.ts` | `CREATE INDEX ...` | ✅ (Phase 6) | | add-constraint | `tools/core/studio-add-constraint.core.ts` | `ALTER TABLE ... ADD CONSTRAINT ...` | ✅ (Phase 6) | | create-view | `tools/core/studio-create-view.core.ts` | `CREATE OR REPLACE VIEW ...` | ✅ (Phase 11) | **Note on intent graph dependency:** `add-column` requires the intent graph. All other scaffold tools work without it. `create-table` optionally writes a minimal `EntityNode` after creation for greenfield bootstrap. #### Workflow coverage | Workflow | Steps | Status | |---|---|---| | `adopt-backend` | introspect → sql-parse → intent-sync (review) → intent-init (approve) | ✅ Done | | `greenfield-init` | init-graph (empty) → Schema Builder forms → generate-create-table × N | ✅ Done (Phase 9) | | `release-check` | rls-check → rpc-lint → migration-lint → release-gate | ✅ Cataloged | | `create-table` | generate-create-table → migration-lint | ✅ Cataloged (guided) | | `add-rls-policy` | generate-add-rls-policy → rls-check preview | ✅ Cataloged (guided) | *** ### Layer 4 — Validate ✅ Complete All five validation tools are implemented, tested, wired to HTTP routes, and exposed as CLI commands. | Artifact | Writer | Tool | CLI | HTTP | Status | |---|---|---|---|---|---| | `studio.rls.plan` | ✅ `writeRlsPlanArtifact` | ✅ `tools/rls-check.ts` | ✅ `studio-rls-check` | ✅ `POST /api/studio/rls-check` | Done | | `studio.rls.report` | ✅ `writeRlsReportArtifact` | ✅ (same tool) | ✅ | ✅ | Done | | `studio.rpc.plan` | ✅ `writeRpcPlanArtifact` | ✅ `tools/rpc-lint.ts` | ✅ `studio-rpc-lint` | ✅ `POST /api/studio/rpc-lint` | Done | | `studio.migration.plan` | ✅ `writeMigrationPlanArtifact` | ✅ `tools/migration-plan.ts` | ✅ `studio-migration-plan` | ✅ `POST /api/studio/migration-plan` | Done | | `studio.migration.lint` | ✅ `writeMigrationLintArtifact` | ✅ `tools/core/studio-migration-lint.core.ts` | ✅ `studio-lint` (alias: `studio-migration-lint`) | ✅ `POST /api/studio/migration-lint` | Done | | `studio.release.gate` | ✅ `writeReleaseGateArtifact` | ✅ `tools/release-gate.ts` | ✅ `studio-release-gate` | ✅ `POST /api/studio/release-gate` | Done | **What each tool does:** * **`rls-check`** — Reads intent graph. For every managed entity, verifies SELECT/INSERT/UPDATE/DELETE policy coverage (ALL command counts for all). Flags SECURITY DEFINER functions without `search_path`. Produces `studio.rls.plan` (proposed policies) and `studio.rls.report` (coverage + gap analysis). * **`rpc-lint`** — Reads all `FunctionNode` entries in intent graph. Checks: DEFINER\_NO\_SEARCH\_PATH, DEFINER\_PUBLIC\_EXPOSURE (definer in public schema), EMPTY\_FUNCTION\_BODY. Skips opaque/excluded functions. Produces `studio.rpc.plan`. * **`migration-plan`** — Diffs intent graph vs schema snapshot. Classifies each change as `additive_safe` / `additive_with_default` / `type_change_narrowing` / `drop` / `policy_change` / `constraint_change`. Sorts additive changes first, destructive last. Produces `studio.migration.plan` with a `snapshotHash`. * **`migration-lint`** — Reads SQL AST artifact. Checks per migration file: TRUNCATE\_DETECTED (error), DROP\_DETECTED (warning), DESTRUCTIVE\_NO\_TRANSACTION (warning), LOW\_PARSE\_CONFIDENCE (info), NAMING\_VIOLATION (non-timestamped filename). Produces `studio.migration.lint`. * **`release-gate`** — Aggregates findings from RLS report, RPC plan, and migration lint (all optional). NO\_VALIDATION if no evidence. RLS gaps and lint errors → blocking. All warnings → gate warnings. Produces `studio.release.gate` with pass/fail status. *** ### Layer 5 — Apply ✅ Complete The core `sbt migrate` command and the migration studio's `POST /api/apply` route exist and work. Gate enforcement, snapshot staleness detection, and an audit log are all implemented. | Component | Status | Notes | |---|---|---| | `sbt migrate` | ✅ Done | Core command, applies SQL files | | `POST /api/apply` in studio | ✅ Done | Browser-triggered apply | | Release gate enforcement at apply | ✅ Done | 422 block when `studio.release.gate` status is `fail`; warning header when no gate artifact | | Snapshot staleness check | ✅ Done (Phase 11) | Reads `studio.migration.plan` hash, re-hashes current snapshot; returns `snapshotStale: true` if changed (non-blocking warning) | | Audit artifact written on apply | ✅ Done (Phase 11) | Writes `studio.apply.log` (`STUDIO_ARTIFACTS.APPLY_LOG`) with `appliedAt`, `output`, `success` after every successful apply | *** ## Tools & Workflows Catalogue (Complete Picture) ### Tools: what exists vs what the artifact system anticipates | Tool name | Artifact out | Implemented | CLI | HTTP | |---|---|---|---|---| | `studio-introspect` | `studio.schema.snapshot` | ✅ | ✅ | ✅ | | `studio-sql-parse` | `studio.sql.ast` | ✅ | ✅ | ✅ | | `studio-intent-sync` | `studio.intent.sync-report` | ✅ | (via adopt) | ✅ (`POST /api/studio/intent-sync`) | | `studio-intent-init` | `studio.intent.graph` | ✅ | (via adopt) | ✅ (`POST /api/studio/intent-init`) | | `studio-add-column` | migration file | ✅ | ✅ | ✅ | | `studio-add-function` | migration file | ✅ | ✅ | ✅ | | `studio-create-rpc` | migration file | ✅ | ✅ | ✅ | | `studio-create-table` | migration file | ✅ | ✅ | ✅ | | `studio-add-rls-policy` | migration file | ✅ | ✅ | ✅ | | `studio-add-index` | migration file | ✅ | ✅ | ✅ | | `studio-add-constraint` | migration file | ✅ | ✅ | ✅ | | `studio-create-view` | migration file | ✅ | ✅ | ✅ | | `studio-rls-check` | `studio.rls.plan` + `studio.rls.report` | ✅ | ✅ | ✅ | | `studio-rpc-lint` | `studio.rpc.plan` | ✅ | ✅ | ✅ | | `studio-migration-plan` | `studio.migration.plan` | ✅ | ✅ | ✅ | | `studio-lint` | `studio.migration.lint` | ✅ | ✅ (`studio-migration-lint` alias) | ✅ | | `studio-release-gate` | `studio.release.gate` | ✅ | ✅ | ✅ | | `studio-greenfield-init` | `studio.intent.graph` (greenfield) | ✅ | ✅ | ✅ | | `studio-intent-patch` | `studio.intent.graph` (mutated) | ✅ | ✅ | ✅ | | `studio-endpoint-map` | `studio.intent.graph` (endpoints added) | ✅ | ✅ | ✅ | ### Workflows: what exists vs what's planned | Workflow | Steps | Implemented | |---|---|---| | `adopt-backend` | introspect → sql-parse → intent-sync (review) → intent-init (approve) | ✅ | | `greenfield-init` | init-graph (empty) → Schema Builder forms → generate-create-table × N | ✅ (tool + UI) | | `create-table` | generate-create-table → migration-lint | ✅ (cataloged guided workflow) | | `add-rls-policy` | generate-add-rls-policy → rls-check preview | ✅ (cataloged guided workflow) | | `release-check` | rls-check → rpc-lint → migration-lint → release-gate | ✅ | *** ## Gap Analysis: How Far to 100% ``` Layer 1: Understand ████████████████████ 100% (endpoint mapping + UI mutation complete) Layer 2: Design ████████████████████ 100% (table + RLS + function + RPC + view builders) Layer 3: Generate ████████████████████ 100% (all 8 scaffold tools + greenfield-init) Layer 4: Validate ████████████████████ 100% (5/5 validation tools, CLI + HTTP wired) Layer 5: Apply ████████████████████ 100% (apply + gate enforcement + snapshot verify + audit log) Overall platform: 100% for original core capability layers; catalog richness now expanding ✅ ``` The infrastructure investment is high-quality and pays forward. Core capability layers are complete; current work is about catalog ergonomics, persona coverage, and guided workflow breadth. *** ## Roadmap to 100% ### Phase 6 — Missing Scaffold Tools (Layer 3 completion) The most impactful next batch: filling in the generation tools that users actually need. **Priority 1 — `generate-create-table`** ```ts interface CreateTableInput { schema: string; name: string; columns: Array<{ name: string; type: string; nullable: boolean; default?: string; identity?: boolean }>; primaryKey?: string[]; // column names; defaults to first identity column enableRls: boolean; // default: true (always yes for Supabase) } ``` Generates: ```sql CREATE TABLE public.users ( id uuid PRIMARY KEY DEFAULT gen_random_uuid(), email text NOT NULL, created_at timestamptz DEFAULT now() ); ALTER TABLE public.users ENABLE ROW LEVEL SECURITY; ``` Does **not** require an existing intent graph — this is the greenfield entry point. After writing the file, optionally writes a minimal `EntityNode` into the intent graph to avoid a round-trip `studio-adopt` for brand-new tables. **Priority 2 — `generate-add-rls-policy`** ```ts interface AddRlsPolicyInput { entityId: string; policy: { name: string; command: 'SELECT' | 'INSERT' | 'UPDATE' | 'DELETE' | 'ALL'; roles: string[]; using?: string; // USING expression withCheck?: string; // WITH CHECK expression permissive?: boolean; // default true }; } ``` Generates: ```sql CREATE POLICY "users_select_authenticated" ON public.users FOR SELECT TO authenticated USING (auth.uid() = id); ``` **Priority 3 — `generate-add-index` and `generate-add-constraint`** Simple generators for common schema additions: ```sql -- index CREATE INDEX users_email_idx ON public.users (email); -- FK constraint ALTER TABLE public.orders ADD CONSTRAINT orders_user_id_fkey FOREIGN KEY (user_id) REFERENCES public.users (id) ON DELETE CASCADE; ``` *** ### Phase 7 — Validation Chain (Layer 4) Five tools that read the intent graph + DB snapshot and produce structured findings. No DB writes — pure analysis. **`rls-check`** (highest value): * Input: `studio.intent.graph`, `studio.schema.snapshot` * Logic: for each managed entity with RLS enabled, verify SELECT/INSERT/UPDATE/DELETE coverage. Flag tables with no policies. Flag SECURITY DEFINER functions. * Output: `studio.rls.plan` (what policies should exist) + `studio.rls.report` (gaps + warnings) * This one tool immediately answers: "is my Supabase project secure?" **`migration-lint`**: * Input: `studio.sql.ast` (parsed migration files) * Logic: reuses `riskMeta` already extracted by sql-parse. Adds: naming convention checks, lock-unsafe patterns, missing transaction wrappers around destructive ops. * Output: `studio.migration.lint` **`rpc-lint`**: * Input: `studio.intent.graph` (FunctionNode list) * Logic: check security model, search\_path, exposed to PostgREST without auth guard. * Output: `studio.rpc.plan` **`migration-plan`**: * Input: `studio.intent.graph` + `studio.schema.snapshot` * Logic: diff intent graph against live DB. Classify each delta by `ChangeClass`. Order changes safely. * Output: `studio.migration.plan` **`release-gate`**: * Input: `studio.rls.report`, `studio.rpc.plan`, `studio.migration.lint` * Logic: aggregate all findings, produce single pass/fail. * Output: `studio.release.gate` *** ### Phase 8 — Dashboard Schema Builder (Layer 2) The visual builder is the layer users actually see. It calls the existing scaffold HTTP routes. **Schema Builder panel** (new section in Adoption page or standalone route `/builder`): ``` ┌─────────────────────────────────────────────────────────────────┐ │ New Table [Cancel] │ ├─────────────────────────────────────────────────────────────────┤ │ Schema [public ▼] Name [users___________] │ ├─────────────────────────────────────────────────────────────────┤ │ Columns [+ Add Column] │ │ ┌────────────────┬──────────────────┬──────────┬──────────────┐ │ │ │ id │ uuid │ NOT NULL │ [Delete] │ │ │ │ email │ text │ NOT NULL │ [Delete] │ │ │ │ created_at │ timestamptz │ NULL │ [Delete] │ │ │ └────────────────┴──────────────────┴──────────┴──────────────┘ │ ├─────────────────────────────────────────────────────────────────┤ │ [x] Enable RLS │ ├─────────────────────────────────────────────────────────────────┤ │ Generated SQL ──────────────────────────────────────────────── │ │ CREATE TABLE public.users ( │ │ id uuid PRIMARY KEY DEFAULT gen_random_uuid(), │ │ email text NOT NULL, │ │ created_at timestamptz DEFAULT now() │ │ ); │ │ ALTER TABLE public.users ENABLE ROW LEVEL SECURITY; │ └──────────────────────────────────────────────────────[Generate]─┘ ``` **RLS Policy Builder** (companion to entity builder): ``` ┌─────────────────────────────────────────────────────────────────┐ │ Add Policy for: public.users [Cancel] │ ├─────────────────────────────────────────────────────────────────┤ │ Name [users_select_authenticated_____] │ │ Command [SELECT ▼] Roles [authenticated ▼] │ │ USING [auth.uid() = id___________________] │ │ WITH CHECK [________________________] (optional) │ ├─────────────────────────────────────────────────────────────────┤ │ Generated SQL ──────────────────────────────────────────────── │ │ CREATE POLICY "users_select_authenticated" │ │ ON public.users FOR SELECT TO authenticated │ │ USING (auth.uid() = id); │ └──────────────────────────────────────────────────────[Generate]─┘ ``` "Generate" → calls `POST /api/studio/scaffold/create-table` → file written to disk → opens file in migration studio editor for final review → user clicks Apply. *** ### Phase 9 — Greenfield Workflow + Gate Enforcement (Layers 1+5) ✅ Complete **Greenfield init (✅):** * `sbt studio-greenfield-init` / `POST /api/studio/greenfield-init` — creates empty intent graph with `mode: 'greenfield'` * Schema Builder page shows "Project Setup" panel: if no intent graph exists, offers "Initialize Greenfield Project" button * After init, user designs tables via the Schema Builder forms and generates migrations * "Release Gate" panel on Schema Builder page lets user run the gate before applying **Gate enforcement at apply (✅):** * `POST /api/apply` now reads `studio.release.gate` before proceeding * If gate artifact exists and `status: 'fail'` → 422 response, lists blocking issues, apply blocked * If no gate artifact → apply proceeds but response includes `gateWarning` field advising the user to run the gate * Snapshot hash staleness check is still not implemented (Phase 10 scope) *** ### Phase 10 — Intent Graph Mutation & Endpoint Mapping ✅ Complete **Intent graph mutation (✅):** * `tools/intent-patch.ts` + `sbt studio-intent-patch` — patches a single entity's `managedStatus`; `exclude` action adds to `managedScope.explicitExclusions`; `set-status` removes from exclusions when promoting * `POST /api/studio/intent-graph/entity` HTTP route * Adoption page: color-coded status badges, per-row "Manage" and "Exclude" action buttons **Endpoint mapping (✅):** * `tools/endpoint-map.ts` + `sbt studio-endpoint-map` — derives `EndpointNode[]` in intent graph * For each managed entity → `table-crud` endpoint with `allowedRoles` from associated policies * For each managed public-schema function → `rpc` endpoint * `POST /api/studio/endpoint-map` HTTP route * Adoption page: "Map Endpoints" button shows total count and breakdown after run *** ### Phase 11 — Full Vision Complete ✅ Complete **`generate-create-view` tool (✅):** * `tools/generate-create-view.ts` + `sbt studio-create-view --schema public --name --query "SELECT ..."` — no intent graph required * `POST /api/studio/scaffold/create-view` HTTP route **Apply improvements — Layer 5 completion (✅):** * **Audit log:** after a successful `POST /api/apply`, writes `studio.apply.log` artifact (`appliedAt`, `output`, `success`) — always a record of the most recent apply * **Snapshot staleness check:** if a `studio.migration.plan` artifact exists at apply time, recomputes the current snapshot hash and includes `snapshotStale: true` in the response when the snapshot has changed since the plan was generated — warns without blocking **Schema Builder UI — Layer 2 completion (✅):** * `FunctionBuilder` component — schema, name, params table (add/remove rows), return type, language (sql/plpgsql), security (invoker/definer), inline body textarea, live SQL preview; calls `POST /api/studio/scaffold/add-function` * `RpcBuilder` component — same as FunctionBuilder but forces `schema: public`; calls `POST /api/studio/scaffold/create-rpc` * `ViewBuilder` component — schema, name, SELECT query textarea, live SQL preview; calls `POST /api/studio/scaffold/create-view` * All three builders appear in the Schema Builder page below the existing Table and Policy builders **New artifact constant:** * `STUDIO_ARTIFACTS.APPLY_LOG` — `studio.apply.log` artifact for apply audit records *** ## Completion Estimate by Phase | Phase | Layer | What it unlocks | Rough effort | |---|---|---|---| | **6** — Scaffold tools | 3 | Create table + RLS policies from CLI/HTTP | Medium | | **7** — Validation chain | 4 | Security analysis, lint, release gate | Medium-large | | **8** — Dashboard builder | 2 | Visual low-code creation in browser | Large (UI-heavy) | | **9** — Greenfield + gate | 1+5 | Start from scratch; safe apply | Medium | | **10** — Mutation + endpoints | 1 | Manage classifications; API surface | Medium | **At Phase 6 completion**: The platform is a usable CLI toolkit for brownfield adoption + basic scaffolding. Still no visual builder. **At Phase 8 completion**: The dashboard can create tables, add columns, and write RLS policies through forms. This is the "low-code" milestone — the original vision becomes real for the most common operations. **At Phase 10 completion**: Full vision. Greenfield and brownfield covered. Every Supabase backend object has a visual builder. The intent graph is the source of design truth, kept in sync with SQL migrations, gated before every apply. *** ## What Exists vs What Was Anticipated The artifact system and SDK types were designed for the full vision from the start. Comparing what was anticipated (defined in `writers.ts` / `constants.ts` / `studio-types.ts`) against what's implemented: | Anticipated artifact | Writer ✅ | Schema ✅ | Tool ❓ | |---|---|---|---| | `studio.schema.snapshot` | ✅ | ✅ | ✅ Implemented | | `studio.sql.ast` | ✅ | ✅ | ✅ Implemented | | `studio.intent.sync-report` | ✅ | ✅ | ✅ Implemented | | `studio.intent.graph` | ✅ | ✅ | ✅ Implemented | | `studio.workflow.run` | ✅ | ✅ | ✅ Implemented | | `studio.rls.plan` | ✅ | ✅ | ✅ Implemented (Phase 7) | | `studio.rls.report` | ✅ | ✅ | ✅ Implemented (Phase 7) | | `studio.rpc.plan` | ✅ | ✅ | ✅ Implemented (Phase 7) | | `studio.migration.plan` | ✅ | ✅ | ✅ Implemented (Phase 7) | | `studio.migration.lint` | ✅ | ✅ | ✅ Implemented (Phase 7) | | `studio.release.gate` | ✅ | ✅ | ✅ Implemented (Phase 7) | | `studio.apply.log` | ✅ | ✅ | ✅ Implemented (Phase 11) | All 12 artifact contracts are fully producing. *** ## Key Architectural Constraint The platform now supports both strict and loose paths: * strict (`managed`): intent-graph-backed, validation-gated * loose (`assisted`/`loose`): selected generators can run without full adoption state Remaining constraint to improve: parameterized scaffold workflows (`create-table`, `add-rls-policy`) are cataloged as guided flows, but still rely on explicit user/tool input rather than fully automatic engine execution. *** ## Persona-Oriented Expansion (Next) To make this a backend-building platform accessible across backgrounds, extend catalogs by audience and control mode. ### Control Modes * `managed`: strict intent ownership + gating * `assisted`: recommendations + selective enforcement * `loose`: minimal constraints, generate/analyze only ### Backend Dev Oriented Tools (Proposed) 1. `studio-contract-check` — detect breaking schema/API contract changes 2. `studio-perf-check` — index/query risk heuristics and lock-risk scoring 3. `studio-data-backfill-plan` — staged data backfill plan for non-null/default migrations 4. `studio-drift-check` — live DB vs intent/migration drift report 5. `studio-rollback-plan` — conservative rollback strategy artifact ### Business/Product Oriented Tools (Proposed) 1. `studio-impact-summary` — plain-English summary of changes, risk, blast radius 2. `studio-policy-plain-language` — translate RLS/policy intent to business language 3. `studio-feature-to-schema` — convert feature brief into suggested entities/endpoints 4. `studio-release-readiness-report` — non-technical go/no-go report 5. `studio-kpi-surface-map` — map schema/API changes to affected metrics and owners ### Backend Dev Oriented Workflows (Proposed) 1. `safe-release`: migration-plan → lint → rls-check → rpc-lint → release-gate → apply 2. `expand-contract`: additive-safe rollout before destructive follow-up 3. `brownfield-hardening`: adopt → intent-patch → endpoint-map → release-check ### Business/Product Oriented Workflows (Proposed) 1. `feature-intake`: brief → proposed schema/API package → review artifact 2. `change-approval`: impact-summary → risk thresholds → sign-off artifact 3. `release-brief`: release-check output → stakeholder summary → publish artifact --- --- url: /supabase-tools/architecture/package-dependencies.md --- # Package & Artifact Dependencies Dependency map for all 13 packages (sdk + core + ui-web + 10 plugins): npm dependencies, artifact produce/consume, filesystem I/O, and command flows. Use this to understand real-time update requirements (e.g. Migration Studio needs fresh `migration.analysis` to show migration list/status). **Note:** `@sbtools/plugin-atlas-html` and `@sbtools/plugin-docs-server` were previously separate packages but have been merged into `@sbtools/core` as of v0.3.0. The `atlas-html` and `docs` commands are now built-in. ## NPM Package Dependencies | Package | Depends On | |---------|------------| | `@sbtools/sdk` | — | | `@sbtools/ui-web` | `react`, `react-dom` | | `@sbtools/core` | `@sbtools/sdk`, `@sbtools/ui-web`, `dotenv`, `pg`, `zod` | | `@sbtools/plugin-migration-studio` | `@sbtools/sdk`, `@sbtools/ui-web`, `pg`, `@codemirror/*` | | `@sbtools/plugin-migration-audit` | `@sbtools/sdk`, `@sbtools/ui-web`, `pg` | | `@sbtools/plugin-deno-functions` | `@sbtools/sdk` | | `@sbtools/plugin-depgraph` | `@sbtools/sdk`, `@sbtools/ui-web` | | `@sbtools/plugin-erd` | `@sbtools/sdk`, `pg` | | `@sbtools/plugin-typegen` | `@sbtools/sdk` | | `@sbtools/plugin-db-test` | `@sbtools/sdk`, `pg`, `@electric-sql/pglite` | | `@sbtools/plugin-logs` | `@sbtools/sdk`, `@sbtools/ui-web` | | `@sbtools/plugin-frontend-usage` | `@sbtools/sdk`, `@sbtools/ui-web` | | `@sbtools/plugin-scaffold` | `@sbtools/sdk` | ## Artifact Produce / Consume | Package | Produces | Consumes | |---------|----------|----------| | plugin-migration-studio | `migration.studio.draft` (planned) | `migration.analysis` | | plugin-migration-audit | `migration.analysis` | — | | @sbtools/core (docs) | — | `openapi.partial.deno-functions` | | plugin-deno-functions | `openapi.partial.deno-functions` | — | | plugin-depgraph | `depgraph.graph` | — | | plugin-frontend-usage | `frontend.usage` | — | ## Artifact Write Triggers | Artifact | Producer | Trigger | |----------|----------|---------| | `migration.analysis` | plugin-migration-audit | `sbt migration-audit` only (not `generate-atlas`) | | `openapi.partial.deno-functions` | plugin-deno-functions | `sbt edge-functions` or `sbt generate-atlas` | | `depgraph.graph` | plugin-depgraph | `sbt depgraph` | | `frontend.usage` | plugin-frontend-usage | `sbt frontend-usage` | ## Command → Output Chain ``` sbt snapshot → supabase/current/ (full snapshot) sbt generate-atlas → reads: supabase/current/ + all plugins getAtlasData → writes: docs/backend-atlas-data.json sbt atlas-html → reads: docs/backend-atlas-data.json → writes: docs/backend-atlas.html sbt migration-audit → reads: supabase/migrations/, DB → writes: migration.analysis artifact, docs/migration-audit.html sbt depgraph → reads: docs/backend-atlas-data.json, supabase/current/, types.ts → writes: depgraph.graph artifact, docs/dependency-graph.html, .md sbt edge-functions → reads: supabase/functions/ → writes: openapi.partial.deno-functions artifact (when functions exist) sbt frontend-usage → reads: src/ (or configured scanPaths) → writes: frontend.usage artifact, docs/frontend-usage.html sbt docs [swagger|redoc|atlas|schemaspy|all] → reads: .sbt/openapi-spec.json, openapi.partial.* artifacts → serves: Swagger, ReDoc, Atlas, SchemaSpy sbt watch --scope migration → listens: Postgres NOTIFY (sbt_watch_events), migrations dir changes → runs: sbt migration-audit --no-open (debounced) → writes: migration.analysis artifact, .sbt/watch/last-event.json → signals: migration-studio /api/events (SSE) ``` ## Migration Studio Dependencies (Real-Time Updates) | Studio Feature | Data Source | How to Refresh | |----------------|-------------|----------------| | Migrations list + status | `migration.analysis` artifact | Run `sbt migration-audit` | | Schema: Database | Live DB | Connect to DB | | Schema: Cached atlas | `docs/backend-atlas-data.json` | Run `sbt generate-atlas` | | Schema: Table names only | `migration.analysis` (fallback) | Run `sbt migration-audit` | | Schema: None | — | No snapshot/atlas/artifact | **Important:** The `migration.analysis` artifact is written **only** by `sbt migration-audit`. Running `sbt generate-atlas` does **not** write it (migration-audit's `getAtlasData` contributes to atlas data but skips artifact write). | Goal | Commands | |------|----------| | Show migrations list with status | `sbt migration-audit` | | Rich schema from cache | `sbt generate-atlas` | | Full schema from DB | Connect Studio to DB | | Both migrations list and atlas schema | `sbt migration-audit` + `sbt generate-atlas` | ## File Inputs by Consumer | Consumer | Inputs | |----------|--------| | **core** snapshot | DB (live) | | **core** generate-atlas | `supabase/current/` + all plugins `getAtlasData` | | **core** atlas-html | `docs/backend-atlas-data.json` | | **core** docs | `.sbt/openapi-spec.json`, `openapi.partial.deno-functions` artifact | | plugin-depgraph | `docs/backend-atlas-data.json`, `supabase/current/`, `types.ts` | | plugin-migration-audit | `supabase/migrations/`, DB | | plugin-migration-studio | `supabase/migrations/`, DB, `docs/backend-atlas-data.json`, `migration.analysis` artifact | ## Related * [Artifact ID Registry](./artifact-registry.md) — Official artifact IDs * [Implicit File Contracts](./implicit-file-contracts.md) — Output paths and merge semantics --- --- url: /supabase-tools/plans/adoption-page-visualization-extensions.md --- # Plan: Extend Adoption Page Visualizations Maximize visual value from intent graph and adoption workflow data using existing libraries (Mermaid, Recharts), tab layout, and optional graph libraries. Includes **tools & workflows** visualization and **business-oriented** metrics for stakeholders. *** ## 1. Existing Assets (Zero New Dependencies) | Asset | Location | Used For | |-------|----------|----------| | **Mermaid** | `MermaidRenderer.tsx` | ER diagrams, flowcharts; dark mode support | | **Recharts** | `MiniBarChart`, `MiniDonutChart` | Bar, donut/pie charts | | **Custom SVG graph** | `Depgraph.tsx` | Interactive nodes + edges; zoom/pan; type palette | | **Intent graph API** | `GET /api/studio/intent-graph` | Full `IntentGraph` (entities, views, functions, triggers, policies, endpoints, opaqueBlocks, managedScope) | *** ## 2. Data Available (Intent Graph) | Data | Shape | Visual Use | |------|-------|------------| | `entities` | id, schema, name, managedStatus, confidence, columns, constraints, indexes | Charts, tables, graph nodes; FK from constraints | | `policies` | id, entity, name, command, roles, managedStatus | Entity→Policy graph edges; per-entity bar | | `triggers` | id, entity, name, events, function | Entity→Trigger→Function graph edges | | `functions` | id, schema, name, args, returnType, security | Graph nodes; security bar chart | | `views` | id, schema, name, materialized | Graph nodes; type breakdown | | `endpoints` | id, type, entity, allowedRoles, operations | Endpoint table; type donut (table-crud vs rpc) | | `opaqueBlocks` | rawSql, sourceSpan, reason | Opaque blocks table + reason breakdown | | `managedScope` | schemas, explicitExclusions | Schema status summary | *** ## 3. Proposed Layout: Tabbed Sections ``` ┌─────────────────────────────────────────────────────────────────┐ │ Adoption Workflow │ │ [Status pill] [Start/Resume/Restart] [Map Endpoints] │ └─────────────────────────────────────────────────────────────────┘ ┌─────────────────────────────────────────────────────────────────┐ │ [Overview] [Entities] [Graph] [Endpoints] [Policies] [Opaque] │ ← Tabs └─────────────────────────────────────────────────────────────────┘ ┌─────────────────────────────────────────────────────────────────┐ │ Tab content │ └─────────────────────────────────────────────────────────────────┘ ``` *** ## 4. Tab Content (Per Tab) ### Tab 1: Overview * **Stats row**: Counts for entities, views, functions, triggers, policies, endpoints, opaque blocks. * **MiniDonutChart**: Managed status (managed / assisted / opaque / excluded). * **MiniBarChart**: Entity count per schema (or policy count per entity). * **Confidence histogram** (Recharts BarChart): buckets (0–0.5, 0.5–0.8, 0.8–1.0). * **Mode & scope summary**: `mode`, `managedScope.schemas` summary. **Libraries**: Recharts (existing). *** ### Tab 2: Entities (current table + enrichment) * Keep current table (Entity, Schema, Status, Confidence, Actions). * Add column: **# Policies** (from `policies.filter(p => p.entity === e.id)`). * Add column: **# Triggers** (from `triggers.filter(t => t.entity === e.id)`). * Optional: Expandable row to show columns/constraints. **New dependencies**: None. *** ### Tab 3: Graph (intent graph visualization) **Option A — Mermaid erDiagram (recommended, no new deps)** * Build Mermaid `erDiagram` from entities + FK constraints. * Render with `MermaidRenderer` (dark mode). * Edges: `Entity1 ||--o{ Entity2 : "FK"`. * Cluster by schema if many tables. * Simple and reliable for entity relationships. **Option B — Interactive SVG (like Depgraph)** * Reuse Depgraph layout/rendering pattern. * Nodes: entity, function, view, trigger, policy, endpoint. * Edges: FK, policy→entity, trigger→entity, trigger→function, endpoint→entity. * Benefit: zoom/pan, selection, type filters. * Cost: New `buildIntentGraphModel()` in `model.ts`; no new lib. **Option C — @xyflow/react (React Flow)** * Industry-standard interactive graph. * Auto-layout (dagre), minimap, selection. * Cost: ~50–80KB gzipped, new dependency. * Use when: graph is large (>50 nodes) and needs professional UX. **Recommendation**: Start with Option A (Mermaid). Add Option B if users request interactivity without adding React Flow. Add Option C only if graph size and UX justify it. *** ### Tab 4: Endpoints * Table: Endpoint ID, Type (table-crud / rpc / view), Entity, Exposed via, Allowed roles, Operations. * **MiniDonutChart**: table-crud vs rpc vs view counts. * **MiniBarChart**: Endpoints per entity (top N). * Data from `intentGraph.endpoints` (populated by Map Endpoints). **Libraries**: Recharts. *** ### Tab 5: Policies * Table: Policy ID, Entity, Command, Roles, Managed status. * **MiniBarChart**: Policy count per entity. * **MiniDonutChart**: Command breakdown (SELECT / INSERT / UPDATE / DELETE / ALL). * Link to entity in Entities tab. **Libraries**: Recharts. *** ### Tab 6: Opaque Blocks (when present) * Table: ID, Source (file:lines), Reason, Touched objects. * **MiniDonutChart**: Reason breakdown (unrepresentable / user-excluded / too-complex). * Expandable row with `rawSql`. * Shown only when `opaqueBlocks.length > 0`. **Libraries**: Recharts. *** ## 5. Implementation Checklist | Task | Effort | Notes | |------|--------|-------| | Fetch full IntentGraph instead of only `entities` | S | Update `fetchIntentGraph` typings and state | | Add tab layout (Overview, Entities, Graph, Endpoints, Policies, Opaque) | M | Tab component or simple button-tabs | | Overview: stats + MiniDonut + MiniBar + confidence histogram | M | Reuse existing chart components | | Entities: keep table, add Policy/Trigger counts | S | Derive from `policies`, `triggers` | | Graph tab: Mermaid erDiagram from entities + FK | M | `buildIntentMermaid(entities)` helper | | Endpoints tab: table + donut + bar | M | New tab content | | Policies tab: table + bar + donut | M | New tab content | | Opaque tab: table + donut, conditional render | S | Gate on `opaqueBlocks.length` | | Optional: Intent graph model + Depgraph-style SVG | L | If Mermaid is insufficient | *** ## 6. Library Choices Summary | Need | Choice | Rationale | |------|--------|-----------| | Entity-relationship diagram | **Mermaid** | Already in use; reliable; no new deps | | Bar/pie/donut charts | **Recharts** | Already in use; proven | | Tab layout | **CSS + state** | No dependency; simple | | Interactive graph (optional) | **Depgraph pattern** or **@xyflow/react** | Reuse vs industry standard | *** ## 7. Data Flow ``` GET /api/studio/intent-graph → full IntentGraph (entities, views, functions, triggers, policies, endpoints, opaqueBlocks, managedScope) → Store in React state → Each tab derives its view from this single source ``` No extra API calls per tab. Map Endpoints runs once; refresh updates `endpoints` in the stored graph. *** ## 8. Tools & Workflows Visualization ### 8.1 What Exists | Concept | Source | API | |---------|--------|-----| | Adoption workflow | `adopt-backend` | `GET /api/studio/adopt/status`, `POST /api/studio/adopt/start`, `POST /api/studio/adopt/resume` | | Release gate | `runReleaseGate` | `POST /api/studio/release-gate` | | RLS check | `runRlsCheck` | `POST /api/studio/rls-check` | | Migration lint | `runMigrationLint` | `POST /api/studio/migration-lint` | | RPC lint | `runRpcLint` | `POST /api/studio/rpc-lint` | | Migration plan | `runMigrationPlan` | `POST /api/studio/migration-plan` | ### 8.2 Workflow Pipeline (Mermaid flowchart) Render the **adopt-backend** workflow as a flowchart: ``` introspect → sql-parse → [review checkpoint] → intent-sync → [approve checkpoint] → intent-init ``` * Nodes: step name + artifact produced. * Edges: data flow; checkpoint nodes styled differently (diamond or callout). * Status overlay: completed (green), running (spinner), failed (red), not started (gray). * Reuse `MermaidRenderer` with `flowchart` or `flowchart LR`. ### 8.3 Validation Pipeline (Mermaid flowchart) Render the **validation pipeline** that feeds release gate: ``` rls-check → rpc-lint → migration-lint → release-gate ``` * Each tool produces an artifact; release-gate aggregates. * Show which tools have been run (evidence in gate) vs missing. * Link to Run buttons per tool. ### 8.4 Tool Grid / Run Matrix * **Tool** | **Last run** | **Status** | **Output** | **Run**. * Derive "last run" from artifact `generatedAt` (requires read endpoints for artifacts, or run-on-demand and show result). * Status: pass/fail from artifact (rls report, migration lint) or from gate evidence. *Note: Artifacts are on disk; no GET endpoints exist for them. Options: (a) add `GET /api/studio/artifacts/:id` to read latest artifact, or (b) run tools on-demand from UI and display result. Option (b) is already used for release-gate.* *** ## 9. Business-Oriented Visualizations Translate technical data into metrics stakeholders care about: **readiness**, **risk**, **progress**, and **API surface**. ### 9.1 Release Readiness Scorecard | Metric | Source | Display | |--------|--------|---------| | **Gate status** | `ReleaseGate.status` | Large pass/fail badge; green/red | | **Blocking issues** | `ReleaseGate.blocking.length` | Count + list; "3 issues must be fixed" | | **Warnings** | `ReleaseGate.warnings.length` | Count; "2 warnings to review" | | **Evidence** | `ReleaseGate.evidence` | "Checked: RLS, RPC, Lint" or "Missing: run rls-check, rpc-lint, lint" | **Visual**: Card/summary at top of Adoption or a dedicated "Release Readiness" tab. One-click "Run Gate" to refresh. ### 9.2 Risk Overview | Risk Type | Source | Business Language | |-----------|--------|-------------------| | **RLS gaps** | `RlsReportData.gaps` | "X tables lack RLS policies — data may be exposed" | | **Destructive changes** | `MigrationPlanData.destructiveCount` | "X destructive changes pending — review before apply" | | **Migration lint errors** | `MigrationLintData.errorCount` | "X lint errors in migration files" | | **RPC security** | `RpcPlanData.securityDefinerCount`, `missingSearchPathCount` | "X functions use definer; Y lack search\_path — security risk" | | **Opaque blocks** | `IntentGraph.opaqueBlocks.length` | "X SQL regions not fully modeled — manual review needed" | **Visual**: MiniDonut or horizontal bar for risk categories. Color-coded: red (blocking), amber (warning), green (ok). ### 9.3 Adoption Progress | Metric | Source | Display | |--------|--------|---------| | **Workflow %** | `WorkflowRun.steps` completed / total | Progress bar: "75% complete (3/4 steps)" | | **Managed scope** | `IntentGraph.entities` by managedStatus | "X managed, Y assisted, Z opaque" | | **Endpoint coverage** | `IntentGraph.endpoints` vs entities | "X of Y tables exposed as API endpoints" | **Visual**: Progress bar for workflow; donut for managed vs unmanaged. ### 9.4 API Surface (Business Value) | Metric | Source | Display | |--------|--------|---------| | **Total endpoints** | `IntentGraph.endpoints.length` | "N API endpoints" | | **Table CRUD** | Endpoints with `type: "table-crud"` | "X REST table endpoints" | | **RPC** | Endpoints with `type: "rpc"` | "X RPC endpoints" | | **Coverage** | Managed entities with endpoints / total managed | "X% of managed tables exposed" | **Visual**: Stat cards; "Your backend exposes N endpoints to the frontend." ### 9.5 Migration Health | Metric | Source | Display | |--------|--------|---------| | **Pending changes** | `MigrationPlanData.changes.length` | "X changes queued" | | **Safe vs risky** | `additive_safe` vs `destructive` | "Y safe, Z require review" | | **Change classes** | Group by `ChangeClass` | Bar: additive\_safe | additive\_with\_default | drop | … **Visual**: Bar chart; tooltip with remediation hints. ### 9.6 Suggested Business-Oriented Tabs | Tab | Audience | Content | |-----|----------|---------| | **Readiness** | PM, Tech Lead | Release gate, blocking count, evidence, Run Gate | | **Risk** | Security, Ops | RLS gaps, destructive changes, RPC security, opaque blocks | | **Progress** | PM | Workflow %, managed scope donut, adoption checklist | | **API Surface** | Product | Endpoint counts, table vs RPC split, coverage % | | **Tools** | Dev | Workflow flowchart, validation pipeline, tool run matrix | ### 9.7 Combined Tab Structure (Technical + Business) ``` [Readiness] [Progress] [Risk] [API Surface] | [Overview] [Entities] [Graph] [Endpoints] [Policies] [Opaque] [Tools] ↑────────── Business-focused ──────────↑ ↑──────────── Technical / Developer ────────────────↑ ``` * **Readiness** — Gate status, blocking/warnings, one-click Run Gate. * **Progress** — Workflow completion %, managed scope, adoption checklist. * **Risk** — Aggregated risk categories with business language. * **API Surface** — Endpoint counts, coverage. * **Overview** — Stats, donuts, confidence (technical overview). * **Entities** — Table with Manage/Exclude. * **Graph** — Mermaid ER or interactive. * **Endpoints** — Endpoint table + charts. * **Policies** — Policy table + charts. * **Opaque** — Unmodeled SQL blocks. * **Tools** — Workflow + validation flowcharts, tool run matrix. *** ## 10. Benefits Summary * **No new deps for core features**: Mermaid + Recharts + custom tabs. * **Full data usage**: Every major IntentGraph field has at least one visual. * **Tabs**: Clear separation; manageable complexity. * **Charts**: Status distribution, schema breakdown, confidence, endpoints, policies. * **Graph**: Entity relationships (Mermaid) with optional interactive SVG later. * **Opaque blocks**: Visibility into unmodeled SQL. * **Endpoints**: Clear table and charts instead of only a count pill. * **Tools & workflows**: Adoption and validation pipelines as flowcharts; tool run status. * **Business-oriented**: Release readiness scorecard, risk overview, adoption progress, API surface metrics — language stakeholders understand. --- --- url: /supabase-tools/plans/platform-catalog-completion-and-expansion.md --- # Plan: Platform Catalog Completion and Expansion ## Goal Close the current tool/workflow catalog gaps against `docs/plans/platform-status-and-roadmap.md`, then extend the roadmap with additional high-impact tools/workflows for the backend-building vision. ## Part A — Implement Missing Catalog Items ### A1) Add Missing Workflow Modules Create new workflow definitions under `packages/plugin-migration-studio/src/workflows/`: 1. `create-table.workflow.ts` * `studio-create-table` → `studio-lint` 2. `add-rls-policy.workflow.ts` * `studio-add-rls-policy` → `studio-rls-check` 3. `release-check.workflow.ts` * `studio-rls-check` → `studio-rpc-lint` → `studio-lint` → `studio-release-gate` These are cataloged workflow definitions (discoverable by `*.workflow.ts`) and can be run via engine consumers. ### A2) Resolve Naming Inconsistency: `studio-lint` vs `studio-migration-lint` Add command/ID compatibility support so roadmap and CLI expectations are both valid: * Keep current canonical tool id and command `studio-lint` (no breaking change). * Add alias command `studio-migration-lint` that executes the same tool. * Update docs to clearly mark alias/canonical naming. ### A3) Resolve Tool Surface Mismatch in Roadmap Roadmap currently claims HTTP exposure for `studio-intent-sync` and `studio-intent-init`. Implementation choice: * Add HTTP routes for both tools through catalog metadata (minimal handlers, no behavioral rewrite), or * Update roadmap to mark them workflow-only. Preferred: expose HTTP for consistency with catalog table in roadmap. ### A4) Update Roadmap to Current Architecture Revise `docs/plans/platform-status-and-roadmap.md` so references match real structure: * `tools/modules/*.tool.ts` + `tools/core/*.core.ts` * `workflows/*.workflow.ts` * current command names + aliases * updated workflow inventory and completion status ## Part B — Expand Vision with New Tools/Workflows Append a new section to roadmap proposing (not implementing yet) additional platform-capability items. ### B1) New Tool Proposals 1. `studio-drift-check` * Compare live DB vs latest migration+intent state, produce drift report artifact. 2. `studio-rollback-plan` * Generate conservative rollback SQL guidance for last N planned changes. 3. `studio-data-backfill-plan` * Detect non-null/default migrations that need data backfill and generate staged plan. 4. `studio-policy-simulate` * Evaluate RLS policy outcomes for sample roles/queries before apply. 5. `studio-breaking-change-check` * Detect API/contract-breaking DB changes (drops/renames/type narrowing) and severity-rank them. ### B2) New Workflow Proposals 1. `safe-release` * plan → lint → rls-check → rpc-lint → release-gate → apply 2. `greenfield-bootstrap` * greenfield-init → create-table × N → add-rls-policy × N → release-check 3. `rpc-publish` * add-function/create-rpc → rpc-lint → endpoint-map → release-gate 4. `schema-change-with-backfill` * add-column/create-table → data-backfill-plan → migration-lint → release-check ## Validation After Part A implementation: 1. `npm run build -w packages/plugin-migration-studio` 2. `npm run test -w packages/plugin-migration-studio` 3. Verify discovery lists new workflows. 4. Verify alias command registration and intended behavior. ## Constraints * No destructive behavior changes. * Keep backward compatibility for existing command users. * Keep single-source catalog architecture (`*.tool.ts` / `*.workflow.ts` discovery). --- --- url: /supabase-tools/plugins.md description: >- List of available plugins, installation, and activation in supabase-tools.config.json. --- # Plugins Plugins extend supabase-tools with additional commands. Install a plugin via npm and add it to `supabase-tools.config.json` to use its commands. ## Available Plugins | Plugin | npm | Description | |--------|-----|-------------| | [plugin-db-test](/plugins/plugin-db-test) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-db-test.svg)](https://www.npmjs.com/package/@sbtools/plugin-db-test) | pgTAP test runner (live + PGlite) | | [plugin-deno-functions](/plugins/plugin-deno-functions) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-deno-functions.svg)](https://www.npmjs.com/package/@sbtools/plugin-deno-functions) | Edge function documentation | | [plugin-depgraph](/plugins/plugin-depgraph) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-depgraph.svg)](https://www.npmjs.com/package/@sbtools/plugin-depgraph) | Dependency graph visualization | | [plugin-erd](/plugins/plugin-erd) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-erd.svg)](https://www.npmjs.com/package/@sbtools/plugin-erd) | Mermaid ERD diagram generation | | [plugin-frontend-usage](/plugins/plugin-frontend-usage) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-frontend-usage.svg)](https://www.npmjs.com/package/@sbtools/plugin-frontend-usage) | Frontend Supabase SDK usage scanner | | [plugin-logs](/plugins/plugin-logs) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-logs.svg)](https://www.npmjs.com/package/@sbtools/plugin-logs) | Docker logs, pg\_stat\_statements | | [plugin-migration-audit](/plugins/plugin-migration-audit) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-migration-audit.svg)](https://www.npmjs.com/package/@sbtools/plugin-migration-audit) | Migration file vs DB tracking — drift detection | | [plugin-migration-studio](/plugins/plugin-migration-studio) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-migration-studio.svg)](https://www.npmjs.com/package/@sbtools/plugin-migration-studio) | **Flagship plugin.** 21 tools + 4 workflows; intent graph; release gate; HTTP tool surface for AI agents (port 3335) | | [plugin-scaffold](/plugins/plugin-scaffold) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-scaffold.svg)](https://www.npmjs.com/package/@sbtools/plugin-scaffold) | Scaffold new plugins | | [plugin-typegen](/plugins/plugin-typegen) | [![npm](https://img.shields.io/npm/v/@sbtools/plugin-typegen.svg)](https://www.npmjs.com/package/@sbtools/plugin-typegen) | TypeScript type generation | The `docs` command (Swagger UI, ReDoc, SchemaSpy) and the `dashboard` command (React SPA) are **built into core** — no plugin required. ## Installation ```bash # Example: install ERD plugin npm install @sbtools/plugin-erd ``` ## Activation Add to `supabase-tools.config.json`: ```json { "plugins": [ { "path": "@sbtools/plugin-erd", "config": {} } ] } ``` Use `"path": "@sbtools/plugin-"` for npm packages, or a relative path for local plugins. --- --- url: /supabase-tools/sdk.md description: >- SDK types, PluginContext, UI utilities, error classes, container/compose utils, and helpers for building plugins. --- # SDK API Reference The `@sbtools/sdk` package provides shared types, interfaces, and utilities for supabase-tools plugins. ## Installation ```bash npm install @sbtools/sdk ``` ## Core Types * **SbtPlugin** — Plugin contract with `name`, `version`, `commands`, and optional hooks * **PluginContext** — Runtime context passed to every plugin hook: * `projectRoot`, `toolsDir`, `sbtDataDir`, `artifactsDir` — absolute directory paths * `apiUrl` — Supabase API URL * `paths` — shared config paths resolved to absolute: `migrations`, `snapshot`, `docsOutput`, `functions` * `pluginConfig` — plugin-specific config from `plugins[].config` * `siblingPlugins` — other loaded plugins for cross-plugin collaboration * **ResolvedPaths** — Type for `ctx.paths` * **SbtPluginCommand** — Command definition with `name`, `description`, `run` * **ArtifactCapabilities** — Declare which artifact IDs a plugin `produces` and `consumes` (used by tooling to validate artifact availability) ## UI Utilities ```ts import { ui } from "@sbtools/sdk"; ui.info("Message"); ui.success("Done"); ui.warn("Warning"); ui.error("Error"); ui.step("Step"); ui.detail("Detail"); ui.heading("Heading"); ui.table([["a", "b"], ["1", "2"]], 2); ``` ## Error Classes * `ConfigError` — Config validation failures * `DatabaseError` — DB connection/query failures * `SnapshotError` — Snapshot generation failures * `PluginError` — Plugin load/hook failures * `SbtError` — Generic CLI errors Use `handleError(err)` for consistent error output. ## Filesystem Utilities * `ensureDir(path)` — Create directory recursively * `readText(path)` — Read file as UTF-8 * `writeFileInDir(dir, filename, content)` — Write file in directory * `safeName(str)` — Replace non-word chars with underscore (for identifiers; preserves dots and hyphens) * `safeFileName(baseName, maxLength?)` — Truncate long filenames with hash * `sanitizeSlug(str)` — Hyphenated slug (e.g. plugin names, directory names) * `sanitizeIdentifier(str)` — Alphanumeric + underscore (e.g. Mermaid node IDs) ## Container Utilities For Docker Compose project naming (used by core and plugins that interact with containers): * `sanitizeContainerPrefix(projectName)` — Sanitize raw project name into valid Docker prefix * `deriveContainerPrefix(projectRoot)` — Read `supabase-tools.config.json` for `project.name`, fallback to basename, return sanitized prefix ```ts import { deriveContainerPrefix, sanitizeContainerPrefix } from "@sbtools/sdk"; const prefix = deriveContainerPrefix(ctx.projectRoot); const container = `${prefix}-supabase-db`; // Or when you already have the name: const prefix2 = sanitizeContainerPrefix(config.project.name); ``` ## Compose Utilities Extract values from Docker Compose YAML files: * `extractComposeKey(composePath, patterns)` — First matching regex capture * `extractSupabaseKeys(composePath)` — Returns `{ anonKey, serviceKey }` in one read (see `SupabaseKeys` type) ```ts import { extractComposeKey, extractSupabaseKeys } from "@sbtools/sdk"; const { anonKey, serviceKey } = extractSupabaseKeys(path.join(ctx.toolsDir, "docker-compose.db.yml")); const jwtSecret = extractComposeKey(composePath, [/JWT_SECRET:\s*([^\s]+)/]); ``` ## CLI Utilities * `hasFlag(args, ...names)` — Check for CLI flags (e.g. `--help`, `-h`) * `getArg(args, name)` — Get CLI argument value (e.g. `--port 3000` → `"3000"`) * `openFile(path)` — Open file in default editor * `withHelp(helpText, fn)` — Wrap a command handler to provide `--help` / `-h` support * `loadPackageVersion(importMetaUrl)` — Load version from nearest package.json (use `import.meta.url`) ### withHelp Example ```ts import { withHelp } from "@sbtools/sdk"; const HELP = ` my-command — does something useful Usage: sbt my-command [--flag] [--arg VALUE] Options: --flag Enable feature --arg VALUE Set value -h, --help Show this help `; const myCommand = withHelp(HELP, async (args: string[], ctx: PluginContext) => { // Command implementation }); ``` ### loadPackageVersion Example ```ts import { loadPackageVersion } from "@sbtools/sdk"; const plugin: SbtPlugin = { name: "@sbtools/plugin-example", version: loadPackageVersion(import.meta.url), // Reads ../package.json commands: [/* ... */], }; ``` ## Plugin Config Helpers Typed accessors for plugin config values — replaces unsafe `(ctx.pluginConfig.foo as string) ?? default` casts: * `getConfigString(ctx, key, fallback)` — Get a string config value * `getConfigNumber(ctx, key, fallback)` — Get a number config value (parses strings too) * `getConfigStringArray(ctx, key, fallback)` — Get a string array config value * `resolveConfigPath(ctx, key, fallbackRelPath)` — Get a path config value; relative paths are resolved against `ctx.projectRoot` ```ts import { getConfigString, getConfigNumber, getConfigStringArray, resolveConfigPath } from "@sbtools/sdk"; const output = getConfigString(ctx, "outputDir", "docs/my-output"); const port = getConfigNumber(ctx, "port", 3333); const scanDirs = getConfigStringArray(ctx, "scanPaths", ["src/"]); const typesOut = resolveConfigPath(ctx, "typesOutput", "src/types/supabase.ts"); ``` ## Artifact Helpers Read and write typed artifact envelopes stored in `.sbt/artifacts/`: * `writeArtifact(dir, id, version, payload)` — Write artifact JSON file * `readArtifact(dir, id, version)` — Read artifact; throws if missing or version mismatch * `readArtifactOrNull(dir, id, version)` — Read artifact; returns `null` if missing * `createArtifactWriter(opts)` — Factory for writing multiple artifacts from the same plugin ```ts import { writeArtifact, readArtifactOrNull } from "@sbtools/sdk"; // Write await writeArtifact(ctx.artifactsDir, "my.artifact", "1.0.0", { items: [] }); // Read (returns null when missing instead of throwing) const result = readArtifactOrNull(ctx.artifactsDir, "migration.analysis", "1.0.0"); if (result) { /* result.payload is the data */ } ``` ## DB Utilities Optional; plugins that need DB access must have `pg` installed. SDK exposes thin wrappers: * `resolveDbUrl()` — From `DATABASE_URL`, `SUPABASE_DB_URL`, `POSTGRES_URL`, or default local URL * `createPgClient()` — Create `pg.Client` (throws if `pg` not installed) * `testConnection(client)` — Returns `true` if connect succeeds * `disconnectClient(client)` — Safe disconnect ```ts import { createPgClient, testConnection, disconnectClient } from "@sbtools/sdk"; const client = createPgClient(); try { if (await testConnection(client)) { /* ... */ } } finally { await disconnectClient(client); } ``` ## Migration Scanner * `scanMigrationFiles(dir)` — Returns `MigrationFileInfo[]` (.sql files, sorted) * `parseTimestampPrefix(filename)` — Extract `YYYYMMDDHHMMSS` from migration filename ```ts import { scanMigrationFiles, parseTimestampPrefix } from "@sbtools/sdk"; const files = scanMigrationFiles(ctx.paths.migrations); const ts = parseTimestampPrefix("20240101120000_foo.sql"); // "20240101120000" ``` ## SQL Analyzer * `analyzeMigrationSql(sql)` — Regex-based DDL classifier. Returns `MigrationSqlAnalysis` with `operations`, `touchedObjectKeys`, `riskFlags`, `confidence`. Used by migration-audit and migration-studio. No Node.js deps; can run in browser. ## Dashboard View Use **`getDashboardView()`** to contribute sections to the `sbt dashboard` UI. Returns JSON-serializable config — no JS strings. ### Interfaces * **`DashboardSectionDef`** — section config (id, title, description, dataKey, layout, stats, card, table, link) * **`DashboardCardDef`** — card layout (titleField, subtitleField, searchFields, badges, details) * **`DashboardTableDef`** — table layout (columns with header, field, format) * **`DashboardStatDef`** — stat row (label, field, tone) * **`DashboardBadgeDef`** — badge (field, toneMap) * **`DashboardDetailDef`** — detail row (label, field, format) * **`DashboardView`** — `{ sections: DashboardSectionDef[] }` ### Example ```ts import type { DashboardSectionDef, DashboardView } from "@sbtools/sdk"; export function getMyPluginDashboardView(): DashboardView { return { sections: [ { id: "my-items", title: "My Items", description: "Items from the database.", dataKey: "my_items", layout: "cards", stats: [ { label: "Total", field: "summary.total", tone: "good" }, ], card: { titleField: "name", subtitleField: "type", searchFields: ["name", "description"], badges: [{ field: "status", toneMap: { active: "good", inactive: "warn" } }], details: [ { label: "Description", field: "description" }, { label: "Created", field: "created_at", format: "date" }, ], }, }, ], }; } ``` Formats: `text`, `code`, `date`, `bytes`, `ms`, `number`. Tones: `default`, `good`, `warn`, `bad`, `accent`. ## Schema Filters (Parameterized Queries) **`SchemaFilter`** — `{ clause: string; params: string[] }` — used by snapshot extractors to inject schema filters into SQL queries using parameterized placeholders (`$1`, `$2`, ...) instead of string interpolation. ```ts import type { SchemaFilter } from "@sbtools/sdk"; const filter: SchemaFilter = { clause: "AND n.nspname IN ($1, $2)", params: ["public", "extensions"], }; const result = await client.query( `SELECT * FROM pg_namespace n WHERE true ${filter.clause}`, filter.params ); ``` **Core utility:** `getSchemaFilter(schemas, column)` in `@sbtools/core` generates these filters from user config. ## Building Plugins See [plugin-scaffold](/plugins/plugin-scaffold) to scaffold a new plugin, or inspect existing plugins in the repository. --- --- url: /supabase-tools/publishing.md description: How to publish @sbtools packages to npm using changesets and CI. --- # Version and Release Guide ## Prerequisites 1. Create the `@sbtools` organization on [npmjs.com](https://www.npmjs.com) (free for public packages) 2. `npm login` to authenticate ## First-Time Publish 1. `npm run build` — compile all packages 2. `npx changeset publish` — publishes all packages at 0.1.0 ## Subsequent Releases 1. Make your changes 2. `npx changeset` — interactively create a changeset describing what changed 3. Commit the changeset file with your PR 4. When ready to release: * `npx changeset version` — bumps package versions and updates CHANGELOGs * `npm run release` — builds and publishes all changed packages ## Deprecated Packages `@sbtools/plugin-atlas-html` and `@sbtools/plugin-docs-server` were merged into `@sbtools/core` as of v0.3.0. To mark them deprecated on npm: ```bash npm deprecate @sbtools/plugin-atlas-html "Merged into @sbtools/core >=0.3.0. Remove from your plugins config." npm deprecate @sbtools/plugin-docs-server "Merged into @sbtools/core >=0.3.0. Remove from your plugins config." ``` The core plugin loader already detects these names in config and prints a warning instead of crashing. ## Notes * All packages are linked: version bumps propagate across `@sbtools/*` * `access: "public"` is required for scoped packages * `NPM_TOKEN` secret is required for CI/CD publishing * Run `npm run lint:conventions` before releasing to check for convention violations --- --- url: /supabase-tools/writing-plugins.md description: Build custom supabase-tools plugins — from scaffold to publish. --- # Writing Plugins A plugin is a default-exported object that satisfies `SbtPlugin`. It can add CLI commands and hook into core operations like `status`, `generate-atlas`, and `docs`. ## Scaffold ```bash npx sbt scaffold-plugin my-plugin # packages/plugin-my-plugin/ npx sbt scaffold-plugin my-plugin --external # plugin-my-plugin/ at project root npx sbt scaffold-plugin my-plugin --hooks # include Atlas/status/OpenAPI stubs ``` ## Minimal Plugin ```ts import type { SbtPlugin, PluginContext } from "@sbtools/sdk"; import { ui } from "@sbtools/sdk"; const plugin: SbtPlugin = { name: "@sbtools/plugin-my-plugin", version: "1.0.0", commands: [ { name: "my-command", description: "Does something useful", async run(args: string[], ctx: PluginContext) { ui.step("Working..."); // ctx.projectRoot, ctx.paths, ctx.pluginConfig are available ui.success("Done"); }, }, ], }; export default plugin; ``` Register it in `supabase-tools.config.json`: ```json { "plugins": [ { "path": "./packages/plugin-my-plugin", "config": {} } ] } ``` Run it: `npx sbt my-command` ## PluginContext Every command and hook receives `ctx`: | Property | Type | Description | |----------|------|-------------| | `projectRoot` | `string` | Absolute path to the project root | | `toolsDir` | `string` | Absolute path to the core package (Docker files) | | `sbtDataDir` | `string` | Project-local runtime data (`.sbt/`) | | `artifactsDir` | `string` | Artifact storage path (`.sbt/artifacts/`) | | `apiUrl` | `string` | Supabase API URL | | `paths` | `ResolvedPaths` | `migrations`, `snapshot`, `docsOutput`, `functions` — all absolute | | `pluginConfig` | `Record` | Plugin-specific config from the `config` block | | `siblingPlugins` | `SbtPlugin[]` | Other loaded plugins for cross-plugin collaboration | ## Reading Config Use the typed config helpers from the SDK (preferred over raw casts): ```ts import { getConfigString, getConfigNumber, getConfigStringArray, resolveConfigPath } from "@sbtools/sdk"; const output = getConfigString(ctx, "outputDir", "docs/my-output"); const port = getConfigNumber(ctx, "port", 3333); const scanDirs = getConfigStringArray(ctx, "scanPaths", ["src/"]); const typesOut = resolveConfigPath(ctx, "typesOutput", "src/types/supabase.ts"); ``` ## CLI Flags ```ts import { hasFlag, getArg } from "@sbtools/sdk"; async run(args: string[], ctx: PluginContext) { if (hasFlag(args, "--json")) { /* output JSON */ } if (hasFlag(args, "-h", "--help")) { /* show help */ } const port = getArg(args, "--port") ?? "3000"; } ``` ## Hooks Hooks let your plugin contribute to core commands. All are optional. | Hook | Called by | Returns | |------|-----------|---------| | `getAtlasData(ctx)` | `sbt generate-atlas` | `{ categories: Record, stats: { label, value }[] }` | | `getDashboardView()` | `sbt dashboard` | `{ sections: DashboardSectionDef[] }` — JSON-serializable | | `getStatusLines(ctx)` | `sbt status` | `string[]` — lines appended to status output | | `getOpenApiSpec(ctx)` | `sbt docs` | Partial OpenAPI 3.0 object (deep-merged into the PostgREST spec) | ### getStatusLines Example ```ts const plugin: SbtPlugin = { name: "@sbtools/plugin-example", version: "1.0.0", commands: [/* ... */], async getStatusLines(ctx) { return [" My Plugin: 3 items tracked"]; }, }; ``` ### getAtlasData Example ```ts const plugin: SbtPlugin = { name: "@sbtools/plugin-example", version: "1.0.0", commands: [/* ... */], async getAtlasData(ctx) { return { categories: { my_items: [{ name: "Item 1" }, { name: "Item 2" }] }, stats: [{ label: "My Items", value: 2 }], }; }, }; ``` ### getDashboardView Example Create a `dashboard.ts` file in your plugin: ```ts // src/dashboard.ts import type { DashboardSectionDef, DashboardView } from "@sbtools/sdk"; export function getMyPluginDashboardView(): DashboardView { return { sections: [ { id: "my-items", title: "My Items", description: "Items extracted from the database.", dataKey: "my_items", layout: "cards", card: { titleField: "name", subtitleField: "type", searchFields: ["name", "description"], badges: [{ field: "status", toneMap: { active: "good", inactive: "warn" } }], details: [{ label: "Description", field: "description" }], }, }, ], }; } ``` Then in `src/index.ts`: ```ts import { getMyPluginDashboardView } from "./dashboard.js"; const plugin: SbtPlugin = { name: "@sbtools/plugin-example", version: "1.0.0", commands: [/* ... */], getDashboardView: () => getMyPluginDashboardView(), }; ``` See [SDK API — Dashboard View](/sdk/#dashboard-view) for full documentation. ## SDK Utilities See the full list in [SDK API](/sdk/). Most commonly used: ```ts import { ui, ensureDir, writeFileInDir, readText, openFile } from "@sbtools/sdk"; ensureDir(outputDir); writeFileInDir(outputDir, "report.html", html); const content = readText(filePath); openFile(outputPath); // open in default app ``` ## Publishing 1. Build: `npm run build` (or `tsc` in your plugin directory) 2. Publish: `npm publish --access public` 3. Users install via `npm install @your-scope/plugin-name` and add it to their config