Skip to main content
SuperDoc tools work with any LLM provider or agent framework that supports tool use. The SDK ships tool definitions in multiple formats — pick the one that matches your stack, write a conversation loop (or let the framework handle it), and dispatch tool calls through the SDK. Each example below opens a document, gives the model SuperDoc tools, and lets it review and edit the content.
LLM tools are in alpha. Tool names and schemas may change between releases.

Cloud platforms

Use SuperDoc tools with cloud AI platforms. You write the agentic loop and control the conversation directly.
npm install @superdoc-dev/sdk @aws-sdk/client-bedrock-runtime
import { BedrockRuntimeClient, ConverseCommand } from '@aws-sdk/client-bedrock-runtime';
import {
  createSuperDocClient, chooseTools, dispatchSuperDocTool,
  formatToolResult, mergeDiscoveredTools,
} from '@superdoc-dev/sdk';

const client = createSuperDocClient();
await client.connect();
await client.doc.open({ doc: './contract.docx' });

// Anthropic format → Bedrock toolSpec shape
const { tools } = await chooseTools({ provider: 'anthropic' });
const toolConfig = { tools: [] };
mergeDiscoveredTools(toolConfig, { tools }, { provider: 'anthropic', target: 'bedrock' });

const bedrock = new BedrockRuntimeClient({ region: 'us-east-1' });
const messages = [
  { role: 'user', content: [{ text: 'Review this contract.' }] },
];

while (true) {
  const res = await bedrock.send(new ConverseCommand({
    modelId: 'us.anthropic.claude-sonnet-4-6',
    messages,
    system: [{ text: 'You edit .docx files using SuperDoc tools.' }],
    toolConfig,
  }));

  const output = res.output?.message;
  if (!output) break;
  messages.push(output);

  const toolUses = output.content?.filter((b) => b.toolUse) ?? [];
  if (!toolUses.length) break;

  const results = [];
  for (const block of toolUses) {
    const { name, input, toolUseId } = block.toolUse;
    const result = await dispatchSuperDocTool(client, name, input ?? {});
    results.push(formatToolResult(result, { target: 'bedrock', toolUseId }));
  }
  messages.push({ role: 'user', content: results });
}

await client.doc.save({ inPlace: true });
await client.dispose();
Auth: AWS credentials via aws configure, env vars, or IAM role. No API key needed.

Agent frameworks

Use SuperDoc tools with agent frameworks. The framework manages the agentic loop — you configure tools and let it run.
npm install @superdoc-dev/sdk ai @ai-sdk/openai zod
import { generateText, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
import { createSuperDocClient, chooseTools, dispatchSuperDocTool } from '@superdoc-dev/sdk';
import { z } from 'zod';

const client = createSuperDocClient();
await client.connect();
await client.doc.open({ doc: './contract.docx' });

// All tools — no discover_tools since the framework manages a fixed tool set
const { tools: sdTools } = await chooseTools({ provider: 'vercel', mode: 'all' });

// Wrap as Vercel AI tool() objects
const tools = {};
for (const t of sdTools) {
  tools[t.function.name] = tool({
    description: t.function.description,
    parameters: z.object({}).passthrough(),
    execute: async (args) =>
      dispatchSuperDocTool(client, t.function.name, args),
  });
}

// generateText handles the agentic loop automatically
const result = await generateText({
  model: openai('gpt-4o'),
  system: 'You edit .docx files using SuperDoc tools.',
  prompt: 'Review this contract.',
  tools,
  maxSteps: 20,
});

console.log(result.text);
await client.doc.save({ inPlace: true });
await client.dispose();
Auth: OPENAI_API_KEY env var. Swap openai(...) for anthropic(...), google(...), etc.

Tool format reference

The SDK ships pre-formatted tools for each integration. The conversion is minimal:
IntegrationTypeSDK formatSDK helpersNative shape
AWS BedrockCloud platformanthropicmergeDiscoveredTools, formatToolResult, formatToolError{ toolSpec: { name, description, inputSchema: { json } } }
Google Vertex AICloud platformgenericsanitizeToolSchemas, mergeDiscoveredTools{ functionDeclarations: [...] }
Vercel AI SDKFrameworkvercelWrap in Vercel tool() with z.object({}).passthrough()
LangChainFrameworkgenericWrap in DynamicStructuredTool
OpenAIDirect APIopenaiformatToolResultPass directly
AnthropicDirect APIanthropicformatToolResultPass directly

The discover_tools pattern

All provider examples above use essential mode (default) — 5 core tools plus discover_tools. When the model needs more tools (comments, formatting, tables, etc.), it calls discover_tools to load them dynamically. Handle this in your agentic loop. Since discover_tools is a meta-tool (not a document operation), intercept it before dispatching:
import { chooseTools, mergeDiscoveredTools } from '@superdoc-dev/sdk';

if (name === 'discover_tools') {
  // discover_tools is a meta-tool — handle client-side via chooseTools
  const discovered = await chooseTools({ provider: 'anthropic', groups: args.groups });
  mergeDiscoveredTools(toolConfig, discovered, { provider: 'anthropic', target: 'bedrock' });
  result = discovered;
} else {
  result = await dispatchSuperDocTool(client, name, args);
}
Framework-managed examples (Vercel AI, LangChain) use mode: 'all' instead, since they can’t inject new tools mid-conversation.
See the LLM Tools guide for details.

Tracked changes

For contract review workflows, you typically want all edits to appear as tracked changes so a human can accept or reject them. Two approaches:

Instruct the model

Tell the model to use changeMode: "tracked" in its apply_mutations calls:
## System prompt
All edits must use changeMode: "tracked" so they appear as
tracked changes for human review.

Use the headless editor

If you don’t need agentic tool use — just want the model to suggest edits — use the headless editor with documentMode: 'suggesting':
import { Editor } from 'superdoc/super-editor';
import { readFile, writeFile } from 'node:fs/promises';

const docx = await readFile('./contract.docx');
const editor = await Editor.open(docx, { documentMode: 'suggesting' });

// Get suggestions from your LLM (structured output, no tool use)
const suggestions = await getSuggestions(editor.state.doc.textContent);

// Apply each suggestion as a tracked change
for (const s of suggestions) {
  const matches = editor.commands.search(s.find, { highlight: false });
  if (!matches.length) continue;

  editor.commands.insertTrackedChange({
    from: matches[0].from,
    to: matches[0].to,
    text: s.replace,
    user: { name: 'AI Reviewer', email: 'ai@example.com' },
  });
}

const result = await editor.exportDocx();
await writeFile('./reviewed.docx', Buffer.from(result));
editor.destroy();

Best practices

  • Start with essential mode. Load 5 tools + discover_tools. The model loads more groups when needed. This keeps token usage low.
  • Use apply_mutations for text edits. It batches multiple rewrites in one call, reducing round trips.
  • Feed errors back. When a tool call fails, return the error as a tool result. Most models self-correct on the next turn.
  • Pin your model version. Use a specific model ID rather than an alias to avoid behavior changes between releases.

Example repository

Complete, runnable examples for all cloud platforms and frameworks (Node.js and Python) are available at examples/ai/.
  • LLM Tools — tool selection, dispatch, and the full API
  • Skills — reusable prompt templates
  • MCP Server — Model Context Protocol integration
  • SDKs — typed Node.js and Python wrappers