SuperDoc tools work with any LLM provider or agent framework that supports tool use. The SDK ships tool definitions in multiple formats — pick the one that matches your stack, write a conversation loop (or let the framework handle it), and dispatch tool calls through the SDK.
Each example below opens a document, gives the model SuperDoc tools, and lets it review and edit the content.
LLM tools are in alpha. Tool names and schemas may change between releases.
Use SuperDoc tools with cloud AI platforms. You write the agentic loop and control the conversation directly.
AWS Bedrock
Google Vertex AI
npm install @superdoc-dev/sdk @aws-sdk/client-bedrock-runtime
import { BedrockRuntimeClient, ConverseCommand } from '@aws-sdk/client-bedrock-runtime';
import {
createSuperDocClient, chooseTools, dispatchSuperDocTool,
formatToolResult, mergeDiscoveredTools,
} from '@superdoc-dev/sdk';
const client = createSuperDocClient();
await client.connect();
await client.doc.open({ doc: './contract.docx' });
// Anthropic format → Bedrock toolSpec shape
const { tools } = await chooseTools({ provider: 'anthropic' });
const toolConfig = { tools: [] };
mergeDiscoveredTools(toolConfig, { tools }, { provider: 'anthropic', target: 'bedrock' });
const bedrock = new BedrockRuntimeClient({ region: 'us-east-1' });
const messages = [
{ role: 'user', content: [{ text: 'Review this contract.' }] },
];
while (true) {
const res = await bedrock.send(new ConverseCommand({
modelId: 'us.anthropic.claude-sonnet-4-6',
messages,
system: [{ text: 'You edit .docx files using SuperDoc tools.' }],
toolConfig,
}));
const output = res.output?.message;
if (!output) break;
messages.push(output);
const toolUses = output.content?.filter((b) => b.toolUse) ?? [];
if (!toolUses.length) break;
const results = [];
for (const block of toolUses) {
const { name, input, toolUseId } = block.toolUse;
const result = await dispatchSuperDocTool(client, name, input ?? {});
results.push(formatToolResult(result, { target: 'bedrock', toolUseId }));
}
messages.push({ role: 'user', content: results });
}
await client.doc.save({ inPlace: true });
await client.dispose();
pip install superdoc-sdk boto3
import boto3
from superdoc import (
SuperDocClient, choose_tools, dispatch_superdoc_tool,
format_tool_result, merge_discovered_tools,
)
client = SuperDocClient()
client.connect()
client.doc.open(doc="./contract.docx")
# Anthropic format → Bedrock toolSpec shape
sd_tools = choose_tools(provider="anthropic")
tool_config = {"tools": []}
merge_discovered_tools(tool_config, sd_tools, provider="anthropic", target="bedrock")
bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")
messages = [{"role": "user", "content": [{"text": "Review this contract."}]}]
while True:
response = bedrock.converse(
modelId="us.anthropic.claude-sonnet-4-6",
messages=messages,
system=[{"text": "You edit .docx files using SuperDoc tools."}],
toolConfig=tool_config,
)
output = response["output"]["message"]
messages.append(output)
tool_uses = [b for b in output.get("content", []) if "toolUse" in b]
if not tool_uses:
break
tool_results = []
for block in tool_uses:
tu = block["toolUse"]
result = dispatch_superdoc_tool(client, tu["name"], tu.get("input", {}))
tool_results.append(
format_tool_result(result, target="bedrock", tool_use_id=tu["toolUseId"])
)
messages.append({"role": "user", "content": tool_results})
client.doc.save(in_place=True)
client.dispose()
Auth: AWS credentials via aws configure, env vars, or IAM role. No API key needed.npm install @superdoc-dev/sdk @google-cloud/vertexai
import { VertexAI } from '@google-cloud/vertexai';
import {
createSuperDocClient, chooseTools, dispatchSuperDocTool,
sanitizeToolSchemas,
} from '@superdoc-dev/sdk';
const client = createSuperDocClient();
await client.connect();
await client.doc.open({ doc: './contract.docx' });
// Generic format → Vertex function declarations (sanitized for Vertex compatibility)
const { tools } = await chooseTools({ provider: 'generic' });
const sanitized = sanitizeToolSchemas(tools, 'vertex');
const vertexTools = [{
functionDeclarations: sanitized.map((t) => ({
name: t.name,
description: t.description,
parameters: t.parameters,
})),
}];
const vertexAI = new VertexAI({ project: 'your-project', location: 'us-central1' });
const model = vertexAI.getGenerativeModel({
model: 'gemini-2.5-pro',
tools: vertexTools,
systemInstruction: { role: 'system', parts: [{ text: 'You edit .docx files using SuperDoc tools.' }] },
});
const chat = model.startChat();
let response = await chat.sendMessage([{ text: 'Review this contract.' }]);
while (true) {
const parts = response.response.candidates?.[0]?.content.parts ?? [];
const calls = parts.filter((p) => p.functionCall);
if (!calls.length) break;
const results = [];
for (const part of calls) {
const { name, args } = part.functionCall;
const result = await dispatchSuperDocTool(client, name, args ?? {});
results.push({ functionResponse: { name, response: result } });
}
response = await chat.sendMessage(results);
}
await client.doc.save({ inPlace: true });
await client.dispose();
pip install superdoc-sdk google-cloud-aiplatform
import vertexai
from vertexai.generative_models import GenerativeModel, Tool, FunctionDeclaration, Part
from superdoc import SuperDocClient, choose_tools, dispatch_superdoc_tool, sanitize_tool_schemas
client = SuperDocClient()
client.connect()
client.doc.open(doc="./contract.docx")
# Generic format → Vertex function declarations (sanitized for Vertex compatibility)
result = choose_tools(provider="generic")
sanitized = sanitize_tool_schemas(result["tools"], "vertex")
vertex_tools = [Tool(function_declarations=[
FunctionDeclaration(name=t["name"], description=t["description"], parameters=t["parameters"])
for t in sanitized
])]
vertexai.init(project="your-project", location="us-central1")
model = GenerativeModel(
"gemini-2.5-pro",
tools=vertex_tools,
system_instruction="You edit .docx files using SuperDoc tools.",
)
chat = model.start_chat()
response = chat.send_message("Review this contract.")
while True:
calls = [p for p in response.candidates[0].content.parts if p.function_call.name]
if not calls:
break
responses = []
for part in calls:
name = part.function_call.name
args = dict(part.function_call.args) if part.function_call.args else {}
result = dispatch_superdoc_tool(client, name, args)
responses.append(Part.from_function_response(name=name, response=result))
response = chat.send_message(responses)
client.doc.save(in_place=True)
client.dispose()
Auth: gcloud auth application-default login or a service account key.
Agent frameworks
Use SuperDoc tools with agent frameworks. The framework manages the agentic loop — you configure tools and let it run.
npm install @superdoc-dev/sdk ai @ai-sdk/openai zod
import { generateText, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
import { createSuperDocClient, chooseTools, dispatchSuperDocTool } from '@superdoc-dev/sdk';
import { z } from 'zod';
const client = createSuperDocClient();
await client.connect();
await client.doc.open({ doc: './contract.docx' });
// All tools — no discover_tools since the framework manages a fixed tool set
const { tools: sdTools } = await chooseTools({ provider: 'vercel', mode: 'all' });
// Wrap as Vercel AI tool() objects
const tools = {};
for (const t of sdTools) {
tools[t.function.name] = tool({
description: t.function.description,
parameters: z.object({}).passthrough(),
execute: async (args) =>
dispatchSuperDocTool(client, t.function.name, args),
});
}
// generateText handles the agentic loop automatically
const result = await generateText({
model: openai('gpt-4o'),
system: 'You edit .docx files using SuperDoc tools.',
prompt: 'Review this contract.',
tools,
maxSteps: 20,
});
console.log(result.text);
await client.doc.save({ inPlace: true });
await client.dispose();
Auth: OPENAI_API_KEY env var. Swap openai(...) for anthropic(...), google(...), etc.npm install @superdoc-dev/sdk @langchain/openai @langchain/core @langchain/langgraph zod
import { ChatOpenAI } from '@langchain/openai';
import { DynamicStructuredTool } from '@langchain/core/tools';
import { createReactAgent } from '@langchain/langgraph/prebuilt';
import { HumanMessage } from '@langchain/core/messages';
import { z } from 'zod';
import { createSuperDocClient, chooseTools, dispatchSuperDocTool } from '@superdoc-dev/sdk';
const client = createSuperDocClient();
await client.connect();
await client.doc.open({ doc: './contract.docx' });
// All tools — no discover_tools since the framework manages a fixed tool set
const { tools: sdTools } = await chooseTools({ provider: 'generic', mode: 'all' });
// Wrap as LangChain DynamicStructuredTool objects
const tools = sdTools.map(
(t) => new DynamicStructuredTool({
name: t.name,
description: t.description,
schema: z.object({}).passthrough(),
func: async (args) => {
const result = await dispatchSuperDocTool(client, t.name, args);
return JSON.stringify(result);
},
}),
);
const agent = createReactAgent({
llm: new ChatOpenAI({ model: 'gpt-4o' }),
tools,
prompt: 'You edit .docx files using SuperDoc tools.',
});
const result = await agent.invoke({
messages: [new HumanMessage('Review this contract.')],
});
console.log(result.messages.at(-1).content);
await client.doc.save({ inPlace: true });
await client.dispose();
pip install superdoc-sdk langchain-openai langgraph
import json
from langchain_openai import ChatOpenAI
from langchain_core.tools import StructuredTool
from langgraph.prebuilt import create_react_agent
from langchain_core.messages import HumanMessage
from superdoc import SuperDocClient, choose_tools, dispatch_superdoc_tool
client = SuperDocClient()
client.connect()
client.doc.open(doc="./contract.docx")
# All tools — no discover_tools since the framework manages a fixed tool set
result = choose_tools(provider="generic", mode="all")
# Wrap as LangChain StructuredTool objects
def make_tool(t):
def invoke(**kwargs) -> str:
return json.dumps(dispatch_superdoc_tool(client, t["name"], kwargs))
return StructuredTool.from_function(
func=invoke, name=t["name"], description=t["description"],
infer_schema=False,
)
tools = [make_tool(t) for t in result["tools"]]
agent = create_react_agent(
model=ChatOpenAI(model="gpt-4o"),
tools=tools,
prompt="You edit .docx files using SuperDoc tools.",
)
result = agent.invoke(
{"messages": [HumanMessage(content="Review this contract.")]}
)
print(result["messages"][-1].content)
client.doc.save(in_place=True)
client.dispose()
Auth: OPENAI_API_KEY env var. Swap ChatOpenAI for ChatAnthropic, ChatGoogleGenerativeAI, etc.
The SDK ships pre-formatted tools for each integration. The conversion is minimal:
| Integration | Type | SDK format | SDK helpers | Native shape |
|---|
| AWS Bedrock | Cloud platform | anthropic | mergeDiscoveredTools, formatToolResult, formatToolError | { toolSpec: { name, description, inputSchema: { json } } } |
| Google Vertex AI | Cloud platform | generic | sanitizeToolSchemas, mergeDiscoveredTools | { functionDeclarations: [...] } |
| Vercel AI SDK | Framework | vercel | — | Wrap in Vercel tool() with z.object({}).passthrough() |
| LangChain | Framework | generic | — | Wrap in DynamicStructuredTool |
| OpenAI | Direct API | openai | formatToolResult | Pass directly |
| Anthropic | Direct API | anthropic | formatToolResult | Pass directly |
All provider examples above use essential mode (default) — 5 core tools plus discover_tools. When the model needs more tools (comments, formatting, tables, etc.), it calls discover_tools to load them dynamically.
Handle this in your agentic loop. Since discover_tools is a meta-tool (not a document operation), intercept it before dispatching:
import { chooseTools, mergeDiscoveredTools } from '@superdoc-dev/sdk';
if (name === 'discover_tools') {
// discover_tools is a meta-tool — handle client-side via chooseTools
const discovered = await chooseTools({ provider: 'anthropic', groups: args.groups });
mergeDiscoveredTools(toolConfig, discovered, { provider: 'anthropic', target: 'bedrock' });
result = discovered;
} else {
result = await dispatchSuperDocTool(client, name, args);
}
Framework-managed examples (Vercel AI, LangChain) use mode: 'all' instead, since they can’t inject new tools mid-conversation.
See the LLM Tools guide for details.
Tracked changes
For contract review workflows, you typically want all edits to appear as tracked changes so a human can accept or reject them. Two approaches:
Instruct the model
Tell the model to use changeMode: "tracked" in its apply_mutations calls:
## System prompt
All edits must use changeMode: "tracked" so they appear as
tracked changes for human review.
Use the headless editor
If you don’t need agentic tool use — just want the model to suggest edits — use the headless editor with documentMode: 'suggesting':
import { Editor } from 'superdoc/super-editor';
import { readFile, writeFile } from 'node:fs/promises';
const docx = await readFile('./contract.docx');
const editor = await Editor.open(docx, { documentMode: 'suggesting' });
// Get suggestions from your LLM (structured output, no tool use)
const suggestions = await getSuggestions(editor.state.doc.textContent);
// Apply each suggestion as a tracked change
for (const s of suggestions) {
const matches = editor.commands.search(s.find, { highlight: false });
if (!matches.length) continue;
editor.commands.insertTrackedChange({
from: matches[0].from,
to: matches[0].to,
text: s.replace,
user: { name: 'AI Reviewer', email: 'ai@example.com' },
});
}
const result = await editor.exportDocx();
await writeFile('./reviewed.docx', Buffer.from(result));
editor.destroy();
Best practices
- Start with essential mode. Load 5 tools +
discover_tools. The model loads more groups when needed. This keeps token usage low.
- Use
apply_mutations for text edits. It batches multiple rewrites in one call, reducing round trips.
- Feed errors back. When a tool call fails, return the error as a tool result. Most models self-correct on the next turn.
- Pin your model version. Use a specific model ID rather than an alias to avoid behavior changes between releases.
Example repository
Complete, runnable examples for all cloud platforms and frameworks (Node.js and Python) are available at examples/ai/.
- LLM Tools — tool selection, dispatch, and the full API
- Skills — reusable prompt templates
- MCP Server — Model Context Protocol integration
- SDKs — typed Node.js and Python wrappers