Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt

Use this file to discover all available pages before exploring further.

The React Assistant SDK is an npm package that ships an MCP client, conversation manager, an LLM provider abstraction, and the React renderer (@bindjs/renderer) for BindJS. Drop it into your React app, configure how it should reach an LLM, and you have a fully functional in-app AI assistant that calls your Metabind tools and renders results as React components.
The SDK ships React components and a useAssistant hook — there isn’t a framework-agnostic core today. If you’re using Vue, Svelte, or vanilla DOM, wrap @bindjs/renderer directly or wait for a framework-agnostic release.

Requirements

  • React 18+
  • A bundler (Vite, Next.js, webpack, etc.)
  • A Metabind project with at least one published Type
  • Either: a Metabind project token (for the managed Agent proxy), or an Anthropic API key (BYOK direct mode)

Install via npm

npm install @metabind/assistant-sdk @bindjs/renderer
Or with pnpm/yarn:
pnpm add @metabind/assistant-sdk @bindjs/renderer
yarn add @metabind/assistant-sdk @bindjs/renderer
@bindjs/renderer is the BindJS renderer for React. The SDK depends on it.

Two ways to configure the LLM

The SDK accepts an LLMProvider. Two implementations ship:
ProviderUse when
MetabindAgentProviderYou want Metabind to manage LLM access. Calls the Agent proxy at agent.metabind.ai. The proxy holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. Recommended for production.
AnthropicProviderYou want to bring your own Anthropic API key (BYOK direct mode) for development or testing. The SDK calls Anthropic directly from the browser.
Custom providers can implement the public LLMProvider interface.
import { MetabindAssistant, MetabindAgentProvider } from "@metabind/assistant-sdk";

const assistant = new MetabindAssistant({
  server: {
    serverURL: "https://mcp.metabind.ai/my-org/oak-and-ivory",
    serverHeaders: { Authorization: `Bearer ${projectToken}` }
  },
  llmProvider: new MetabindAgentProvider({ apiKey: projectToken })
});
The Agent proxy:
  • Authenticates with a Metabind project token (Bearer header).
  • Runs the LLM call and the tool loop on the server side. Tool results stream back to the browser as Server-Sent Events.
  • Handles secret-bearing LLM keys server-side, so your bundle doesn’t ship any third-party provider keys.

Configure with direct BYOK mode

import { MetabindAssistant, AnthropicProvider } from "@metabind/assistant-sdk";

const assistant = new MetabindAssistant({
  server: {
    serverURL: "https://mcp.metabind.ai/my-org/oak-and-ivory",
    serverHeaders: { Authorization: `Bearer ${projectToken}` }
  },
  llmProvider: new AnthropicProvider({
    apiKey: anthropicKey,
    model: "claude-sonnet-4-6"
  })
});
Do not ship a real Anthropic API key in production browser code — anyone with DevTools can read it from network traffic or memory. BYOK mode is appropriate for local development, internal tools, or apps where the key reaches the SDK from an authenticated user-managed source. For production, prefer the Agent proxy.

Drop in the chat surface

The SDK ships a default React component:
import { MetabindAssistantView } from "@metabind/assistant-sdk/react";

export function AssistantPanel({ assistant }) {
  return <MetabindAssistantView assistant={assistant} />;
}
MetabindAssistantView is a fully featured chat UI: input, streaming, tool call rendering, scrolling, accessibility. It’s themed via CSS variables — drop it into a styled container and it picks up your app’s typography and colors. For a fully custom UI, use the useAssistant hook directly. See Custom host UI.

Native rendering of tool output

Interactive Tool output renders inline as React components via @bindjs/renderer. The default MetabindAssistantView wires this up automatically. Override the renderer if you want to wrap the result with your own framing:
import { BindJSRenderer } from "@bindjs/renderer";

<MetabindAssistantView
  assistant={assistant}
  renderToolOutput={(output) => <BindJSRenderer spec={output.ui} />}
/>
The renderer produces real React components, not iframes — they participate in your app’s layout, theme, and event handling.

Server-side rendering

@bindjs/renderer works with Next.js, Remix, and any SSR framework. Tool output specs serialize to JSON; the renderer converts them on either server or client. For Next.js App Router, the assistant client itself manages live state, so wrap it in a client component:
"use client";

import { MetabindAssistant, MetabindAssistantView } from "@metabind/assistant-sdk";
// ...

Conversation state

Conversation history is held in memory. You can read assistant.conversation.messages directly to render your own UI or to feed the conversation into your app’s state. If your app needs persistence across reloads, serialize the messages array (e.g., to localStorage, IndexedDB, or your backend) and rehydrate on next mount.

Streaming

Streaming is on by default. SSE events from the Agent proxy render as they arrive; in BYOK mode, tokens stream from the LLM and tool calls render their results as soon as they complete. The conversation state is reactive — components subscribed via the SDK’s hooks re-render on each delta.

Authentication patterns

  • Backend mints token. Your backend authenticates the user, mints a Metabind project token, returns it to the client. Most secure; recommended for production.
  • Static token (dev only). Hardcoded in client config. Don’t ship to production.

TypeScript

The SDK is TypeScript-first. Conversation state, tool calls, and component specs are fully typed. Tool input and output schemas can be code-generated from your Metabind project for compile-time correctness:
npx @metabind/cli generate-types --project oak-and-ivory --out src/metabind-types.ts

Bundle size

The SDK is roughly 25 KB minified + gzipped, plus @bindjs/renderer at around 60 KB. Tree-shakable — unused features (e.g., the BYOK Anthropic provider when you only use the Agent proxy) drop out of production builds.

What you write vs. what the SDK does

You writeSDK does
Project URL + token from your backendConnection, auth, retries
LLM provider config (Agent proxy vs. BYOK)Tool calls, schema validation, conversation state
<MetabindAssistantView> placement (or custom UI)Streaming, message rendering, tool result rendering

Assistant SDK overview

Conceptual: when to embed vs. connect to an external host.

LLM provider configuration

Agent proxy vs. BYOK; key custody.

Custom host UI

Replace the default chat UI with your own.

iOS SDK

The iOS equivalent.