The React Assistant SDK is an npm package that ships an MCP client, conversation manager, an LLM provider abstraction, and the React renderer (Documentation Index
Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt
Use this file to discover all available pages before exploring further.
@bindjs/renderer) for BindJS. Drop it into your React app, configure how it should reach an LLM, and you have a fully functional in-app AI assistant that calls your Metabind tools and renders results as React components.
The SDK ships React components and a
useAssistant hook — there isn’t a framework-agnostic core today. If you’re using Vue, Svelte, or vanilla DOM, wrap @bindjs/renderer directly or wait for a framework-agnostic release.Requirements
- React 18+
- A bundler (Vite, Next.js, webpack, etc.)
- A Metabind project with at least one published Type
- Either: a Metabind project token (for the managed Agent proxy), or an Anthropic API key (BYOK direct mode)
Install via npm
@bindjs/renderer is the BindJS renderer for React. The SDK depends on it.
Two ways to configure the LLM
The SDK accepts anLLMProvider. Two implementations ship:
| Provider | Use when |
|---|---|
MetabindAgentProvider | You want Metabind to manage LLM access. Calls the Agent proxy at agent.metabind.ai. The proxy holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. Recommended for production. |
AnthropicProvider | You want to bring your own Anthropic API key (BYOK direct mode) for development or testing. The SDK calls Anthropic directly from the browser. |
LLMProvider interface.
Configure with the Agent proxy (recommended)
- Authenticates with a Metabind project token (Bearer header).
- Runs the LLM call and the tool loop on the server side. Tool results stream back to the browser as Server-Sent Events.
- Handles secret-bearing LLM keys server-side, so your bundle doesn’t ship any third-party provider keys.
Configure with direct BYOK mode
Drop in the chat surface
The SDK ships a default React component:MetabindAssistantView is a fully featured chat UI: input, streaming, tool call rendering, scrolling, accessibility. It’s themed via CSS variables — drop it into a styled container and it picks up your app’s typography and colors.
For a fully custom UI, use the useAssistant hook directly. See Custom host UI.
Native rendering of tool output
Interactive Tool output renders inline as React components via@bindjs/renderer. The default MetabindAssistantView wires this up automatically. Override the renderer if you want to wrap the result with your own framing:
Server-side rendering
@bindjs/renderer works with Next.js, Remix, and any SSR framework. Tool output specs serialize to JSON; the renderer converts them on either server or client.
For Next.js App Router, the assistant client itself manages live state, so wrap it in a client component:
Conversation state
Conversation history is held in memory. You can readassistant.conversation.messages directly to render your own UI or to feed the conversation into your app’s state. If your app needs persistence across reloads, serialize the messages array (e.g., to localStorage, IndexedDB, or your backend) and rehydrate on next mount.
Streaming
Streaming is on by default. SSE events from the Agent proxy render as they arrive; in BYOK mode, tokens stream from the LLM and tool calls render their results as soon as they complete. The conversation state is reactive — components subscribed via the SDK’s hooks re-render on each delta.Authentication patterns
- Backend mints token. Your backend authenticates the user, mints a Metabind project token, returns it to the client. Most secure; recommended for production.
- Static token (dev only). Hardcoded in client config. Don’t ship to production.
TypeScript
The SDK is TypeScript-first. Conversation state, tool calls, and component specs are fully typed. Tool input and output schemas can be code-generated from your Metabind project for compile-time correctness:Bundle size
The SDK is roughly 25 KB minified + gzipped, plus@bindjs/renderer at around 60 KB. Tree-shakable — unused features (e.g., the BYOK Anthropic provider when you only use the Agent proxy) drop out of production builds.
What you write vs. what the SDK does
| You write | SDK does |
|---|---|
| Project URL + token from your backend | Connection, auth, retries |
| LLM provider config (Agent proxy vs. BYOK) | Tool calls, schema validation, conversation state |
<MetabindAssistantView> placement (or custom UI) | Streaming, message rendering, tool result rendering |
Related
Assistant SDK overview
Conceptual: when to embed vs. connect to an external host.
LLM provider configuration
Agent proxy vs. BYOK; key custody.
Custom host UI
Replace the default chat UI with your own.
iOS SDK
The iOS equivalent.