The Assistant SDK is the embed path for Metabind. It packages an MCP-aware AI client, conversation state, and the BindJS native renderer into a single SDK you drop into your app. Configure how it should reach an LLM, give it a host surface, and the SDK handles the rest.Documentation Index
Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt
Use this file to discover all available pages before exploring further.
What the SDK gives you
Use the Assistant SDK when
- You’re building an iOS, Android, or web product and want an in-product AI assistant.
- You want to ship MCP tools that you built (in Metabind) without exposing a third-party MCP host to your users.
- You want native UI rendering of tool results — SwiftUI, Compose, React — not WebViews.
- You want Metabind to manage LLM key custody and the tool loop server-side (the default).
Use a connected MCP host instead when
- Your users already work in Claude Desktop or ChatGPT, and you want to extend those surfaces.
- You don’t want to ship a UI yourself — connecting to an existing host is faster.
- The use case is developer-facing rather than end-user-facing.
Three platforms, one model
| Platform | Package | Renderer |
|---|---|---|
| iOS | metabind-ai-apple (Swift Package) | SwiftUI via bindjs-apple |
| Android | metabind-ai-android (Maven) | Jetpack Compose via bindjs-android |
| React (web) | @metabind/assistant-sdk (npm) | React via @bindjs/renderer |
MetabindAssistant configured with a server and an LLMProvider, plus a default MetabindAssistantView chat surface — adapted to the platform’s idioms. The web release is React-only today; non-React frameworks can use @bindjs/renderer directly.
Two ways to reach an LLM
The SDK is provider-pluggable through anLLMProvider abstraction. Two implementations ship:
MetabindAgentProvider— calls Metabind’s hosted Agent proxy atagent.metabind.ai. The proxy authenticates with a Metabind project token, holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. The proxy supports Anthropic, OpenAI, and Google — selected per-project in MCP App Studio. Recommended for production.AnthropicProvider— bring-your-own-key (BYOK) Anthropic, called directly from the client. Useful for development, internal tools, or apps where the key reaches the SDK from an authenticated user-managed source.
LLMProvider protocol if you need to integrate something else. See LLM provider configuration.
What you bring
- A Metabind project with at least one published Type. The SDK connects to its production endpoint by default; you can override to draft for testing.
- A Metabind project token (for Agent proxy mode), or an Anthropic API key (for BYOK mode). Mint project tokens server-side so they aren’t hard-coded in the binary.
- A surface in your app to render the assistant — a screen, a sheet, a sidebar, an inline panel. The SDK ships default UI, but you can fully replace it.
What you don’t write
- Tool calling logic. In Agent proxy mode the proxy runs the loop; in BYOK mode the SDK does. Either way you don’t.
- Streaming. SSE deltas flow into the conversation state and re-render the UI.
- Native rendering. Interactive Tool output renders through BindJS without you wiring it up.
- Schema validation. Inputs and outputs are validated against the project’s tool schemas before they reach the renderer.
Token and key handling
| What | Where it lives |
|---|---|
| Metabind project token | Mint per user / session on your backend. The SDK holds it for the session and uses it to authenticate to the MCP server and (in proxy mode) to the Agent proxy. |
| LLM provider API key (Agent proxy mode) | Server-side only. Your client never sees it. |
| LLM provider API key (BYOK direct mode) | Delivered to the SDK by your auth flow; ideally short-lived. Don’t hard-code in production. |
Conversation state
The SDK maintains conversation state in memory by default — useful for ephemeral chats. For persistence across launches or reloads, readassistant.conversation.messages, serialize to your platform’s storage, and rehydrate on next launch. For multi-device conversations (start on iPhone, continue on iPad), persist the conversation server-side and resume by ID.
Native rendering details
When a Metabind Interactive Tool returns, its UI resource is handed to the SDK’s renderer:- iOS: Rendered as SwiftUI views. Embedded inside your
Viewhierarchy. - Android: Rendered as Composables. Embedded inside your
@Composabletree. - Web: Rendered as React components. Embedded inside your React tree.
What ships in the SDK
- Default chat surface (
MetabindAssistantView) with sender + assistant messages. - Native rendering of Interactive Tool output via BindJS.
MetabindAgentProvider(proxy) andAnthropicProvider(BYOK direct) on iOS, Android, and web.- Custom
LLMProviderimplementations for anything else.
Related
iOS SDK
Swift Package — install, configure, embed.
Android SDK
Maven — install, configure, embed.
LLM provider configuration
Agent proxy vs. BYOK; key custody.
Custom host UI
Drive your own chat surface with the lower-level API.