Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt

Use this file to discover all available pages before exploring further.

The Assistant SDK is the embed path for Metabind. It packages an MCP-aware AI client, conversation state, and the BindJS native renderer into a single SDK you drop into your app. Configure how it should reach an LLM, give it a host surface, and the SDK handles the rest.

What the SDK gives you

The Assistant SDK sits between your app's UI and rendered native UI. It packages an MCP client, LLM provider, conversation manager, and native renderer.
The result: your users get a chat-like assistant that can call your Metabind tools and render the results in true native components — no web view, no rebuild for layout changes, no release cycle when you add a tool.

Use the Assistant SDK when

  • You’re building an iOS, Android, or web product and want an in-product AI assistant.
  • You want to ship MCP tools that you built (in Metabind) without exposing a third-party MCP host to your users.
  • You want native UI rendering of tool results — SwiftUI, Compose, React — not WebViews.
  • You want Metabind to manage LLM key custody and the tool loop server-side (the default).

Use a connected MCP host instead when

  • Your users already work in Claude Desktop or ChatGPT, and you want to extend those surfaces.
  • You don’t want to ship a UI yourself — connecting to an existing host is faster.
  • The use case is developer-facing rather than end-user-facing.
A project’s tools work in both modes simultaneously. You can ship an Assistant SDK in your app and let power users also connect Claude Desktop to the same project.

Three platforms, one model

PlatformPackageRenderer
iOSmetabind-ai-apple (Swift Package)SwiftUI via bindjs-apple
Androidmetabind-ai-android (Maven)Jetpack Compose via bindjs-android
React (web)@metabind/assistant-sdk (npm)React via @bindjs/renderer
Each SDK exposes the same shape — a MetabindAssistant configured with a server and an LLMProvider, plus a default MetabindAssistantView chat surface — adapted to the platform’s idioms. The web release is React-only today; non-React frameworks can use @bindjs/renderer directly.

Two ways to reach an LLM

The SDK is provider-pluggable through an LLMProvider abstraction. Two implementations ship:
  • MetabindAgentProvider — calls Metabind’s hosted Agent proxy at agent.metabind.ai. The proxy authenticates with a Metabind project token, holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. The proxy supports Anthropic, OpenAI, and Google — selected per-project in MCP App Studio. Recommended for production.
  • AnthropicProvider — bring-your-own-key (BYOK) Anthropic, called directly from the client. Useful for development, internal tools, or apps where the key reaches the SDK from an authenticated user-managed source.
Custom providers can implement the public LLMProvider protocol if you need to integrate something else. See LLM provider configuration.

What you bring

  • A Metabind project with at least one published Type. The SDK connects to its production endpoint by default; you can override to draft for testing.
  • A Metabind project token (for Agent proxy mode), or an Anthropic API key (for BYOK mode). Mint project tokens server-side so they aren’t hard-coded in the binary.
  • A surface in your app to render the assistant — a screen, a sheet, a sidebar, an inline panel. The SDK ships default UI, but you can fully replace it.

What you don’t write

  • Tool calling logic. In Agent proxy mode the proxy runs the loop; in BYOK mode the SDK does. Either way you don’t.
  • Streaming. SSE deltas flow into the conversation state and re-render the UI.
  • Native rendering. Interactive Tool output renders through BindJS without you wiring it up.
  • Schema validation. Inputs and outputs are validated against the project’s tool schemas before they reach the renderer.

Token and key handling

WhatWhere it lives
Metabind project tokenMint per user / session on your backend. The SDK holds it for the session and uses it to authenticate to the MCP server and (in proxy mode) to the Agent proxy.
LLM provider API key (Agent proxy mode)Server-side only. Your client never sees it.
LLM provider API key (BYOK direct mode)Delivered to the SDK by your auth flow; ideally short-lived. Don’t hard-code in production.
For production, prefer the Agent proxy — your binary ships only a Metabind project token, and there is no third-party LLM key to leak.

Conversation state

The SDK maintains conversation state in memory by default — useful for ephemeral chats. For persistence across launches or reloads, read assistant.conversation.messages, serialize to your platform’s storage, and rehydrate on next launch. For multi-device conversations (start on iPhone, continue on iPad), persist the conversation server-side and resume by ID.

Native rendering details

When a Metabind Interactive Tool returns, its UI resource is handed to the SDK’s renderer:
  • iOS: Rendered as SwiftUI views. Embedded inside your View hierarchy.
  • Android: Rendered as Composables. Embedded inside your @Composable tree.
  • Web: Rendered as React components. Embedded inside your React tree.
The renderer respects your app’s typography, color scheme, and accessibility settings — it’s a native view rendered by your platform’s UI runtime, not an iframe or WebView.

What ships in the SDK

  • Default chat surface (MetabindAssistantView) with sender + assistant messages.
  • Native rendering of Interactive Tool output via BindJS.
  • MetabindAgentProvider (proxy) and AnthropicProvider (BYOK direct) on iOS, Android, and web.
  • Custom LLMProvider implementations for anything else.

iOS SDK

Swift Package — install, configure, embed.

Android SDK

Maven — install, configure, embed.

LLM provider configuration

Agent proxy vs. BYOK; key custody.

Custom host UI

Drive your own chat surface with the lower-level API.