The iOS Assistant SDK is a Swift Package that ships an MCP client, a conversation manager, an LLM provider abstraction, and the native SwiftUI renderer for BindJS. Drop it into your app, configure how it should reach an LLM, and you have a fully functional in-app AI assistant that calls your Metabind tools and renders results as native SwiftUI.Documentation Index
Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt
Use this file to discover all available pages before exploring further.
Requirements
- iOS 16+
- Xcode 15+
- Swift 5.9+
- A Metabind project with at least one published Type
- Either: a Metabind project token (for the managed Agent proxy), or an Anthropic API key (BYOK direct mode)
Install via Swift Package Manager
In Xcode: File → Add Package Dependencies → paste:Package.swift:
MetabindAssistant to your target’s dependencies.
Two ways to configure the LLM
The SDK accepts anLLMProvider. Two implementations ship today:
| Provider | Use when |
|---|---|
MetabindAgentProvider | You want Metabind to manage LLM access. Calls the Agent proxy at agent.metabind.ai. The proxy holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. Recommended for production. |
AnthropicProvider | You want to bring your own Anthropic API key (BYOK direct mode) for development or testing. The SDK calls Anthropic directly. |
LLMProvider protocol if you need to integrate something else.
Configure with the Agent proxy (recommended)
- Authenticates with a Metabind project token (Bearer header).
- Runs the LLM call and the tool loop on the server side. Tool results are streamed back to the client as Server-Sent Events.
- Handles secret-bearing LLM keys server-side, so your iOS binary doesn’t ship any third-party provider keys.
Configure with direct BYOK mode
Drop in the chat surface
The SDK ships a default chat UI as a SwiftUI view:MetabindAssistantView renders the conversation UI, handles input, streams responses, and renders Interactive Tool output as native SwiftUI views inline. It respects your app’s color scheme, dynamic type, and accessibility settings.
For a fully custom UI, use the lower-level API on MetabindAssistant — assistant.send(...), assistant.conversation, assistant.cancel(), and assistant.isProcessing. See Custom host UI.
Native rendering of tool output
When the assistant calls an Interactive Tool, the result renders as SwiftUI inline:Conversation state
Conversation history is held in memory by theConversation observable on the assistant. You can read assistant.conversation.messages directly to render your own UI or to feed the conversation into your app’s state. If your app needs persistence across launches, serialize the messages array (e.g., to Core Data, SwiftData, or your backend) and rehydrate on next launch.
Mock mode for previews
The SDK includesMockMCPServer and helpers for SwiftUI previews so you can iterate on chat UI without wiring up a live server:
Authentication patterns
- Backend mints token. Your backend authenticates the user, mints a Metabind project token, hands it to the iOS app. The SDK passes it to the Agent proxy. Most secure; recommended for production.
- Static token (dev only). Hardcoded in the app. Don’t ship to production.
Performance
- The SDK ships compiled Swift; no JavaScript runtime on iOS.
- BindJS layouts compile to SwiftUI views at render time; subsequent renders reuse cached compilation.
- Streaming SSE events from the Agent proxy render as they arrive.
What you write vs. what the SDK does
| You write | SDK does |
|---|---|
| Project URL + token from your backend | Connection, auth, retries |
| LLM provider config (Agent proxy vs. BYOK) | Tool calls, schema validation, conversation state |
MetabindAssistantView placement (or custom UI) | Streaming, message rendering, tool result rendering |
Related
Assistant SDK overview
Conceptual: when to embed vs. connect to an external host.
LLM provider configuration
Agent proxy vs. BYOK; key custody.
Custom host UI
Replace the default chat UI with your own.
Android SDK
The Android equivalent.