Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt

Use this file to discover all available pages before exploring further.

The iOS Assistant SDK is a Swift Package that ships an MCP client, a conversation manager, an LLM provider abstraction, and the native SwiftUI renderer for BindJS. Drop it into your app, configure how it should reach an LLM, and you have a fully functional in-app AI assistant that calls your Metabind tools and renders results as native SwiftUI.

Requirements

  • iOS 16+
  • Xcode 15+
  • Swift 5.9+
  • A Metabind project with at least one published Type
  • Either: a Metabind project token (for the managed Agent proxy), or an Anthropic API key (BYOK direct mode)

Install via Swift Package Manager

In Xcode: File → Add Package Dependencies → paste:
https://github.com/metabindai/metabind-ai-apple
Or in Package.swift:
dependencies: [
  .package(url: "https://github.com/metabindai/metabind-ai-apple", from: "1.0.0")
]
Add MetabindAssistant to your target’s dependencies.

Two ways to configure the LLM

The SDK accepts an LLMProvider. Two implementations ship today:
ProviderUse when
MetabindAgentProviderYou want Metabind to manage LLM access. Calls the Agent proxy at agent.metabind.ai. The proxy holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. Recommended for production.
AnthropicProviderYou want to bring your own Anthropic API key (BYOK direct mode) for development or testing. The SDK calls Anthropic directly.
Custom providers can conform to the public LLMProvider protocol if you need to integrate something else.
import MetabindAssistant

let assistant = MetabindAssistant(
  server: MCPServerConfig(
    serverURL: URL(string: "https://mcp.metabind.ai/my-org/oak-and-ivory")!,
    serverHeaders: ["Authorization": "Bearer \(projectToken)"]
  ),
  llmProvider: MetabindAgentProvider(
    apiKey: projectToken
  )
)
The Agent proxy:
  • Authenticates with a Metabind project token (Bearer header).
  • Runs the LLM call and the tool loop on the server side. Tool results are streamed back to the client as Server-Sent Events.
  • Handles secret-bearing LLM keys server-side, so your iOS binary doesn’t ship any third-party provider keys.

Configure with direct BYOK mode

import MetabindAssistant

let assistant = MetabindAssistant(
  server: MCPServerConfig(
    serverURL: URL(string: "https://mcp.metabind.ai/my-org/oak-and-ivory")!,
    serverHeaders: ["Authorization": "Bearer \(projectToken)"]
  ),
  llmProvider: AnthropicProvider(
    apiKey: anthropicKey,
    model: "claude-sonnet-4-6"
  )
)
Do not ship a real Anthropic API key in a production iOS app — anyone can extract it from the binary. BYOK mode is for development, internal tools, or apps where the key reaches the SDK from an authenticated user-managed source. For production use, prefer the Agent proxy.

Drop in the chat surface

The SDK ships a default chat UI as a SwiftUI view:
import SwiftUI
import MetabindAssistant

struct AssistantScreen: View {
  let assistant: MetabindAssistant

  var body: some View {
    MetabindAssistantView(assistant: assistant)
      .navigationTitle("Assistant")
  }
}
MetabindAssistantView renders the conversation UI, handles input, streams responses, and renders Interactive Tool output as native SwiftUI views inline. It respects your app’s color scheme, dynamic type, and accessibility settings. For a fully custom UI, use the lower-level API on MetabindAssistantassistant.send(...), assistant.conversation, assistant.cancel(), and assistant.isProcessing. See Custom host UI.

Native rendering of tool output

When the assistant calls an Interactive Tool, the result renders as SwiftUI inline:
In a chat turn, a Metabind ProductCard renders inline as a real native component — image, title, price, and shop button — not a WebView.
The card is a SwiftUI view. It animates with your app’s transitions, respects dynamic type, and supports VoiceOver. It is not a WebView.

Conversation state

Conversation history is held in memory by the Conversation observable on the assistant. You can read assistant.conversation.messages directly to render your own UI or to feed the conversation into your app’s state. If your app needs persistence across launches, serialize the messages array (e.g., to Core Data, SwiftData, or your backend) and rehydrate on next launch.

Mock mode for previews

The SDK includes MockMCPServer and helpers for SwiftUI previews so you can iterate on chat UI without wiring up a live server:
#Preview {
  MetabindAssistantView(assistant: MetabindAssistant.preview())
}

Authentication patterns

  • Backend mints token. Your backend authenticates the user, mints a Metabind project token, hands it to the iOS app. The SDK passes it to the Agent proxy. Most secure; recommended for production.
  • Static token (dev only). Hardcoded in the app. Don’t ship to production.

Performance

  • The SDK ships compiled Swift; no JavaScript runtime on iOS.
  • BindJS layouts compile to SwiftUI views at render time; subsequent renders reuse cached compilation.
  • Streaming SSE events from the Agent proxy render as they arrive.

What you write vs. what the SDK does

You writeSDK does
Project URL + token from your backendConnection, auth, retries
LLM provider config (Agent proxy vs. BYOK)Tool calls, schema validation, conversation state
MetabindAssistantView placement (or custom UI)Streaming, message rendering, tool result rendering

Assistant SDK overview

Conceptual: when to embed vs. connect to an external host.

LLM provider configuration

Agent proxy vs. BYOK; key custody.

Custom host UI

Replace the default chat UI with your own.

Android SDK

The Android equivalent.