Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt

Use this file to discover all available pages before exploring further.

The Android Assistant SDK is a Maven library that ships an MCP client, conversation manager, an LLM provider abstraction, and the native Jetpack Compose renderer for BindJS. Drop it in, configure how it should reach an LLM, and you have a fully functional in-app AI assistant that calls your Metabind tools and renders results as native Compose.
The Android Assistant SDK is at parity with iOS in design but lags slightly in shipped code. The API shapes shown below mirror the iOS SDK; check the package’s release notes for the latest available symbols.

Requirements

  • Android 8.0 (API 26)+
  • Kotlin 1.9+
  • Compose Compiler 1.5+
  • A Metabind project with at least one published Type
  • Either: a Metabind project token (for the managed Agent proxy), or an Anthropic API key (BYOK direct mode)

Install via Maven

In your module’s build.gradle.kts:
dependencies {
  implementation("ai.metabind:assistant:1.0.0")
}
Or in build.gradle:
dependencies {
  implementation 'ai.metabind:assistant:1.0.0'
}
The SDK pulls in bindjs-android (the Compose renderer) as a transitive dependency.

Two ways to configure the LLM

The SDK accepts an LLMProvider. Two implementations ship:
ProviderUse when
MetabindAgentProviderYou want Metabind to manage LLM access. Calls the Agent proxy at agent.metabind.ai. The proxy holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. Recommended for production.
AnthropicProviderYou want to bring your own Anthropic API key (BYOK direct mode) for development or testing. The SDK calls Anthropic directly.
Custom providers can implement the public LLMProvider interface.
import ai.metabind.assistant.MetabindAssistant
import ai.metabind.assistant.MCPServerConfig
import ai.metabind.assistant.llm.MetabindAgentProvider

val assistant = MetabindAssistant(
  server = MCPServerConfig(
    serverURL = "https://mcp.metabind.ai/my-org/oak-and-ivory",
    serverHeaders = mapOf("Authorization" to "Bearer $projectToken")
  ),
  llmProvider = MetabindAgentProvider(apiKey = projectToken)
)
The Agent proxy:
  • Authenticates with a Metabind project token (Bearer header).
  • Runs the LLM call and the tool loop on the server side. Tool results are streamed back to the client as Server-Sent Events.
  • Handles secret-bearing LLM keys server-side, so your Android binary doesn’t ship any third-party provider keys.

Configure with direct BYOK mode

import ai.metabind.assistant.MetabindAssistant
import ai.metabind.assistant.MCPServerConfig
import ai.metabind.assistant.llm.AnthropicProvider

val assistant = MetabindAssistant(
  server = MCPServerConfig(
    serverURL = "https://mcp.metabind.ai/my-org/oak-and-ivory",
    serverHeaders = mapOf("Authorization" to "Bearer $projectToken")
  ),
  llmProvider = AnthropicProvider(
    apiKey = anthropicKey,
    model = "claude-sonnet-4-6"
  )
)
Do not ship a real Anthropic API key in a production Android app — anyone can extract it from the APK. BYOK mode is for development, internal tools, or apps where the key reaches the SDK from an authenticated user-managed source. For production use, prefer the Agent proxy.

Drop in the chat surface

import androidx.compose.runtime.Composable
import ai.metabind.assistant.ui.MetabindAssistantView

@Composable
fun AssistantScreen(assistant: MetabindAssistant) {
  MetabindAssistantView(assistant = assistant)
}
MetabindAssistantView is a Compose surface that handles input, streaming, and tool output rendering. It respects your app’s Material theme, color scheme, and content padding. For a fully custom UI, use the lower-level API on MetabindAssistantassistant.send(...), assistant.conversation, assistant.cancel(), and assistant.isProcessing. See Custom host UI.

Native rendering of tool output

When the assistant calls an Interactive Tool, the result renders as Composables inside the conversation:
In a chat turn, a Metabind ProductCard renders inline as a real Compose composable — image, title, price, and shop button — not a WebView.
The card is a Composable. It animates with your app’s transitions, respects Material You theming, and supports TalkBack. It is not a WebView. The runtime uses a JavaScript engine under the hood — the BindJS compiler translates the layout spec to Compose at render time. The footprint is around 12 MB combined SDK + JS engine.

Conversation state

Conversation state is exposed as observable Kotlin Flow on the assistant. You can read assistant.conversation.messages directly to render your own UI or to feed the conversation into your app’s state. If your app needs persistence across launches, serialize the messages list (e.g., to Room, DataStore, or your backend) and rehydrate on next launch.

Coroutines and lifecycle

All SDK calls are suspending functions. Conversation state and isProcessing are StateFlows you can collect from LaunchedEffect or collectAsState.
val messages by assistant.conversation.messages.collectAsState()
val isProcessing by assistant.isProcessing.collectAsState()
The default MetabindAssistantView does this for you; the lower-level API is for fully custom UIs.

Authentication patterns

  • Backend mints token. Your backend authenticates the user, mints a Metabind project token, hands it to the Android app. The SDK passes it to the Agent proxy. Most secure; recommended for production.
  • Static token (dev only). Hardcoded in the app. Don’t ship to production.

ProGuard / R8

The SDK ships with consumer ProGuard rules. No app-side configuration needed for typical setups. If you’re aggressively shrinking and tool calls fail with serialization errors, ensure the rules are picked up by checking app/build/outputs/mapping/.

Performance

  • Compose composition reuses cached layouts across renders.
  • Streaming SSE events from the Agent proxy render as they arrive.
  • The JavaScript engine starts on first use (~50 ms cold start) and stays warm for the session.
  • Memory footprint of SDK + engine: roughly 12 MB.

What you write vs. what the SDK does

You writeSDK does
Project URL + token from your backendConnection, auth, retries
LLM provider config (Agent proxy vs. BYOK)Tool calls, schema validation, conversation state
MetabindAssistantView placement (or custom UI)Streaming, message rendering, tool result rendering

Assistant SDK overview

Conceptual: when to embed vs. connect to an external host.

LLM provider configuration

Agent proxy vs. BYOK; key custody.

Custom host UI

Replace the default chat UI with your own.

iOS SDK

The iOS equivalent.