The Android Assistant SDK is a Maven library that ships an MCP client, conversation manager, an LLM provider abstraction, and the native Jetpack Compose renderer for BindJS. Drop it in, configure how it should reach an LLM, and you have a fully functional in-app AI assistant that calls your Metabind tools and renders results as native Compose.Documentation Index
Fetch the complete documentation index at: https://docs.metabind.ai/llms.txt
Use this file to discover all available pages before exploring further.
The Android Assistant SDK is at parity with iOS in design but lags slightly in shipped code. The API shapes shown below mirror the iOS SDK; check the package’s release notes for the latest available symbols.
Requirements
- Android 8.0 (API 26)+
- Kotlin 1.9+
- Compose Compiler 1.5+
- A Metabind project with at least one published Type
- Either: a Metabind project token (for the managed Agent proxy), or an Anthropic API key (BYOK direct mode)
Install via Maven
In your module’sbuild.gradle.kts:
build.gradle:
bindjs-android (the Compose renderer) as a transitive dependency.
Two ways to configure the LLM
The SDK accepts anLLMProvider. Two implementations ship:
| Provider | Use when |
|---|---|
MetabindAgentProvider | You want Metabind to manage LLM access. Calls the Agent proxy at agent.metabind.ai. The proxy holds the LLM key, runs the tool loop server-side, and streams responses back as SSE. Recommended for production. |
AnthropicProvider | You want to bring your own Anthropic API key (BYOK direct mode) for development or testing. The SDK calls Anthropic directly. |
LLMProvider interface.
Configure with the Agent proxy (recommended)
- Authenticates with a Metabind project token (Bearer header).
- Runs the LLM call and the tool loop on the server side. Tool results are streamed back to the client as Server-Sent Events.
- Handles secret-bearing LLM keys server-side, so your Android binary doesn’t ship any third-party provider keys.
Configure with direct BYOK mode
Drop in the chat surface
MetabindAssistantView is a Compose surface that handles input, streaming, and tool output rendering. It respects your app’s Material theme, color scheme, and content padding.
For a fully custom UI, use the lower-level API on MetabindAssistant — assistant.send(...), assistant.conversation, assistant.cancel(), and assistant.isProcessing. See Custom host UI.
Native rendering of tool output
When the assistant calls an Interactive Tool, the result renders as Composables inside the conversation:Conversation state
Conversation state is exposed as observable Kotlin Flow on the assistant. You can readassistant.conversation.messages directly to render your own UI or to feed the conversation into your app’s state. If your app needs persistence across launches, serialize the messages list (e.g., to Room, DataStore, or your backend) and rehydrate on next launch.
Coroutines and lifecycle
All SDK calls are suspending functions. Conversation state andisProcessing are StateFlows you can collect from LaunchedEffect or collectAsState.
MetabindAssistantView does this for you; the lower-level API is for fully custom UIs.
Authentication patterns
- Backend mints token. Your backend authenticates the user, mints a Metabind project token, hands it to the Android app. The SDK passes it to the Agent proxy. Most secure; recommended for production.
- Static token (dev only). Hardcoded in the app. Don’t ship to production.
ProGuard / R8
The SDK ships with consumer ProGuard rules. No app-side configuration needed for typical setups. If you’re aggressively shrinking and tool calls fail with serialization errors, ensure the rules are picked up by checkingapp/build/outputs/mapping/.
Performance
- Compose composition reuses cached layouts across renders.
- Streaming SSE events from the Agent proxy render as they arrive.
- The JavaScript engine starts on first use (~50 ms cold start) and stays warm for the session.
- Memory footprint of SDK + engine: roughly 12 MB.
What you write vs. what the SDK does
| You write | SDK does |
|---|---|
| Project URL + token from your backend | Connection, auth, retries |
| LLM provider config (Agent proxy vs. BYOK) | Tool calls, schema validation, conversation state |
MetabindAssistantView placement (or custom UI) | Streaming, message rendering, tool result rendering |
Related
Assistant SDK overview
Conceptual: when to embed vs. connect to an external host.
LLM provider configuration
Agent proxy vs. BYOK; key custody.
Custom host UI
Replace the default chat UI with your own.
iOS SDK
The iOS equivalent.