Skip to main content
Back to plugins
@capgo/capacitor-llm
Tutorial
by github.com/Cap-go

LLM

Run Large Language Models locally on-device with Apple Intelligence and MLX support

Guide

Tutorial on LLM

Using @capgo/capacitor-llm

LLM Plugin interface for interacting with on-device language models.

Install

bun add @capgo/capacitor-llm
bunx cap sync

What This Plugin Exposes

  • createChat - Creates a new chat session.
  • sendMessage - Sends a message to the AI in a specific chat session.
  • getReadiness - Gets the readiness status of the LLM.
  • setModel - Sets the model configuration - iOS: Use "Apple Intelligence" as path for system model, or provide path to MediaPipe model - Android: Path to a MediaPipe model file (in assets or files directory).

Example Usage

createChat

Creates a new chat session.

import { CapgoLLM } from '@capgo/capacitor-llm';

await CapgoLLM.createChat();

sendMessage

Sends a message to the AI in a specific chat session.

import { CapgoLLM } from '@capgo/capacitor-llm';

await CapgoLLM.sendMessage({} as { chatId: string; message: string });

getReadiness

Gets the readiness status of the LLM.

import { CapgoLLM } from '@capgo/capacitor-llm';

await CapgoLLM.getReadiness();

setModel

Sets the model configuration - iOS: Use "Apple Intelligence" as path for system model, or provide path to MediaPipe model - Android: Path to a MediaPipe model file (in assets or files directory).

import { CapgoLLM } from '@capgo/capacitor-llm';

await CapgoLLM.setModel({} as ModelOptions);

Full Reference