Llamatik#
Kotlin-first llama.cpp integration for on-device and remote LLM inference.
- Kotlin Multiplatform: Android, iOS, desktop (JVM) and more
- Offline inference via bundled native bindings
- Embeddings + text generation + streaming
- Schema‑constrained JSON generation (optional JSON Schema → valid JSON output)
Quick links#
- Getting started: head to Installation and Quickstart
- Guides: embeddings, generation, streaming, JSON mode
- API reference:
LlamaBridgeandGenStream - Troubleshooting: common build/linking pitfalls (especially iOS)