Aller au contenu principal

Introduction

Welcome to the official Akio documentation. Never depend on a model provider or Google a command again.

Akio is a plug-and-play autonomous AI agent with embedded model inference, written in Rust. No OpenAI, no Anthropic, no Ollama — the inference runs directly inside the binary.


Akio is under active development. The project is not yet fully functional and may not work as expected out of the box. If you'd like to contribute, feel free to open a pull request. Found a bug, vulnerability, or something unusual? Please open an issue.


What is Akio?

Akio is a local autonomous AI agent that can assist you on many things. It embeds llama.cpp directly into a single Rust binary, so model inference happens locally on your machine — completely offline, with no API keys, no cloud dependencies, and no external model provider required.

It supports GPU acceleration via Metal on macOS and OpenMP on Linux, and ships with support for multiple GGUF-quantized models from Hugging Face.


Available Tools

Akio comes with built-in tools the agent can use autonomously:

ToolDescription
shellExecute shell commands
readRead file contents
writeWrite or create files
globFind files matching patterns
websearchSearch the web via DuckDuckGo

More tools and features are coming later.


Coming Soon

  • Voice interaction
  • Image/video generation
  • More models support
  • And more...

PoC (old Python version)

This is an older proof-of-concept demonstrating Akio's features from the previous Python version. The project has since been rewritten in Rust.