Installation
Akio is written in Rust and embeds llama.cpp for local model inference. You can build it from source using Cargo.
Prerequisites
- Rust (latest stable)
- CMake (used to build llama.cpp)
- A C/C++ compiler (clang or gcc)
On macOS:
xcode-select --install
brew install cmake
On Linux:
apt install build-essential cmake
Installation
Install from git repository
cargo install --git https://github.com/Fastiraz/akio
Build from source
git clone --recurse-submodules https://github.com/Fastiraz/akio.git
cd akio
cargo build --release
The binary will be available at target/release/akio.
Download a model
Akio downloads GGUF models from Hugging Face. Use the pull command to fetch a model:
akio pull ggml-org/Qwen3-0.6B-GGUF
Available models:
| Repository | File | Size |
|---|---|---|
ggml-org/Qwen3-0.6B-GGUF | Qwen3-0.6B-Q4_0.gguf | 0.6B |
ggml-org/Qwen3-8B-GGUF | Qwen3-8B-Q4_K_M.gguf | 8B |
ggml-org/gpt-oss-20b-GGUF | gpt-oss-20b-Q4_0.gguf | 20B |
Models are cached in ~/.akio/models/.