Aller au contenu principal

Installation

Akio is written in Rust and embeds llama.cpp for local model inference. You can build it from source using Cargo.


Prerequisites

  • Rust (latest stable)
  • CMake (used to build llama.cpp)
  • A C/C++ compiler (clang or gcc)

On macOS:

xcode-select --install
brew install cmake

On Linux:

apt install build-essential cmake

Installation

Install from git repository

cargo install --git https://github.com/Fastiraz/akio

Build from source

git clone --recurse-submodules https://github.com/Fastiraz/akio.git
cd akio

cargo build --release

The binary will be available at target/release/akio.


Download a model

Akio downloads GGUF models from Hugging Face. Use the pull command to fetch a model:

akio pull ggml-org/Qwen3-0.6B-GGUF

Available models:

RepositoryFileSize
ggml-org/Qwen3-0.6B-GGUFQwen3-0.6B-Q4_0.gguf0.6B
ggml-org/Qwen3-8B-GGUFQwen3-8B-Q4_K_M.gguf8B
ggml-org/gpt-oss-20b-GGUFgpt-oss-20b-Q4_0.gguf20B

Models are cached in ~/.akio/models/.