Skip to content

Be11aMer/offline-ai-android

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Offline AI on Android

This project explores running local language models entirely offline on Android devices, using open-source tools like llama.cpp, UserLAnd, and various Linux distributions (Arch, Alpine, Ubuntu, Kali) installed in a rooted/non-rooted environment.

The goal: turn old or underused Android devices into capable AI inference machines with complete user control and no cloud dependency.


What’s Included

  • Install and run local LLMs (e.g., TinyLLaMA) on Android using UserLAnd
  • Compile and use llama.cpp on-device (OnePlus 3T, 6GB RAM )
  • Scripts for installing dependencies and launching models
  • Architecture for reproducible experiments and logging

Requirements

  • Android phone (6GB RAM or more recommended)
  • UserLAnd app installed from Play Store
  • Working session using Arch Linux (or other supported distro)
  • 3–5GB free storage for model + tools

πŸ“ Repo Structure

offline-ai-android/
β”œβ”€β”€ scripts/                # install and run helpers
β”œβ”€β”€ docs/                   # setup logs and markdown walkthroughs
β”œβ”€β”€ userland-setup/         # Linux distro notes
β”œβ”€β”€ experiments/            # test runs, logs, model benchmarks
β”œβ”€β”€ models/                 # tested gguf models
└── assets/                 # screenshots, diagrams

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors