Skip to content

SkillichSE/Lumi-userbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lumi Bot

Lumi Userbot

Telegram userbot integrated with local LLM (LM Studio)

Responds to messages, remembers chat history, and speaks in different moods. Triggered by the name "Lumi".

Python Telethon LM Studio

User-facing messages: Russian


Features

Local AI Integration

  • LM Studio API (llama-3.1-8b-instruct)
  • Automatic web search via DuckDuckGo

Smart Memory System

  • Per-chat conversation history
  • Persistent memory notes stored in data/
  • Contextual responses

Multiple Personalities

  • friendly, sarcastic, formal, funny
  • aggressive, shy, creative, philosophical, minimal, chaotic

Management Commands

  • Memory operations
  • Mood switching
  • Owner-only controls
📸 View Screenshots

Chat Example

Chat

Memory System

Memory

Mood Selection

Moods

Commands

Commands

Reset Function

Reset

Installation

# Clone repository
git clone <repo_url>
cd lumi-bot

# Setup virtual environment
python -m venv .venv

# Activate environment
# Linux/macOS:
source .venv/bin/activate
# Windows:
.venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

Configuration

Copy .env.example to .env and configure:

TG_API_ID=12345678
TG_API_HASH=your_api_hash
OWNER_IDS=123456789,987654321
SESSION_NAME=lumi_userbot

Running

python main.py

Important: LM Studio must be running at http://localhost:1234


Project Structure

lumi/
├── main.py           # entry point
├── config.py         # all constants and env variables
├── requirements.txt
├── .env.example
├── ai/
│   ├── model.py      # ask_model, clean_response
│   ├── moods.py      # mood prompts
│   └── search.py     # classifier, DuckDuckGo, analyze_and_search
├── bot/
│   ├── handler.py    # message handler, rate limiting
│   ├── commands.py   # all /commands
│   └── console.py    # console debug mode
└── utils/
    ├── storage.py    # chat data persistence (saved to data/)
    └── history.py    # in-memory conversation history

Commands Reference

General

Command Description
/ping Check response time
/model Show active model
/prompt Show current system prompt

Memory Management

Command Description
/memorize <text> Save a note
/show_memory List saved notes
/forget Delete all notes
/forget <number> Delete specific note

Mood Control

Command Description
/mood Show current mood
/mood <mood> Set mood
/mood list List available moods

Available moods: sarcastic, friendly, formal, funny, aggressive, shy, creative, philosophical, minimal, chaotic

Owner Commands

Command Description
/reset Clear history, memory and reset mood
/set_prompt <text> Override system prompt for current chat

LM Studio Configuration

API Endpoint: http://localhost:1234/v1/chat/completions
Default Model: llama-3.1-8b-instruct
Request Timeout: 60 seconds
Temperature: 0.5
Max Tokens: 200

Technical Notes

  • User-facing messages: Russian
  • Developer resources: English
  • History size: Configurable via HISTORY_MAX in config.py
  • Memory files: Stored in data/ directory, excluded from git
  • PROJECT_LINKS: Customizable in config.py

Made with local AI

About

AI userbot (telethon) for Telegram powered by LM Studio

Resources

Stars

Watchers

Forks

Contributors

Languages