Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
e92acb4
Improve logging
avelytchko Mar 4, 2026
b57cc37
Fix linter
avelytchko Mar 4, 2026
33aae20
Add rate limit check
avelytchko Mar 4, 2026
538a090
Implement Grok API support
avelytchko Mar 4, 2026
0806b07
Parametrize user limits
avelytchko Mar 4, 2026
e91f722
Fix linter
avelytchko Mar 4, 2026
6db35fb
Add disable=broad-exception-caught for pyling
avelytchko Mar 4, 2026
1e61c42
Implement conversation context
avelytchko Mar 4, 2026
8c19200
Add ukrainian to translation to some log messages
avelytchko Mar 4, 2026
ab9188e
Modify system prompt to LLM
avelytchko Mar 4, 2026
606d45a
Implement MAX_CONTEXT_CHARS
avelytchko Mar 4, 2026
98e47dc
Fix LLM rate limiting and error handling issues
avelytchko Mar 4, 2026
5c6c98b
Remove PR_MESSAGE.md
avelytchko Mar 4, 2026
725c3e1
Fix LLM API helpers to propagate exceptions and add plain text instru…
avelytchko Mar 4, 2026
c9865ba
Fix linter
avelytchko Mar 4, 2026
c3b6179
Add disable=broad-exception-caught for pyling
avelytchko Mar 4, 2026
28bb1c7
Fix linter
avelytchko Mar 4, 2026
a8dd712
Fix cleanup task initialization
avelytchko Mar 4, 2026
3be82cc
feat: Add SQLite persistence for user data across restarts
avelytchko Mar 4, 2026
47b25a6
Fix linter
avelytchko Mar 4, 2026
1de8512
Fix linter
avelytchko Mar 4, 2026
dbdf3aa
debug: Add detailed logging for SQLite database operations
avelytchko Mar 4, 2026
deababe
Fix linter
avelytchko Mar 4, 2026
1696d2d
Update dependencies
avelytchko Mar 5, 2026
5e0b958
fix: Apply code review improvements
avelytchko Mar 5, 2026
8e89347
fix: Apply critical code review fixes
avelytchko Mar 5, 2026
9f15bb8
fix: Apply final code review improvements
avelytchko Mar 5, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,11 @@
# instagram_cookies.txt should not be tracked by git because it has cookies
instagram_cookies.txt

# SQLite database
src/data/
*.db
*.db-journal

# Byte-compiled / optimized / compiled Python files
__pycache__/
*.py[cod]
Expand Down
3 changes: 3 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ COPY src /bot

WORKDIR /bot

# Create data directory for SQLite database
RUN mkdir -p /bot/data

# https://stackoverflow.com/questions/58701233/docker-logs-erroneously-appears-empty-until-container-stops
ENV PYTHONUNBUFFERED=1

Expand Down
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,21 @@ docker build . -t downloader-bot:latest
```
docker run -d --name downloader-bot --restart always --env-file .env downloader-bot:latest
```
To persist user data (conversation history, rate limits) between restarts, add a volume:
```
docker run -d --name downloader-bot --restart always --env-file .env -v bot-data:/bot/data downloader-bot:latest
```
or use a built image from **Docker hub**
```
docker run -d --name downloader-bot --restart always --env-file .env ovchynnikov/load-bot-linux:latest
```
With persistent data:
```
docker run -d --name downloader-bot --restart always --env-file .env -v bot-data:/bot/data ovchynnikov/load-bot-linux:latest
```
or if you use instagram cookies
```
Comment on lines +31 to 43
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add language identifiers to new fenced code blocks.

Lines 31, 35, 39, and 43 trigger MD040 (fenced-code-language). Please annotate these command fences (e.g., bash) to keep docs lint-clean.

📝 Suggested doc fix
-```
+```bash
 docker run -d --name downloader-bot --restart always --env-file .env -v bot-data:/bot/data downloader-bot:latest

- +bash
docker run -d --name downloader-bot --restart always --env-file .env ovchynnikov/load-bot-linux:latest


-```
+```bash
docker run -d --name downloader-bot --restart always --env-file .env -v bot-data:/bot/data ovchynnikov/load-bot-linux:latest

- +bash
docker run -d --name downloader-bot --restart always --env-file .env -v bot-data:/bot/data -v /absolute/path/to/instagram_cookies.txt:/bot/instagram_cookies.txt ovchynnikov/load-bot-linux:latest

🧰 Tools
🪛 markdownlint-cli2 (0.21.0)

[warning] 31-31: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


[warning] 35-35: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


[warning] 39-39: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


[warning] 43-43: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 31 - 43, The README's fenced code blocks for the
Docker commands (the three docker run examples and the instagram cookies
example) are missing language identifiers and trigger MD040; update each
triple-backtick fence that wraps those commands to use a bash language tag
(e.g., replace ``` with ```bash) so all four code blocks are annotated (the
docker run without volume, the docker run using Docker Hub image, the docker run
with persistent volume, and the instagram cookies docker run).

docker run -d --name downloader-bot --restart always --env-file .env -v /absolute/path/to/instagram_cookies.txt:/bot/instagram_cookies.txt ovchynnikov/load-bot-linux:latest
docker run -d --name downloader-bot --restart always --env-file .env -v bot-data:/bot/data -v /absolute/path/to/instagram_cookies.txt:/bot/instagram_cookies.txt ovchynnikov/load-bot-linux:latest
```
or if you want use GPU power of intel chip and set USE_GPU_COMPRESSING=True variable
```
Expand Down
5 changes: 5 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,13 @@ services:
restart: unless-stopped
volumes:
- ./src:/app:cached # Use bind mount for development
- bot-data:/bot/data # Persistent storage for SQLite database
deploy:
resources:
limits:
cpus: '1'
memory: 512M

volumes:
bot-data:
driver: local
87 changes: 87 additions & 0 deletions src/db_storage.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
"""SQLite storage for bot user data persistence."""

import sqlite3
import json
import os
import time
from logger import debug


class BotStorage:
"""Handles persistent storage of user data in SQLite."""

def __init__(self, db_path="data/bot.db"):
"""Initialize database connection and create tables."""
os.makedirs(os.path.dirname(db_path), exist_ok=True)
self.db_path = db_path
self.conn = sqlite3.connect(db_path, check_same_thread=False)
self._create_tables()
debug("Database initialized at %s", db_path)

def _create_tables(self):
"""Create tables if they don't exist."""
cursor = self.conn.cursor()
cursor.execute("""
CREATE TABLE IF NOT EXISTS user_data (
user_id INTEGER PRIMARY KEY,
conversation_context TEXT,
rate_limit_timestamps TEXT,
daily_count INTEGER DEFAULT 0,
daily_date TEXT,
last_seen REAL
)
""")
cursor.execute("CREATE INDEX IF NOT EXISTS idx_user_data_last_seen ON user_data(last_seen)")
self.conn.commit()

def load_user_data(self, user_id):
"""Load user data from database."""
cursor = self.conn.cursor()
cursor.execute("SELECT * FROM user_data WHERE user_id = ?", (user_id,))
row = cursor.fetchone()
if row:
return {
"conversation_context": json.loads(row[1]) if row[1] else [],
"rate_limit_timestamps": json.loads(row[2]) if row[2] else [],
"daily_count": row[3],
"daily_date": row[4],
"last_seen": row[5],
}
return None

def save_user_data(self, user_id, conversation_context, rate_limit_timestamps, daily_count, daily_date, last_seen):
"""Save user data to database."""
cursor = self.conn.cursor()
cursor.execute(
"""
INSERT OR REPLACE INTO user_data
(user_id, conversation_context, rate_limit_timestamps, daily_count, daily_date, last_seen)
VALUES (?, ?, ?, ?, ?, ?)
""",
(
user_id,
json.dumps(conversation_context),
json.dumps(rate_limit_timestamps),
daily_count,
daily_date,
last_seen,
),
)
self.conn.commit()

def delete_user_data(self, user_id):
"""Delete user data from database."""
cursor = self.conn.cursor()
cursor.execute("DELETE FROM user_data WHERE user_id = ?", (user_id,))
self.conn.commit()

def get_stale_users(self, ttl_seconds):
"""Get list of user IDs that haven't been seen within TTL."""
current_time = time.time()
cursor = self.conn.cursor()
cursor.execute("SELECT user_id FROM user_data WHERE last_seen < ?", (current_time - ttl_seconds,))
return [row[0] for row in cursor.fetchall()]

def close(self):
"""Close database connection."""
self.conn.close()
Comment thread
coderabbitai[bot] marked this conversation as resolved.
Loading