Skip to content

yuyanwangg16/Fridge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fridge

AI-assisted home fridge inventory with use-by suggestions and hazard highlighting (see Description.md and docs/inventory-and-spoilage-alerts.md).

App (web/)

Stack: Next.js 15, React 19, Prisma + SQLite, and the OpenAI API for structured hazard + shelf-life inference (default model gpt-5.4-mini), with local rules if the API is not configured.

Run on your Mac

  1. Install Node.js (LTS, includes npm).
  2. In a terminal:
cd web
cp .env.example .env
  1. Edit web/.env and set your OPENAI_API_KEY (from OpenAI API keys).

  2. Then:

npm install
npx prisma db push
npm run dev
  1. Open http://localhost:3000.

Without OPENAI_API_KEY, the app still runs; use-by and hazard hints use local rules and tables only.

Environment (typical setup)

Variable Required Notes
OPENAI_API_KEY Yes (for AI) Your sk-… key.
OPENAI_MODEL No Defaults to gpt-5.4-mini.
DATABASE_URL Yes Default in .env.example: file:./dev.db (SQLite under web/prisma/).

Optional: local LLM (Ollama / LM Studio)

If you set OPENAI_BASE_URL (e.g. http://127.0.0.1:11434/v1 for Ollama), the app talks to that server instead. Use OPENAI_MODEL to match the local model name (e.g. llama3.2). If you also keep an sk-… key, a failed local call can retry on OpenAI using OPENAI_FALLBACK_MODEL (default gpt-5.4-mini). See web/.env.example history or ask in the repo if you need this.

Repo layout

Fridge/
├── Description.md          # Master feature list
├── docs/                   # Feature specs
├── README.md
└── web/                    # Next.js application
    ├── prisma/
    │   └── schema.prisma
    ├── src/
    │   ├── app/            # Routes & API
    │   ├── components/
    │   └── lib/            # Inference, rules, Prisma client
    ├── package.json
    └── .env.example

API (local)

  • GET /api/health — Liveness + LLM config (llmConfigured, provider, model).
  • GET /api/items?sort=useBy|addedAt — Active inventory.
  • POST /api/items — Create item (runs inference; optional useBy override).
  • POST /api/infer — Preview suggestion without saving.
  • PATCH /api/items/[id] — Update item.
  • DELETE /api/items/[id] — Remove item.
  • POST /api/items/[id]/consume — Mark consumed.

Caveats

  • OpenAI usage is billed; calls need internet.
  • Accuracy: Models can be wrong about food safety. The UI includes disclaimers; curated rules bias toward caution.

If anything fails (Prisma, port 3000 in use), check the terminal from npm run dev or npx prisma db push.

About

An AI agent that help you to organize your fridge at home

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages