Your actual pet, living on your actual desktop.
liveondesk takes a photo of your animal and generates a personalized animated sprite — trained on your pet specifically — that walks on top of open windows, jumps on blocks, hides in corners, dances when music plays, and thinks out loud in a floating bubble. Not a generic character. Your dog. Your cat. Whatever creature you photographed.
Upload a photo during onboarding. The app isolates your pet from the background, identifies the animal type, and trains a LoRA model on fal.ai in roughly two minutes. From that model it generates animation frames for every behavior cycle — walking, idle, jumping, sleeping, dancing, sniffing — assembles them into a spritesheet, and your pet is live on the desktop within five minutes of first launch.
The desktop layer is a transparent, borderless NSPanel that floats above all other windows without stealing focus or blocking clicks. SpriteKit handles physics: gravity, collision, and the satisfying moment when a window closes under your pet and it falls to the next surface.
Thoughts are generated by Apple Foundation Models on-device (macOS 26+), with MLX fallback for older Apple Silicon Macs, and GPT-4o-mini as the cloud fallback. The pet knows what app you have open, what time it is, and how long you've been inactive.
- Swift / AppKit / SpriteKit — transparent floating window, 2D physics engine
- CGWindowListCopyWindowInfo — window geometry at 2 Hz, no Screen Recording permission required
- Vision framework — on-device foreground isolation, animal detection, color extraction
- fal.ai — FLUX LoRA training, frame generation, Wan 2.1 video cycles
- Apple Foundation Models / MLX / GPT-4o-mini — thought generation, tiered by availability
- Supabase — auth, sprite storage, cross-device sync
- Transparent window with static sprite — validate the core visual experience first
- Window detection and basic physics
- Behavior state machine
- Thought bubble with static phrases
- Onboarding pipeline (photo to Vision to color-matched base sprite)
- AI generation pipeline (LoRA, frame generation, spritesheet)
- Dynamic thoughts with LLM
- Monetization (StoreKit 2)
- App Store distribution
Step 1 is the only gate that matters. If the effect of a real pet living on real windows does not create the right reaction in the first ten people who try it, iterate before building anything else.
Mac App Store primary. Signed and notarized DMG with Sparkle for direct distribution. No Screen Recording permission required — window geometry only.
macOS 14+ — Apple Silicon and Intel