A private, intelligent iOS app for recording and exploring your thoughts through voice with AI-powered insights
- One-tap audio recording with real-time waveform visualization
- On-device speech recognition using Apple's Speech framework
- Automatic transcription with confidence scoring
- Haptic feedback and intuitive UI design
- Local LLM powered by Leap iOS SDK with LFM2-1.2B model
- Contextual memory exploration and pattern analysis
- Personal chat interface that understands your thought patterns
- No cloud dependency - all AI processing happens on-device
- Kuzu Graph Database integration for complex relationship modeling
- MCP (Model Context Protocol) Server for structured thought queries
- Semantic search across thoughts, emotions, people, and activities
- Graph-based pattern detection and insight generation
- Natural language queries: "What makes me stressed?", "Fun memories with friends"
- Emotion tracking and sentiment analysis over time
- Health correlation analysis (HealthKit integration)
- Location-based thought clustering
- Keyword networks and topic modeling
- Interactive 3D thought network using Three.js
- Temporal emotion timeline with intensity tracking
- Swift Charts for statistical insights
- Pattern analysis across time, location, and context
- 100% Local Processing - no cloud sync or external APIs
- Face ID/Touch ID app lock protection
- File-level encryption (NSFileProtectionComplete)
- On-device speech recognition and AI inference
- Your thoughts never leave your device
At the heart of Tapestry lies LFM2-1.2B, a remarkably versatile 1.2 billion parameter language model that demonstrates how effective privacy-first AI can be. Despite its compact size, LFM2 drives multiple sophisticated AI capabilities throughout the app:
- Keyword Extraction: Precisely identifies key concepts from voice transcriptions using constrained generation to ensure consistent, structured output
- Entity Recognition: Extracts people, places, emotions, and activities with high accuracy while maintaining strict output formats
- Metadata Generation: Creates structured thought metadata including sentiment scores, confidence ratings, and contextual tags
- Memory Synthesis: Acts as your personal memory companion, weaving together insights from your thought patterns with natural, empathetic responses
- Context Awareness: Maintains conversation context while drawing from your personal knowledge graph to provide meaningful, personalized insights
- Adaptive Personality: Develops understanding of your communication style and emotional patterns over time
- Tool Selection: Intelligently chooses appropriate MCP tools based on natural language queries ("What makes me stressed?" → emotion pattern analysis tools)
- Parameter Generation: Converts conversational requests into precise database query parameters
- Result Interpretation: Transforms raw database results into meaningful, conversational insights
- Graph Database Queries: Generates complex Cypher queries for the Kuzu knowledge graph, enabling sophisticated relationship exploration
- Dynamic Query Optimization: Adapts query complexity based on data patterns and user intent
- Multi-hop Reasoning: Creates compound queries that traverse multiple relationship types to uncover hidden insights
- Temporal Analysis: Identifies trends in emotions, activities, and thoughts across different time periods
- Correlation Discovery: Detects relationships between health metrics, locations, people, and emotional states
- Narrative Generation: Transforms statistical patterns into compelling, personal insights
The genius of using LFM2-1.2B lies in its optimal balance between capability and privacy. While larger models require cloud processing or massive local resources, LFM2's compact architecture enables:
- Real-time Performance: Sub-second response times for all AI operations
- Battery Efficiency: Minimal impact on device battery life during extended use
- Device Compatibility: Runs smoothly on iPhone models back to the A12 Bionic chip
- Complete Privacy: Zero data leaves your device - every AI operation happens locally
- Multi-modal Intelligence: Single model handles text generation, structured extraction, function calling, and conversational AI
This makes Tapestry uniquely powerful among personal AI apps - you get enterprise-grade intelligence with consumer-friendly privacy and performance. LFM2 proves that you don't need massive models or cloud dependencies to create truly intelligent, personalized experiences.
- UI: SwiftUI with iOS 17+ features
- Audio: AVFoundation (m4a/AAC, optimized for voice)
- AI/LLM: Leap iOS SDK with LFM2-1.2B local model inference
- Graph Database: Kuzu (embedded graph database)
- Storage: Core Data + Kuzu dual storage system
- Search: Vector embeddings with NaturalLanguage framework
- 3D Visualization: WKWebView + Three.js + 3D Force Graph
- MCP Server: Model Context Protocol for structured AI interactions
- Semantic Embeddings: Vector similarity for thought relationships
- Health Integration: HealthKit correlation analysis
- Location Context: CoreLocation for geographical insights
- Biometric Security: LocalAuthentication framework integration
Tapestry/
├── TapestryApp.swift # App entry point with MCP server init
├── Models/
│ ├── Tapestry.xcdatamodeld # Core Data schema
│ ├── PersistenceController.swift # Core Data + vector search
│ ├── ChatMessage.swift # Chat interface models
│ └── ThoughtPatternModels.swift # Graph analysis models
├── Views/
│ ├── ContentView.swift # Main app navigation
│ ├── RecordView.swift # Voice recording interface
│ ├── ChatView.swift # Oracle AI chat interface
│ ├── TimelineView.swift # Thought timeline browser
│ ├── ThoughtPatternExplorationView.swift # Graph visualization
│ └── AnalyzeView.swift # Analytics dashboard
├── Services/
│ ├── AudioManager.swift # Audio recording & playback
│ ├── SpeechManager.swift # Speech recognition + embeddings
│ ├── ChatStore.swift # AI conversation management
│ ├── KuzuManager.swift # Graph database operations
│ ├── MCPThoughtServer.swift # MCP protocol server
│ ├── HealthKitManager.swift # Health data integration
│ ├── LocationManager.swift # Location context collection
│ └── BiometricAuthManager.swift # Security & authentication
├── MCP Tools/ # Specialized database query tools
│ ├── MCPThoughtServerTools.swift # Core search & retrieval
│ ├── MCPThoughtServerAnalysisTools.swift # Pattern analysis
│ ├── MCPThoughtServerContextTools.swift # Location & health correlation
│ └── MCPThoughtServerStatTools.swift # Statistical insights
├── Graph/
│ └── index.html # 3D force graph visualization
└── Resources/
├── LFM2-1.2B-8da4w_output_8da8w-seq_4096.bundle # Local LLM model
└── Assets.xcassets # App icons & images
- Xcode 15.0+
- iOS 17.0+ target device
- ~2GB free space (for LFM2-1.2B model and app data)
-
Clone & Open Project
git clone <repository-url> cd ThoughtDiary open Tapestry.xcodeproj
-
Install Dependencies (Auto-resolved via SPM)
- Kuzu Swift:
https://github.com/kuzudb/kuzu-swift.git - Leap iOS SDK:
https://github.com/liquid4all/leap-ios
- Kuzu Swift:
-
Add 3D Graph Assets to
Graph/folder:- Download
three.min.jsfrom Three.js - Download
3d-force-graph.min.jsfrom 3D Force Graph
- Download
-
Configure Project
- Set development team in Signing & Capabilities
- Enable required capabilities (see permissions below)
- Build and run on iOS 17+ device/simulator
- Microphone: Voice recording for thoughts
- Speech Recognition: On-device transcription
- HealthKit (optional): Health correlation analysis
- Location (optional): Geographic context for thoughts
- Face ID/Touch ID (optional): App security lock
- Thought: Audio recordings with transcribed text and metadata
- User settings: Preferences and configuration
- Nodes:
Thought,Keyword,Person,Emotion,Activity,Location,HealthContext - Relationships:
SIMILAR,MENTIONS,INVOLVES_PERSON,EXPRESSES_EMOTION,LOCATED_AT,HAS_HEALTH_CONTEXT - Advanced Queries: Pattern analysis, sentiment tracking, relationship mapping
search_thoughts- Natural language thought searchanalyze_patterns- Emotion/keyword/activity pattern analysisfind_similar_thoughts- Semantic similarity searchget_emotional_timeline- Temporal emotion trackingget_person_interactions- Social relationship analysisget_health_correlated_thoughts- Health-thought correlationsget_database_stats- Knowledge graph statistics
The built-in Oracle assistant provides intelligent conversations about your thoughts:
- "What thoughts bring me happiness?" → Searches for positive emotions and related memories
- "What makes me stressed lately?" → Analyzes recent stress patterns and triggers
- "Fun memories with [Person]" → Retrieves person-specific positive experiences
- "How does my sleep affect my mood?" → Correlates health data with emotional states
- "Show me thoughts from last week" → Temporal filtering with context
- Core Data for reliable app data and settings
- Kuzu for complex graph relationships and advanced queries
- Vector embeddings for semantic search across both systems
- Local LFM2-1.2B inference with Leap iOS SDK (no API keys needed)
- MCP protocol for structured AI-database interactions
- Intelligent query classification and tool selection
- Multi-purpose AI: conversation, extraction, function calling, and Cypher generation
- Zero external network requests for core functionality
- All AI processing happens locally on device
- Advanced encryption for sensitive thought data
- Optional biometric security layers
- Health data correlation (heart rate, sleep, activity)
- Location-based memory clustering
- Temporal pattern recognition across multiple dimensions
- Social relationship mapping and analysis
- Advanced NLP: Named entity recognition, topic modeling
- Visualization: More chart types, graph layout algorithms
- Export: Secure backup and data portability options
- Integrations: Calendar events, weather correlation
- Multi-modal: Photo and document attachment support
- Local-First: All data processing happens on your device
- End-to-End Privacy: No cloud sync, APIs, or external services
- Secure Storage: File-level encryption with iOS security features
- Minimal Permissions: Only requests necessary device access
- Biometric Protection: Optional Face ID/Touch ID app lock
- Transparent Processing: Open source architecture, no hidden data collection
- Device: iPhone running iOS 17.0 or later
- Storage: ~2-3GB for app, models, and user data
- Performance: A12 Bionic chip or newer recommended for optimal LFM2-1.2B performance
- Network: Not required for core functionality (100% offline capable)
Experience the future of personal knowledge management with Tapestry.