Sckry is an intelligent "Network Search Assistant" that visualizes and makes your LinkedIn network searchable. It treats your professional connections as a database, using Generative AI to structure messy data and enable natural language queries (e.g., "Find VCs in my network").
- Framework: Next.js 16.0 (App Router) + React 19.2
- Language: TypeScript
- Styling: Tailwind CSS v4 (using
@tailwindcss/postcss),tailwind-merge,clsx - UI Library: Radix UI primitives, Framer Motion (animations), Lucide React (icons)
- Visualization: Recharts
- Database: Supabase (PostgreSQL)
- AI/LLM: Google Gemini (
@google/genai) used for both RAG and Data Classification - Scraping Engine: Playwright (Server-side) with client feedback streaming
- Validation: Zod
- Payments: Stripe
The application is structured into three distinct layers:
Unlike typical scrapers that run silently, Sckry implements a "remote browser" pattern. The server runs Playwright but streams status updates and screenshots back to the client via Server-Sent Events (SSE).
hooks/use-linkedin-sync.ts: Manages the connection to the server, handling the SSE stream.lib/client-scraper.ts: Client-side console injection scraper (fallback/alternative).
Raw LinkedIn data is messy. Sckry uses Gemini to "clean" this data and interpret user questions.
lib/gemini/classifier.ts: Extracts structued fields (Title, Company, Industry) from raw headlines using parallel batch processing.lib/gemini/rag.ts: The RAG engine that interprets user queries (e.g., "Find growth engineers") into structured search parameters.
A robust Single Page Application (SPA) experience within Next.js.
app/page.tsx: The main orchestrator usingframer-motionto transition between screens (Sync -> Progress -> Graph -> Subscription).
- User Trigger: User enters credentials.
- Remote Execution: Server spins up Playwright.
- Feedback Loop: Server streams progress and screenshots ("remote browser" view) to the client.
- Completion: Scraped profiles are passed to the Classification engine.
- Input: Raw connection data (Name, Headline, URL).
- Processing: Headlines are split into batches and sent to Gemini in parallel.
- Prompting: Gemini acts as a "LinkedIn expert" to infer Industry and standardize Titles.
- Output: Structured, queryable data saved to Supabase.
- User Query: User types "Show me investors".
- Interpretation: Gemini maps "investors" to keywords like
["VC", "Partner", "Angel"]. - Database Query: Structured keywords filter the Postgres database.
- Result: Clean results displayed on the Network Graph.