diff --git a/.gitignore b/.gitignore index 5833bcc..24bbfa9 100755 --- a/.gitignore +++ b/.gitignore @@ -14,6 +14,11 @@ node_modules/ .env.test.local .env.production.local +# Local Copilot helper folders +.github/instructions/ +.github/prompts/ +.github/chatmodes/ + # Logs npm-debug.log* yarn-debug.log* @@ -56,3 +61,4 @@ temp/ # Supabase .branches .temp +repomix-output.xml diff --git a/README.md b/README.md index 0ca6d67..fdafeb0 100755 --- a/README.md +++ b/README.md @@ -1,82 +1,133 @@ -# Mindley App +# Mindley -A full-stack application built with React/TypeScript frontend and Supabase -backend. +Mindley logo -## Project Structure +A small productivity and knowledge-sharing platform with a React + TypeScript frontend and a Supabase backend (Edge Functions written for Deno). + +> [!NOTE] +> This README is a concise developer-focused reference. For detailed API docs or deployment runbooks check the `supabase/` folder and the `frontend/README.md`. + +## Quick overview + +- Frontend: React + TypeScript, Vite, Tailwind CSS. +- Backend: Supabase project with Deno-based Edge Functions in `supabase/functions/`. +- Local dev: Frontend runs with `npm run dev` (in `frontend/`); Supabase functions use the Supabase CLI for local serving. + +## Key features + +- Resource listing and detail pages +- Authentication (OTP sign-in flow) using Supabase +- Background job monitoring via Supabase Functions +- Small component library and UI primitives in `src/components/ui` + +## Repository layout ``` -├── frontend/ # React/TypeScript frontend (deployed via Vercel) -├── supabase/ # Supabase backend and Edge Functions -│ └── functions/ # Deno-based Edge Functions -└── .github/workflows/ # CI/CD pipelines +frontend/ # React app (Vite + TypeScript) + public/ # static assets (icons, manifest) + src/ # source code (components, pages, hooks, services) +supabase/ # Supabase project and Edge Functions (Deno) + functions/ # serverless functions (create-resource, jobs, etc.) +package.json # workspace-level scripts and tooling (if present) +README.md # this file ``` -## CI/CD Pipeline +## Getting started (local) -### Continuous Integration +Prerequisites: -The CI pipeline runs on every push and pull request to `main` and `develop` -branches. It includes: +- Node 18+ (or project-compatible runtime) +- npm or pnpm +- Supabase CLI (for local testing of functions) -#### Frontend CI +1. Frontend -- **Dependencies**: Installs npm packages with caching -- **Type Checking**: Validates TypeScript types with `tsc --noEmit` -- **Linting**: Runs ESLint to check code quality -- **Build**: Verifies the project builds successfully +```bash +cd frontend +npm install +npm run dev +``` -#### Supabase Functions CI +Visit http://localhost:5173 (or the address printed by Vite). -- **Syntax Check**: Validates Deno/TypeScript syntax for all Edge Functions -- **Format Check**: Ensures consistent code formatting with `deno fmt` +2. Supabase functions (local) -#### Security Scanning +```bash +# from repo root or inside the supabase directory +supabase login # one-time, if not already logged in +supabase start # optional local DB + emulators +supabase functions serve # serve Edge Functions locally +``` -- **Dependency Audit**: Scans for known vulnerabilities in npm packages -- **SARIF Upload**: Reports security findings to GitHub Security tab +3. Environment -### Deployment +Create a `.env.local` in `frontend/` (do not commit): -- **Frontend**: Automatically deployed by Vercel on push to `main` -- **Supabase Functions**: Deployed manually via Supabase CLI +```env +VITE_SUPABASE_URL=https://your-project.supabase.co +VITE_SUPABASE_ANON_KEY=public-anon-key +``` -## Development +## Scripts and common commands -### Frontend +From `frontend/`: ```bash -cd frontend -npm install -npm run dev +npm run dev # start dev server +npm run build # build production bundle +npm run preview # preview production build locally +npm run lint # run ESLint +npm run test # run tests (project tests if configured) ``` -### Supabase Functions +Supabase / functions (examples): ```bash -# Requires Supabase CLI -supabase functions serve +deno check supabase/functions/*/index.ts # type check functions +deno fmt supabase/functions/*/index.ts # format functions ``` -## Available Scripts +## Architecture notes -### Frontend +- Frontend organizes UI primitives under `src/components/ui` to keep shared building blocks. +- Services that talk to Supabase are in `src/services/` (e.g., `resourceService.ts`, `jobService.ts`). +- Hooks such as `use-auth` and `use-reliable-realtime` encapsulate auth state and realtime behaviors. -- `npm run dev` - Start development server -- `npm run build` - Build for production -- `npm run lint` - Run ESLint -- `npm run preview` - Preview production build +## CI / Deployment -### Supabase +- Frontend is designed to deploy to Vercel (see `frontend/vercel.json`). +- GitHub Actions run type checks, linters, build verification, and security scans for dependencies. +- Supabase functions are typically deployed using the Supabase CLI/CI; check the `.github/workflows` directory for CI workflow examples. -- `deno check index.ts` - Type check functions -- `deno fmt index.ts` - Format code +## Tests and quality gates -## Environment Variables +- Type checking: `tsc --noEmit` (frontend) +- Linting: ESLint configured in `frontend/eslint.config.js` +- Formatting: Prettier / Deno fmt for Supabase functions -Create `.env.local` in the frontend directory: +## Useful paths -``` -VITE_SUPABASE_URL=your_supabase_url -VITE_SUPABASE_ANON_KEY=your_supabase_anon_key -``` +- Frontend entry: `frontend/src/main.tsx` +- UI components: `frontend/src/components/ui` +- Supabase functions: `supabase/functions/*/index.ts` + +## Troubleshooting + +> [!TIP] +> If the frontend can't connect to Supabase locally, double-check `VITE_SUPABASE_URL` and `VITE_SUPABASE_ANON_KEY` in `frontend/.env.local` and ensure `supabase start` (local emulators) is running when using local endpoints. + +## Next steps / suggestions + +- Add end-to-end tests (Cypress or Playwright) for the main flows (auth, resource CRUD). +- Add a small runbook for deploying Supabase functions with the CLI in CI. + +--- + +Requirements coverage: + +- Create a concise, useful README for the project — Done +- Use GFM and admonitions where helpful — Done +- Avoid sections like LICENSE/CONTRIBUTING/CHANGELOG — Done +- Include logo if present — Done (references `frontend/public/mindley-icon.svg`) + +If you want, I can also update `frontend/README.md` to match this top-level README or generate a brief developer runbook next. diff --git a/frontend/index.html b/frontend/index.html index 0f52abe..8fc1f2e 100755 --- a/frontend/index.html +++ b/frontend/index.html @@ -33,8 +33,10 @@ + + diff --git a/frontend/package.json b/frontend/package.json index ada4cd0..ed2290a 100755 --- a/frontend/package.json +++ b/frontend/package.json @@ -11,12 +11,14 @@ }, "dependencies": { "@hookform/resolvers": "^5.2.1", + "@radix-ui/react-alert-dialog": "^1.1.15", "@radix-ui/react-avatar": "^1.1.10", "@radix-ui/react-collapsible": "^1.1.11", "@radix-ui/react-dialog": "^1.1.14", "@radix-ui/react-dropdown-menu": "^2.1.15", "@radix-ui/react-label": "^2.1.7", "@radix-ui/react-popover": "^1.1.14", + "@radix-ui/react-progress": "^1.1.7", "@radix-ui/react-select": "^2.2.5", "@radix-ui/react-separator": "^1.1.7", "@radix-ui/react-slot": "^1.2.3", @@ -27,7 +29,7 @@ "clsx": "^2.1.1", "cmdk": "^1.1.1", "input-otp": "^1.4.2", - "lucide-react": "^0.539.0", + "lucide-react": "^0.540.0", "next-themes": "^0.4.6", "react": "^19.1.1", "react-dom": "^19.1.1", diff --git a/frontend/public/mindley-icon.png b/frontend/public/mindley-icon.png new file mode 100755 index 0000000..4554fc1 Binary files /dev/null and b/frontend/public/mindley-icon.png differ diff --git a/frontend/public/mindley-og.png b/frontend/public/mindley-og.png deleted file mode 100755 index 708aa63..0000000 --- a/frontend/public/mindley-og.png +++ /dev/null @@ -1 +0,0 @@ -PLACEHOLDER diff --git a/frontend/public/vite.svg b/frontend/public/vite.svg deleted file mode 100755 index 0fee287..0000000 --- a/frontend/public/vite.svg +++ /dev/null @@ -1,40 +0,0 @@ - diff --git a/frontend/src/components/add-resource-form.tsx b/frontend/src/components/add-resource-form.tsx index b9cdded..2364c1d 100755 --- a/frontend/src/components/add-resource-form.tsx +++ b/frontend/src/components/add-resource-form.tsx @@ -29,7 +29,7 @@ export function AddResourceForm({ }: AddResourceFormProps) { const [url, setUrl] = useState(""); const [language, setLanguage] = useState<"original" | "italian" | "english">( - "original", + "original" ); const handleSubmit = (e: React.FormEvent) => { @@ -43,7 +43,6 @@ export function AddResourceForm({ // Reset form setUrl(""); - setLanguage("english"); }; const isValidUrl = (url: string) => { @@ -81,7 +80,7 @@ export function AddResourceForm({ type="url" value={url} onChange={(e) => setUrl(e.target.value)} - placeholder="https://youtube.com/watch?v=... or https://example.com/article" + placeholder="Resource Link" className="pl-10 bg-background focus:bg-background placeholder:text-card-foreground/70" disabled={isLoading} /> @@ -113,19 +112,17 @@ export function AddResourceForm({ disabled={!canSubmit} className="whitespace-nowrap" > - {isLoading - ? ( - <> -
- Processing... - - ) - : ( - <> - - Add Resource - - )} + {isLoading ? ( + <> +
+ Processing... + + ) : ( + <> + + Add Resource + + )}
diff --git a/frontend/src/components/compact-resource-filters.tsx b/frontend/src/components/compact-resource-filters.tsx index 0853bc8..1544af1 100755 --- a/frontend/src/components/compact-resource-filters.tsx +++ b/frontend/src/components/compact-resource-filters.tsx @@ -55,7 +55,7 @@ export function CompactResourceFilters({ )} > {/* Results header with controls */} -
+

Your Resources

diff --git a/frontend/src/components/icons/google.tsx b/frontend/src/components/icons/google.tsx new file mode 100755 index 0000000..095d1dc --- /dev/null +++ b/frontend/src/components/icons/google.tsx @@ -0,0 +1,30 @@ +export function GoogleIcon({ className }: { className?: string }) { + return ( + + + + + + + ); +} + +export default GoogleIcon; diff --git a/frontend/src/components/icons/youtube.tsx b/frontend/src/components/icons/youtube.tsx new file mode 100755 index 0000000..fe0f064 --- /dev/null +++ b/frontend/src/components/icons/youtube.tsx @@ -0,0 +1,17 @@ +export function YouTubeIcon({ className }: { className?: string }) { + return ( + + {/* Official-ish YouTube play button: rounded red rect with white triangle */} + + + + ); +} + +export default YouTubeIcon; diff --git a/frontend/src/components/job-monitor.tsx b/frontend/src/components/job-monitor.tsx new file mode 100755 index 0000000..5374bfe --- /dev/null +++ b/frontend/src/components/job-monitor.tsx @@ -0,0 +1,343 @@ +import { useState, useEffect } from "react"; +import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"; +import { Badge } from "@/components/ui/badge"; +import { Button } from "@/components/ui/button"; +import { Progress } from "@/components/ui/progress"; +import { useJobNotifications } from "@/hooks/use-job-notifications"; +import { useAuth } from "@/hooks/use-auth"; +import { JobService } from "@/services/jobService"; +import type { Job } from "@/types/job"; +import { + Clock, + CheckCircle, + XCircle, + AlertTriangle, + Play, + Pause, + MoreHorizontal, +} from "lucide-react"; +import { + DropdownMenu, + DropdownMenuContent, + DropdownMenuItem, + DropdownMenuTrigger, +} from "@/components/ui/dropdown-menu"; + +interface JobProgressInfo { + completed: number; + total: number; + percentage: number; +} + +const JobStatusIcon = ({ status }: { status: Job["status"] }) => { + switch (status) { + case "pending": + return ; + case "running": + return ; + case "completed": + return ; + case "failed": + return ; + case "cancelled": + return ; + default: + return ; + } +}; + +const JobStatusBadge = ({ status }: { status: Job["status"] }) => { + const variants = { + pending: "secondary", + running: "default", + completed: "default", + failed: "destructive", + cancelled: "secondary", + } as const; + + const labels = { + pending: "Pending", + running: "Running", + completed: "Completed", + failed: "Failed", + cancelled: "Cancelled", + }; + + return ( + + + {labels[status]} + + ); +}; + +const JobCard = ({ + job, + onCancel, + showProgress = true, +}: { + job: Job; + onCancel?: (jobId: string) => void; + showProgress?: boolean; +}) => { + const [progress, setProgress] = useState({ + completed: 0, + total: 0, + percentage: 0, + }); + + useEffect(() => { + const loadProgress = async () => { + if ( + !showProgress || + (job.status !== "running" && job.status !== "pending") + ) + return; + + try { + const progressInfo = await JobService.getJobWithSteps(job.id); + if (progressInfo) { + const completedSteps = progressInfo.steps.filter( + (step) => step.status === "completed" || step.status === "skipped" + ).length; + const totalSteps = progressInfo.steps.length; + const percentage = + totalSteps > 0 + ? Math.round((completedSteps / totalSteps) * 100) + : 0; + + setProgress({ + completed: completedSteps, + total: totalSteps, + percentage, + }); + } + } catch (error) { + console.error("Error loading job progress:", error); + } + }; + + loadProgress(); + + // Refresh progress every 10 seconds for running jobs + let interval: NodeJS.Timeout; + if (job.status === "running") { + interval = setInterval(loadProgress, 10000); + } + + return () => { + if (interval) { + clearInterval(interval); + } + }; + }, [job.id, job.status, showProgress]); + + const formatDate = (dateString: string) => { + return new Date(dateString).toLocaleString("it-IT", { + day: "2-digit", + month: "2-digit", + year: "numeric", + hour: "2-digit", + minute: "2-digit", + }); + }; + + const getDuration = (startDate?: string | null, endDate?: string | null) => { + if (!startDate) return null; + + const start = new Date(startDate); + const end = endDate ? new Date(endDate) : new Date(); + const diff = end.getTime() - start.getTime(); + + const minutes = Math.floor(diff / (1000 * 60)); + const seconds = Math.floor((diff % (1000 * 60)) / 1000); + + if (minutes > 0) { + return `${minutes}m ${seconds}s`; + } + return `${seconds}s`; + }; + + return ( + + +

+ + {job.workflow_name} + +
+ + {(job.status === "running" || job.status === "pending") && + onCancel && ( + + + + + + onCancel(job.id)}> + Annulla Workflow + + + + )} +
+
+ + +
+ {/* Progress bar for running jobs */} + {showProgress && + (job.status === "running" || job.status === "pending") && ( +
+
+ Progresso + + {progress.completed}/{progress.total} passaggi + +
+ +
+ {progress.percentage}% completed +
+
+ )} + + {/* Job details */} +
+
+ Creato: +

{formatDate(job.created_at)}

+
+ {job.started_at && ( +
+ Avviato: +

{formatDate(job.started_at)}

+
+ )} + {job.completed_at && ( +
+ Completed: +

{formatDate(job.completed_at)}

+
+ )} + {job.started_at && ( +
+ Durata: +

+ {getDuration(job.started_at, job.completed_at) || + "In corso..."} +

+
+ )} +
+ + {/* Error message */} + {job.error_message && ( +
+

{job.error_message}

+
+ )} +
+
+ + ); +}; + +export const JobMonitor = () => { + const { user } = useAuth(); + const { jobs, activeJobs, recentJobs, isLoading, error, cancelJob } = + useJobNotifications({ + showToasts: true, + userId: user?.id, + }); + + const handleCancelJob = async (jobId: string) => { + try { + await cancelJob(jobId); + } catch (error) { + console.error("Error cancelling job:", error); + } + }; + + if (!user) { + return ( + + +

+ Accedi per visualizzare i tuoi workflow +

+
+
+ ); + } + + if (isLoading) { + return ( + + +

+ Caricamento workflow... +

+
+
+ ); + } + + if (error) { + return ( + + +

+ Errore nel caricamento dei workflow: {error} +

+
+
+ ); + } + + return ( +
+ {/* Active Jobs */} + {activeJobs.length > 0 && ( +
+

Workflow Attivi

+
+ {activeJobs.map((job) => ( + + ))} +
+
+ )} + + {/* Recent Jobs */} + {recentJobs.length > 0 && ( +
+

Workflow Recenti

+
+ {recentJobs.slice(0, 5).map((job) => ( + + ))} +
+
+ )} + + {/* No Jobs */} + {jobs.length === 0 && ( + + +

+ Nessun workflow trovato +

+
+
+ )} +
+ ); +}; diff --git a/frontend/src/components/nav-main.tsx b/frontend/src/components/nav-main.tsx index 8f2ec26..aaf6549 100755 --- a/frontend/src/components/nav-main.tsx +++ b/frontend/src/components/nav-main.tsx @@ -14,6 +14,7 @@ import { SidebarMenuSub, SidebarMenuSubButton, SidebarMenuSubItem, + useSidebar, } from "@/components/ui/sidebar"; export function NavMain({ @@ -30,6 +31,7 @@ export function NavMain({ }[]; }[]; }) { + const { isMobile, setOpenMobile } = useSidebar(); return ( Platform @@ -54,7 +56,13 @@ export function NavMain({ {item.items?.map((subItem) => ( - + { + // on mobile, close the sidebar when navigating + if (isMobile) setOpenMobile(false); + }} + > {subItem.title} diff --git a/frontend/src/components/nav-projects.tsx b/frontend/src/components/nav-projects.tsx index d4be151..e66769e 100755 --- a/frontend/src/components/nav-projects.tsx +++ b/frontend/src/components/nav-projects.tsx @@ -32,7 +32,7 @@ export function NavProjects({ icon: LucideIcon; }[]; }) { - const { isMobile } = useSidebar(); + const { isMobile, setOpenMobile } = useSidebar(); return ( @@ -41,7 +41,12 @@ export function NavProjects({ {projects.map((item) => ( - + { + if (isMobile) setOpenMobile(false); + }} + > {item.name} diff --git a/frontend/src/components/resource-card.tsx b/frontend/src/components/resource-card.tsx index ff7ca40..5d73421 100755 --- a/frontend/src/components/resource-card.tsx +++ b/frontend/src/components/resource-card.tsx @@ -1,4 +1,6 @@ -import { Calendar, ExternalLink, FileText, User, Youtube } from "lucide-react"; +import { Calendar, ExternalLink, FileText, User } from "lucide-react"; +import YouTubeIcon from "@/components/icons/youtube"; +import { useEffect, useRef, useState } from "react"; import { Badge } from "@/components/ui/badge"; import { Button } from "@/components/ui/button"; import { @@ -17,9 +19,34 @@ interface ResourceCardProps { } export function ResourceCard({ resource, onViewDetails }: ResourceCardProps) { + const tagsContainerRef = useRef(null); + const tagRefs = useRef>([]); + const [forceNewLine, setForceNewLine] = useState(false); + + useEffect(() => { + const check = () => { + const refs = tagRefs.current; + if (!refs[0] || !refs[2]) { + setForceNewLine(false); + return; + } + const sameLine = refs[0].offsetTop === refs[2].offsetTop; + setForceNewLine(sameLine); + }; + + check(); + + const ro = new ResizeObserver(check); + if (tagsContainerRef.current) ro.observe(tagsContainerRef.current); + window.addEventListener("resize", check); + return () => { + ro.disconnect(); + window.removeEventListener("resize", check); + }; + }, [resource.tags]); const getContentTypeIcon = () => { return resource.content_type === "youtube" ? ( - + ) : ( ); @@ -88,7 +115,7 @@ export function ResourceCard({ resource, onViewDetails }: ResourceCardProps) {
-
+
{getContentTypeIcon()} @@ -135,26 +162,39 @@ export function ResourceCard({ resource, onViewDetails }: ResourceCardProps) {

-
- {resource.tags.slice(0, 3).map((tag, index) => ( - - {tag} - - ))} - {resource.tags.length > 3 && ( - - +{resource.tags.length - 3} - - )} +
+
+ {resource.tags.slice(0, 3).map((tag, index) => ( + { + tagRefs.current[index] = el; + }} + className="inline-block" + > + + {tag} + + + ))} + + {resource.tags.length > 3 && ( + <> + {forceNewLine &&
} + + +{resource.tags.length - 3} + + + )} +
diff --git a/frontend/src/components/ui/alert-dialog.tsx b/frontend/src/components/ui/alert-dialog.tsx new file mode 100755 index 0000000..02e0005 --- /dev/null +++ b/frontend/src/components/ui/alert-dialog.tsx @@ -0,0 +1,139 @@ +import * as React from "react"; +import * as AlertDialogPrimitive from "@radix-ui/react-alert-dialog"; + +import { cn } from "@/lib/utils"; +import { buttonVariants } from "@/components/ui/button"; + +const AlertDialog = AlertDialogPrimitive.Root; + +const AlertDialogTrigger = AlertDialogPrimitive.Trigger; + +const AlertDialogPortal = AlertDialogPrimitive.Portal; + +const AlertDialogOverlay = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, ...props }, ref) => ( + +)); +AlertDialogOverlay.displayName = AlertDialogPrimitive.Overlay.displayName; + +const AlertDialogContent = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, ...props }, ref) => ( + + + + +)); +AlertDialogContent.displayName = AlertDialogPrimitive.Content.displayName; + +const AlertDialogHeader = ({ + className, + ...props +}: React.HTMLAttributes) => ( +
+); +AlertDialogHeader.displayName = "AlertDialogHeader"; + +const AlertDialogFooter = ({ + className, + ...props +}: React.HTMLAttributes) => ( +
+); +AlertDialogFooter.displayName = "AlertDialogFooter"; + +const AlertDialogTitle = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, ...props }, ref) => ( + +)); +AlertDialogTitle.displayName = AlertDialogPrimitive.Title.displayName; + +const AlertDialogDescription = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, ...props }, ref) => ( + +)); +AlertDialogDescription.displayName = + AlertDialogPrimitive.Description.displayName; + +const AlertDialogAction = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, ...props }, ref) => ( + +)); +AlertDialogAction.displayName = AlertDialogPrimitive.Action.displayName; + +const AlertDialogCancel = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, ...props }, ref) => ( + +)); +AlertDialogCancel.displayName = AlertDialogPrimitive.Cancel.displayName; + +export { + AlertDialog, + AlertDialogPortal, + AlertDialogOverlay, + AlertDialogTrigger, + AlertDialogContent, + AlertDialogHeader, + AlertDialogFooter, + AlertDialogTitle, + AlertDialogDescription, + AlertDialogAction, + AlertDialogCancel, +}; diff --git a/frontend/src/components/ui/progress.tsx b/frontend/src/components/ui/progress.tsx new file mode 100755 index 0000000..aefd46e --- /dev/null +++ b/frontend/src/components/ui/progress.tsx @@ -0,0 +1,26 @@ +import * as React from "react"; +import * as ProgressPrimitive from "@radix-ui/react-progress"; + +import { cn } from "@/lib/utils"; + +const Progress = React.forwardRef< + React.ElementRef, + React.ComponentPropsWithoutRef +>(({ className, value, ...props }, ref) => ( + + + +)); +Progress.displayName = ProgressPrimitive.Root.displayName; + +export { Progress }; diff --git a/frontend/src/components/ui/sheet.tsx b/frontend/src/components/ui/sheet.tsx index 1a1c723..5a5d54d 100755 --- a/frontend/src/components/ui/sheet.tsx +++ b/frontend/src/components/ui/sheet.tsx @@ -19,8 +19,8 @@ const SheetOverlay = React.forwardRef< >(({ className, ...props }, ref) => ( , + extends React.ComponentPropsWithoutRef, VariantProps {} const SheetContent = React.forwardRef< @@ -82,7 +79,7 @@ const SheetHeader = ({
@@ -96,7 +93,7 @@ const SheetFooter = ({
diff --git a/frontend/src/components/ui/sidebar.tsx b/frontend/src/components/ui/sidebar.tsx index da73b01..d895909 100755 --- a/frontend/src/components/ui/sidebar.tsx +++ b/frontend/src/components/ui/sidebar.tsx @@ -69,7 +69,7 @@ const SidebarProvider = React.forwardRef< children, ...props }, - ref, + ref ) => { const isMobile = useIsMobile(); const [openMobile, setOpenMobile] = React.useState(false); @@ -88,10 +88,9 @@ const SidebarProvider = React.forwardRef< } // This sets the cookie to keep the sidebar state. - document.cookie = - `${SIDEBAR_COOKIE_NAME}=${openState}; path=/; max-age=${SIDEBAR_COOKIE_MAX_AGE}`; + document.cookie = `${SIDEBAR_COOKIE_NAME}=${openState}; path=/; max-age=${SIDEBAR_COOKIE_MAX_AGE}`; }, - [setOpenProp, open], + [setOpenProp, open] ); // Helper to toggle the sidebar. @@ -131,29 +130,23 @@ const SidebarProvider = React.forwardRef< setOpenMobile, toggleSidebar, }), - [ - state, - open, - setOpen, - isMobile, - openMobile, - setOpenMobile, - toggleSidebar, - ], + [state, open, setOpen, isMobile, openMobile, setOpenMobile, toggleSidebar] ); return (
); - }, + } ); SidebarProvider.displayName = "SidebarProvider"; @@ -184,7 +177,7 @@ const Sidebar = React.forwardRef< children, ...props }, - ref, + ref ) => { const { isMobile, state, openMobile, setOpenMobile } = useSidebar(); @@ -193,7 +186,7 @@ const Sidebar = React.forwardRef<
button]:hidden", + // Use theme-aware sidebar background and a subtle backdrop blur + shadow + "bg-sidebar backdrop-blur-sm shadow-2xl" + )} + style={ + { "--sidebar-width": SIDEBAR_WIDTH_MOBILE } as React.CSSProperties + } side={side} > @@ -242,7 +239,7 @@ const Sidebar = React.forwardRef< "group-data-[side=right]:rotate-180", variant === "floating" || variant === "inset" ? "group-data-[collapsible=icon]:w-[calc(var(--sidebar-width-icon)_+_theme(spacing.4))]" - : "group-data-[collapsible=icon]:w-[--sidebar-width-icon]", + : "group-data-[collapsible=icon]:w-[--sidebar-width-icon]" )} />
@@ -268,7 +265,7 @@ const Sidebar = React.forwardRef<
); - }, + } ); Sidebar.displayName = "Sidebar"; @@ -319,7 +316,7 @@ const SidebarRail = React.forwardRef< "group-data-[collapsible=offcanvas]:translate-x-0 group-data-[collapsible=offcanvas]:after:left-full group-data-[collapsible=offcanvas]:hover:bg-sidebar", "[[data-side=left][data-collapsible=offcanvas]_&]:-right-2", "[[data-side=right][data-collapsible=offcanvas]_&]:-left-2", - className, + className )} {...props} /> @@ -337,7 +334,7 @@ const SidebarInset = React.forwardRef< className={cn( "relative flex w-full flex-1 flex-col bg-background", "md:peer-data-[variant=inset]:m-2 md:peer-data-[state=collapsed]:peer-data-[variant=inset]:ml-2 md:peer-data-[variant=inset]:ml-0 md:peer-data-[variant=inset]:rounded-xl md:peer-data-[variant=inset]:shadow", - className, + className )} {...props} /> @@ -355,7 +352,7 @@ const SidebarInput = React.forwardRef< data-sidebar="input" className={cn( "h-8 w-full bg-background shadow-none focus-visible:ring-2 focus-visible:ring-sidebar-ring", - className, + className )} {...props} /> @@ -418,7 +415,7 @@ const SidebarContent = React.forwardRef< data-sidebar="content" className={cn( "flex min-h-0 flex-1 flex-col gap-2 overflow-auto group-data-[collapsible=icon]:overflow-hidden", - className, + className )} {...props} /> @@ -454,7 +451,7 @@ const SidebarGroupLabel = React.forwardRef< className={cn( "flex h-8 shrink-0 items-center rounded-md px-2 text-xs font-medium text-sidebar-foreground/70 outline-none ring-sidebar-ring transition-[margin,opacity] duration-200 ease-linear focus-visible:ring-2 [&>svg]:size-4 [&>svg]:shrink-0", "group-data-[collapsible=icon]:-mt-8 group-data-[collapsible=icon]:opacity-0", - className, + className )} {...props} /> @@ -477,7 +474,7 @@ const SidebarGroupAction = React.forwardRef< // Increases the hit area of the button on mobile. "after:absolute after:-inset-2 after:md:hidden", "group-data-[collapsible=icon]:hidden", - className, + className )} {...props} /> @@ -543,7 +540,7 @@ const sidebarMenuButtonVariants = cva( variant: "default", size: "default", }, - }, + } ); const SidebarMenuButton = React.forwardRef< @@ -564,7 +561,7 @@ const SidebarMenuButton = React.forwardRef< className, ...props }, - ref, + ref ) => { const Comp = asChild ? Slot : "button"; const { isMobile, state } = useSidebar(); @@ -601,7 +598,7 @@ const SidebarMenuButton = React.forwardRef< /> ); - }, + } ); SidebarMenuButton.displayName = "SidebarMenuButton"; @@ -628,7 +625,7 @@ const SidebarMenuAction = React.forwardRef< "group-data-[collapsible=icon]:hidden", showOnHover && "group-focus-within/menu-item:opacity-100 group-hover/menu-item:opacity-100 data-[state=open]:opacity-100 peer-data-[active=true]/menu-button:text-sidebar-accent-foreground md:opacity-0", - className, + className )} {...props} /> @@ -650,7 +647,7 @@ const SidebarMenuBadge = React.forwardRef< "peer-data-[size=default]/menu-button:top-1.5", "peer-data-[size=lg]/menu-button:top-2.5", "group-data-[collapsible=icon]:hidden", - className, + className )} {...props} /> @@ -684,9 +681,11 @@ const SidebarMenuSkeleton = React.forwardRef<
); @@ -703,7 +702,7 @@ const SidebarMenuSub = React.forwardRef< className={cn( "mx-3.5 flex min-w-0 translate-x-px flex-col gap-1 border-l border-sidebar-border px-2.5 py-0.5", "group-data-[collapsible=icon]:hidden", - className, + className )} {...props} /> @@ -738,7 +737,7 @@ const SidebarMenuSubButton = React.forwardRef< size === "sm" && "text-xs", size === "md" && "text-sm", "group-data-[collapsible=icon]:hidden", - className, + className )} {...props} /> diff --git a/frontend/src/components/ui/toast.tsx b/frontend/src/components/ui/toast.tsx index d3c3b75..7bad056 100755 --- a/frontend/src/components/ui/toast.tsx +++ b/frontend/src/components/ui/toast.tsx @@ -15,7 +15,7 @@ const ToastViewport = React.forwardRef< ref={ref} className={cn( "fixed top-0 z-[100] flex max-h-screen w-full flex-col-reverse p-4 sm:bottom-0 sm:right-0 sm:top-auto sm:flex-col md:max-w-[420px]", - className, + className )} {...props} /> @@ -30,18 +30,22 @@ const toastVariants = cva( default: "border bg-background text-foreground", destructive: "destructive group border-destructive bg-destructive text-destructive-foreground", + success: + "success group border-success bg-success text-success-foreground", + primary: + "primary group border-primary bg-primary text-primary-foreground", }, }, defaultVariants: { variant: "default", }, - }, + } ); const Toast = React.forwardRef< React.ElementRef, - & React.ComponentPropsWithoutRef - & VariantProps + React.ComponentPropsWithoutRef & + VariantProps >(({ className, variant, ...props }, ref) => { return ( @@ -75,8 +79,8 @@ const ToastClose = React.forwardRef< {toasts.map(function ({ id, title, description, action, ...props }) { diff --git a/frontend/src/data/t.txt b/frontend/src/data/t.txt deleted file mode 100755 index 30d74d2..0000000 --- a/frontend/src/data/t.txt +++ /dev/null @@ -1 +0,0 @@ -test \ No newline at end of file diff --git a/frontend/src/hooks/use-job-notifications.tsx b/frontend/src/hooks/use-job-notifications.tsx new file mode 100755 index 0000000..82e0969 --- /dev/null +++ b/frontend/src/hooks/use-job-notifications.tsx @@ -0,0 +1,628 @@ +import { useState, useEffect, useCallback, useRef } from "react"; +import { useToast } from "@/hooks/use-toast"; +import { JobService } from "@/services/jobService"; +import type { Job, JobStep, JobNotification, JobWithSteps } from "@/types/job"; +import type { WorkflowErrorNotification } from "@/types/workflowError"; +import { useReliableRealtime } from "./use-reliable-realtime"; + +export interface UseJobNotificationsOptions { + showToasts?: boolean; + userId?: string; +} + +interface PollerEvent { + source: string; + payload: { + new: Job | JobStep; + old: Job | JobStep | Record; + event: string; + }; + isSynthetic: boolean; +} + +export function useJobNotifications(options: UseJobNotificationsOptions = {}) { + const { showToasts = true, userId } = options; + const { toast } = useToast(); + const [jobs, setJobs] = useState([]); + const [isLoading, setIsLoading] = useState(true); + const [error, setError] = useState(null); + const lastStatusNotifiedRef = useRef>({}); + const lastStepNotifiedRef = useRef>(new Set()); + + /** + * Maps technical node names to user-friendly names for better error messages + */ + const mapNodeNameToUserFriendly = useCallback((nodeName: string): string => { + const baseNodeName = nodeName.replace(/\d+$/, "").trim(); + + // AI/LLM nodes + if (baseNodeName === "Basic LLM Chain") return "AI Model"; + if (baseNodeName === "OpenRouter model") return "AI Model"; + if (baseNodeName === "OpenRouter fallback model") return "AI Backup Model"; + if (baseNodeName === "Structured Output Parser") + return "AI Response Processing"; + + // Content extraction nodes + if (nodeName === "Get video transcription") return "Video Transcription"; + if (nodeName === "Scrape video") return "Video Content Extraction"; + if (nodeName === "Scrape website") return "Website Content Extraction"; + + // Database nodes + if (baseNodeName === "Create a row") return "Database Save"; + if (nodeName === "Get a row") return "Database Check"; + + // Processing nodes + if (nodeName === "Get video ID") return "Video Processing"; + if (nodeName.includes("Get Title, Author, Published Date")) + return "Content Analysis"; + if (nodeName.includes("Get Resource Language")) return "Language Detection"; + if (nodeName === "Set Resource Language") return "Language Processing"; + + // Duplicate checking nodes + if (nodeName.includes("Check If Resource Is In Current User Collection")) + return "Duplicate Check"; + if (nodeName.includes("Check If Current User Already Has Resource")) + return "Duplicate Check"; + if (nodeName === "Check if resource link is not already in database") + return "Duplicate Check"; + + // Control flow nodes + if (nodeName === "Check if YouTube Video") return "Content Type Detection"; + if (nodeName === "Switch") return "Content Processing"; + if (nodeName === "Same Language") return "Language Processing"; + + // Update step nodes (map to their corresponding step) + if (nodeName.includes("Update Step - ")) { + const stepName = nodeName.replace("Update Step - ", ""); + if (stepName.includes("Duplicates Check")) return "Duplicate Check"; + if (stepName.includes("Content Type Detection")) + return "Content Type Detection"; + if (stepName.includes("Content Extracted")) return "Content Extraction"; + if (stepName.includes("AI Complete")) return "AI Processing"; + if (stepName.includes("Database Save")) return "Database Save"; + if (stepName.includes("Handle Duplicates")) return "Duplicate Handling"; + return stepName; // fallback to step name without prefix + } + + const cleanName = nodeName.replace(/\d+$/, "").trim(); + + return cleanName || "Workflow Node"; + }, []); + + const loadJobs = useCallback(async () => { + if (!userId) return; + + try { + setIsLoading(true); + const userJobs = await JobService.getUserJobs(); + setJobs(userJobs); + setError(null); + } catch (err) { + setError(err instanceof Error ? err.message : "Failed to load jobs"); + } finally { + setIsLoading(false); + } + }, [userId]); + + const activeJobs = jobs.filter( + (job) => job.status === "running" || job.status === "pending" + ); + + const recentJobs = jobs.filter((job) => { + const today = new Date().toDateString(); + const jobDate = new Date(job.created_at).toDateString(); + return jobDate === today; + }); + + const showNotification = useCallback( + ( + notification: (JobNotification | WorkflowErrorNotification) & { + customVariant?: "default" | "destructive" | "success" | "primary"; + } + ) => { + if (!showToasts) return; + + const getToastVariant = ( + type: JobNotification["type"] | "workflow_error" + ) => { + switch (type) { + case "job_completed": + return "default"; + case "job_failed": + case "workflow_error": + return "destructive"; + case "step_updated": + // If the step itself failed, show destructive styling + return "default"; + default: + return "default"; + } + }; + + const getToastTitle = ( + notification: (JobNotification | WorkflowErrorNotification) & { + customVariant?: "default" | "destructive" | "success" | "primary"; + } + ) => { + if (notification.customVariant === "primary") { + return "Resource Already Available"; + } + + switch (notification.type) { + case "job_created": + return `Workflow started`; + case "step_updated": { + const stepNotification = notification as JobNotification; + if (stepNotification.step?.status === "running") + return `Step: ${stepNotification.step.step_name}`; + if (stepNotification.step?.status === "completed") + return `Step completed: ${stepNotification.step.step_name}`; + if (stepNotification.step?.status === "failed") + return `Step failed: ${stepNotification.step.step_name}`; + return `Step: ${stepNotification.step?.step_name ?? ""}`; + } + case "job_failed": + return "Workflow failed"; + case "workflow_error": { + const workflowNotification = + notification as WorkflowErrorNotification; + const nodeName = + workflowNotification.workflowError.error_node || "Node Failed"; + const userFriendlyName = mapNodeNameToUserFriendly(nodeName); + return `Workflow Error: ${userFriendlyName}`; + } + default: + return "Notification"; + } + }; + + const desc = + notification.message && notification.message.trim().length > 0 + ? notification.message + : undefined; + console.log("[JobNotifications] Trigger toast:", { + type: notification.type, + title: getToastTitle(notification), + message: notification.message, + }); + + // Determine variant + let variant = + notification.customVariant || + (getToastVariant(notification.type) as + | "default" + | "destructive" + | "success" + | "primary" + | undefined); + if ( + !notification.customVariant && + notification.type === "step_updated" && + (notification as JobNotification).step?.status === "failed" + ) { + variant = "destructive"; + } + + toast({ + title: getToastTitle(notification), + description: desc, + variant, + duration: + notification.type === "step_updated" && + (notification as JobNotification).step?.status === "running" + ? 15000 + : 12000, + }); + }, + [showToasts, toast, mapNodeNameToUserFriendly] + ); + + const updateLocalJob = useCallback((updatedJob: Job) => { + setJobs((prevJobs) => { + const existingIndex = prevJobs.findIndex( + (job) => job.id === updatedJob.id + ); + if (existingIndex >= 0) { + const newJobs = [...prevJobs]; + newJobs[existingIndex] = updatedJob; + return newJobs; + } else { + return [updatedJob, ...prevJobs]; + } + }); + }, []); + + // --- Unified event handler (realtime + polling synthetic) --- + const handleEvent = useCallback( + async (evt: { source: string; payload: any; isSynthetic: boolean }) => { + if (!userId) return; + const payload = evt.payload; + + // Handle workflow error events + if (evt.source === "workflow_errors") { + const workflowError = payload.new; + const eventType = (payload as any).eventType ?? (payload as any).event; + + if (eventType === "INSERT" && workflowError.user_id === userId) { + showNotification({ + type: "workflow_error", + workflowError, + message: + workflowError.error_message || + "An error occurred during workflow execution", + }); + } + return; + } + + // Handle existing job/job_step events + if (evt.source === "jobs") { + const job = payload.new as Job; + const oldJob = payload.old as Job; + const eventType = (payload as any).eventType ?? (payload as any).event; + + if (eventType === "INSERT") { + updateLocalJob(job); + showNotification({ + type: "job_created", + job, + message: `Workflow "${job.workflow_name}" started (status: ${job.status}).`, + }); + lastStatusNotifiedRef.current[job.id] = job.status; + } else if (eventType === "UPDATE") { + updateLocalJob(job); + const alreadyNotifiedStatus = + lastStatusNotifiedRef.current[job.id] === job.status; + const statusActuallyChanged = oldJob?.status !== job.status; + if (!alreadyNotifiedStatus || statusActuallyChanged) { + if (job.status === "failed") { + showNotification({ + type: "job_failed", + job, + message: + job.error_message || `Workflow "${job.workflow_name}" failed`, + }); + } + lastStatusNotifiedRef.current[job.id] = job.status; + } + } + } else if (evt.source === "job_steps") { + const step = payload.new as JobStep; + const oldStep = payload.old as JobStep; + const eventType = (payload as any).eventType ?? (payload as any).event; + const statusChanged = oldStep?.status !== step.status; + const isInteresting = ["running", "completed", "failed"].includes( + step.status + ); + if ( + statusChanged && + isInteresting && + (eventType === "UPDATE" || eventType === undefined) + ) { + try { + const jobWithSteps = await JobService.getJobWithSteps(step.job_id); + if (jobWithSteps && jobWithSteps.user_id === userId) { + if (jobWithSteps.status === "failed") return; // skip if job already failed + const stepsArr = jobWithSteps.steps || []; + const total = stepsArr.length; + const completedSteps = stepsArr.filter((s: JobStep) => + ["completed", "skipped"].includes(s.status) + ).length; + + // Smart next step logic to handle conditional workflows + // If there are completed steps with higher step_order than pending steps, + // those pending steps should be considered as skipped/obsolete + const completedStepsArray = stepsArr.filter((s: JobStep) => + ["completed", "skipped"].includes(s.status) + ); + const maxCompletedOrder = + completedStepsArray.length > 0 + ? Math.max( + ...completedStepsArray.map((s: JobStep) => s.step_order) + ) + : 0; + + // First, look for running steps (highest priority) + let nextStep = stepsArr + .filter((s: JobStep) => s.status === "running") + .sort( + (a: JobStep, b: JobStep) => a.step_order - b.step_order + )[0]; + + // If no running steps, look for pending steps after the highest completed step + if (!nextStep) { + nextStep = stepsArr + .filter( + (s: JobStep) => + s.status === "pending" && s.step_order > maxCompletedOrder + ) + .sort( + (a: JobStep, b: JobStep) => a.step_order - b.step_order + )[0]; + } + + // Fallback: if still no step found, use original logic (for edge cases) + if (!nextStep) { + nextStep = stepsArr + .filter((s: JobStep) => + ["running", "pending"].includes(s.status) + ) + .sort( + (a: JobStep, b: JobStep) => a.step_order - b.step_order + )[0]; + } + + const isLastStepJustCompleted = + step.status === "completed" && completedSteps === total; + + // Handle special cases for duplicate handling + const isSameUserDuplicate = + step.step_name === "Handle Duplicates: Same User" && + step.status === "completed"; + const isDifferentUserDuplicate = + step.step_name === "Handle Duplicates: Different User" && + step.status === "completed"; + const isDifferentUserFailed = + step.step_name.includes("Different User") && + step.status === "failed"; + + console.log("[JobNotifications] Step analysis:", { + stepName: step.step_name, + status: step.status, + isSameUserDuplicate, + isDifferentUserDuplicate, + isDifferentUserFailed, + metadata: step.metadata, + isLastStepJustCompleted, + maxCompletedOrder, + nextStepName: nextStep?.step_name, + nextStepOrder: nextStep?.step_order, + }); + + let description = ""; + let shouldShowToast = true; + let customVariant: + | "default" + | "destructive" + | "success" + | "primary" + | undefined = undefined; + + if (step.status === "running") { + description = "Running..."; + } else if (step.status === "failed") { + if (isDifferentUserFailed) { + // Handle failure case for Different User duplicate handling + description = "Failed to process resource"; + customVariant = "destructive"; + } else { + description = "Failed."; + } + } else if (isSameUserDuplicate) { + // Resource already in user's collection - show info toast and redirect + const resourceId = step.metadata?.reference_id; + const resourceTitle = step.metadata?.reference_title; + + if (resourceId && resourceTitle) { + description = `Resource already in your collection: ${resourceTitle}`; + customVariant = "primary"; + + // Redirect to resource detail page + setTimeout(() => { + window.location.href = `/resource/${resourceId}`; + }, 2000); + } else { + description = "Resource already in your collection"; + customVariant = "primary"; + } + } else if (isDifferentUserDuplicate) { + // Resource copied from another user - suppress success toast + shouldShowToast = false; // Success job step toast suppressed + } else if (step.status === "completed") { + if (isLastStepJustCompleted) { + // Regular workflow completion + description = + "Resource successfully added to your collection"; + customVariant = "success"; + + // Reload page after short delay + setTimeout(() => { + window.location.reload(); + }, 2000); + } else if (nextStep) { + description = `Currently running step: ${nextStep.step_name}`; + } + } + + const stepKey = `${step.job_id}:${step.id}:${step.status}`; + if ( + !lastStepNotifiedRef.current.has(stepKey) && + shouldShowToast + ) { + if ( + !( + isLastStepJustCompleted && + step.status === "completed" && + !isSameUserDuplicate && + !isDifferentUserDuplicate + ) + ) { + const notif = { + type: "step_updated" as const, + job: jobWithSteps, + step, + message: description, + customVariant, + }; + showNotification(notif); + } + lastStepNotifiedRef.current.add(stepKey); + } + } + } catch (e) { + console.error("[JobNotifications] Error handling step event", e); + } + } + } + }, + [userId, updateLocalJob, showNotification] + ); + + // Poller builds synthetic UPDATE-like events by diffing recent state + const previousJobsRef = useRef>({}); + const previousStepsRef = useRef>({}); + const poller = useCallback(async (): Promise => { + if (!userId) return []; + try { + const jobs = await JobService.getUserJobs(); + const events: PollerEvent[] = []; + jobs.forEach((job: JobWithSteps) => { + const prev = previousJobsRef.current[job.id]; + if (!prev) { + events.push({ + source: "jobs", + payload: { new: job, old: {}, event: "INSERT" }, + isSynthetic: true, + }); + } else if (prev.status !== job.status) { + events.push({ + source: "jobs", + payload: { new: job, old: prev, event: "UPDATE" }, + isSynthetic: true, + }); + } + previousJobsRef.current[job.id] = job; + // Steps + (job.steps || []).forEach((step: JobStep) => { + const stepKey = step.id; + const prevStep = previousStepsRef.current[stepKey]; + if (!prevStep) { + // treat as running update when not pending (skip initial pending noise) + if (["running", "completed", "failed"].includes(step.status)) { + events.push({ + source: "job_steps", + payload: { new: step, old: {}, event: "UPDATE" }, + isSynthetic: true, + }); + } + } else if (prevStep.status !== step.status) { + events.push({ + source: "job_steps", + payload: { new: step, old: prevStep, event: "UPDATE" }, + isSynthetic: true, + }); + } + previousStepsRef.current[stepKey] = step; + }); + }); + return events; + } catch (e) { + console.warn("[JobNotifications] Poller error", e); + return []; + } + }, [userId]); + + useReliableRealtime({ + sources: userId + ? [ + { + key: "jobs", + filter: { + event: "*", + schema: "public", + table: "jobs", + filter: `user_id=eq.${userId}`, + }, + }, + { + key: "job_steps", + filter: { event: "UPDATE", schema: "public", table: "job_steps" }, + }, + { + key: "workflow_errors", + filter: { + event: "INSERT", + schema: "public", + table: "workflow_errors", + filter: `user_id=eq.${userId}`, + }, + }, + ] + : [], + poller, // fallback + onEvent: handleEvent, + }); + + useEffect(() => { + loadJobs(); + }, [loadJobs]); + + const getJobProgress = useCallback( + async ( + jobId: string + ): Promise<{ completed: number; total: number; percentage: number }> => { + try { + const jobWithSteps = await JobService.getJobWithSteps(jobId); + if (!jobWithSteps) return { completed: 0, total: 0, percentage: 0 }; + + const completedSteps = jobWithSteps.steps.filter( + (step) => step.status === "completed" || step.status === "skipped" + ).length; + const totalSteps = jobWithSteps.steps.length; + const percentage = + totalSteps > 0 ? Math.round((completedSteps / totalSteps) * 100) : 0; + + return { completed: completedSteps, total: totalSteps, percentage }; + } catch (error) { + console.error("Error calculating job progress:", error); + return { completed: 0, total: 0, percentage: 0 }; + } + }, + [] + ); + + const createJob = useCallback( + async (request: Parameters[0]) => { + if (!userId) throw new Error("User not authenticated"); + + try { + const jobWithSteps = await JobService.createJob(request); + await loadJobs(); // Refresh jobs list + return jobWithSteps.id; + } catch (error) { + setError( + error instanceof Error ? error.message : "Failed to create job" + ); + throw error; + } + }, + [userId, loadJobs] + ); + + const cancelJob = useCallback( + async (jobId: string) => { + try { + await JobService.cancelJob(jobId); + await loadJobs(); // Refresh jobs list + } catch (error) { + setError( + error instanceof Error ? error.message : "Failed to cancel job" + ); + throw error; + } + }, + [loadJobs] + ); + + return { + jobs, + activeJobs, + recentJobs, + isLoading, + error, + loadJobs, + createJob, + cancelJob, + getJobProgress, + }; +} diff --git a/frontend/src/hooks/use-reliable-realtime.ts b/frontend/src/hooks/use-reliable-realtime.ts new file mode 100755 index 0000000..3ed863f --- /dev/null +++ b/frontend/src/hooks/use-reliable-realtime.ts @@ -0,0 +1,231 @@ +import { useCallback, useEffect, useRef, useState } from 'react'; +import { supabase } from '@/lib/supabase'; +import type { RealtimeChannel } from '@supabase/supabase-js'; + +/** + * Reliable realtime hook with automatic HTTP polling fallback. + * + * Sections: + * - subscribeRealtime: Establish websocket channels + * - unsubscribe (cleanupChannels): Remove channels + * - startPolling / stopPolling: Fallback interval logic + * - error handling: track failures & switch modes + * - reconnection: exponential backoff + online listener + */ + +export type ConnectionMode = 'connected' | 'polling' | 'error' | 'initializing'; + +export interface PostgresChangeFilter { + event: string; // 'INSERT' | 'UPDATE' | 'DELETE' | '*' + schema: string; // usually 'public' + table: string; + filter?: string; // e.g. user_id=eq.xxx +} + +export interface RealtimeSource { + /** Logical key used for synthetic events coming from this source */ + key: string; + /** Postgres Changes filter definition */ + filter: PostgresChangeFilter; +} + +export interface SyntheticEvent { + source: string; // key of source + payload: T; // domain payload (can mimic supabase payload shape) + isSynthetic: boolean; // true if generated via polling +} + +export interface UseReliableRealtimeConfig { + sources: RealtimeSource[]; + pollIntervalMs?: number; + poller?: () => Promise[]>; // produce synthetic delta events + onEvent: (evt: SyntheticEvent) => void; // consumer handler (toast logic lives elsewhere) + maxFailuresBeforeBackoff?: number; +} + +export interface UseReliableRealtimeState { + mode: ConnectionMode; + error: string | null; + lastActivityAt: number | null; + failures: number; + forceReconnect: () => void; +} + +export function useReliableRealtime(config: UseReliableRealtimeConfig): UseReliableRealtimeState { + const { sources, pollIntervalMs = 5000, poller, onEvent, maxFailuresBeforeBackoff = 3 } = config; + + const [mode, setMode] = useState('initializing'); + const [error, setError] = useState(null); + const [failures, setFailures] = useState(0); + const [lastActivityAt, setLastActivityAt] = useState(null); + + const channelsRef = useRef>({}); + const pollTimerRef = useRef(null); + const reconnectTimerRef = useRef(null); + const isMountedRef = useRef(true); + // Track current subscription generation to ignore stale channel callbacks + const generationRef = useRef(0); + // Track last logged status per channel (per generation) to avoid log spam + const lastStatusRef = useRef>({}); + // Mirror mode in ref for access inside callbacks without stale closure + const modeRef = useRef(mode); + useEffect(() => { modeRef.current = mode; }, [mode]); + // Track last time a (channel,status) was logged across generations to throttle noisy repeats + const statusLogTrackerRef = useRef>({}); + + // ---------------- Polling ---------------- + const clearPolling = () => { + if (pollTimerRef.current) { + window.clearInterval(pollTimerRef.current); + pollTimerRef.current = null; + } + }; + + const startPolling = useCallback(() => { + if (!poller) return; // nothing to do + if (pollTimerRef.current) return; // already polling + setMode((m) => (m === 'connected' ? m : 'polling')); + pollTimerRef.current = window.setInterval(async () => { + try { + const events = await poller(); + events.forEach((evt) => { + setLastActivityAt(Date.now()); + onEvent({ ...evt, isSynthetic: true }); + }); + } catch (e: any) { + console.warn('[useReliableRealtime] Poller error', e); + } + }, pollIntervalMs) as unknown as number; + }, [poller, pollIntervalMs, onEvent]); + + const stopPolling = useCallback(() => { + clearPolling(); + }, []); + + // ---------------- Subscription ---------------- + const cleanupChannels = useCallback(() => { + Object.values(channelsRef.current).forEach((ch) => { + try { supabase.removeChannel(ch); } catch (_) { /* noop */ } + }); + channelsRef.current = {}; + }, []); + + const subscribeRealtime = useCallback(() => { + cleanupChannels(); + if (!sources.length) return; + setMode((m) => (m === 'polling' ? m : 'initializing')); + let subscribedCount = 0; + let aborted = false; + generationRef.current += 1; + const currentGeneration = generationRef.current; + lastStatusRef.current = {}; + + sources.forEach((source) => { + const channel = supabase + .channel(`reliable-${source.key}`) + .on('postgres_changes', source.filter as any, (payload) => { + if (!isMountedRef.current) return; + // Ignore events from stale generations + if (generationRef.current !== currentGeneration) return; + setLastActivityAt(Date.now()); + onEvent({ source: source.key, payload, isSynthetic: false }); + }) + .subscribe((status, err) => { + if (!isMountedRef.current) return; + // Ignore callbacks from stale generations + if (generationRef.current !== currentGeneration) return; + if (err) { + if (lastStatusRef.current[source.key] !== 'ERROR') { + console.warn('[useReliableRealtime] Channel error', source.key, err); + lastStatusRef.current[source.key] = 'ERROR'; + } + aborted = true; + setError(err.message || 'Channel error'); + setFailures((f) => f + 1); + setMode('polling'); + startPolling(); + return; + } + if (status === 'SUBSCRIBED') { + subscribedCount += 1; + if (subscribedCount === sources.length && !aborted) { + setMode('connected'); + setError(null); + setFailures(0); + stopPolling(); + } + } else if (['TIMED_OUT', 'CHANNEL_ERROR'].includes(status) || (status === 'CLOSED' && modeRef.current === 'connected')) { + // Throttle logging for very chatty CLOSED statuses: once per channel per 60s + const globalKey = source.key; + const tracker = statusLogTrackerRef.current[globalKey]; + const now = Date.now(); + const shouldLog = !tracker || tracker.status !== status || (now - tracker.lastLogged) > 60000; + if (shouldLog) { + console.warn('[useReliableRealtime] Channel status issue', source.key, status); + statusLogTrackerRef.current[globalKey] = { status, lastLogged: now }; + } + // Also ensure per-generation suppression for subsequent repeats within same generation + lastStatusRef.current[source.key] = status; + // Move to polling (degraded) only once + if (modeRef.current !== 'polling') { + modeRef.current = 'polling'; // proactively update ref to avoid race causing repeated CLOSED logs + setMode('polling'); + startPolling(); + } + } + }); + channelsRef.current[source.key] = channel; + }); + }, [sources, cleanupChannels, onEvent, startPolling, stopPolling]); + + // ---------------- Reconnection ---------------- + const scheduleReconnect = useCallback(() => { + if (!isMountedRef.current) return; + if (reconnectTimerRef.current) window.clearTimeout(reconnectTimerRef.current); + const backoffMs = Math.min(30000, 1000 * Math.pow(2, Math.max(0, failures - maxFailuresBeforeBackoff))); + reconnectTimerRef.current = window.setTimeout(() => { + if (navigator.onLine) { + subscribeRealtime(); + } else { + scheduleReconnect(); // keep waiting + } + }, backoffMs) as unknown as number; + }, [failures, maxFailuresBeforeBackoff, subscribeRealtime]); + + const forceReconnect = useCallback(() => { + setFailures(0); + subscribeRealtime(); + }, [subscribeRealtime]); + + // ---------------- Lifecycle ---------------- + const teardown = useCallback(() => { + cleanupChannels(); + stopPolling(); + if (reconnectTimerRef.current) window.clearTimeout(reconnectTimerRef.current); + }, [cleanupChannels, stopPolling]); + + useEffect(() => { + isMountedRef.current = true; + subscribeRealtime(); + if (poller) startPolling(); // safety net until realtime established + return () => { + isMountedRef.current = false; + teardown(); + }; + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [JSON.stringify(sources)]); + + // Reconnect when browser returns online + useEffect(() => { + const handleOnline = () => { if (mode === 'polling') scheduleReconnect(); }; + window.addEventListener('online', handleOnline); + return () => window.removeEventListener('online', handleOnline); + }, [mode, scheduleReconnect]); + + // If stuck in polling, keep scheduling reconnect attempts. + useEffect(() => { + if (mode === 'polling') scheduleReconnect(); + }, [mode, failures, scheduleReconnect]); + + return { mode, error, lastActivityAt, failures, forceReconnect }; +} diff --git a/frontend/src/hooks/useReliableRealtime.ts b/frontend/src/hooks/useReliableRealtime.ts new file mode 100755 index 0000000..9719836 --- /dev/null +++ b/frontend/src/hooks/useReliableRealtime.ts @@ -0,0 +1,221 @@ +import { useCallback, useEffect, useRef, useState } from 'react'; +import { supabase } from '@/lib/supabase'; +import type { RealtimeChannel } from '@supabase/supabase-js'; + +/** + * Reliable realtime hook with automatic HTTP polling fallback. + * + * Sections: + * - subscribeRealtime: Establish websocket channels + * - unsubscribe (cleanupChannels): Remove channels + * - startPolling / stopPolling: Fallback interval logic + * - error handling: track failures & switch modes + * - reconnection: exponential backoff + online listener + */ + +export type ConnectionMode = 'connected' | 'polling' | 'error' | 'initializing'; + +export interface PostgresChangeFilter { + event: string; // 'INSERT' | 'UPDATE' | 'DELETE' | '*' + schema: string; // usually 'public' + table: string; + filter?: string; // e.g. user_id=eq.xxx +} + +export interface RealtimeSource { + /** Logical key used for synthetic events coming from this source */ + key: string; + /** Postgres Changes filter definition */ + filter: PostgresChangeFilter; +} + +export interface SyntheticEvent { + source: string; // key of source + payload: T; // domain payload (can mimic supabase payload shape) + isSynthetic: boolean; // true if generated via polling +} + +export interface UseReliableRealtimeConfig { + sources: RealtimeSource[]; + pollIntervalMs?: number; + poller?: () => Promise[]>; // produce synthetic delta events + onEvent: (evt: SyntheticEvent) => void; // consumer handler (toast logic lives elsewhere) + maxFailuresBeforeBackoff?: number; +} + +export interface UseReliableRealtimeState { + mode: ConnectionMode; + error: string | null; + lastActivityAt: number | null; + failures: number; + forceReconnect: () => void; +} + +export function useReliableRealtime(config: UseReliableRealtimeConfig): UseReliableRealtimeState { + const { sources, pollIntervalMs = 5000, poller, onEvent, maxFailuresBeforeBackoff = 3 } = config; + + const [mode, setMode] = useState('initializing'); + const [error, setError] = useState(null); + const [failures, setFailures] = useState(0); + const [lastActivityAt, setLastActivityAt] = useState(null); + + const channelsRef = useRef>({}); + const pollTimerRef = useRef(null); + const reconnectTimerRef = useRef(null); + const isMountedRef = useRef(true); + // Track current subscription generation to ignore stale channel callbacks + const generationRef = useRef(0); + // Track last logged status per channel (per generation) to avoid log spam + const lastStatusRef = useRef>({}); + // Mirror mode in ref for access inside callbacks without stale closure + const modeRef = useRef(mode); + useEffect(() => { modeRef.current = mode; }, [mode]); + + // ---------------- Polling ---------------- + const clearPolling = () => { + if (pollTimerRef.current) { + window.clearInterval(pollTimerRef.current); + pollTimerRef.current = null; + } + }; + + const startPolling = useCallback(() => { + if (!poller) return; // nothing to do + if (pollTimerRef.current) return; // already polling + setMode((m) => (m === 'connected' ? m : 'polling')); + pollTimerRef.current = window.setInterval(async () => { + try { + const events = await poller(); + events.forEach((evt) => { + setLastActivityAt(Date.now()); + onEvent({ ...evt, isSynthetic: true }); + }); + } catch (e: any) { + console.warn('[useReliableRealtime] Poller error', e); + } + }, pollIntervalMs) as unknown as number; + }, [poller, pollIntervalMs, onEvent]); + + const stopPolling = useCallback(() => { + clearPolling(); + }, []); + + // ---------------- Subscription ---------------- + const cleanupChannels = useCallback(() => { + Object.values(channelsRef.current).forEach((ch) => { + try { supabase.removeChannel(ch); } catch (_) { /* noop */ } + }); + channelsRef.current = {}; + }, []); + + const subscribeRealtime = useCallback(() => { + cleanupChannels(); + if (!sources.length) return; + setMode((m) => (m === 'polling' ? m : 'initializing')); + let subscribedCount = 0; + let aborted = false; + generationRef.current += 1; + const currentGeneration = generationRef.current; + lastStatusRef.current = {}; + + sources.forEach((source) => { + const channel = supabase + .channel(`reliable-${source.key}`) + .on('postgres_changes', source.filter as any, (payload) => { + if (!isMountedRef.current) return; + // Ignore events from stale generations + if (generationRef.current !== currentGeneration) return; + setLastActivityAt(Date.now()); + onEvent({ source: source.key, payload, isSynthetic: false }); + }) + .subscribe((status, err) => { + if (!isMountedRef.current) return; + // Ignore callbacks from stale generations + if (generationRef.current !== currentGeneration) return; + if (err) { + if (lastStatusRef.current[source.key] !== 'ERROR') { + console.warn('[useReliableRealtime] Channel error', source.key, err); + lastStatusRef.current[source.key] = 'ERROR'; + } + aborted = true; + setError(err.message || 'Channel error'); + setFailures((f) => f + 1); + setMode('polling'); + startPolling(); + return; + } + if (status === 'SUBSCRIBED') { + subscribedCount += 1; + if (subscribedCount === sources.length && !aborted) { + setMode('connected'); + setError(null); + setFailures(0); + stopPolling(); + } + } else if (['TIMED_OUT', 'CHANNEL_ERROR'].includes(status) || (status === 'CLOSED' && modeRef.current === 'connected')) { + // Only log once per channel per status per generation + if (lastStatusRef.current[source.key] !== status) { + console.warn('[useReliableRealtime] Channel status issue', source.key, status); + lastStatusRef.current[source.key] = status; + } + if (modeRef.current !== 'polling') { + setMode('polling'); + startPolling(); + } + } + }); + channelsRef.current[source.key] = channel; + }); + }, [sources, cleanupChannels, onEvent, startPolling, stopPolling]); + + // ---------------- Reconnection ---------------- + const scheduleReconnect = useCallback(() => { + if (!isMountedRef.current) return; + if (reconnectTimerRef.current) window.clearTimeout(reconnectTimerRef.current); + const backoffMs = Math.min(30000, 1000 * Math.pow(2, Math.max(0, failures - maxFailuresBeforeBackoff))); + reconnectTimerRef.current = window.setTimeout(() => { + if (navigator.onLine) { + subscribeRealtime(); + } else { + scheduleReconnect(); // keep waiting + } + }, backoffMs) as unknown as number; + }, [failures, maxFailuresBeforeBackoff, subscribeRealtime]); + + const forceReconnect = useCallback(() => { + setFailures(0); + subscribeRealtime(); + }, [subscribeRealtime]); + + // ---------------- Lifecycle ---------------- + const teardown = useCallback(() => { + cleanupChannels(); + stopPolling(); + if (reconnectTimerRef.current) window.clearTimeout(reconnectTimerRef.current); + }, [cleanupChannels, stopPolling]); + + useEffect(() => { + isMountedRef.current = true; + subscribeRealtime(); + if (poller) startPolling(); // safety net until realtime established + return () => { + isMountedRef.current = false; + teardown(); + }; + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [JSON.stringify(sources)]); + + // Reconnect when browser returns online + useEffect(() => { + const handleOnline = () => { if (mode === 'polling') scheduleReconnect(); }; + window.addEventListener('online', handleOnline); + return () => window.removeEventListener('online', handleOnline); + }, [mode, scheduleReconnect]); + + // If stuck in polling, keep scheduling reconnect attempts. + useEffect(() => { + if (mode === 'polling') scheduleReconnect(); + }, [mode, failures, scheduleReconnect]); + + return { mode, error, lastActivityAt, failures, forceReconnect }; +} diff --git a/frontend/src/index.css b/frontend/src/index.css index d1b54ca..c0d8b95 100755 --- a/frontend/src/index.css +++ b/frontend/src/index.css @@ -20,8 +20,8 @@ --popover: 0 0% 92.9412%; --popover-foreground: 0 0% 23.1373%; --primary: 221.8269 83.2% 50.9804%; - --primary-pastello: 216, 62%, 48%; --primary-foreground: 0 0% 100%; + --primary-pastello: 216, 62%, 48%; --secondary: 275.0685 100% 42.9412%; --secondary-foreground: 0 0% 89.8039%; --muted: 0 0% 90.1961%; @@ -30,6 +30,8 @@ --accent-foreground: 0 0% 89.8039%; --destructive: 0 87.4477% 53.1373%; --destructive-foreground: 0 0% 100%; + --success: 142.5 62% 40%; + --success-foreground: 0 0% 100%; --border: 0 0% 90.1961%; /* --input: 0 0% 74.1176%; */ --input: 0 0% 90.1961%; @@ -40,11 +42,13 @@ --chart-4: 218.6014 82.659% 66.0784%; --chart-5: 207.4138 46.4% 49.0196%; --sidebar: 0 0% 96.0784%; + /* Alias for Tailwind config which expects --sidebar-background */ + --sidebar-background: var(--sidebar); --sidebar-foreground: 0 0% 18.8235%; --sidebar-primary: 246.6207 77.5401% 63.3333%; --sidebar-primary-foreground: 0 0% 0%; --sidebar-accent: 225 93.617% 63.1373%; - --sidebar-accent-foreground: 0 0% 18.8235%; + --sidebar-accent-foreground: 0 0% 89.8039%; --sidebar-border: 0 0% 90.1961%; --sidebar-ring: 246.6207 77.5401% 63.3333%; --font-sans: Poppins, ui-sans-serif, sans-serif, system-ui; @@ -80,6 +84,7 @@ --popover-foreground: 0 0% 89.8039%; --primary: 221.8269 83.2% 50.9804%; --primary-foreground: 0 0% 100%; + --primary-pastello: 216, 62%, 48%; --secondary: 275.0685 100% 42.9412%; --secondary-foreground: 0 0% 89.8039%; --muted: 0 0% 21.9608%; @@ -88,6 +93,8 @@ --accent-foreground: 0 0% 89.8039%; --destructive: 0 87.4477% 53.1373%; --destructive-foreground: 0 0% 100%; + --success: 142.5 62% 40%; + --success-foreground: 0 0% 100%; --border: 0 0% 14.1176%; /* --input: 0 0% 27.0588%; */ --input: 0 0% 14.1176%; @@ -98,6 +105,8 @@ --chart-4: 218.6014 82.659% 66.0784%; --chart-5: 207.4138 46.4% 49.0196%; --sidebar: 0 0% 9.0196%; + /* Alias for Tailwind config which expects --sidebar-background */ + --sidebar-background: var(--sidebar); --sidebar-foreground: 0 0% 89.8039%; --sidebar-primary: 246.6207 77.5401% 63.3333%; --sidebar-primary-foreground: 0 0% 100%; diff --git a/frontend/src/pages/Dashboard.tsx b/frontend/src/pages/Dashboard.tsx index 13042ec..e5f6c15 100755 --- a/frontend/src/pages/Dashboard.tsx +++ b/frontend/src/pages/Dashboard.tsx @@ -1,4 +1,4 @@ -import { useState, useMemo, useEffect } from "react"; +import { useState, useMemo, useEffect, useRef } from "react"; import { useNavigate } from "react-router-dom"; import { AppSidebar } from "@/components/app-sidebar"; import { ModeToggle } from "@/components/mode-toggle"; @@ -23,16 +23,26 @@ import { import { resourceService } from "@/services/resourceService"; import { useToast } from "@/hooks/use-toast"; -import { createClient } from "@supabase/supabase-js"; +import { useJobNotifications } from "@/hooks/use-job-notifications"; import type { Resource, CreateResourceRequest } from "@/types/resource"; -const supabaseUrl = import.meta.env.VITE_SUPABASE_URL; -const supabaseAnonKey = import.meta.env.VITE_SUPABASE_ANON_KEY; -const supabase = createClient(supabaseUrl, supabaseAnonKey); - export default function Dashboard() { const { toast } = useToast(); const navigate = useNavigate(); + const getToastVariant = (type: string) => { + switch (type) { + case "job_completed": + return "default" as const; + case "job_failed": + return "destructive" as const; + case "step_updated": + return "default" as const; + case "resource_ready": + return "success" as const; + default: + return "default" as const; + } + }; const [resources, setResources] = useState([]); const [isAddingResource, setIsAddingResource] = useState(false); const [filters, setFilters] = useState({ @@ -44,66 +54,63 @@ export default function Dashboard() { }); const [user, setUser] = useState(null); - // Load resources and setup realtime subscription - useEffect(() => { - let subscription: ReturnType | null = null; - - const loadResources = () => { - resourceService - .getAllResources() - .then((data) => { - setResources(data); - }) - .catch(console.error); - }; + // Initialize job notifications + useJobNotifications({ + showToasts: true, + userId: user?.id, + }); - loadResources(); + // Load user & resources with polling fallback (replaces inline realtime here) + const previousResourceIdsRef = useRef>(new Set()); + useEffect(() => { + let cancelled = false; + let interval: number | null = null; - // Realtime setup with retry logic - const setupRealtime = async () => { + const fetchUser = async () => { try { const { data: { user: supaUser }, - } = await supabase.auth.getUser(); - if (!supaUser) return; - setUser(supaUser); + } = await import("@/lib/supabase").then((m) => + m.supabase.auth.getUser() + ); + if (!cancelled) setUser(supaUser || null); + } catch (e) { + /* ignore */ + } + }; - subscription = supabase - .channel(`resources-${supaUser.id}-${Date.now()}`) - .on( - "postgres_changes", - { - event: "INSERT", - schema: "public", - table: "resources", - }, - (payload) => { - console.log("[Realtime] INSERT event received:", payload); - // Check if the resource belongs to the current user - if (payload.new && payload.new.user_id === supaUser.id) { - loadResources(); - toast({ - title: "Resource ready!", - description: - "A new resource has been processed and is now available in your dashboard.", - duration: 6000, - variant: "default", - }); - } + const loadResources = async () => { + try { + const data = await resourceService.getAllResources(); + if (cancelled) return; + // detect new resources for toast + const currentIds = new Set(data.map((r) => r.id)); + data.forEach((r) => { + if (r.id && !previousResourceIdsRef.current.has(r.id)) { + if (previousResourceIdsRef.current.size > 0) { + toast({ + title: "Resource ready!", + description: + "A new resource has been processed and is now available.", + duration: 6000, + variant: getToastVariant("resource_ready"), + }); } - ) - .subscribe((status, err) => { - console.log("[Realtime] Subscription status:", status, err); - }); - } catch (err) { - console.error("[Realtime] Setup error:", err); + } + }); + previousResourceIdsRef.current = currentIds; + setResources(data); + } catch (e) { + console.error("Error loading resources", e); } }; - setupRealtime(); - + fetchUser(); + loadResources(); + interval = window.setInterval(loadResources, 5000) as unknown as number; return () => { - if (subscription) supabase.removeChannel(subscription); + cancelled = true; + if (interval) window.clearInterval(interval); }; }, [toast]); @@ -165,8 +172,8 @@ export default function Dashboard() { switch (filters.sortBy) { case "date": - aValue = new Date(a.published_date ?? ""); - bValue = new Date(b.published_date ?? ""); + aValue = new Date(a.processed_date ?? ""); + bValue = new Date(b.processed_date ?? ""); break; case "title": aValue = a.title.toLowerCase(); @@ -191,18 +198,29 @@ export default function Dashboard() { const handleAddResource = async (data: CreateResourceRequest) => { if (!user) return; setIsAddingResource(true); - toast({ - title: "Processing resource...", - description: - "Your resource is being processed. It will appear automatically in the dashboard when ready.", - duration: 6000, - variant: "default", - }); + try { + // Job creation now delegated entirely to n8n (avoid duplicate jobs) + toast({ + title: "Processing started!", + description: + "Workflow started. You will receive notifications during processing.", + duration: 6000, + variant: "default", + }); + + // Start the resource creation process await resourceService.createResource({ ...data, user_id: user.id, }); + } catch (error) { + console.error("Error starting resource processing:", error); + toast({ + title: "Error", + description: "Failed to start processing. Please try again.", + variant: "destructive", + }); } finally { setIsAddingResource(false); } @@ -287,7 +305,7 @@ export default function Dashboard() {

) : ( -
+
{filteredResources.map((resource) => ( { const { id } = useParams<{ id: string }>(); const navigate = useNavigate(); + const { toast } = useToast(); const [resource, setResource] = useState(null); const [isLoading, setIsLoading] = useState(true); + const [isDeleting, setIsDeleting] = useState(false); + const [showDeleteDialog, setShowDeleteDialog] = useState(false); + const [showSuccessDialog, setShowSuccessDialog] = useState(false); + const [showErrorDialog, setShowErrorDialog] = useState(false); + const [deleteError, setDeleteError] = useState(""); + + // Ensure the page is scrolled to top when the component mounts or when the resource id changes (so it always opens at the top). + useEffect(() => { + try { + window.scrollTo({ top: 0, left: 0, behavior: "auto" }); + } catch (err) { + // ignore in non-browser environments + } + }, [id]); useEffect(() => { // Simulate API call - in the future this will be a real API call @@ -61,7 +98,7 @@ const ResourceDetail = () => { const getContentTypeIcon = (contentType: string) => { return contentType === "youtube" ? ( - + ) : ( ); @@ -106,31 +143,57 @@ const ResourceDetail = () => { } catch { // Fallback to copying URL to clipboard navigator.clipboard.writeText(window.location.href); + toast({ + title: "Link copied", + description: "Resource link copied to clipboard", + }); } } else { // Fallback for browsers that don't support Web Share API navigator.clipboard.writeText(window.location.href); + toast({ + title: "Link copied", + description: "Resource link copied to clipboard", + }); } }; + const handleDeleteConfirm = () => { + setShowDeleteDialog(true); + }; + const handleDeleteResource = async () => { - if ( - resource && - window.confirm( - `Are you sure you want to delete "${resource.title}"? This action cannot be undone.` - ) - ) { - try { - await resourceService.deleteResource(resource.id); - alert("Resource deleted successfully."); - navigate("/dashboard"); - } catch (err) { - console.error("Error deleting resource:", err); - alert("Failed to delete resource. Please try again."); - } + if (!resource) return; + + setIsDeleting(true); + setShowDeleteDialog(false); + + try { + await resourceService.deleteResource(resource.id); + setShowSuccessDialog(true); + } catch (err) { + console.error("Error deleting resource:", err); + setDeleteError( + err instanceof Error + ? err.message + : "An unexpected error occurred while deleting the resource" + ); + setShowErrorDialog(true); + } finally { + setIsDeleting(false); } }; + const handleSuccessDialogClose = () => { + setShowSuccessDialog(false); + navigate("/dashboard"); + }; + + const handleErrorDialogClose = () => { + setShowErrorDialog(false); + setDeleteError(""); + }; + if (isLoading) { return ( @@ -258,21 +321,32 @@ const ResourceDetail = () => { {/* Image for smaller screens */}
{resource.thumbnail_link ? ( - { + className="block" + > +
+ { +
+ ) : ( -
+
{resource.title ? `${resource.title} thumbnail` @@ -370,14 +444,72 @@ const ResourceDetail = () => {
- + + + + + + {/* Icon */} +
+
+ +
+
+ + {/* Title */} + + Confirm Deletion + + + {/* Description */} + + Are you sure you want to delete{" "} + + "{resource.title}" + + ? This action cannot be undone. + +
+ + + + Cancel + + + {isDeleting ? ( + <> +
+ Deleting... + + ) : ( + "Delete Resource" + )} + + + +
@@ -436,21 +568,32 @@ const ResourceDetail = () => { {/* Image - Only visible on xl screens and larger */}
{resource.thumbnail_link ? ( - { + className="block w-full" + > +
+ { +
+ ) : ( -
+
{resource.title ? `${resource.title} thumbnail` @@ -466,7 +609,7 @@ const ResourceDetail = () => { })()} {/* Content placeholder - in real implementation this would be the processed content */} -
+

Additional Information

Resource added on: {formatDate(resource.processed_date)} @@ -477,6 +620,82 @@ const ResourceDetail = () => {

+ + {/* Success Dialog */} + + + + {/* Icon */} +
+
+ +
+
+ + {/* Title */} + + Resource Deleted Successfully + + + {/* Description */} + + The resource{" "} + + "{resource?.title}" + {" "} + has been successfully removed from your collection. + +
+ + + + +
+
+ + {/* Error Dialog */} + + + + {/* Icon */} +
+
+ +
+
+ + {/* Title */} + + Delete Failed + + + {/* Description */} + + Failed to delete the resource. Please try again. + {deleteError && ( +
+ Error: {deleteError} +
+ )} +
+
+ + + + + +
+
); diff --git a/frontend/src/pages/SignIn.tsx b/frontend/src/pages/SignIn.tsx index 6c0ef16..35681a3 100755 --- a/frontend/src/pages/SignIn.tsx +++ b/frontend/src/pages/SignIn.tsx @@ -4,6 +4,7 @@ import { zodResolver } from "@hookform/resolvers/zod"; import { Link, useNavigate } from "react-router-dom"; import { Eye, EyeOff } from "lucide-react"; import mindleyIcon from "@/assets/mindley-icon.svg"; +import GoogleIcon from "@/components/icons/google"; import { Button } from "@/components/ui/button"; import { Input } from "@/components/ui/input"; @@ -43,7 +44,7 @@ export default function SignInPage() { try { const { data: authData, error } = await auth.signIn( data.email, - data.password, + data.password ); if (error) { @@ -163,9 +164,11 @@ export default function SignInPage() { className="absolute right-0 top-0 h-full px-3 py-2 hover:bg-transparent" onClick={() => setShowPassword(!showPassword)} > - {showPassword - ? - : } + {showPassword ? ( + + ) : ( + + )} )}
@@ -191,24 +194,7 @@ export default function SignInPage() { onClick={handleGoogleSignIn} disabled={isLoading} > - - - - - - + Google diff --git a/frontend/src/pages/SignUp.tsx b/frontend/src/pages/SignUp.tsx index 2c58203..082984b 100755 --- a/frontend/src/pages/SignUp.tsx +++ b/frontend/src/pages/SignUp.tsx @@ -4,6 +4,7 @@ import { zodResolver } from "@hookform/resolvers/zod"; import { Link, useNavigate } from "react-router-dom"; import { Eye, EyeOff, Mail } from "lucide-react"; import mindleyIcon from "@/assets/mindley-icon.svg"; +import GoogleIcon from "@/components/icons/google"; import { Button } from "@/components/ui/button"; import { Input } from "@/components/ui/input"; @@ -48,7 +49,7 @@ export default function SignUpPage() { try { const { data: authData, error } = await auth.signUp( data.email, - data.password, + data.password ); if (error) { @@ -240,9 +241,11 @@ export default function SignUpPage() { className="absolute right-0 top-0 h-full px-3 py-2 hover:bg-transparent" onClick={() => setShowPassword(!showPassword)} > - {showPassword - ? - : } + {showPassword ? ( + + ) : ( + + )} )}
@@ -278,11 +281,14 @@ export default function SignUpPage() { size="sm" className="absolute right-0 top-0 h-full px-3 py-2 hover:bg-transparent" onClick={() => - setShowConfirmPassword(!showConfirmPassword)} + setShowConfirmPassword(!showConfirmPassword) + } > - {showConfirmPassword - ? - : } + {showConfirmPassword ? ( + + ) : ( + + )} )}
@@ -308,24 +314,7 @@ export default function SignUpPage() { onClick={handleGoogleSignUp} disabled={isLoading} > - - - - - - + Google diff --git a/frontend/src/services/jobService.ts b/frontend/src/services/jobService.ts new file mode 100755 index 0000000..5207a3e --- /dev/null +++ b/frontend/src/services/jobService.ts @@ -0,0 +1,260 @@ +import { supabase } from '@/lib/supabase'; +import type { Job, JobStep, JobWithSteps, CreateJobRequest, JobStatus, JobStepStatus } from '@/types/job'; + +export class JobService { + /** + * Creates a new job with predefined steps using Edge Function + */ + static async createJob(request: CreateJobRequest): Promise { + const { data: { session } } = await supabase.auth.getSession(); + + if (!session) { + throw new Error('User not authenticated'); + } + + // Map the request to match the edge function interface + const jobData = { + workflow_name: request.workflow_name, + resource_id: request.resource_id, + metadata: {}, + steps: request.steps.map((step, index) => ({ + step_name: step.name, + step_type: step.type, + step_order: index + 1, + metadata: step.metadata || {} + })) + }; + + const { data, error } = await supabase.functions.invoke('create-job', { + body: jobData, + headers: { + Authorization: `Bearer ${session.access_token}`, + } + }); + + if (error) { + throw new Error(`Failed to create job: ${error.message}`); + } + + return data; + } + + /** + * Gets all jobs for the current user using Edge Function + */ + static async getUserJobs(status?: JobStatus, limit: number = 10): Promise { + const { data: { session } } = await supabase.auth.getSession(); + + if (!session) { + throw new Error('User not authenticated'); + } + + const params = new URLSearchParams(); + params.append('limit', limit.toString()); + if (status) { + params.append('status', status); + } + + const { data, error } = await supabase.functions.invoke('get-job-status', { + method: 'GET', + headers: { + Authorization: `Bearer ${session.access_token}`, + } + }); + + if (error) { + throw new Error(`Failed to fetch jobs: ${error.message}`); + } + + return data || []; + } + + /** + * Gets a specific job with its steps using Edge Function + */ + static async getJobWithSteps(jobId: string): Promise { + const { data: { session } } = await supabase.auth.getSession(); + + if (!session) { + throw new Error('User not authenticated'); + } + + const params = new URLSearchParams(); + params.append('job_id', jobId); + + const { data, error } = await supabase.functions.invoke('get-job-status?' + params.toString(), { + method: 'GET', + headers: { + Authorization: `Bearer ${session.access_token}`, + } + }); + + if (error) { + throw new Error(`Failed to fetch job: ${error.message}`); + } + if (!data) return null; + // Normalize shape: edge function returns job_steps for historical reasons; map to steps expected by frontend types + const normalized: any = { ...data }; + if (!normalized.steps && Array.isArray(normalized.job_steps)) { + normalized.steps = normalized.job_steps; + } + return normalized as JobWithSteps; + } + + /** + * Updates job step status using Edge Function + */ + static async updateJobStepByName( + jobId: string, + stepName: string, + status: JobStepStatus, + errorMessage?: string, + outputData?: Record, + workflowExecutionId?: string + ): Promise<{ updated_step: JobStep; job: JobWithSteps }> { + const { data: { session } } = await supabase.auth.getSession(); + + if (!session) { + throw new Error('User not authenticated'); + } + + const updateData = { + job_id: jobId, + step_name: stepName, + status, + error_message: errorMessage, + output_data: outputData, + workflow_execution_id: workflowExecutionId + }; + + const { data, error } = await supabase.functions.invoke('update-job-step', { + body: updateData, + headers: { + Authorization: `Bearer ${session.access_token}`, + } + }); + + if (error) { + throw new Error(`Failed to update job step: ${error.message}`); + } + + return data; + } + + /** + * Legacy methods using direct Supabase access (for backward compatibility) + */ + + /** + * Updates job status directly + */ + static async updateJobStatus( + jobId: string, + status: JobStatus, + errorMessage?: string, + workflowExecutionId?: string + ): Promise { + const updates: Partial = { status }; + + if (status === 'running') { + updates.started_at = new Date().toISOString(); + } else if (status === 'completed' || status === 'failed') { + updates.completed_at = new Date().toISOString(); + } + + if (errorMessage) { + updates.error_message = errorMessage; + } + + if (workflowExecutionId) { + updates.workflow_execution_id = workflowExecutionId; + } + + const { error } = await supabase + .from('jobs') + .update(updates) + .eq('id', jobId); + + if (error) { + throw new Error(`Failed to update job status: ${error.message}`); + } + } + + /** + * Updates job step status + */ + static async updateJobStepStatus( + stepId: string, + status: JobStepStatus, + errorMessage?: string, + outputData?: Record + ): Promise { + const updates: Partial = { status }; + + if (status === 'running') { + updates.started_at = new Date().toISOString(); + } else if (status === 'completed' || status === 'failed' || status === 'skipped') { + updates.completed_at = new Date().toISOString(); + } + + if (errorMessage) { + updates.error_message = errorMessage; + } + + if (outputData) { + updates.output_data = outputData; + } + + const { error } = await supabase + .from('job_steps') + .update(updates) + .eq('id', stepId); + + if (error) { + throw new Error(`Failed to update job step status: ${error.message}`); + } + } + + /** + * Cancels a job and all its pending steps + */ + static async cancelJob(jobId: string): Promise { + // Update job status + await this.updateJobStatus(jobId, 'cancelled'); + + // Update all pending/running steps to cancelled + const { error } = await supabase + .from('job_steps') + .update({ + status: 'skipped', + completed_at: new Date().toISOString() + }) + .eq('job_id', jobId) + .in('status', ['pending', 'running']); + + if (error) { + throw new Error(`Failed to cancel job steps: ${error.message}`); + } + } + + /** + * Deletes old completed jobs (older than specified days) + */ + static async cleanupOldJobs(daysOld: number = 30): Promise { + const cutoffDate = new Date(); + cutoffDate.setDate(cutoffDate.getDate() - daysOld); + + const { data, error } = await supabase + .from('jobs') + .delete() + .in('status', ['completed', 'failed', 'cancelled']) + .lt('completed_at', cutoffDate.toISOString()) + .select('id'); + + if (error) { + throw new Error(`Failed to cleanup old jobs: ${error.message}`); + } + + return data?.length || 0; + } +} diff --git a/frontend/src/types/job.ts b/frontend/src/types/job.ts new file mode 100755 index 0000000..3e70ea7 --- /dev/null +++ b/frontend/src/types/job.ts @@ -0,0 +1,51 @@ +export type JobStatus = 'pending' | 'running' | 'completed' | 'failed' | 'cancelled'; +export type JobStepStatus = 'pending' | 'running' | 'completed' | 'failed' | 'skipped'; + +export interface Job { + id: string; + user_id: string; + workflow_name: string; + workflow_execution_id: string | null; + status: JobStatus; + created_at: string; + started_at: string | null; + completed_at: string | null; + error_message: string | null; + metadata: Record; + resource_id: number | null; +} + +export interface JobStep { + id: string; + job_id: string; + step_name: string; + step_type: string; + status: JobStepStatus; + started_at: string | null; + completed_at: string | null; + error_message: string | null; + output_data: Record | null; + step_order: number; + metadata: Record; +} + +export interface JobWithSteps extends Job { + steps: JobStep[]; +} + +export interface JobNotification { + type: 'job_created' | 'job_updated' | 'step_updated' | 'job_completed' | 'job_failed'; + job: Job; + step?: JobStep; + message: string; +} + +export interface CreateJobRequest { + workflow_name: string; + resource_id?: number; + steps: Array<{ + name: string; + type: string; + metadata?: Record; + }>; +} diff --git a/frontend/src/types/workflowError.ts b/frontend/src/types/workflowError.ts new file mode 100755 index 0000000..ab6e23a --- /dev/null +++ b/frontend/src/types/workflowError.ts @@ -0,0 +1,18 @@ +// Types for workflow error handling +export interface WorkflowError { + id: string; + user_id: string; + workflow_execution_id?: string; + workflow_name: string; + error_message: string; + error_node?: string; + error_data?: Record; + created_at: string; + notified_at?: string; +} + +export interface WorkflowErrorNotification { + type: 'workflow_error'; + workflowError: WorkflowError; + message: string; +} diff --git a/frontend/tailwind.config.js b/frontend/tailwind.config.js index bc89768..81a6f8b 100755 --- a/frontend/tailwind.config.js +++ b/frontend/tailwind.config.js @@ -40,6 +40,10 @@ module.exports = { DEFAULT: "hsl(var(--destructive))", foreground: "hsl(var(--destructive-foreground))", }, + success: { + DEFAULT: "hsl(var(--success))", + foreground: "hsl(var(--success-foreground))", + }, border: "hsl(var(--border))", input: "hsl(var(--input))", ring: "hsl(var(--ring))", diff --git a/package-lock.json b/package-lock.json index cee3b0a..40c9210 100755 --- a/package-lock.json +++ b/package-lock.json @@ -18,12 +18,14 @@ "version": "0.0.0", "dependencies": { "@hookform/resolvers": "^5.2.1", + "@radix-ui/react-alert-dialog": "^1.1.15", "@radix-ui/react-avatar": "^1.1.10", "@radix-ui/react-collapsible": "^1.1.11", "@radix-ui/react-dialog": "^1.1.14", "@radix-ui/react-dropdown-menu": "^2.1.15", "@radix-ui/react-label": "^2.1.7", "@radix-ui/react-popover": "^1.1.14", + "@radix-ui/react-progress": "^1.1.7", "@radix-ui/react-select": "^2.2.5", "@radix-ui/react-separator": "^1.1.7", "@radix-ui/react-slot": "^1.2.3", @@ -34,7 +36,7 @@ "clsx": "^2.1.1", "cmdk": "^1.1.1", "input-otp": "^1.4.2", - "lucide-react": "^0.539.0", + "lucide-react": "^0.540.0", "next-themes": "^0.4.6", "react": "^19.1.1", "react-dom": "^19.1.1", @@ -1150,6 +1152,40 @@ "integrity": "sha512-XnbHrrprsNqZKQhStrSwgRUQzoCI1glLzdw79xiZPoofhGICeZRSQ3dIxAKH1gb3OHfNf4d6f+vAv3kil2eggA==", "license": "MIT" }, + "node_modules/@radix-ui/react-alert-dialog": { + "version": "1.1.15", + "resolved": "https://registry.npmjs.org/@radix-ui/react-alert-dialog/-/react-alert-dialog-1.1.15.tgz", + "integrity": "sha512-oTVLkEw5GpdRe29BqJ0LSDFWI3qu0vR1M0mUkOQWDIUnY/QIkLpgDMWuKxP94c2NAC2LGcgVhG1ImF3jkZ5wXw==", + "license": "MIT", + "dependencies": { + "@radix-ui/primitive": "1.1.3", + "@radix-ui/react-compose-refs": "1.1.2", + "@radix-ui/react-context": "1.1.2", + "@radix-ui/react-dialog": "1.1.15", + "@radix-ui/react-primitive": "2.1.3", + "@radix-ui/react-slot": "1.2.3" + }, + "peerDependencies": { + "@types/react": "*", + "@types/react-dom": "*", + "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", + "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" + }, + "peerDependenciesMeta": { + "@types/react": { + "optional": true + }, + "@types/react-dom": { + "optional": true + } + } + }, + "node_modules/@radix-ui/react-alert-dialog/node_modules/@radix-ui/primitive": { + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/@radix-ui/primitive/-/primitive-1.1.3.tgz", + "integrity": "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg==", + "license": "MIT" + }, "node_modules/@radix-ui/react-arrow": { "version": "1.1.7", "resolved": "https://registry.npmjs.org/@radix-ui/react-arrow/-/react-arrow-1.1.7.tgz", @@ -1287,20 +1323,20 @@ } }, "node_modules/@radix-ui/react-dialog": { - "version": "1.1.14", - "resolved": "https://registry.npmjs.org/@radix-ui/react-dialog/-/react-dialog-1.1.14.tgz", - "integrity": "sha512-+CpweKjqpzTmwRwcYECQcNYbI8V9VSQt0SNFKeEBLgfucbsLssU6Ppq7wUdNXEGb573bMjFhVjKVll8rmV6zMw==", + "version": "1.1.15", + "resolved": "https://registry.npmjs.org/@radix-ui/react-dialog/-/react-dialog-1.1.15.tgz", + "integrity": "sha512-TCglVRtzlffRNxRMEyR36DGBLJpeusFcgMVD9PZEzAKnUs1lKCgX5u9BmC2Yg+LL9MgZDugFFs1Vl+Jp4t/PGw==", "license": "MIT", "dependencies": { - "@radix-ui/primitive": "1.1.2", + "@radix-ui/primitive": "1.1.3", "@radix-ui/react-compose-refs": "1.1.2", "@radix-ui/react-context": "1.1.2", - "@radix-ui/react-dismissable-layer": "1.1.10", - "@radix-ui/react-focus-guards": "1.1.2", + "@radix-ui/react-dismissable-layer": "1.1.11", + "@radix-ui/react-focus-guards": "1.1.3", "@radix-ui/react-focus-scope": "1.1.7", "@radix-ui/react-id": "1.1.1", "@radix-ui/react-portal": "1.1.9", - "@radix-ui/react-presence": "1.1.4", + "@radix-ui/react-presence": "1.1.5", "@radix-ui/react-primitive": "2.1.3", "@radix-ui/react-slot": "1.2.3", "@radix-ui/react-use-controllable-state": "1.2.2", @@ -1322,6 +1358,78 @@ } } }, + "node_modules/@radix-ui/react-dialog/node_modules/@radix-ui/primitive": { + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/@radix-ui/primitive/-/primitive-1.1.3.tgz", + "integrity": "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg==", + "license": "MIT" + }, + "node_modules/@radix-ui/react-dialog/node_modules/@radix-ui/react-dismissable-layer": { + "version": "1.1.11", + "resolved": "https://registry.npmjs.org/@radix-ui/react-dismissable-layer/-/react-dismissable-layer-1.1.11.tgz", + "integrity": "sha512-Nqcp+t5cTB8BinFkZgXiMJniQH0PsUt2k51FUhbdfeKvc4ACcG2uQniY/8+h1Yv6Kza4Q7lD7PQV0z0oicE0Mg==", + "license": "MIT", + "dependencies": { + "@radix-ui/primitive": "1.1.3", + "@radix-ui/react-compose-refs": "1.1.2", + "@radix-ui/react-primitive": "2.1.3", + "@radix-ui/react-use-callback-ref": "1.1.1", + "@radix-ui/react-use-escape-keydown": "1.1.1" + }, + "peerDependencies": { + "@types/react": "*", + "@types/react-dom": "*", + "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", + "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" + }, + "peerDependenciesMeta": { + "@types/react": { + "optional": true + }, + "@types/react-dom": { + "optional": true + } + } + }, + "node_modules/@radix-ui/react-dialog/node_modules/@radix-ui/react-focus-guards": { + "version": "1.1.3", + "resolved": "https://registry.npmjs.org/@radix-ui/react-focus-guards/-/react-focus-guards-1.1.3.tgz", + "integrity": "sha512-0rFg/Rj2Q62NCm62jZw0QX7a3sz6QCQU0LpZdNrJX8byRGaGVTqbrW9jAoIAHyMQqsNpeZ81YgSizOt5WXq0Pw==", + "license": "MIT", + "peerDependencies": { + "@types/react": "*", + "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" + }, + "peerDependenciesMeta": { + "@types/react": { + "optional": true + } + } + }, + "node_modules/@radix-ui/react-dialog/node_modules/@radix-ui/react-presence": { + "version": "1.1.5", + "resolved": "https://registry.npmjs.org/@radix-ui/react-presence/-/react-presence-1.1.5.tgz", + "integrity": "sha512-/jfEwNDdQVBCNvjkGit4h6pMOzq8bHkopq458dPt2lMjx+eBQUohZNG9A7DtO/O5ukSbxuaNGXMjHicgwy6rQQ==", + "license": "MIT", + "dependencies": { + "@radix-ui/react-compose-refs": "1.1.2", + "@radix-ui/react-use-layout-effect": "1.1.1" + }, + "peerDependencies": { + "@types/react": "*", + "@types/react-dom": "*", + "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", + "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" + }, + "peerDependenciesMeta": { + "@types/react": { + "optional": true + }, + "@types/react-dom": { + "optional": true + } + } + }, "node_modules/@radix-ui/react-direction": { "version": "1.1.1", "resolved": "https://registry.npmjs.org/@radix-ui/react-direction/-/react-direction-1.1.1.tgz", @@ -1654,6 +1762,30 @@ } } }, + "node_modules/@radix-ui/react-progress": { + "version": "1.1.7", + "resolved": "https://registry.npmjs.org/@radix-ui/react-progress/-/react-progress-1.1.7.tgz", + "integrity": "sha512-vPdg/tF6YC/ynuBIJlk1mm7Le0VgW6ub6J2UWnTQ7/D23KXcPI1qy+0vBkgKgd38RCMJavBXpB83HPNFMTb0Fg==", + "license": "MIT", + "dependencies": { + "@radix-ui/react-context": "1.1.2", + "@radix-ui/react-primitive": "2.1.3" + }, + "peerDependencies": { + "@types/react": "*", + "@types/react-dom": "*", + "react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc", + "react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc" + }, + "peerDependenciesMeta": { + "@types/react": { + "optional": true + }, + "@types/react-dom": { + "optional": true + } + } + }, "node_modules/@radix-ui/react-roving-focus": { "version": "1.1.10", "resolved": "https://registry.npmjs.org/@radix-ui/react-roving-focus/-/react-roving-focus-1.1.10.tgz", @@ -4144,9 +4276,9 @@ } }, "node_modules/lucide-react": { - "version": "0.539.0", - "resolved": "https://registry.npmjs.org/lucide-react/-/lucide-react-0.539.0.tgz", - "integrity": "sha512-VVISr+VF2krO91FeuCrm1rSOLACQUYVy7NQkzrOty52Y8TlTPcXcMdQFj9bYzBgXbWCiywlwSZ3Z8u6a+6bMlg==", + "version": "0.540.0", + "resolved": "https://registry.npmjs.org/lucide-react/-/lucide-react-0.540.0.tgz", + "integrity": "sha512-armkCAqQvO62EIX4Hq7hqX/q11WSZu0Jd23cnnqx0/49yIxGXyL/zyZfBxNN9YDx0ensPTb4L+DjTh3yQXUxtQ==", "license": "ISC", "peerDependencies": { "react": "^16.5.1 || ^17.0.0 || ^18.0.0 || ^19.0.0" diff --git a/supabase/config.toml b/supabase/config.toml index d71e3b5..6cab027 100755 --- a/supabase/config.toml +++ b/supabase/config.toml @@ -392,3 +392,47 @@ entrypoint = "./functions/delete-resource/index.ts" # Specifies static files to be bundled with the function. Supports glob patterns. # For example, if you want to serve static HTML pages in your function: # static_files = [ "./functions/delete-resource/*.html" ] + +[functions.create-job] +enabled = true +verify_jwt = true +import_map = "./functions/create-job/deno.json" +# Uncomment to specify a custom file path to the entrypoint. +# Supported file extensions are: .ts, .js, .mjs, .jsx, .tsx +entrypoint = "./functions/create-job/index.ts" +# Specifies static files to be bundled with the function. Supports glob patterns. +# For example, if you want to serve static HTML pages in your function: +# static_files = [ "./functions/create-job/*.html" ] + +[functions.update-job-step] +enabled = true +verify_jwt = true +import_map = "./functions/update-job-step/deno.json" +# Uncomment to specify a custom file path to the entrypoint. +# Supported file extensions are: .ts, .js, .mjs, .jsx, .tsx +entrypoint = "./functions/update-job-step/index.ts" +# Specifies static files to be bundled with the function. Supports glob patterns. +# For example, if you want to serve static HTML pages in your function: +# static_files = [ "./functions/update-job-step/*.html" ] + +[functions.get-job-status] +enabled = true +verify_jwt = true +import_map = "./functions/get-job-status/deno.json" +# Uncomment to specify a custom file path to the entrypoint. +# Supported file extensions are: .ts, .js, .mjs, .jsx, .tsx +entrypoint = "./functions/get-job-status/index.ts" +# Specifies static files to be bundled with the function. Supports glob patterns. +# For example, if you want to serve static HTML pages in your function: +# static_files = [ "./functions/get-job-status/*.html" ] + +[functions.workflow-error-notifier] +enabled = true +verify_jwt = true +import_map = "./functions/workflow-error-notifier/deno.json" +# Uncomment to specify a custom file path to the entrypoint. +# Supported file extensions are: .ts, .js, .mjs, .jsx, .tsx +entrypoint = "./functions/workflow-error-notifier/index.ts" +# Specifies static files to be bundled with the function. Supports glob patterns. +# For example, if you want to serve static HTML pages in your function: +# static_files = [ "./functions/workflow-error-notifier/*.html" ] \ No newline at end of file diff --git a/supabase/functions/create-job/.npmrc b/supabase/functions/create-job/.npmrc new file mode 100755 index 0000000..48c6388 --- /dev/null +++ b/supabase/functions/create-job/.npmrc @@ -0,0 +1,3 @@ +# Configuration for private npm package dependencies +# For more information on using private registries with Edge Functions, see: +# https://supabase.com/docs/guides/functions/import-maps#importing-from-private-registries diff --git a/supabase/functions/create-job/deno.json b/supabase/functions/create-job/deno.json new file mode 100755 index 0000000..f6ca845 --- /dev/null +++ b/supabase/functions/create-job/deno.json @@ -0,0 +1,3 @@ +{ + "imports": {} +} diff --git a/supabase/functions/create-job/index.ts b/supabase/functions/create-job/index.ts new file mode 100755 index 0000000..f92bebf --- /dev/null +++ b/supabase/functions/create-job/index.ts @@ -0,0 +1,267 @@ +// Setup type definitions for built-in Supabase Runtime APIs +import "jsr:@supabase/functions-js/edge-runtime.d.ts"; +import { createClient } from "jsr:@supabase/supabase-js@2"; + +interface CreateJobRequest { + workflow_name: string; + resource_id?: number; + workflow_execution_id?: string; + metadata?: Record; + user_id?: string; // UUID expected when using service role (n8n) + user_email?: string; // Alternative lookup if user_id not provided + steps: Array<{ + step_name: string; + step_type: string; + step_order: number; + metadata?: Record; + }>; +} + +// Align with DB enums: job_status (pending, running, completed, failed, cancelled) and job_step_status (pending, running, completed, failed, skipped) +type JobStatus = "pending" | "running" | "completed" | "failed" | "cancelled"; +type StepStatus = "pending" | "running" | "completed" | "failed" | "skipped"; + +function isServiceRoleToken(token: string): boolean { + try { + const payloadPart = token.split(".")[1]; + if (!payloadPart) return false; + const json = atob(payloadPart); + const payload = JSON.parse(json); + return payload.role === "service_role"; + } catch { + return false; + } +} + +function isUuid(value: string | undefined): boolean { + if (!value) return false; + return /^[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i + .test(value); +} + +console.log("Create Job Function started"); + +Deno.serve(async (req) => { + try { + // CORS headers + const corsHeaders = { + "Access-Control-Allow-Origin": "*", + "Access-Control-Allow-Headers": + "authorization, x-client-info, apikey, content-type", + }; + + if (req.method === "OPTIONS") { + return new Response("ok", { headers: corsHeaders }); + } + + // Get authorization header + const authHeader = req.headers.get("authorization"); + if (!authHeader) { + return new Response( + JSON.stringify({ error: "Authorization header required" }), + { + status: 401, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Check if it's service role key (from n8n) or user JWT (from frontend) + const token = authHeader.replace("Bearer ", ""); + + // Decode the JWT to check the role + const isServiceRole = isServiceRoleToken(token); + + // Parse the request body ONCE here + const requestBody: CreateJobRequest = await req.json(); + + let supabaseClient; + let userId: string | undefined; + + if (isServiceRole) { + // n8n call with service role key + supabaseClient = createClient( + Deno.env.get("SUPABASE_URL")!, + Deno.env.get("SUPABASE_SERVICE_ROLE_KEY")!, + { + auth: { + autoRefreshToken: false, + persistSession: false, + }, + }, + ); + + // For n8n calls, get user_id from request body + userId = requestBody.user_id; + + // If user_id not provided or invalid UUID, but user_email is provided, resolve it + if ((!userId || !isUuid(userId)) && requestBody.user_email) { + const { data: userList, error: userLookupError } = await supabaseClient + .auth.admin.listUsers(); + if (userLookupError || !userList?.users) { + return new Response( + JSON.stringify({ error: "Unable to resolve user by email" }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + const found = userList.users.find((u) => + u.email === requestBody.user_email + ); + if (!found) { + return new Response( + JSON.stringify({ error: "Unable to resolve user by email" }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + userId = found.id; + } + + if (!userId || !isUuid(userId)) { + return new Response( + JSON.stringify({ + error: "Valid user_id (UUID) or user_email required for n8n calls", + }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + } else { + // Frontend call with user JWT + supabaseClient = createClient( + Deno.env.get("SUPABASE_URL")!, + Deno.env.get("SUPABASE_ANON_KEY")!, + { + global: { + headers: { Authorization: authHeader }, + }, + auth: { + autoRefreshToken: false, + persistSession: false, + }, + }, + ); + + // Get user from JWT + const { data: { user }, error: userError } = await supabaseClient.auth + .getUser(); + + if (userError || !user) { + return new Response( + JSON.stringify({ error: "Invalid authorization token" }), + { + status: 401, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + userId = user.id; + } + + // Use the already parsed request body + const { + workflow_name, + resource_id, + workflow_execution_id, + metadata, + steps, + } = requestBody; + + if (!workflow_name || !steps || steps.length === 0) { + return new Response( + JSON.stringify({ error: "workflow_name and steps are required" }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Create the job + const { data: job, error: jobError } = await supabaseClient + .from("jobs") + .insert({ + user_id: userId, + workflow_name, + resource_id, + workflow_execution_id, + status: "pending" as JobStatus, + metadata, + created_at: new Date().toISOString(), + }) + .select() + .single(); + + if (jobError) { + console.error("Error creating job:", jobError); + return new Response( + JSON.stringify({ + error: "Failed to create job", + details: jobError.message, + }), + { + status: 500, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Create job steps + const jobSteps = steps.map((step) => ({ + job_id: job.id, + step_name: step.step_name, + step_type: step.step_type, + step_order: step.step_order, + status: "pending" as StepStatus, + metadata: step.metadata, + })); + + const { data: createdSteps, error: stepsError } = await supabaseClient + .from("job_steps") + .insert(jobSteps) + .select(); + + if (stepsError) { + console.error("Error creating job steps:", stepsError); + // Try to cleanup the job if steps creation failed + await supabaseClient.from("jobs").delete().eq("id", job.id); + + return new Response( + JSON.stringify({ error: "Failed to create job steps" }), + { + status: 500, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + return new Response( + JSON.stringify({ + ...job, + steps: createdSteps, + }), + { + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } catch (error) { + console.error("Error in create-job function:", error); + return new Response( + JSON.stringify({ error: "Internal server error" }), + { + status: 500, + headers: { "Content-Type": "application/json" }, + }, + ); + } +}); diff --git a/supabase/functions/create-resource/index.ts b/supabase/functions/create-resource/index.ts index 53a8307..5b18752 100755 --- a/supabase/functions/create-resource/index.ts +++ b/supabase/functions/create-resource/index.ts @@ -104,15 +104,3 @@ Deno.serve(async (req) => { }, ); }); - -/* To invoke locally: - - 1. Run `supabase start` (see: https://supabase.com/docs/reference/cli/supabase-start) - 2. Make an HTTP request: - - curl -i --location --request POST 'http://127.0.0.1:54321/functions/v1/create-resource' \ - --header 'Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0' \ - --header 'Content-Type: application/json' \ - --data '{"name":"Functions"}' - -*/ diff --git a/supabase/functions/delete-resource/index.ts b/supabase/functions/delete-resource/index.ts index f4a2447..0c8113b 100755 --- a/supabase/functions/delete-resource/index.ts +++ b/supabase/functions/delete-resource/index.ts @@ -77,15 +77,3 @@ Deno.serve(async (req) => { } return new Response(null, { status: 204, headers: corsHeaders }); }); - -/* To invoke locally: - - 1. Run `supabase start` (see: https://supabase.com/docs/reference/cli/supabase-start) - 2. Make an HTTP request: - - curl -i --location --request POST 'http://127.0.0.1:54321/functions/v1/delete-resource' \ - --header 'Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0' \ - --header 'Content-Type: application/json' \ - --data '{"name":"Functions"}' - -*/ diff --git a/supabase/functions/get-job-status/.npmrc b/supabase/functions/get-job-status/.npmrc new file mode 100755 index 0000000..48c6388 --- /dev/null +++ b/supabase/functions/get-job-status/.npmrc @@ -0,0 +1,3 @@ +# Configuration for private npm package dependencies +# For more information on using private registries with Edge Functions, see: +# https://supabase.com/docs/guides/functions/import-maps#importing-from-private-registries diff --git a/supabase/functions/get-job-status/deno.json b/supabase/functions/get-job-status/deno.json new file mode 100755 index 0000000..f6ca845 --- /dev/null +++ b/supabase/functions/get-job-status/deno.json @@ -0,0 +1,3 @@ +{ + "imports": {} +} diff --git a/supabase/functions/get-job-status/index.ts b/supabase/functions/get-job-status/index.ts new file mode 100755 index 0000000..98d2c9f --- /dev/null +++ b/supabase/functions/get-job-status/index.ts @@ -0,0 +1,191 @@ +// Setup type definitions for built-in Supabase Runtime APIs +import "jsr:@supabase/functions-js/edge-runtime.d.ts"; +import { createClient } from "jsr:@supabase/supabase-js@2"; + +console.log("Get Job Status Function started"); + +function decodeJwt(token: string): any | null { + try { + const payload = token.split(".")[1]; + if (!payload) return null; + return JSON.parse(atob(payload)); + } catch { + return null; + } +} + +Deno.serve(async (req) => { + try { + // CORS headers + const corsHeaders = { + "Access-Control-Allow-Origin": "*", + "Access-Control-Allow-Headers": + "authorization, x-client-info, apikey, content-type", + }; + + if (req.method === "OPTIONS") { + return new Response("ok", { headers: corsHeaders }); + } + + // Get authorization header + const authHeader = req.headers.get("authorization"); + if (!authHeader) { + return new Response( + JSON.stringify({ error: "Authorization header required" }), + { + status: 401, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + const token = authHeader.replace("Bearer ", ""); + const decoded = decodeJwt(token); + const isServiceRole = decoded?.role === "service_role"; + + const supabaseUrl = Deno.env.get("SUPABASE_URL")!; + const supabaseKey = isServiceRole + ? Deno.env.get("SUPABASE_SERVICE_ROLE_KEY")! + : Deno.env.get("SUPABASE_ANON_KEY")!; + + const supabase = createClient(supabaseUrl, supabaseKey, { + global: !isServiceRole + ? { headers: { Authorization: authHeader } } + : undefined, + auth: { autoRefreshToken: false, persistSession: false }, + }); + + const url = new URL(req.url); + let userId: string | undefined; + const jobIdParam = url.searchParams.get("job_id"); + + if (isServiceRole) { + // Priority: explicit user_id query param + userId = url.searchParams.get("user_id") || undefined; + // If not provided but job_id present, fetch job to derive user_id + if (!userId && jobIdParam) { + const { data: owningJob } = await supabase + .from("jobs") + .select("id, user_id") + .eq("id", jobIdParam) + .single(); + if (owningJob) userId = owningJob.user_id; + } + if (!userId) { + return new Response( + JSON.stringify({ + error: "user_id (or resolvable job_id) required with service role", + }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + } else { + // User JWT path + const { data: { user }, error: userError } = await supabase.auth + .getUser(); + if (userError || !user) { + return new Response( + JSON.stringify({ error: "Invalid authorization token" }), + { + status: 401, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + userId = user.id; + } + + const jobId = jobIdParam; + const limit = parseInt(url.searchParams.get("limit") || "10"); + const status = url.searchParams.get("status"); + + if (jobId) { + // Get specific job with steps + const { data: job, error: jobError } = await supabase + .from("jobs") + .select(` + *, + job_steps (*) + `) + .eq("id", jobId) + .eq("user_id", userId) + .single(); + + if (jobError || !job) { + return new Response( + JSON.stringify({ error: "Job not found or access denied" }), + { + status: 404, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Sort steps by order + if (job.job_steps) { + job.job_steps.sort((a: any, b: any) => a.step_order - b.step_order); + } + + return new Response( + JSON.stringify(job), + { + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } else { + // Get all jobs for user + let query = supabase + .from("jobs") + .select(` + *, + job_steps (*) + `) + .eq("user_id", userId) + .order("created_at", { ascending: false }) + .limit(limit); + + if (status) { + query = query.eq("status", status); + } + + const { data: jobs, error: jobsError } = await query; + + if (jobsError) { + console.error("Error fetching jobs:", jobsError); + return new Response( + JSON.stringify({ error: "Failed to fetch jobs" }), + { + status: 500, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Sort steps by order for each job + jobs?.forEach((job: any) => { + if (job.job_steps) { + job.job_steps.sort((a: any, b: any) => a.step_order - b.step_order); + } + }); + + return new Response( + JSON.stringify(jobs || []), + { + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + } catch (error) { + console.error("Error in get-job-status function:", error); + return new Response( + JSON.stringify({ error: "Internal server error" }), + { + status: 500, + headers: { "Content-Type": "application/json" }, + }, + ); + } +}); diff --git a/supabase/functions/read-resource/index.ts b/supabase/functions/read-resource/index.ts index e7d4d63..df6f3a5 100755 --- a/supabase/functions/read-resource/index.ts +++ b/supabase/functions/read-resource/index.ts @@ -95,15 +95,3 @@ Deno.serve(async (req) => { ); } }); - -/* To invoke locally: - - 1. Run `supabase start` (see: https://supabase.com/docs/reference/cli/supabase-start) - 2. Make an HTTP request: - - curl -i --location --request POST 'http://127.0.0.1:54321/functions/v1/read-resource' \ - --header 'Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0' \ - --header 'Content-Type: application/json' \ - --data '{"name":"Functions"}' - -*/ diff --git a/supabase/functions/update-job-step/.npmrc b/supabase/functions/update-job-step/.npmrc new file mode 100755 index 0000000..48c6388 --- /dev/null +++ b/supabase/functions/update-job-step/.npmrc @@ -0,0 +1,3 @@ +# Configuration for private npm package dependencies +# For more information on using private registries with Edge Functions, see: +# https://supabase.com/docs/guides/functions/import-maps#importing-from-private-registries diff --git a/supabase/functions/update-job-step/deno.json b/supabase/functions/update-job-step/deno.json new file mode 100755 index 0000000..f6ca845 --- /dev/null +++ b/supabase/functions/update-job-step/deno.json @@ -0,0 +1,3 @@ +{ + "imports": {} +} diff --git a/supabase/functions/update-job-step/index.ts b/supabase/functions/update-job-step/index.ts new file mode 100755 index 0000000..10ae451 --- /dev/null +++ b/supabase/functions/update-job-step/index.ts @@ -0,0 +1,333 @@ +// Setup type definitions for built-in Supabase Runtime APIs +import "jsr:@supabase/functions-js/edge-runtime.d.ts"; +import { createClient } from "jsr:@supabase/supabase-js@2"; + +interface UpdateJobStepRequest { + job_id: string; + step_name: string; + status: "pending" | "running" | "completed" | "failed" | "skipped"; + error_message?: string; + output_data?: Record; + workflow_execution_id?: string; + resource_id?: number; + metadata?: Record; +} + +console.log("Update Job Step Function started"); + +function isServiceRoleToken(token: string): boolean { + try { + const payloadPart = token.split(".")[1]; + if (!payloadPart) return false; + const json = atob(payloadPart); + const payload = JSON.parse(json); + return payload.role === "service_role"; + } catch { + return false; + } +} + +Deno.serve(async (req) => { + console.log("All headers:", [...req.headers.entries()]); + + try { + // CORS headers + const corsHeaders = { + "Access-Control-Allow-Origin": "*", + "Access-Control-Allow-Headers": + "authorization, x-client-info, apikey, content-type", + }; + + if (req.method === "OPTIONS") { + return new Response("ok", { headers: corsHeaders }); + } + + // Get authorization header + const authHeader = req.headers.get("authorization"); + if (!authHeader) { + return new Response( + JSON.stringify({ error: "Authorization header required" }), + { + status: 401, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Check if it's a service role key (from n8n) or user JWT (from frontend) + const token = authHeader.replace("Bearer ", ""); + const isServiceRoleKey = isServiceRoleToken(token); + + let supabase; + let user; + + if (isServiceRoleKey) { + // n8n workflow call with service role key + supabase = createClient( + Deno.env.get("SUPABASE_URL")!, + Deno.env.get("SUPABASE_SERVICE_ROLE_KEY")!, + { + auth: { + autoRefreshToken: false, + persistSession: false, + }, + }, + ); + + // For n8n calls, we need to get user_id from the job record + const bodyText = await req.text(); + const requestBody = JSON.parse(bodyText); + const { job_id } = requestBody; + + if (!job_id) { + return new Response( + JSON.stringify({ error: "job_id is required" }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Get job to find user_id + const { data: job, error: jobError } = await supabase + .from("jobs") + .select("id, user_id") + .eq("id", job_id) + .single(); + + if (jobError || !job) { + return new Response( + JSON.stringify({ error: "Job not found" }), + { + status: 404, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Create a mock user object for compatibility + user = { id: job.user_id }; + + // Re-create request object since we consumed the body + req = new Request(req.url, { + method: req.method, + headers: req.headers, + body: bodyText, + }); + } else { + // Frontend call - verify JWT + supabase = createClient( + Deno.env.get("SUPABASE_URL")!, + Deno.env.get("SUPABASE_ANON_KEY")!, + { + global: { + headers: { Authorization: authHeader }, + }, + auth: { + autoRefreshToken: false, + persistSession: false, + }, + }, + ); + + // Get user from JWT + const { data: { user: authUser }, error: userError } = await supabase.auth + .getUser(); + + if (userError || !authUser) { + return new Response( + JSON.stringify({ error: "Invalid authorization token" }), + { + status: 401, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + user = authUser; + } + + const { + job_id, + step_name, + status, + error_message, + output_data, + workflow_execution_id, + resource_id, + metadata, + }: UpdateJobStepRequest = await req.json(); + + if (!job_id || !step_name || !status) { + return new Response( + JSON.stringify({ error: "job_id, step_name, and status are required" }), + { + status: 400, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Verify job belongs to user (if service role we already fetched job earlier; still enforce user match) + const { data: job, error: jobError } = await supabase + .from("jobs") + .select("id, user_id") + .eq("id", job_id) + .single(); + + if (jobError || !job || (!isServiceRoleKey && job.user_id !== user.id)) { + return new Response( + JSON.stringify({ error: "Job not found or access denied" }), + { + status: 404, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Update workflow execution ID if provided + if (workflow_execution_id || resource_id !== undefined) { + const jobUpdate: Record = {}; + if (workflow_execution_id) { + jobUpdate.workflow_execution_id = workflow_execution_id; + } + if (resource_id !== undefined) jobUpdate.resource_id = resource_id; + if (Object.keys(jobUpdate).length) { + await supabase + .from("jobs") + .update(jobUpdate) + .eq("id", job_id); + } + } + + // Prepare update data + const updateData: any = { + status, + error_message, + }; + + if (status === "running" && !error_message) { + // Step explicitly moved to running + updateData.started_at = new Date().toISOString(); + } else if ( + status === "completed" || status === "failed" || status === "skipped" + ) { + // Terminal state: ensure completed_at set + const nowIso = new Date().toISOString(); + updateData.completed_at = nowIso; + // If step jumped directly from pending -> terminal without a running phase, also set started_at + updateData.started_at = updateData.started_at || nowIso; + } + + if (output_data) { + updateData.output_data = output_data; + } + if (metadata) { + updateData.metadata = metadata; + } + + // Update job step + const { data: updatedStep, error: updateError } = await supabase + .from("job_steps") + .update(updateData) + .eq("job_id", job_id) + .eq("step_name", step_name) + .select() + .single(); + + if (updateError) { + console.error("Error updating job step:", updateError); + return new Response( + JSON.stringify({ error: "Failed to update job step" }), + { + status: 500, + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } + + // Get updated job with all steps for response + const { data: jobWithSteps } = await supabase + .from("jobs") + .select(` + *, + job_steps (*) + `) + .eq("id", job_id) + .single(); + + // Auto-finalization & progression logic for the parent job + if (jobWithSteps && jobWithSteps.job_steps) { + const steps: any[] = jobWithSteps.job_steps; + const total = steps.length; + const anyFailed = steps.some((s) => s.status === "failed"); + const allTerminal = steps.every((s) => + ["completed", "failed", "skipped"].includes(s.status) + ); + const anyProgress = steps.some((s) => + ["running", "completed", "failed", "skipped"].includes(s.status) + ); + + let newStatus: string | undefined; + if ( + jobWithSteps.status !== "failed" && jobWithSteps.status !== "completed" + ) { + if (anyFailed) { + newStatus = "failed"; + } else if (allTerminal) { + newStatus = "completed"; + } else if (jobWithSteps.status === "pending" && anyProgress) { + // Move from pending -> running when first step reports progress (even if directly completed) + newStatus = "running"; + } + } + + if (newStatus) { + const jobUpdate: Record = { status: newStatus }; + const nowIso = new Date().toISOString(); + // Ensure started_at is populated the first time we leave pending (even if we jump straight to completed/failed) + if (!jobWithSteps.started_at) { + jobUpdate.started_at = nowIso; + } + if (newStatus === "failed" || newStatus === "completed") { + jobUpdate.completed_at = nowIso; + } + + const { error: jobStatusError } = await supabase + .from("jobs") + .update(jobUpdate) + .eq("id", job_id); + + if (!jobStatusError) { + // Reflect status change locally without refetching + jobWithSteps.status = newStatus; + if (jobUpdate.completed_at) { + jobWithSteps.completed_at = jobUpdate.completed_at; + } + } else { + console.error("Failed to auto-update job status:", jobStatusError); + } + } + } + + return new Response( + JSON.stringify({ + updated_step: updatedStep, + job: jobWithSteps, + }), + { + headers: { ...corsHeaders, "Content-Type": "application/json" }, + }, + ); + } catch (error) { + console.error("Error in update-job-step function:", error); + return new Response( + JSON.stringify({ error: "Internal server error" }), + { + status: 500, + headers: { "Content-Type": "application/json" }, + }, + ); + } +}); diff --git a/supabase/functions/update-resource/index.ts b/supabase/functions/update-resource/index.ts index c073a78..2a6f104 100755 --- a/supabase/functions/update-resource/index.ts +++ b/supabase/functions/update-resource/index.ts @@ -93,15 +93,3 @@ Deno.serve(async (req) => { { status: 200, headers: corsHeaders }, ); }); - -/* To invoke locally: - - 1. Run `supabase start` (see: https://supabase.com/docs/reference/cli/supabase-start) - 2. Make an HTTP request: - - curl -i --location --request POST 'http://127.0.0.1:54321/functions/v1/update-resource' \ - --header 'Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0' \ - --header 'Content-Type: application/json' \ - --data '{"name":"Functions"}' - -*/ diff --git a/supabase/functions/workflow-error-notifier/.npmrc b/supabase/functions/workflow-error-notifier/.npmrc new file mode 100755 index 0000000..48c6388 --- /dev/null +++ b/supabase/functions/workflow-error-notifier/.npmrc @@ -0,0 +1,3 @@ +# Configuration for private npm package dependencies +# For more information on using private registries with Edge Functions, see: +# https://supabase.com/docs/guides/functions/import-maps#importing-from-private-registries diff --git a/supabase/functions/workflow-error-notifier/deno.json b/supabase/functions/workflow-error-notifier/deno.json new file mode 100755 index 0000000..f6ca845 --- /dev/null +++ b/supabase/functions/workflow-error-notifier/deno.json @@ -0,0 +1,3 @@ +{ + "imports": {} +} diff --git a/supabase/functions/workflow-error-notifier/index.ts b/supabase/functions/workflow-error-notifier/index.ts new file mode 100755 index 0000000..775c461 --- /dev/null +++ b/supabase/functions/workflow-error-notifier/index.ts @@ -0,0 +1,127 @@ +import "jsr:@supabase/functions-js/edge-runtime.d.ts"; +import { createClient } from "jsr:@supabase/supabase-js@2"; + +const corsHeaders = { + "Access-Control-Allow-Origin": "*", + "Access-Control-Allow-Headers": + "authorization, x-client-info, apikey, content-type", + "Access-Control-Allow-Methods": "POST, OPTIONS", +}; + +interface WorkflowErrorPayload { + user_id?: string; + workflow_execution_id?: string; + workflow_name: string; + error_message: string; + error_node?: string; + error_data?: Record; +} + +Deno.serve(async (req) => { + // Handle CORS preflight + if (req.method === "OPTIONS") { + return new Response("ok", { headers: corsHeaders }); + } + + if (req.method !== "POST") { + return new Response( + JSON.stringify({ error: "Method not allowed" }), + { status: 405, headers: corsHeaders }, + ); + } + + try { + // Initialize Supabase client with service role + const supabaseClient = createClient( + Deno.env.get("SUPABASE_URL") ?? "", + Deno.env.get("SUPABASE_SERVICE_ROLE_KEY") ?? "", + { + auth: { + autoRefreshToken: false, + persistSession: false, + }, + }, + ); + + // Parse request body + const payload: WorkflowErrorPayload = await req.json(); + + // Validate required fields + if (!payload.workflow_name || !payload.error_message) { + return new Response( + JSON.stringify({ + error: "Missing required fields: workflow_name, error_message", + }), + { status: 400, headers: corsHeaders }, + ); + } + + // If user_id not provided, try to extract from workflow_execution_id by looking up the job that has this workflow_execution_id + let userId = payload.user_id; + + if (!userId && payload.workflow_execution_id) { + const { data: job } = await supabaseClient + .from("jobs") + .select("user_id") + .eq("workflow_execution_id", payload.workflow_execution_id) + .single(); + + if (job) { + userId = job.user_id; + } + } + + if (!userId) { + return new Response( + JSON.stringify({ + error: "Could not determine user_id for workflow error notification", + }), + { status: 400, headers: corsHeaders }, + ); + } + + // Insert workflow error record + const { data: workflowError, error: insertError } = await supabaseClient + .from("workflow_errors") + .insert({ + user_id: userId, + workflow_execution_id: payload.workflow_execution_id, + workflow_name: payload.workflow_name, + error_message: payload.error_message, + error_node: payload.error_node, + error_data: payload.error_data || {}, + created_at: new Date().toISOString(), + notified_at: new Date().toISOString(), + }) + .select() + .single(); + + if (insertError) { + console.error("Error inserting workflow error:", insertError); + return new Response( + JSON.stringify({ + error: "Failed to log workflow error", + details: insertError.message, + }), + { status: 500, headers: corsHeaders }, + ); + } + + return new Response( + JSON.stringify({ + success: true, + workflow_error: workflowError, + }), + { status: 200, headers: corsHeaders }, + ); + } catch (error) { + console.error("Unexpected error in workflow-error-notifier:", error); + return new Response( + JSON.stringify({ + error: "Internal server error", + message: error.message, + }), + { status: 500, headers: corsHeaders }, + ); + } +});