-
Notifications
You must be signed in to change notification settings - Fork 1
Fall 2025 Semester Changes (From Greenfield) #60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…but includes variations that will be needed
back-into-app-wd
This was for testing. no longer needed
…ed search tag layout.
Updated Search Page UI
…in/signup functions
Signin/Signout page using Clerk + Authentication
…sliders in main section, added further description to fragmented). Refactored search file (components, constants, and utils)
Updated theme variables and search page. Wrapped root layout in AppTh…
Feature/client revised search fs
fully wired landing page
Search Page Color Adjustments
Rendering Bug Fixes
Feature/statue id delete warning wd
Added new secrets/variables
…in-db-fs Added create functionality and resolved various consistency errors
Feature/hard delete yy
Added details about ongoing bugs and errors in the README.
Various Tests Added
Added a section on Technical Architecture with an image.
Doc/readme yy
Implemented set null
…tables-and-field-type-validation fixed update on images table
WalkthroughThis PR introduces a comprehensive Khmer Statuary Project application with artifact upload, metadata processing, admin review workflows, statue search functionality, CRUD database management, and authentication. It includes Next.js pages, API routes, UI components, Supabase schema migrations, and integration with Clerk auth and OpenRouter LLM services. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Areas requiring extra attention:
Possibly related PRs
Pre-merge checks and finishing touches❌ Failed checks (1 warning, 1 inconclusive)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This pull request contains comprehensive changes for the Fall 2025 semester, including new authentication middleware, database migrations, image upload functionality, LLM metadata extraction, and various utility modules for a Cambodian artifact cataloging system.
Key Changes:
- Added Clerk authentication middleware with admin role protection
- Implemented Supabase integration for database and storage operations
- Created LLM-based metadata extraction system using OpenRouter API
- Added rate limiting for uploads and image processing workflows
- Included multiple database migrations for artifact metadata management
Reviewed changes
Copilot reviewed 109 out of 122 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| middleware.ts | Implements Clerk authentication with public/admin route protection |
| tsconfig.json | Updates TypeScript configuration for Next.js with ts-node support |
| package.json | Adds dependencies for Clerk, Supabase, MUI, and other libraries |
| next.config.ts | Configures Next.js image optimization for Supabase storage |
| supabase/migrations/* | Multiple SQL migrations for artifact metadata tables and functions |
| lib/upload/uploadImage.ts | Handles image uploads to Supabase storage with validation |
| lib/llm/*.ts | LLM integration for metadata extraction from descriptions |
| lib/pending-images/pendingImages.ts | Fetches and processes pending image approvals |
| lib/rate-limit/uploadRateLimit.ts | Implements in-memory rate limiting for uploads |
| lib/db/supabase.ts | Initializes Supabase client with service role key |
| lib/registry.ts | Defines table access control and CRUD permissions |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| } | ||
|
|
||
| const newImageId = crypto.randomUUID(); | ||
| const internalReferenceNumber = `IRN-${Date.now()}-${crypto.randomBytes(4).toString('hex')}`; |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The internal reference number format combines timestamp and random bytes, but could collide if multiple uploads happen at the exact same millisecond. Consider using only the UUID (generated on line 27) as the internal reference number, or include more randomness in the format.
| long_description_input TEXT, | ||
| misc_information_input TEXT, | ||
| ai_generated_input BOOLEAN, | ||
| ) |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Syntax error: trailing comma after the last function parameter. Remove the comma after ai_generated_input BOOLEAN on line 12.
| long_description_input, | ||
| misc_information_input, | ||
| COALESCE(ai_generated_input, TRUE), | ||
| COALESCE(created_at_input, NOW()) |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Variable created_at_input is referenced but never defined in the function parameters. This will cause a runtime error. Either add this parameter to the function signature or replace with a different value.
| }; | ||
| }) |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The filter removes null values from the array, but the return type should account for this. Additionally, filtering out null entries could lead to a mismatch between database records and returned results. Consider handling invalid paths differently, such as logging the error and returning a placeholder or throwing an error early.
| variant="contained" | ||
| startIcon={<CloudUpload />} | ||
| onClick={handleUpload} | ||
| disabled={!selectedFile || (metadataMode === 'ai' && !aiMetadataAvailable)} |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This negation always evaluates to false.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 7
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
🟠 Major comments (28)
supabase/migrations/20251026005531_handle_new_image_upload.sql-1-4 (1)
1-4: Add input validation for required parameters.The function accepts TEXT parameters without validation. Empty or NULL values could create orphaned records or violate business logic constraints.
Add validation at the beginning of the function body:
BEGIN + -- Validate inputs + IF image_url_input IS NULL OR image_url_input = '' THEN + RAISE EXCEPTION 'image_url_input cannot be NULL or empty'; + END IF; + + IF image_gcs_input IS NULL OR image_gcs_input = '' THEN + RAISE EXCEPTION 'image_gcs_input cannot be NULL or empty'; + END IF; + SELECT gen_random_uuid() INTO new_internal_ref_id;Committable suggestion skipped: line range outside the PR's diff.
supabase/migrations/20251026005531_handle_new_image_upload.sql-42-42 (1)
42-42: Remove hardcoded admin_id.The hardcoded UUID
'cacf0e78-f815-4a47-abed-179aa74c40eb'appears to be a placeholder or test value that shouldn't be deployed to production. This prevents proper tracking of which admin is responsible for reviewing each image.Add
admin_idas a function parameter:CREATE OR REPLACE FUNCTION handle_new_image_upload( image_url_input TEXT, - image_gcs_input TEXT + image_gcs_input TEXT, + admin_id_input TEXT ) RETURNS TEXT AS $$Then use the parameter:
VALUES ( new_internal_ref_id, 'pending_review', - 'cacf0e78-f815-4a47-abed-179aa74c40eb', + admin_id_input, next_approval_id );Alternatively, if the approval should initially have no assigned admin (pending assignment), use
NULL:- 'cacf0e78-f815-4a47-abed-179aa74c40eb', + NULL,Committable suggestion skipped: line range outside the PR's diff.
package.json-25-26 (1)
25-26: Remove redundant Supabase packages and avoid GitHub source installs.You have two Supabase client packages:
- Line 25:
@supabase-js/sourcefrom GitHub- Line 26:
@supabase/supabase-jsfrom npmThis is redundant and confusing. Installing directly from GitHub bypasses version locking, makes builds less reproducible, and can cause dependency resolution issues. Use only the official npm package
@supabase/supabase-js.Apply this diff to remove the GitHub source install:
- "@supabase-js/source": "github:supabase/supabase-js", "@supabase/supabase-js": "^2.76.1",package.json-31-31 (1)
31-31: Remove the deprecatedmuipackage.The
muipackage at version 0.0.1 is a deprecated placeholder and should not be used. The actual Material-UI library is@mui/material(already included on line 22). This legacy package serves no purpose and can cause confusion.Apply this diff to remove it:
"lucide-react": "^0.544.0", - "mui": "^0.0.1", "next": "^15.3.2",app/search/components/CustomTag.tsx-39-49 (1)
39-49: Fix accessibility issue: white text on light background for unfilled chips.Lines 43-44 create a contrast issue. When
isFilledis false (non-binary types with falsy values),backgroundColorfalls back to Material-UI's light gray default (viacolor='default'on line 36), but text remains white (line 44). This results in poor contrast that fails WCAG accessibility standards.Apply this diff to ensure proper contrast:
sx={{ cursor: 'pointer', maxWidth: '100%', borderRadius: 2, backgroundColor: isFilled ? themeTokens.secondaryBlue : undefined, - color: '#fff', + color: isFilled ? '#fff' : undefined, borderColor: isFilled ? undefined : themeTokens.secondaryBlue, '&:hover': { backgroundColor: isFilled ? alpha(themeTokens.secondaryBlue, 0.8) : alpha(themeTokens.secondaryBlue, 0.1), }, }}This allows Material-UI to apply the appropriate default text color (typically dark) for unfilled chips, ensuring sufficient contrast with the light background.
app/signin/[[...signin]]/page.tsx-204-210 (1)
204-210: Sign-up link also needs redirect URL validation.The same open redirect concern applies to the sign-up link construction.
Ensure the
redirectUrlused here is already validated (per the earlier fix), or apply the same validation before use in the href.app/search/utils.ts-27-27 (1)
27-27: Import statement should be at the top of the file.The import from
@/app/search/constantsappears after function definitions on line 27. Imports should be placed at the top of the file for readability and to follow JavaScript/TypeScript conventions.Move the import to the top of the file:
+import { BASIC_FIELDS } from '@/app/search/constants'; + export const formatFileSize = (bytes: number): string => { // ... }; // ... other functions ... -import { BASIC_FIELDS } from '@/app/search/constants'; - type BasicState = Record<string, unknown>;Committable suggestion skipped: line range outside the PR's diff.
app/signin/[[...signin]]/page.tsx-19-21 (1)
19-21: Open redirect vulnerability:redirect_urlis not validated.The
redirect_urlquery parameter is used directly for navigation after sign-in. A malicious actor could craft a URL like/signin?redirect_url=https://evil.comto redirect users to a phishing site after authentication.Validate that the redirect URL is a relative path or belongs to an allowed domain:
const searchParams = useSearchParams(); - const redirectUrl = searchParams.get('redirect_url') || '/'; + const rawRedirect = searchParams.get('redirect_url') || '/'; + // Only allow relative URLs (starting with /) and reject absolute URLs + const redirectUrl = rawRedirect.startsWith('/') && !rawRedirect.startsWith('//') + ? rawRedirect + : '/';lib/rate-limit/uploadRateLimit.ts-80-86 (1)
80-86: Remove unnecessary sort - records are already chronological.Since
Date.now()is always called at the time of recording, new uploads are inherently in chronological order. The sort after every push adds unnecessary O(n log n) overhead.export function recordUpload(): void { const now = Date.now(); uploadRecords.push({ timestamp: now }); - - // Keep records sorted by timestamp (oldest first) for efficient cleanup - uploadRecords.sort((a, b) => a.timestamp - b.timestamp); }lib/rate-limit/uploadRateLimit.ts-20-22 (1)
20-22: Module-level throw may crash the application unexpectedly.Throwing at module load time means any import of this file will crash the entire application if the environment variables are missing. This is especially problematic in testing or during builds where these env vars may not be set.
Consider lazy initialization or graceful degradation:
-if (!process.env.UPLOAD_RATE_LIMIT_MAX || !process.env.UPLOAD_RATE_LIMIT_WINDOW_MS) { - throw new Error('UPLOAD_RATE_LIMIT_MAX and UPLOAD_RATE_LIMIT_WINDOW_MS must be set in environment variables'); -} - -const DEFAULT_CONFIG: RateLimitConfig = { - maxUploads: parseInt(process.env.UPLOAD_RATE_LIMIT_MAX, 10), - // Convert minutes to milliseconds: minutes × 60 seconds × 1000 milliseconds - // Supports decimal values (e.g., 0.5 = 30 seconds, 1.5 = 90 seconds) - windowMs: parseFloat(process.env.UPLOAD_RATE_LIMIT_WINDOW_MS) * 60 * 1000, -}; +function getDefaultConfig(): RateLimitConfig { + if (!process.env.UPLOAD_RATE_LIMIT_MAX || !process.env.UPLOAD_RATE_LIMIT_WINDOW_MS) { + throw new Error('UPLOAD_RATE_LIMIT_MAX and UPLOAD_RATE_LIMIT_WINDOW_MS must be set in environment variables'); + } + return { + maxUploads: parseInt(process.env.UPLOAD_RATE_LIMIT_MAX, 10), + windowMs: parseFloat(process.env.UPLOAD_RATE_LIMIT_WINDOW_MS) * 60 * 1000, + }; +} + +let _defaultConfig: RateLimitConfig | null = null; +const DEFAULT_CONFIG = (): RateLimitConfig => { + if (!_defaultConfig) _defaultConfig = getDefaultConfig(); + return _defaultConfig; +};Then update function signatures to call
DEFAULT_CONFIG()instead of using it directly.Committable suggestion skipped: line range outside the PR's diff.
app/api/admin/reset-rate-limit/route.ts-11-27 (1)
11-27: Security: Any authenticated user can reset the global rate limit.The TODO comment acknowledges that admin role checking is missing. Allowing any authenticated user to reset the rate limit defeats its purpose and could enable abuse. This should be addressed before deploying to production.
Consider implementing the admin check now, or at minimum restricting access via middleware:
export async function POST() { const { userId } = await auth(); if (!userId) { return NextResponse.json({ error: 'Unauthorized' }, { status: 401 }); } - // TODO: Add admin role check here - // For now, any authenticated user can reset (for testing) - // In production, add: if (metadata?.role !== 'admin') return 403 + // Check admin role - implement based on your auth setup + const { sessionClaims } = await auth(); + if (sessionClaims?.metadata?.role !== 'admin') { + return NextResponse.json({ error: 'Forbidden - Admin access required' }, { status: 403 }); + } resetRateLimit();lib/pending-images/pendingImages.ts-79-169 (1)
79-169: Filter out null values before returning the images array.The
mapcallback returnsnullfor invalid storage paths (line 85), but these nulls are not filtered fromenhancedImagesbefore returning. This will cause the response to containnullentries, which may break consumers expecting an array of image objects.Apply this diff to filter out null values:
- return { images: enhancedImages }; + return { images: enhancedImages.filter((img): img is NonNullable<typeof img> => img !== null) };app/upload/page.tsx-11-21 (1)
11-21: Add user feedback during redirect delay and extract magic number.The forced redirect after upload has several UX concerns:
- No visual feedback: Users see a 2-second delay with no indication they're being redirected
- No opt-out: Users cannot cancel the redirect if they want to upload more artifacts
- Magic number: The
2000millisecond delay is hardcodedConsider this improved implementation:
+const REDIRECT_DELAY_MS = 2000; + export default function UploadPage() { const router = useRouter(); const [redirectPending, setRedirectPending] = useState(false); + const [secondsRemaining, setSecondsRemaining] = useState(2); useEffect(() => { if (!redirectPending) return; + // Countdown timer + const countdownInterval = setInterval(() => { + setSecondsRemaining((prev) => prev - 1); + }, 1000); + const timer = setTimeout(() => { router.push('/search'); - }, 2000); + }, REDIRECT_DELAY_MS); - return () => clearTimeout(timer); + return () => { + clearTimeout(timer); + clearInterval(countdownInterval); + }; }, [redirectPending, router]);Then add visual feedback in the ArtifactsUpload component or as an overlay showing "Redirecting to search in {secondsRemaining} seconds..." with a "Stay on this page" button.
Committable suggestion skipped: line range outside the PR's diff.
README.md-120-120 (1)
120-120: Updateeslint-config-nextto match Next.js 15.3.2.The CI/CD build failure is likely caused by a version mismatch:
nextis pinned to^15.3.2whileeslint-config-nextis pinned to15.1.6. Updateeslint-config-nextin package.json to^15.3.2to align with the Next.js version. Version mismatches between Next.js and its ESLint configuration are a common cause of build failures and would explain the mysterious build-time error blocking the pipeline.lib/upload/uploadImage.ts-13-63 (1)
13-63: Make error handling consistent throughout the function.The function throws errors for validation failures (lines 14, 19, 24) but returns
{ error }objects for upload failures (lines 59-63). This inconsistent pattern forces callers to handle both thrown exceptions and error return values. Choose one pattern consistently.Recommendation: Either throw errors consistently, or return result objects consistently. Since the function is async and used in API routes, returning a discriminated union type would be clearer:
type UploadResult = | { success: true; id: string; internalReferenceNumber: string; publicUrl: string; gcsPath: string } | { success: false; error: string }; export async function handleUploadImage({ file }: { file: File }): Promise<UploadResult> { if (!file) return { success: false, error: 'No file uploaded' }; // ... validation checks return error objects instead of throwing if (!ALLOWED_MIME_TYPES.has(file.type)) { const ext = file.name.split('.').pop()?.toLowerCase(); if (!ext || !ALLOWED_EXTENSIONS.has(ext)) { return { success: false, error: 'Unsupported file type. Only JPG, JPEG, PNG, or WEBP images are allowed.' }; } } if (file.size > maxSize) { return { success: false, error: 'File too large. Maximum size is 10MB.' }; } // ... rest of function return { success: true, id: newImageId, internalReferenceNumber, publicUrl, gcsPath, }; }app/api/admin/deny/route.ts-6-12 (1)
6-12: Missing authentication check for admin route.This admin endpoint allows any user to deny/reject images without authentication. Add Clerk auth verification consistent with other admin routes like
pending-images.import { NextRequest, NextResponse } from 'next/server'; import { supabase } from '@/lib/db/supabase'; +import { auth } from '@clerk/nextjs/server'; const TEMP_TABLE = 'temp_artifact_metadata'; +const ADMIN_ID = process.env.ADMIN_ID; export async function POST(request: NextRequest) { try { + const { userId } = await auth(); + if (userId !== ADMIN_ID) { + return NextResponse.json({ error: 'Unauthorized' }, { status: 401 }); + } + const { imageId } = await request.json();app/api/admin/image-url/[imageId]/route.ts-12-50 (1)
12-50: Missing authentication check for admin route.This route is under
/api/admin/but lacks authentication verification. Other admin routes in this PR (e.g.,app/api/admin/pending-images/route.ts) checkauth()from Clerk and validate the user ID againstADMIN_ID. Without this check, any user can generate signed URLs for any image.Add authentication similar to other admin routes:
import { NextResponse } from 'next/server'; import { supabase } from '@/lib/db/supabase'; +import { auth } from '@clerk/nextjs/server'; + +const ADMIN_ID = process.env.ADMIN_ID; const storageBucket = 'spark'; // ... normalizeStoragePath ... export async function GET(request: Request, { params }: { params: Promise<{ imageId: string }> }) { try { + const { userId } = await auth(); + if (userId !== ADMIN_ID) { + return NextResponse.json({ error: 'Unauthorized' }, { status: 401 }); + } + const { imageId } = await params; // ... rest of handlerlib/db/insertArtifactMetadata.ts-26-35 (1)
26-35: Excessive logging of potentially sensitive metadata.These console.log statements output full metadata including descriptions and miscellaneous information to server logs. This could expose PII or sensitive artifact details in production logs.
Consider reducing logging verbosity or using structured logging with appropriate log levels:
- console.log('Inserting metadata for image:', imageId); - console.log('Internal Reference Number:', internalReferenceNumber); - console.log('GCS Path:', gcsPath); - console.log('Basic metadata:', JSON.stringify(basicSearchMetadata, null, 2)); - console.log('Advanced metadata:', JSON.stringify(advancedSearchMetadata, null, 2)); - console.log('Short description:', shortDescription); - console.log('Long description:', longDescription); - console.log('Misc information:', miscInformation); - console.log('AI generated:', aiGenerated); - console.log('Created at timestamp:', createdAt); + console.log('Inserting metadata for image:', imageId);Or use debug-level logging that can be disabled in production.
app/api/process-metadata/route.ts-105-117 (1)
105-117: Missing null check formanualMetadatain manual processing path.If
body.manualMetadataisundefined, accessingmanualMetadata.basicSearchMetadataon line 110 will throw a runtime error. Consider validating thatmanualMetadataexists whenprocessWithAIisfalse.} else { // Manual Processing Path const { manualMetadata, longDescription } = body; + if (!manualMetadata?.basicSearchMetadata || !manualMetadata?.advancedSearchMetadata) { + return NextResponse.json( + { error: 'manualMetadata with basicSearchMetadata and advancedSearchMetadata is required for manual processing' }, + { status: 400 } + ); + } + metadata = { basicSearchMetadata: manualMetadata.basicSearchMetadata, advancedSearchMetadata: manualMetadata.advancedSearchMetadata,app/signup/[[...signup]]/page.tsx-93-97 (1)
93-97: OAuth redirect also uses unvalidated URL.The same
redirectUrlis passed toredirectUrlCompletein the OAuth flow. While Clerk may have its own validation, it's safer to validate on the client side as well for consistency.app/signup/[[...signup]]/page.tsx-24-24 (1)
24-24: Potential open redirect vulnerability.The
redirectUrlis taken directly from query parameters and used inrouter.push()without validation. An attacker could craft a URL like/signup?redirect_url=https://malicious.comto redirect users after signup. Consider validating that the redirect URL is relative or matches an allowlist.+ const isValidRedirect = (url: string) => { + // Only allow relative URLs or same-origin + return url.startsWith('/') && !url.startsWith('//'); + }; + const searchParams = useSearchParams(); - const redirectUrl = searchParams.get('redirect_url') || '/'; + const rawRedirect = searchParams.get('redirect_url') || '/'; + const redirectUrl = isValidRedirect(rawRedirect) ? rawRedirect : '/';Also applies to: 70-70
app/admin/admin-review/page.tsx-119-154 (1)
119-154: Non-atomic two-step operation can leave orphan statues.
handleAddToNewcreates a statue first, then approves the image. If the approve step fails, an empty statue remains in the database with no cleanup. Consider adding error handling to delete the orphan statue on approval failure, or use a transactional endpoint.if (!approveResponse.ok) { + // Attempt to clean up the orphan statue + try { + await fetch(`/api/admin/statues/${createData.statue.id}`, { method: 'DELETE' }); + } catch { + console.error('Failed to clean up orphan statue:', createData.statue.id); + } throw new Error(approveData.error || 'Failed to approve image'); }app/admin/admin-review/page.tsx-75-93 (1)
75-93: Optimistic state update without actual persistence.
handleSaveMetadataupdates local state with new metadata (line 88), but the API call is commented out. This gives false feedback to the user—data appears saved but isn't persisted. Consider either implementing the API call or removing the local state update until the backend is ready.const handleSaveMetadata = useCallback(async (imageId: string, metadata: PendingImage['metadata']) => { // TODO: API call to update metadata console.log('Saving metadata for', imageId, metadata); - try { - // TODO: Add API call here - // await fetch('/api/admin/update-metadata', { - // method: 'POST', - // headers: { 'Content-Type': 'application/json' }, - // body: JSON.stringify({ imageId: internalReferenceNumber, metadata }), - // }); - - // Update local state with new metadata - setImages((prev) => prev.map((img) => (img?.image_id === imageId ? { ...img, metadata } : img))); - } catch (error) { - console.error('Failed to save metadata:', error); - setError('Failed to save metadata'); - } + // NOTE: Metadata saving is not yet implemented + setError('Metadata saving is not yet available'); }, []);lib/db/statueSearchQueryBuilder.ts-28-115 (1)
28-115: Duplicate type definitions and SEARCH_SELECT constant.These types (
SupabaseLocation,SupabaseCurrentLocation,SupabaseImage,SupabaseAuctionEvent,SupabaseStatueRow) andSEARCH_SELECTare already defined inlib/db/statueSearch.types.ts(lines 33-64 per relevant snippets). Import them to avoid drift and duplication.import { supabase } from './supabase'; -import type { StatueSearchFilters, StatueSearchRow } from './statueSearch.types'; +import type { + StatueSearchFilters, + StatueSearchRow, + SupabaseLocation, + SupabaseCurrentLocation, + SupabaseImage, + SupabaseAuctionEvent, + SupabaseStatueRow, + SEARCH_SELECT, +} from './statueSearch.types';Note: Ensure these types are exported from
statueSearch.types.tsif not already.app/api/admin/db-view/[table]/route.ts-23-29 (1)
23-29: Add explicit authentication check to match other admin endpoints.This endpoint lacks the explicit authentication verification found in other admin routes (
app/api/admin/reset-rate-limit/route.ts,app/api/admin/pending-images/route.ts). While Clerk middleware provides app-level protection, add an explicit check for consistency and defense-in-depth:import { auth } from '@clerk/nextjs/server'; export async function GET(request: NextRequest, { params }: { params: Promise<{ table: string }> }) { const { userId } = await auth(); if (!userId) { return NextResponse.json({ error: 'Unauthorized' }, { status: 401 }); } // ... rest of handlerapp/api/admin/db-view/[table]/route.ts-102-103 (1)
102-103: Consolidate primary key resolution with TABLE_REGISTRY.Same as above—use
TABLE_REGISTRY[table].primaryKeyinstead of hardcoding. This also doesn't handle composite keys, which could cause silent failures for junction tables.- // Determine primary key based on table FIRST - const primaryKey = table === 'statues' ? 'statue_id' : table === 'images' ? 'internal_reference_number' : 'id'; + const config = TABLE_REGISTRY[table]; + if (!config) { + return NextResponse.json({ error: `Table "${table}" is not registered` }, { status: 400 }); + } + if (Array.isArray(config.primaryKey)) { + return NextResponse.json({ error: 'Composite key updates not supported via this endpoint' }, { status: 400 }); + } + const primaryKey = config.primaryKey;app/api/admin/db-view/[table]/route.ts-36-36 (1)
36-36: Use TABLE_REGISTRY for primary key instead of hardcoding.This logic duplicates line 103 and could drift from
TABLE_REGISTRY.primaryKey. Since you already fetchconfigon line 39, useconfig.primaryKeyfor ordering.- // Determine primary key for ordering - const orderBy = table === 'statues' ? 'statue_id' : table === 'images' ? 'internal_reference_number' : 'id'; - // Get table configuration from registry const config = TABLE_REGISTRY[table]; if (!config) { return NextResponse.json({ error: `Table "${table}" is not registered` }, { status: 400 }); } + // Determine primary key for ordering from registry + const orderBy = Array.isArray(config.primaryKey) ? config.primaryKey[0] : config.primaryKey; + // Build queryCommittable suggestion skipped: line range outside the PR's diff.
lib/crud-handlers/delete.ts-393-437 (1)
393-437: No transaction wrapping - partial failures may leave inconsistent state.This function performs multiple sequential deletes across many tables but doesn't use a transaction. If any delete fails mid-way, the database could be left in an inconsistent state with some related records deleted but others remaining.
Consider wrapping the operations in a Supabase transaction or using a stored procedure. If Supabase transactions aren't available directly, you might need to use
supabase.rpc()with a database function, or at minimum document this limitation and ensure callers understand the risk.// Example using RPC with a database function: // await supabase.rpc('hard_delete_statue_with_relations', { statue_id: statueId });
🟡 Minor comments (21)
.env.example-3-3 (1)
3-3: Fix incorrect delimiter for environment variable.Line 3 uses a colon (
:) instead of an equals sign (=). This will prevent the environment variable from being parsed correctly.Apply this diff:
-SUPABASE_SERVICE_ROLE_KEY: service_role_key_here +SUPABASE_SERVICE_ROLE_KEY=service_role_key_here.env.example-12-13 (1)
12-13: Fix invalid comment syntax and add trailing newline.
.envfiles use#for comments, not//. The inline comment will be parsed as part of the value, causingNEXT_PUBLIC_BYPASS_CLERKto be set totrue // put true or false...instead of justtrue. Also, add a trailing newline at end of file.Apply this diff:
-NEXT_PUBLIC_BYPASS_CLERK=true // put true or false for testing endpoints without auth protection (for curl, postmanm etc,) -NODE_ENV=development +# Set to true or false for testing endpoints without auth protection (for curl, postman, etc.) +NEXT_PUBLIC_BYPASS_CLERK=true +NODE_ENV=development +supabase/migrations/20251107000000_add_description_fields.sql-66-67 (1)
66-67: Add explicit type validation for year fields ininsertArtifactMetadata.ts.While the LLM response uses JSON schema enforcement (constraining to numbers/null) and the manual form uses
type="number"inputs,insertArtifactMetadata.tslacks explicit validation before passingbasicSearchMetadatato the RPC. Add a check to ensurefirstAppearanceYearandfirstAppearanceYearOutsideCambodiaare numbers or null before calling the RPC, preventing potential integer cast failures in the SQL function.supabase/migrations/20251107062412_recreate_temp_artifact_metadata.sql-1-41 (1)
1-41: Remove the redundant recreate migration or consolidate into the original.The two migrations create an identical table schema with no meaningful differences in the column definitions. The only changes in the recreate migration are stylistic: it removes the safer
IF NOT EXISTSclause and replaces it withDROP TABLE IF EXISTSfollowed by an unsafeCREATE TABLEstatement (withoutIF NOT EXISTS).Since the schema is identical, this recreate migration is redundant. If the first migration hasn't been deployed to any environment, remove this migration and keep only the original create with its safer
IF NOT EXISTSsyntax. If the first migration was already deployed, consolidate both into a single idempotent migration that usesCREATE TABLE IF NOT EXISTSrather thanDROP TABLE.app/search/utils.ts-14-15 (1)
14-15: File extension extraction may fail for files without extensions.If a filename has no extension (e.g.,
"README"),file.name.split('.').pop()returns"README", andfileExtensionbecomes".readme", which is incorrect behavior.- const fileExtension = '.' + file.name.split('.').pop()?.toLowerCase(); + const nameParts = file.name.split('.'); + const fileExtension = nameParts.length > 1 ? '.' + nameParts.pop()?.toLowerCase() : '';app/admin/admin-review/components/ApprovalDrawer/README.md-26-177 (1)
26-177: Update README documentation to reflect actual implementation instead of incomplete TODOs.The TODO section describes outdated backend integration steps. Verification shows the folder functionality is already fully implemented using the statues table and API (
/api/admin/statues) rather than a separate folders table and/api/admin/foldersendpoint. The approve/deny routes exist but use a different database schema (temp_artifact_metadata table) than documented.Replace the TODO section with accurate documentation of the current implementation: folders are managed through statues endpoints, metadata is stored in temp_artifact_metadata, and the ApprovalDrawer component already fetches and manages folder selections dynamically.
examples/exampleStatueSearchModuleUsage.ts-18-30 (1)
18-30: Minor: Incomplete status code handling in switch statement.The switch statement handles
NORMAL,EMPTY_RESULT, andCONNECTION_ERRORcases but omits theEXCEED_LIMITstatus code defined inSTATUE_SEARCH_STATUS_CODES(visible in lib/search/statueSearchStatus.ts). While the default case will catch it, explicitly handling all defined status codes improves clarity and maintainability.Apply this diff to add the missing case:
case STATUE_SEARCH_STATUS_CODES.CONNECTION_ERROR: console.error(`[${payload.statusCode}] The search failed due to a database connectivity issue`, payload.error); break; + case STATUE_SEARCH_STATUS_CODES.EXCEED_LIMIT: + console.warn(`[${payload.statusCode}] The search exceeded the maximum result limit`); + break; default: console.warn(`[${payload.statusCode}] Unexpected status code`, payload.status); }app/api/upload/route.ts-26-31 (1)
26-31: Minor: Validated field is not used in this handler.The
shortDescriptionfield is validated in the upload endpoint but never passed tohandleUploadImageor used elsewhere in this handler. The field is validated again on the frontend and subsequently used in theprocess-metadataendpoint. The validation here is redundant and can be removed.examples/exampleStatueSearchModuleUsage.ts-13-13 (1)
13-13:__dirnameis undefined in ESM context and will cause a runtime error.Line 13 uses
__dirname, which is not available in ES Module context. Since this file uses ESM imports and the project targets"module": "esnext"in tsconfig, this will fail withReferenceError: __dirname is not definedat runtime.Use
import.meta.urlinstead:import { fileURLToPath } from 'node:url'; import { dirname } from 'node:path'; const __dirname = dirname(fileURLToPath(import.meta.url));lib/schemas/search_json.json-16-16 (1)
16-16: Reconsider the default value for repatriated field.Defaulting
repatriatedtotrueseems unusual for a search schema, as it would filter to only repatriated artifacts. For a search filter schema, this field should likely default tofalse, benull, or be omitted to avoid unintentionally restrictive searches.Consider changing to:
- "repatriated": true + "repatriated": falseOr omit the field entirely if it's optional.
lib/schemas/search_json.json-18-27 (1)
18-27: Review boolean and numeric default values in advanced search.The advanced section has several potentially problematic defaults:
- All boolean flags (
base_present,inscription,multiple_heads,fragmentary) default totruenumber_of_armsdefaults to4These specific defaults would create a very restrictive search filter that matches only artifacts with all these characteristics. For a search schema, these should likely default to
null/undefined(to not filter) orfalse(for booleans), unless these defaults represent the most common search use case.Consider:
"advanced": { "image_source": "", "material": "", - "base_present": true, - "inscription": true, - "multiple_heads": true, - "fragmentary": true, - "number_of_arms": 4, + "base_present": false, + "inscription": false, + "multiple_heads": false, + "fragmentary": false, + "number_of_arms": null, "limbs_present": [], "parts_fragmented": [] }lib/upload/uploadImage.ts-29-29 (1)
29-29: Validate file extension extraction.The file extension extraction doesn't validate the result. If the filename has no extension,
fileExtensionwill be the entire filename, leading to an incorrect file path.Apply this diff:
- const fileExtension = file.name.split('.').pop(); + const fileExtension = file.name.split('.').pop(); + if (!fileExtension) { + throw new Error('File must have a valid extension'); + }Committable suggestion skipped: line range outside the PR's diff.
app/search/constants.ts-18-23 (1)
18-23: Typo:placehldershould beplaceholder.This typo will cause the placeholder text to not render for the "Suspected Current Location" field.
{ id: 'suspectedCurrentLocation', label: 'Suspected Current Location', type: 'text', - placehlder: 'e.g. Getty Museum', + placeholder: 'e.g. Getty Museum', },app/api/admin/approve/route.test.ts-125-149 (1)
125-149: Test name does not match expected behavior.The test is named
'returns 400 when unable to determine internal reference number'but the assertion at line 147 expectsstatusto be200withsuccess: true. Either the test name is incorrect, or the assertion is wrong.- it('returns 400 when unable to determine internal reference number', async () => { + it('returns 200 when internal reference cannot be determined but route still succeeds', async () => {app/api/admin/crud/route.ts-28-30 (1)
28-30: Update action also requires an id but lacks validation.The delete action validates for a required
id, but update also requires one (as defined inUpdateRequest). Consider adding similar validation for the update action.if (body.action === 'delete' && (body.id === undefined || body.id === null)) { return NextResponse.json({ error: 'Delete action requires an id' }, { status: 400 }); } + + if (body.action === 'update' && (body.id === undefined || body.id === null)) { + return NextResponse.json({ error: 'Update action requires an id' }, { status: 400 }); + }app/api/admin/crud/route.ts-65-66 (1)
65-66: Use NotImplementedError for unimplemented actions.
NotImplementedErroris defined but never used. The default case throws a genericError, which returns a 500 instead of the intended 501 with guidance. For the 'read' action (which is allowed inTABLE_REGISTRYbut not implemented), this would incorrectly return 500.default: - throw new Error(`Unsupported action: ${payload.action as string}`); + throw new NotImplementedError(payload.action); }lib/llm/callMetadataLLM.ts-182-184 (1)
182-184: Avoid logging response content in production.Logging up to 2000 characters of the raw response can expose sensitive metadata to logs. Consider guarding this log statement with a development-only check.
const responseText = await response.text(); - console.log('Raw OpenRouter response text:', responseText.substring(0, 2000)); + if (process.env.NODE_ENV !== 'production') { + console.log('Raw OpenRouter response text:', responseText.substring(0, 2000)); + }app/admin/db-view/page.tsx-756-774 (1)
756-774: Date parsing may throw on invalid strings.If
formData[col]contains an invalid date string,new Date(formData[col] as string)will produce an "Invalid Date" object, andtoISOString()will throw. Add a guard.if (isDateField) { - // Convert database date string to YYYY-MM-DD format for input - const dateValue = formData[col] ? new Date(formData[col] as string).toISOString().split('T')[0] : ''; + // Convert database date string to YYYY-MM-DD format for input + let dateValue = ''; + if (formData[col]) { + const parsed = new Date(formData[col] as string); + if (!isNaN(parsed.getTime())) { + dateValue = parsed.toISOString().split('T')[0]; + } + }app/upload/components/DescriptionMetadataEntry.tsx-68-81 (1)
68-81: Potentially confusing rate limit message.When
remaining > 0butaiMetadataAvailableis false, the message states AI is unavailable while also showing uploads remaining. This could confuse users—if uploads remain, why is AI unavailable? Clarify the distinction between rate limiting and other unavailability reasons.app/components/Navbar.tsx-62-71 (1)
62-71: Missingidattribute foraria-labelledbyreference.The
Menucomponent referencesaria-labelledby: 'admin-button', but theButtonelement doesn't haveid="admin-button". This breaks the accessibility association.- <Button variant="contained" color="primary" onClick={handleAdminMenuOpen} sx={{ position: 'relative' }}> + <Button id="admin-button" variant="contained" color="primary" onClick={handleAdminMenuOpen} sx={{ position: 'relative' }}> Admin </Button>app/search/components/SearchForm.tsx-226-235 (1)
226-235: Number values not converted to string when editing chip.When clicking a chip with a numeric value,
advancedInputValueis set to an empty string instead of the stringified number, losing the current value.const handleChipClick = (selection: AdvancedSelection) => { if (selection.type === 'binary') { handleToggleBinary(selection.id); return; } setActiveParamId(selection.id); - setAdvancedInputValue(typeof selection.value === 'string' ? selection.value : ''); + setAdvancedInputValue( + typeof selection.value === 'string' ? selection.value : String(selection.value ?? '') + ); focusAdvancedInput(); };
| const ADMIN_ID = process.env.ADMIN_ID!; | ||
|
|
||
| export async function GET() { | ||
| const { userId } = await auth(); | ||
| if (userId !== ADMIN_ID) { | ||
| return NextResponse.json({ error: 'Unauthorized' }, { status: 401 }); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Missing runtime validation for ADMIN_ID environment variable.
Line 5 uses a non-null assertion operator on process.env.ADMIN_ID without runtime validation. If this environment variable is not set, ADMIN_ID will be undefined, and the auth check on line 9 could pass when userId is also undefined, creating a severe security vulnerability that grants admin access to unauthenticated requests.
Apply this diff to add runtime validation:
-const ADMIN_ID = process.env.ADMIN_ID!;
+const ADMIN_ID = process.env.ADMIN_ID;
+
+if (!ADMIN_ID) {
+ throw new Error('ADMIN_ID environment variable is required');
+}
export async function GET() {
const { userId } = await auth();🤖 Prompt for AI Agents
In app/api/admin/pending-images/route.ts around lines 5 to 11, the code uses
process.env.ADMIN_ID! without runtime validation which can lead to ADMIN_ID
being undefined and allowing unauthorized access when both userId and ADMIN_ID
are undefined; replace the non-null assertion with a runtime check that throws
or returns a 500 error at module initialization (or inside GET) if ADMIN_ID is
missing or empty, ensure ADMIN_ID is read as a string (and parsed if numeric)
into a properly validated constant before performing the auth comparison, and
fail fast with a clear error message so the route cannot inadvertently grant
admin access.
| const { data: maxStatue, error: maxError } = await supabase | ||
| .from('statues') | ||
| .select('statue_id') | ||
| .order('statue_id', { ascending: false }) | ||
| .limit(1) | ||
| .single(); | ||
|
|
||
| // PGRST116 is "no rows returned" which is fine for empty table | ||
| if (maxError && maxError.code !== 'PGRST116') { | ||
| return NextResponse.json({ error: maxError.message }, { status: 500 }); | ||
| } | ||
|
|
||
| const nextStatueId = maxStatue ? maxStatue.statue_id + 1 : 1; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Race condition when computing next statue_id.
Concurrent POST requests can read the same max statue_id and attempt to insert duplicates. If statue_id has a unique constraint, one request will fail; if not, duplicates are created.
Consider using a database sequence or transaction with row locking:
+ // Use a transaction with advisory lock to prevent race conditions
+ const { data: newStatue, error: insertError } = await supabase.rpc('create_statue_with_next_id');Or create a PostgreSQL function that atomically handles ID generation:
CREATE OR REPLACE FUNCTION create_statue_with_next_id()
RETURNS statues AS $$
DECLARE
next_id INTEGER;
new_statue statues;
BEGIN
LOCK TABLE statues IN SHARE ROW EXCLUSIVE MODE;
SELECT COALESCE(MAX(statue_id), 0) + 1 INTO next_id FROM statues;
INSERT INTO statues (statue_id) VALUES (next_id) RETURNING * INTO new_statue;
RETURN new_statue;
END;
$$ LANGUAGE plpgsql;| const formData = await request.formData(); | ||
| const file = formData.get('file') as File; | ||
| const shortDescription = formData.get('shortDescription') as string; | ||
|
|
||
| // Validate that short description is provided | ||
| if (!shortDescription || shortDescription.trim() === '') { | ||
| return NextResponse.json({ error: 'Short description is required' }, { status: 400 }); | ||
| } | ||
|
|
||
| try { | ||
| const result = await handleUploadImage({ file }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Missing file validation before upload.
The file field is extracted from formData (line 25) but never validated before being passed to handleUploadImage (line 34). If no file is provided in the request, file will be null, which could cause runtime errors in the upload handler.
Apply this diff to add file validation:
const formData = await request.formData();
const file = formData.get('file') as File;
const shortDescription = formData.get('shortDescription') as string;
+ // Validate that file is provided
+ if (!file) {
+ return NextResponse.json({ error: 'File is required' }, { status: 400 });
+ }
+
// Validate that short description is provided
if (!shortDescription || shortDescription.trim() === '') {
return NextResponse.json({ error: 'Short description is required' }, { status: 400 });
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const formData = await request.formData(); | |
| const file = formData.get('file') as File; | |
| const shortDescription = formData.get('shortDescription') as string; | |
| // Validate that short description is provided | |
| if (!shortDescription || shortDescription.trim() === '') { | |
| return NextResponse.json({ error: 'Short description is required' }, { status: 400 }); | |
| } | |
| try { | |
| const result = await handleUploadImage({ file }); | |
| const formData = await request.formData(); | |
| const file = formData.get('file') as File; | |
| const shortDescription = formData.get('shortDescription') as string; | |
| // Validate that file is provided | |
| if (!file) { | |
| return NextResponse.json({ error: 'File is required' }, { status: 400 }); | |
| } | |
| // Validate that short description is provided | |
| if (!shortDescription || shortDescription.trim() === '') { | |
| return NextResponse.json({ error: 'Short description is required' }, { status: 400 }); | |
| } | |
| try { | |
| const result = await handleUploadImage({ file }); |
🤖 Prompt for AI Agents
In app/api/upload/route.ts around lines 24 to 34, the code reads the file from
formData but never validates it; add a guard before calling handleUploadImage
that checks the file is present and is a File (and optionally non-empty), and if
not return NextResponse.json({ error: 'File is required' }, { status: 400 }); so
the upload handler never receives a null value and runtime errors are prevented.
| import { NextResponse } from 'next/server'; | ||
|
|
||
| const isAdminRoute = createRouteMatcher(['/admin(.*)', '/api/admin(.*)']); | ||
| const skipAuth = process.env.NEXT_PUBLIC_BYPASS_CLERK === 'true'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical security concern: Client-accessible authentication bypass.
The NEXT_PUBLIC_BYPASS_CLERK environment variable is exposed to the client browser due to the NEXT_PUBLIC_ prefix. This means any user can inspect the client-side code and discover whether authentication is bypassed, and potentially attempt to manipulate client state or environment to exploit this.
Apply this diff to use a server-only environment variable:
-const skipAuth = process.env.NEXT_PUBLIC_BYPASS_CLERK === 'true';
+const skipAuth = process.env.BYPASS_CLERK === 'true';Ensure you update .env.example and any documentation to reflect the renamed variable without the NEXT_PUBLIC_ prefix.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const skipAuth = process.env.NEXT_PUBLIC_BYPASS_CLERK === 'true'; | |
| const skipAuth = process.env.BYPASS_CLERK === 'true'; |
🤖 Prompt for AI Agents
In middleware.ts around line 6, the code reads the client-exposed
NEXT_PUBLIC_BYPASS_CLERK variable; replace it with a server-only environment
variable (e.g. process.env.BYPASS_CLERK) so the bypass flag is not exposed to
the browser, keeping the same truth check (compare to 'true'); update
.env.example and any docs to rename NEXT_PUBLIC_BYPASS_CLERK to BYPASS_CLERK and
document that it is server-only, and search the repo for any other usages of
NEXT_PUBLIC_BYPASS_CLERK to update them accordingly.
| "@prisma/client": "^6.19.0", | ||
| "@supabase-js/source": "github:supabase/supabase-js", | ||
| "@supabase/supabase-js": "^2.76.1", | ||
| "@tailwindcss/postcss": "^4.1.16", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🌐 Web query:
Tailwind CSS v4 @tailwindcss/postcss Next.js 15 configuration guide
💡 Result:
Quick guide — Tailwind CSS v4 + @tailwindcss/postcss in a Next.js 15 app
- Install
- npm install -D tailwindcss @tailwindcss/postcss postcss. [1][2]
- PostCSS config (root)
- Create postcss.config.mjs and export the plugin:
export default { plugins: { '@tailwindcss/postcss': {}, }, };
(ESM file recommended). [1][2]
- Global CSS
- In your global CSS (e.g., styles/globals.css) import Tailwind:
@import "tailwindcss";
Then import that CSS in your app entry (app/layout.js or pages/_app.js). [1][3]
- Configuration model
- v4 defaults to “zero config” and a CSS-first approach. You can configure theme, safelists, and extra sources inside your CSS using @theme and @source directives rather than a tailwind.config.js file (optional — tailwind.config.js still supported if you prefer). [1]
- Common gotchas
- Ensure you use @tailwindcss/postcss (not tailwindcss) in PostCSS; postcss must be installed (v8+). If Tailwind appears not to apply, try clearing .next and restarting the dev server. [1][5]
References
- Tailwind v4 release + CSS-first / installation notes. [1]
- Tailwind docs — Using PostCSS (installation steps). [2]
- Next.js docs (Tailwind integration example). [3]
If you want, I can generate the exact sample files (package.json scripts, postcss.config.mjs, styles/globals.css, and app/layout.js or pages/_app.js) tailored to your repo.
🏁 Script executed:
# Check for PostCSS config
fd -t f '(postcss|tailwind).*\.(config|js|mjs|cjs|ts)' || echo "Searching for config files..."
# Check package.json for conflicting packages and Next.js version
cat -n package.json | head -30Repository: BU-Spark/se-stolen-artifacts
Length of output: 1164
🏁 Script executed:
# Search for any conflicting tailwindcss package or duplicate Tailwind entries
rg "tailwindcss|@tailwindcss" package.json -i -A 1 -B 1Repository: BU-Spark/se-stolen-artifacts
Length of output: 172
🏁 Script executed:
# Check for PostCSS configuration in common locations
ls -la | grep -E "postcss|tailwind" || echo "No config files in root"
cat -n postcss.config.js 2>/dev/null || cat -n postcss.config.mjs 2>/dev/null || echo "PostCSS config not found"Repository: BU-Spark/se-stolen-artifacts
Length of output: 120
🏁 Script executed:
# Check for tailwind.config files
ls -la tailwind.config.* 2>/dev/null || echo "No tailwind.config found"Repository: BU-Spark/se-stolen-artifacts
Length of output: 96
Add missing PostCSS configuration and install the postcss package.
Tailwind CSS v4 with @tailwindcss/postcss requires:
- The
postcsspackage (v8+) in dependencies—it's currently missing - A
postcss.config.mjsfile in the project root with:export default { plugins: { '@tailwindcss/postcss': {} } };
- A global CSS file importing
@import "tailwindcss";and imported inapp/layout.js
Without these, Tailwind v4 will not work despite the package being listed. Your current setup is incomplete.
🤖 Prompt for AI Agents
In package.json around line 27, the project lists "@tailwindcss/postcss" but is
missing the required PostCSS package and configuration; install the postcss
dependency (v8+), add a project-root file named postcss.config.mjs exporting
default { plugins: { '@tailwindcss/postcss': {} } }, and create/import a global
CSS file that contains @import "tailwindcss"; then import that global CSS from
app/layout.js so Tailwind v4 is properly initialized.
| SELECT COALESCE(MAX(statue_id), 0) + 1 INTO next_statue_id FROM statues; | ||
| SELECT COALESCE(MAX(approval_id), 0) + 1 INTO next_approval_id FROM approval; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Race condition: MAX() + 1 pattern is unsafe for concurrent operations.
Computing IDs using MAX(column) + 1 creates a race condition when multiple requests execute simultaneously. Two concurrent transactions can compute the same next_statue_id or next_approval_id, leading to primary key violations or unpredictable behavior.
Replace manual ID computation with PostgreSQL sequences or SERIAL/IDENTITY columns:
Option 1 (recommended): Use SERIAL or IDENTITY columns
Modify the table definitions to use auto-incrementing columns:
-- In the table creation migration
ALTER TABLE statues ALTER COLUMN statue_id ADD GENERATED BY DEFAULT AS IDENTITY;
ALTER TABLE approval ALTER COLUMN approval_id ADD GENERATED BY DEFAULT AS IDENTITY;Then update the function to use DEFAULT for these columns:
- SELECT COALESCE(MAX(statue_id), 0) + 1 INTO next_statue_id FROM statues;
- SELECT COALESCE(MAX(approval_id), 0) + 1 INTO next_approval_id FROM approval;
-
-- Insert the new statue row
- INSERT INTO statues (statue_id)
- VALUES (next_statue_id);
+ INSERT INTO statues (statue_id)
+ VALUES (DEFAULT)
+ RETURNING statue_id INTO next_statue_id;
-- ... (images insert)
INSERT INTO approval (
image_id,
status,
admin_id,
- approval_id
+ approval_id
)
VALUES (
new_internal_ref_id,
'pending_review',
'cacf0e78-f815-4a47-abed-179aa74c40eb',
- next_approval_id
+ DEFAULT
);Option 2: Use explicit sequences
-- Create sequences if they don't exist
CREATE SEQUENCE IF NOT EXISTS statues_statue_id_seq;
CREATE SEQUENCE IF NOT EXISTS approval_approval_id_seq;Update the function:
- SELECT COALESCE(MAX(statue_id), 0) + 1 INTO next_statue_id FROM statues;
- SELECT COALESCE(MAX(approval_id), 0) + 1 INTO next_approval_id FROM approval;
+ SELECT nextval('statues_statue_id_seq') INTO next_statue_id;
+ SELECT nextval('approval_approval_id_seq') INTO next_approval_id;🤖 Prompt for AI Agents
In supabase/migrations/20251026005531_handle_new_image_upload.sql around lines
13-14 the function computes new IDs using SELECT COALESCE(MAX(...),0)+1 which
causes a race condition under concurrency; fix by switching to database-managed
sequences/identity columns: modify the statues and approval tables to use
GENERATED BY DEFAULT AS IDENTITY (or SERIAL/IDENTITY) so inserts can rely on
DEFAULT for the id, or create explicit sequences if you must and use
nextval(sequence) when inserting; update this migration/function to remove
MAX()+1 and either let INSERT use DEFAULT for the identity columns or call
nextval on the created sequences.
| CREATE OR REPLACE FUNCTION insert_llm_artifact_metadata_withGCS( | ||
| image_id_input TEXT, | ||
| gcs_path_input TEXT, | ||
| internal_reference_number_input TEXT, -- Added internal_reference_number | ||
| basic_search_metadata_input JSONB, | ||
| advanced_search_metadata_input JSONB, | ||
| short_description_input TEXT, | ||
| long_description_input TEXT, | ||
| ai_generated_input BOOLEAN | ||
| ) | ||
| RETURNS UUID AS $$ | ||
| DECLARE | ||
| new_id UUID := gen_random_uuid(); | ||
| BEGIN | ||
| INSERT INTO artifact_metadata_upload_log ( | ||
| id, | ||
| image_id, | ||
| gcs_path, | ||
| internal_reference_number, -- Added internal_reference_number | ||
| subject, | ||
| dealer_name, | ||
| suspected_current_location, | ||
| artifact_title, | ||
| photograph_location, | ||
| first_appearance_year, | ||
| first_appearance_year_outside_cambodia, | ||
| repatriated, | ||
| image_source, | ||
| material, | ||
| has_inscription, | ||
| multiple_heads, | ||
| four_arms, | ||
| eight_arms, | ||
| ten_arms, | ||
| over_ten_arms, | ||
| fragmentary, | ||
| fragments_from_multiple_statues, | ||
| head_present, | ||
| torso_present, | ||
| shoulder_elbow_present, | ||
| elbow_wrist_present, | ||
| hip_knee_present, | ||
| knee_ankle_present, | ||
| foot_present, | ||
| base_present, | ||
| fragmented_at_neck, | ||
| fragmented_at_shoulder, | ||
| fragmented_at_elbow, | ||
| fragmented_at_wrist, | ||
| fragmented_at_upper_leg, | ||
| fragmented_at_knee, | ||
| fragmented_at_ankle, | ||
| short_description, | ||
| long_description, | ||
| ai_generated, | ||
| created_at | ||
| ) VALUES ( | ||
| new_id, | ||
| image_id_input, | ||
| gcs_path_input, | ||
| internal_reference_number_input, -- Insert internal_reference_number | ||
| basic_search_metadata_input->>'subject', | ||
| basic_search_metadata_input->>'dealerName', | ||
| basic_search_metadata_input->>'suspectedCurrentLocation', | ||
| basic_search_metadata_input->>'artifactTitle', | ||
| basic_search_metadata_input->>'photographLocation', | ||
| (basic_search_metadata_input->>'firstAppearanceYear')::INTEGER, | ||
| (basic_search_metadata_input->>'firstAppearanceYearOutsideCambodia')::INTEGER, | ||
| COALESCE((basic_search_metadata_input->>'repatriated')::BOOLEAN, FALSE), | ||
| advanced_search_metadata_input->>'imageSource', | ||
| advanced_search_metadata_input->>'material', | ||
| COALESCE((advanced_search_metadata_input->>'hasInscription')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'multipleHeads')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fourArms')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'eightArms')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'tenArms')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'overTenArms')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentary')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentsFromMultipleStatues')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'headPresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'torsoPresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'shoulderElbowPresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'elbowWristPresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'hipKneePresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'kneeAnklePresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'footPresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'basePresent')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtNeck')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtShoulder')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtElbow')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtWrist')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtUpperLeg')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtKnee')::BOOLEAN, FALSE), | ||
| COALESCE((advanced_search_metadata_input->>'fragmentedAtAnkle')::BOOLEAN, FALSE), | ||
| short_description_input, | ||
| long_description_input, | ||
| ai_generated_input, | ||
| NOW() | ||
| ); | ||
| RETURN new_id; | ||
| END; | ||
| $$ LANGUAGE plpgsql; No newline at end of file |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
rg -n -A 20 'insert_llm_artifact_metadata_withGCS' --type=tsRepository: BU-Spark/se-stolen-artifacts
Length of output: 54
🏁 Script executed:
rg -n 'insert_llm_artifact_metadata_withGCS|insertArtifactMetadata' -A 15Repository: BU-Spark/se-stolen-artifacts
Length of output: 13841
🏁 Script executed:
cat -n lib/db/insertArtifactMetadata.tsRepository: BU-Spark/se-stolen-artifacts
Length of output: 2611
🏁 Script executed:
rg -n 'insert_llm_artifact_metadata_with_timestamp' supabase/migrations/Repository: BU-Spark/se-stolen-artifacts
Length of output: 526
This function is not being called by the application; it has been superseded by a later migration.
The TypeScript code calls insert_llm_artifact_metadata_with_timestamp (created in migrations dated 20251207+), but this migration creates insert_llm_artifact_metadata_withGCS. Additionally, the reviewed function accepts 8 parameters while TypeScript passes 9 parameters (including misc_information_input), which the reviewed function does not support.
The original approval overlooked this critical disconnect between the reviewed SQL and the actual RPC interface being used by the application.
Pull Request: Fall 2025 Semester Changes (From Greenfield)
Description
Includes all changes from the start of the semester. All details in
README.md.Summary by CodeRabbit
New Features
Documentation
✏️ Tip: You can customize this high-level summary in your review settings.