Litscreen
Built a multi-model AI screening tool solo in 8 weeks - 89% faster than manual.

Design research teams were spending 45+ minutes per screen manually tagging UI components, categorising patterns, and writing metadata - unscalable for any team with volume. The process was entirely manual, inconsistent across reviewers, and impossible to search.
When Gemini 3 Pro launched, I noticed it had strong visual reasoning - far better than anything available for image understanding at the time. Claude, on the other hand, was the best model for structured reasoning and scoring. Most builders would default to one model and accept the tradeoff. I decided to play to the strengths of each: Gemini handles the visual parsing and component recognition, Claude takes that structured output and applies reasoning, scoring, and metadata generation. Two models, two jobs, one pipeline - with a human-in-the-loop verification step so the output stays trustworthy.


A full-stack AI research tool with an admin dashboard, public search interface, and component library. The AI pipeline processes uploaded screenshots end-to-end: image analysis, metadata extraction, contextual tagging, structured output, human review, and database publishing. Built on Next.js, Node.js, and Supabase. Deployed to production.
Workflow dropped from 45 minutes to 5 minutes per screen - an 89% efficiency gain. 23,000 lines of production code shipped in 8 weeks. Proved that a PM with a sharp technical point of view can move as fast as a small engineering team.
Stack: Next.js (frontend + API routes), Node.js, Supabase (PostgreSQL + storage), Cloudinary CDN. AI layer: Gemini 3 Pro for image understanding and component recognition, Claude Sonnet for reasoning, metadata generation and scoring, with human-in-the-loop (HITL) verification to catch hallucinations. Search: PostgreSQL full-text search with custom ranking logic. Prompt architecture was iterated heavily - structured output format, token optimisation (SOPs, context window management), and 40% cost reduction through prompt engineering.