| name | maintain-coding-standards |
| description | Guide code development to follow project coding standards while writing new features. Automatically applies TypeScript patterns, React best practices, API route structure, testing patterns, and style guidelines. Use when building new features, writing new code, creating components, or implementing functionality to ensure standards are followed from the start. |
Maintain Coding Standards Skill
This skill helps you write code that follows project standards as you build, preventing anti-patterns before they're written.
What This Skill Does
When you're building new features, this skill:
- Applies project patterns automatically to new code
- Reminds you of required patterns (withAuth, validation, etc.)
- Suggests the right approach for common tasks
- Prevents anti-patterns before they're written
- Guides TypeScript types, React patterns, API structure, and testing
This is NOT a code review tool - it's a development guide that keeps you on track while coding.
When This Activates
This skill automatically activates when you're:
- Building a new feature or component
- Writing a new API route
- Creating a service class
- Implementing tests
- Adding new functionality
Key trigger phrases:
- "build a new [feature]"
- "create a [component/route/service]"
- "implement [functionality]"
- "add [feature]"
- "write code for [task]"
Development Patterns to Apply
1. TypeScript - Write Type-Safe Code
When writing new TypeScript code, automatically apply these patterns:
Use branded types for IDs:
// ✅ ALWAYS write new functions like this:
import { type UserId, type ProjectId } from '@/types/branded';
function getProject(projectId: ProjectId, userId: UserId): Promise<Project | null> {
// Implementation
}
Never use any - use unknown or generics:
// ✅ When processing unknown data:
function processData(data: unknown): ProcessedData {
// Validate and narrow type
if (typeof data !== 'object' || data === null) {
throw new ValidationError('Invalid data');
}
// Now safe to use
}
// ✅ For generic functions:
function transform<T>(items: T[]): T[] {
return items.map(item => /* transform */);
}
Always specify return types:
// ✅ Every function needs explicit return type:
async function fetchUser(id: UserId): Promise<User | null> {
// Implementation
}
function calculate(x: number): number {
return x * 2;
}
2. React Components - Write Composable Components
When creating new React components, automatically structure them like this:
Reusable UI components with forwardRef:
// ✅ ALWAYS use forwardRef for reusable components:
import React from 'react';
interface ButtonProps extends React.ButtonHTMLAttributes<HTMLButtonElement> {
variant?: 'primary' | 'secondary';
size?: 'sm' | 'md' | 'lg';
}
const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
({ variant = 'primary', size = 'md', ...props }, ref) => {
return (
<button ref={ref} className={cn(buttonVariants({ variant, size }))} {...props} />
);
}
);
Button.displayName = 'Button';
export { Button };
Hook order - ALWAYS follow this sequence:
function MyComponent() {
// 1. Context hooks
const auth = useAuth();
const theme = useTheme();
// 2. State hooks
const [count, setCount] = useState(0);
const [isLoading, setIsLoading] = useState(false);
// 3. Ref hooks
const inputRef = useRef<HTMLInputElement>(null);
// 4. Effect hooks
useEffect(() => {
// Side effects
}, []);
// 5. Custom hooks
const debouncedValue = useDebounce(count, 500);
return <div>...</div>;
}
Extract complex logic to custom hooks:
// ✅ When logic gets complex, extract to hook:
function useProjectData(projectId: string) {
const [project, setProject] = useState<Project | null>(null);
const [isLoading, setIsLoading] = useState(true);
useEffect(() => {
// Fetch logic
}, [projectId]);
return { project, isLoading };
}
// Then use in component:
function ProjectView({ projectId }: Props) {
const { project, isLoading } = useProjectData(projectId);
// Clean component code
}
Accessibility - WCAG 2.1 AA Compliance:
When building components, ALWAYS ensure accessibility:
// ✅ Keyboard accessible interactive elements
<button onClick={handleClick} onKeyDown={(e) => e.key === 'Enter' && handleClick()}>
Action
</button>
// ✅ Proper ARIA labels for form inputs
<label htmlFor="email">Email</label>
<input id="email" type="email" aria-required="true" />
// ✅ Alt text for images
<img src={url} alt="User profile photo" />
<img src={decorative} alt="" /> {/* Decorative images */}
// ✅ ARIA roles for custom controls
<div
role="button"
tabIndex={0}
onClick={handleClick}
onKeyDown={(e) => e.key === 'Enter' && handleClick()}
aria-label="Close dialog"
>
×
</div>
// ✅ Modal dialogs with proper ARIA and focus trapping
<div role="dialog" aria-modal="true" aria-labelledby="dialog-title">
<h2 id="dialog-title">Confirm Action</h2>
{/* Content */}
</div>
// ✅ Live regions for dynamic content
<div role="status" aria-live="polite">
{message}
</div>
// ✅ Respect reduced motion preference
const prefersReducedMotion = window.matchMedia('(prefers-reduced-motion: reduce)').matches;
<motion.div
animate={{ opacity: 1 }}
transition={{ duration: prefersReducedMotion ? 0 : 0.3 }}
/>
Accessibility Checklist:
- All interactive elements are keyboard accessible (Tab, Enter, Space, Escape)
- Form inputs have proper labels (
htmlFor+idoraria-label) - Custom controls have ARIA roles (
role="button",role="slider") - Images have meaningful
alttext (oralt=""for decorative) - Focus indicators are visible (outline or box-shadow)
- Color contrast meets WCAG AA (4.5:1 for text, 3:1 for UI)
- Modals have
role="dialog",aria-modal="true", and trap focus - Dynamic content uses ARIA live regions (
role="status",role="alert") - Reduced motion preference is respected
Reference: See /docs/ACCESSIBILITY.md for complete guidelines.
3. Asynchronous Generation - Queue-Based Architecture
CRITICAL: For ANY generation operation (images, video, upscaling, image editing, audio), ALWAYS use Supabase queues for persistence.
Generation operations include:
- Image generation (DALL-E, Stable Diffusion, etc.)
- Video generation (Runway, Pika, Veo, etc.)
- Video upscaling (Topaz, etc.)
- Image editing (background removal, filters, etc.)
- Audio generation (ElevenLabs, etc.)
- Any AI model inference that takes >2 seconds
Why use queues:
- ✅ Persists job state across server restarts
- ✅ Handles timeouts gracefully (API routes have limits)
- ✅ Enables retry logic for failures
- ✅ Allows polling for status updates
- ✅ Tracks job history and results
- ✅ Prevents duplicate submissions
Architecture Pattern:
// ❌ NEVER do this - direct API call in route
export const POST = withAuth(async (request, context) => {
const result = await generateVideo(prompt); // Blocks for minutes!
return successResponse(result);
});
// ✅ ALWAYS do this - queue-based pattern
export const POST = withAuth(async (request, context) => {
const { user, supabase } = context;
// 1. Create job in queue
const { data: job } = await supabase
.from('generation_queue')
.insert({
user_id: user.id,
type: 'video_generation',
status: 'pending',
input_data: { prompt, model, duration },
})
.select()
.single();
// 2. Return job ID immediately
return successResponse({ jobId: job.id, status: 'pending' });
});
When to Use Queue Pattern:
✅ ALWAYS use for:
- Video generation (any duration)
- Image generation (DALL-E, Stable Diffusion)
- Video upscaling
- Audio generation (>10 seconds)
- Batch processing
- Any operation taking >5 seconds
❌ Don't use for:
- Simple database queries
- Fetching existing data
- Real-time chat responses
- Lightweight transformations (<1 second)
For complete queue implementation patterns (schema, service layer, worker, polling), see /docs/CODING_BEST_PRACTICES.md or existing queue implementations in the codebase.
4. API Routes - Write Secure, Validated Routes
When creating new API routes, ALWAYS structure them exactly like this:
Template for ALL new API routes:
// app/api/my-feature/route.ts
import { NextRequest } from 'next/server';
import { withAuth, type AuthContext } from '@/lib/api/withAuth';
import { RATE_LIMITS } from '@/lib/rateLimit';
import { errorResponse, successResponse } from '@/lib/api/errorResponse';
import { validateAll, validateString, validateUUID } from '@/lib/validation';
import { MyService } from '@/lib/services/myService';
async function handleMyFeature(request: NextRequest, context: AuthContext) {
const { user, supabase } = context;
// 1. Parse request
const body = await request.json();
// 2. ALWAYS validate input
const validation = validateAll([
validateString(body.name, 'name', { minLength: 1, maxLength: 100 }),
validateUUID(body.projectId, 'projectId'),
]);
if (!validation.valid) {
return errorResponse(validation.errors[0]?.message ?? 'Invalid input', 400);
}
// 3. Use service layer for business logic
const service = new MyService(supabase);
const result = await service.doSomething(user.id, body);
return successResponse(result);
}
// 4. ALWAYS export with withAuth and rate limiting
export const POST = withAuth(handleMyFeature, {
route: '/api/my-feature',
rateLimit: RATE_LIMITS.tier2_resource_creation, // Choose appropriate tier
});
Rate limit tier guide:
tier1_auth_payment(5/min): Authentication, payments, admintier2_resource_creation(10/min): AI generation, uploads, exportstier3_status_read(30/min): Status checks, readstier4_general(60/min): Logging, general ops
Validation patterns:
// ✅ ALWAYS validate EVERY input:
import { validateAll, validateString, validateUUID, validateEnum } from '@/lib/validation';
const validation = validateAll([
validateString(body.title, 'title', { minLength: 1, maxLength: 200 }),
validateUUID(body.userId, 'userId'),
validateEnum(body.status, 'status', ['active', 'inactive']),
]);
if (!validation.valid) {
return errorResponse(validation.errors[0]?.message, 400);
}
5. Service Layer - Separate Business Logic
When implementing business logic, ALWAYS create a service class:
Service class template:
// lib/services/myService.ts
import { type SupabaseClient } from '@supabase/supabase-js';
import { trackError, ErrorCategory, ErrorSeverity } from '@/lib/errorTracking';
import { cache, CacheKeys, CacheTTL } from '@/lib/cache';
import { invalidateMyCache } from '@/lib/cacheInvalidation';
export class MyService {
// ✅ ALWAYS use dependency injection
constructor(private supabase: SupabaseClient) {}
async createThing(userId: string, data: CreateData): Promise<Thing> {
try {
// Business logic here
const { data: thing, error } = await this.supabase
.from('things')
.insert({ user_id: userId, ...data })
.select()
.single();
if (error) {
trackError(error, {
category: ErrorCategory.DATABASE,
severity: ErrorSeverity.HIGH,
context: { userId, operation: 'createThing' },
});
throw new Error(`Failed to create thing: ${error.message}`);
}
// Invalidate cache after mutation
await invalidateMyCache(userId);
return thing;
} catch (error) {
trackError(error, {
category: ErrorCategory.DATABASE,
severity: ErrorSeverity.HIGH,
context: { userId },
});
throw error;
}
}
async getThings(userId: string): Promise<Thing[]> {
// ✅ ALWAYS implement caching for reads
const cacheKey = CacheKeys.userThings(userId);
const cached = await cache.get<Thing[]>(cacheKey);
if (cached) return cached;
const { data, error } = await this.supabase.from('things').select('*').eq('user_id', userId);
if (error) throw new Error(`Failed to fetch: ${error.message}`);
await cache.set(cacheKey, data || [], CacheTTL.userThings);
return data || [];
}
}
6. Storage - User-Scoped and Secure Storage
CRITICAL: ALL persistent data must use Supabase Storage with user-scoped paths. Local storage is NOT sustainable.
When implementing file uploads, AI generations, or data persistence, ALWAYS follow these patterns:
Supabase Storage upload pattern:
// ✅ ALWAYS use user-scoped paths
import { createServerSupabaseClient } from '@/lib/supabase';
async function uploadAsset(
userId: string,
projectId: string,
file: File
): Promise<{ url: string; path: string }> {
const supabase = createServerSupabaseClient();
// 1. Generate user-scoped path
const timestamp = Date.now();
const fileExt = file.name.split('.').pop();
const fileName = `${timestamp}.${fileExt}`;
const storagePath = `${userId}/${projectId}/images/${fileName}`;
// 2. Upload to Supabase Storage
const { data: uploadData, error: uploadError } = await supabase.storage
.from('assets')
.upload(storagePath, file, {
contentType: file.type,
upsert: false,
});
if (uploadError) {
trackError(uploadError, {
category: ErrorCategory.STORAGE,
context: { userId, projectId, fileName },
});
throw new Error(`Upload failed: ${uploadError.message}`);
}
// 3. Create database record
const { data: asset, error: dbError } = await supabase
.from('assets')
.insert({
user_id: userId,
project_id: projectId,
storage_url: `supabase://assets/${storagePath}`,
type: 'image',
})
.select()
.single();
// 4. CRITICAL: Clean up storage if database insert fails
if (dbError) {
await supabase.storage.from('assets').remove([storagePath]);
trackError(dbError, {
category: ErrorCategory.DATABASE,
context: { userId, projectId, storagePath },
});
throw new Error(`Database insert failed: ${dbError.message}`);
}
return { url: asset.storage_url, path: storagePath };
}
Temporary file handling:
// ✅ ALWAYS use os.tmpdir() with cleanup
import os from 'os';
import fs from 'fs/promises';
import path from 'path';
async function processVideo(videoUrl: string): Promise<Result> {
const tempDir = os.tmpdir();
const tempFilePath = path.join(tempDir, `video-${Date.now()}.mp4`);
try {
// 1. Download to temp file
const response = await fetch(videoUrl);
const buffer = await response.arrayBuffer();
await fs.writeFile(tempFilePath, Buffer.from(buffer));
// 2. Process the file
const result = await processFile(tempFilePath);
return result;
} finally {
// 3. CRITICAL: Always clean up in finally block
try {
await fs.unlink(tempFilePath);
serverLogger.info({ tempFilePath }, 'Cleaned up temporary file');
} catch (cleanupError) {
serverLogger.error({ cleanupError, tempFilePath }, 'Failed to clean up temp file');
}
}
}
GCS usage (only for AI processing):
// ✅ ONLY use GCS when required by Google Vertex AI
import { Storage } from '@google-cloud/storage';
async function uploadForVertexAI(
userId: string,
projectId: string,
assetId: string,
videoBuffer: Buffer
): Promise<string> {
const storage = new Storage({ credentials });
const bucket = storage.bucket(process.env.GCS_BUCKET_NAME!);
// 1. User-scoped path with metadata
const timestamp = Date.now();
const gcsFilePath = `video-analysis/${userId}/${projectId}/${assetId}-${timestamp}.mp4`;
const gcsFile = bucket.file(gcsFilePath);
// 2. Upload with user metadata
await gcsFile.save(videoBuffer, {
metadata: {
contentType: 'video/mp4',
metadata: {
assetId,
projectId,
userId, // CRITICAL: Track ownership
uploadedAt: new Date().toISOString(),
},
},
});
// 3. CRITICAL: Clean up after processing
try {
const result = await processWithVertexAI(`gs://${bucket.name}/${gcsFilePath}`);
return result;
} finally {
// Always delete GCS file after processing
try {
await gcsFile.delete();
serverLogger.info({ gcsFilePath }, 'Cleaned up GCS file');
} catch (deleteError) {
serverLogger.error({ deleteError, gcsFilePath }, 'Failed to clean up GCS file');
}
}
}
localStorage validation (limited use):
// ✅ ONLY use localStorage for non-persistent UI state (theme, hints, discovered eggs)
// ALWAYS validate and handle corruption
function loadDiscoveredEggs(): Set<string> {
if (typeof window === 'undefined') return new Set();
const stored = localStorage.getItem('discoveredEasterEggs');
if (!stored) return new Set();
try {
const eggs = JSON.parse(stored);
if (!Array.isArray(eggs)) {
// Clean up invalid data
localStorage.removeItem('discoveredEasterEggs');
return new Set();
}
return new Set(eggs);
} catch (error) {
trackError(error, {
category: ErrorCategory.CLIENT,
context: { operation: 'loadDiscoveredEggs' },
});
localStorage.removeItem('discoveredEasterEggs');
return new Set();
}
}
function loadTimestamp(key: string): number | null {
const stored = localStorage.getItem(key);
if (!stored) return null;
const parsed = parseInt(stored, 10);
if (isNaN(parsed)) {
// CRITICAL: Clean up corrupt timestamps
localStorage.removeItem(key);
return null;
}
return parsed;
}
Storage patterns checklist:
- User-scoped paths:
{user_id}/{project_id}/{type}/{filename} - Storage upload BEFORE database insert
- Cleanup storage on database insert failure (in catch block)
- Temporary files use
os.tmpdir()withfinallyblock cleanup - GCS only for AI processing, with cleanup after
- GCS paths include
userIdfor audit trail - NO localStorage for persistent data (use Supabase Database)
- localStorage validation: try/catch + NaN checks + cleanup on corruption
- RLS policies enforce
(storage.foldername(name))[1] = auth.uid()::text - Signed URLs for retrieval (1-hour expiration)
Reference: See /docs/STORAGE_GUIDE.md for complete patterns and examples.
7. Testing - Write Clear, Maintainable Tests
When writing tests, ALWAYS use this structure:
Test template (AAA pattern):
// __tests__/services/myService.test.ts
import { createMockSupabaseClient, mockAuthenticatedUser } from '@/test-utils';
import { MyService } from '@/lib/services/myService';
describe('MyService', () => {
let service: MyService;
let mockSupabase: ReturnType<typeof createMockSupabaseClient>;
// ✅ ALWAYS reset mocks before each test
beforeEach(() => {
mockSupabase = createMockSupabaseClient();
service = new MyService(mockSupabase);
});
afterEach(() => {
jest.clearAllMocks();
});
// ✅ ALWAYS write descriptive test names
it('creates thing with valid data', async () => {
// ARRANGE - Set up test data and mocks
const userId = 'user-123';
const data = { name: 'Test Thing' };
mockSupabase.mockResolvedValue({
data: { id: 'thing-123', name: 'Test Thing' },
error: null,
});
// ACT - Perform the action
const result = await service.createThing(userId, data);
// ASSERT - Verify the outcome
expect(result).toEqual({ id: 'thing-123', name: 'Test Thing' });
expect(mockSupabase.from).toHaveBeenCalledWith('things');
});
});
Component test template:
// __tests__/components/MyComponent.test.tsx
import { render, screen, waitFor } from '@/test-utils';
import userEvent from '@testing-library/user-event';
import { MyComponent } from '@/components/MyComponent';
describe('MyComponent', () => {
it('submits form when user clicks submit', async () => {
// ARRANGE
const user = userEvent.setup();
const onSubmit = jest.fn();
render(<MyComponent onSubmit={onSubmit} />);
// ACT
await user.type(screen.getByLabelText('Name'), 'John');
await user.click(screen.getByRole('button', { name: 'Submit' }));
// ASSERT
await waitFor(() => {
expect(onSubmit).toHaveBeenCalledWith({ name: 'John' });
});
});
});
API route test template:
// __tests__/api/my-route/route.test.ts
import { NextRequest } from 'next/server';
import { POST } from '@/app/api/my-route/route';
import { createMockSupabaseClient, mockAuthenticatedUser } from '@/test-utils';
import { mockWithAuth } from '@/test-utils/mockWithAuth';
jest.mock('@/lib/api/withAuth', () => ({ withAuth: mockWithAuth }));
jest.mock('@/lib/supabase', () => ({
createServerSupabaseClient: jest.fn(),
}));
describe('POST /api/my-route', () => {
let mockSupabase: ReturnType<typeof createMockSupabaseClient>;
beforeEach(() => {
mockSupabase = createMockSupabaseClient();
require('@/lib/supabase').createServerSupabaseClient.mockResolvedValue(mockSupabase);
});
it('returns 200 with valid data', async () => {
// ARRANGE
mockAuthenticatedUser(mockSupabase);
mockSupabase.mockResolvedValue({ data: { id: '123' }, error: null });
// ACT
const request = new NextRequest('http://localhost/api/my-route', {
method: 'POST',
body: JSON.stringify({ name: 'Test' }),
});
const response = await POST(request);
// ASSERT
expect(response.status).toBe(200);
const data = await response.json();
expect(data.id).toBe('123');
});
});
8. Style & Naming - Consistent Code Style
Naming conventions to apply:
// ✅ Variables and functions: camelCase
const userName = 'John';
function calculateTotal() {}
// ✅ Components and types: PascalCase
interface UserProfile {}
function UserCard() {}
// ✅ Constants: SCREAMING_SNAKE_CASE
const MAX_RETRIES = 3;
const API_TIMEOUT_MS = 5000;
// ✅ Booleans: is/has/should prefix
const isLoading = true;
const hasError = false;
const shouldRetry = true;
Import organization:
// ✅ ALWAYS organize imports in this order:
// 1. React/Next.js
import { NextRequest } from 'next/server';
import { useState, useEffect } from 'react';
// 2. Third-party libraries
import { create } from 'zustand';
// 3. Absolute imports (@/)
import { withAuth } from '@/lib/api/withAuth';
import { Button } from '@/components/ui/Button';
// 4. Relative imports
import { helper } from './helper';
// 5. Type imports (separate)
import type { User } from '@/types/api';
How to Use This Skill
This skill automatically applies when you're building. Just describe what you want to build:
Example 1: Building an API Route
You say:
"Create an API route for updating user profiles"
This skill ensures:
- Uses withAuth middleware
- Includes input validation
- Applies appropriate rate limiting
- Uses service layer for business logic
- Follows error handling patterns
- Includes proper TypeScript types
Example 2: Creating a Component
You say:
"Build a reusable Modal component"
This skill ensures:
- Uses forwardRef pattern
- Props interface with proper extends
- Correct hook ordering
- Type-safe implementation
- Accessible ARIA attributes
Example 3: Writing Tests
You say:
"Write tests for the ProfileService"
This skill ensures:
- AAA pattern (Arrange-Act-Assert)
- Descriptive test names
- Proper mock setup/teardown
- Uses project test utilities
- Async handling with waitFor
Example 4: Integrating with an API
You say:
"Add Stripe payment processing to checkout"
The skill FIRST does:
- Fetches Stripe API docs using Firecrawl:
mcp__firecrawl__firecrawl_scrape({
url: 'https://stripe.com/docs/api/checkout/sessions',
formats: ['markdown'],
});
- Checks local Stripe docs:
Glob: "docs/**/*stripe*.md"
Grep: "stripe" in docs/
Read: any found documentation files
- Cross-references patterns to determine correct implementation
- Then builds following verified patterns from official + local docs
This skill ensures:
- Latest API patterns from official docs
- Consistency with existing project integrations
- Proper authentication method
- Correct error handling for that API
- Appropriate rate limiting
- Version compatibility
What Makes This Different
This skill is PRESCRIPTIVE with VALIDATION:
- ✅ Guides you WHILE coding (prevents mistakes)
- ✅ Applies patterns to NEW code automatically
- ✅ Keeps you on track as you build
- ✅ Spawns validation subagent after completion
- ✅ Verifies quality, best practices, and API compliance
Two-Phase Approach:
- Development Phase: Guides with templates and patterns
- Validation Phase: Subagent checks quality automatically
This is NOT a full codebase audit tool:
- ❌ Doesn't scan the entire codebase
- ❌ Doesn't generate historical reports
- ✅ DOES validate the specific feature you just built
Automatic Quality Validation
CRITICAL: After completing any code implementation, ALWAYS spawn a validation subagent using the Task tool.
Validation Process
After writing code, spawn a quality check subagent:
Task({
subagent_type: 'general-purpose',
description: 'Quality validation: [feature name]',
prompt: `
Validate the newly created [feature] for quality and standards compliance.
Files to check: [list files]
Perform these checks:
1. **Code Quality**:
- TypeScript: Branded types, no 'any', explicit return types
- Error handling: try/catch, trackError, user-friendly messages
- Functions: Focused, single responsibility, proper naming
2. **Architecture Patterns**:
- API routes: withAuth + rate limiting + validation
- Services: Dependency injection, error tracking, caching
- Components: forwardRef, hook order, accessibility
- Generation: Supabase queues (if applicable)
3. **Best Practices**:
- Import organization
- Naming conventions
- Testing patterns (AAA, descriptive names)
4. **API Integration** (if external API used):
- Use Firecrawl to fetch official docs: [API docs URL]
- Check local docs: Grep "[api-name]" in docs/
- Compare our patterns against official
- Verify authentication, error handling, rate limiting
Report findings as:
- ✅ Compliant: Correctly followed
- ⚠️ Warning: Minor issue, suggest fix
- ❌ Critical: Must fix before deployment
`,
});
The subagent will:
- Read the files you created
- Check against all project standards
- Fetch official API docs if APIs were used
- Compare patterns against documentation
- Report issues with severity and fixes
You must then:
- Fix ❌ critical issues immediately
- Address ⚠️ warnings if time permits
- Document any intentional deviations
Validation Examples
Example 1: API Route
After creating an API route, validate it:
Task({
subagent_type: 'general-purpose',
description: 'Validate projects API route',
prompt: `
Validate: app/api/projects/route.ts
Check:
- withAuth middleware present
- Rate limiting (tier2 for creation)
- Input validation (validateAll)
- Uses ProjectService (not inline logic)
- Error/success response helpers
- TypeScript return types
- trackError for errors
API docs check:
- Fetch Supabase docs with Firecrawl
- Compare our Supabase usage against official
- Verify error handling patterns
Report compliance and issues.
`,
});
Example 2: Generation Feature
After implementing video generation:
Task({
subagent_type: 'general-purpose',
description: 'Validate video generation feature',
prompt: `
Validate video generation across:
- app/api/video-gen/route.ts
- lib/services/videoService.ts
CRITICAL checks:
1. Uses Supabase queue (generation_queue table)
2. Returns job ID immediately (doesn't wait)
3. Has separate status polling endpoint
API integration:
- Fetch official docs: https://runwayml.com/docs/api
- Compare our integration vs official
- Verify auth method matches
- Check error handling conventions
Report all deviations.
`,
});
Example 3: React Component
After creating a component:
Task({
subagent_type: 'general-purpose',
description: 'Validate VideoPlayer component',
prompt: `
Validate: components/VideoPlayer.tsx
Check:
- forwardRef pattern
- Props extend HTML attributes
- Hook order: context → state → refs → effects → custom
- Accessibility: keyboard, ARIA, focus, contrast
- Error handling
- TypeScript types
Report compliance.
`,
});
Why Automatic Validation Matters
- ✅ Catches anti-patterns before code review
- ✅ Verifies API integrations match official docs
- ✅ Ensures security patterns applied
- ✅ Maintains consistent quality
- ✅ Documents decisions and deviations
- ✅ Reduces bug risk
API Integration Requirements
CRITICAL: When building features that connect to ANY API (Supabase, third-party services, external APIs), you MUST verify patterns before implementing.
Workflow: Building API Integrations
Step 1: Identify the API
Determine which external service is being integrated:
- Supabase (database operations)
- Stripe (payments)
- Google Cloud (AI, storage, etc.)
- OpenAI, Anthropic (AI models)
- Third-party APIs
Step 2: Fetch Official Documentation
Use Firecrawl to get the latest API documentation:
// ALWAYS do this FIRST when building API features
mcp__firecrawl__firecrawl_scrape({
url: '[official-api-docs-url]',
formats: ['markdown'],
onlyMainContent: true,
});
What to look for in official docs:
- Authentication methods (API keys, OAuth, tokens)
- Request/response format and types
- Error handling patterns
- Rate limiting requirements
- SDK usage examples
- Latest API version and changes
- Best practices and warnings
Example URLs to fetch:
- Supabase JavaScript SDK:
https://supabase.com/docs/reference/javascript/[method] - Stripe API:
https://stripe.com/docs/api/[resource] - Google Cloud:
https://cloud.google.com/[service]/docs - OpenAI:
https://platform.openai.com/docs/api-reference - Anthropic:
https://docs.anthropic.com/en/api
Step 3: Check Local Documentation
Search for existing project-specific patterns:
# Search for API-specific documentation
Glob: "docs/**/*[api-name]*.md"
Glob: "docs/integrations/**/*.md"
# Search for existing usage in code
Grep: "[api-name]" in lib/services/
Grep: "import.*[api-name]" in lib/
# Check API guide
Read: docs/api/API_GUIDE.md
Read: docs/api/API_REFERENCE.md
What to look for in local docs:
- Existing integration patterns
- Project-specific wrappers or utilities
- Environment variable naming
- Error handling conventions
- Caching strategies
- Rate limiting implementations
- Known issues or workarounds
Step 4: Cross-Reference and Reconcile
Compare official docs with local patterns:
- Official docs are the source of truth for API usage
- Local docs override for project-specific patterns
- Document deviations if local pattern differs from official
- Update local docs if official API has changed
Decision matrix:
- Official doc pattern exists + Local doc exists → Use local pattern (assumes project-specific customization)
- Official doc pattern exists + No local doc → Use official pattern and create local doc
- Official pattern changed + Local pattern outdated → Use new official pattern and update local doc
- No clear pattern in either → Use official doc pattern and document as new local pattern
Step 5: Implementation Checklist
Before writing code, verify:
- Fetched official API docs with Firecrawl
- Checked for local API documentation
- Reviewed existing code using this API (Grep in lib/)
- Verified authentication method
- Confirmed error handling approach
- Checked rate limiting requirements
- Reviewed API version compatibility
- Identified environment variables needed
- Documented any deviations from official patterns
Step 6: Document New Integrations
If this is a new API integration:
- Create integration doc at
docs/integrations/[api-name].md - Document authentication method used
- Document error patterns specific to this API
- Document rate limiting strategy
- Add environment variables to
.env.example - Add to API_REFERENCE.md if it's a project API endpoint
Common API Integration Patterns
Supabase Database Operations:
// ALWAYS check: https://supabase.com/docs/reference/javascript/[method]
const { data, error } = await supabase.from('table').select('*');
if (error) {
trackError(error, {
category: ErrorCategory.DATABASE,
context: { operation: 'select', table: 'table' },
});
}
Stripe Payment Processing:
// ALWAYS check: https://stripe.com/docs/api/checkout/sessions
const session = await stripe.checkout.sessions.create({
// Configuration from official docs
});
Third-Party API Calls:
// Check official docs for authentication and request format
const response = await fetch('https://api.service.com/endpoint', {
headers: {
Authorization: `Bearer ${process.env.SERVICE_API_KEY}`,
'Content-Type': 'application/json',
},
});
When to Skip API Verification
You can skip the full verification workflow for:
- Internal project APIs (your own
/api/*routes) - Well-established patterns already used 10+ times in codebase
- Minor variations of existing integrations (same API, different endpoint)
But you should ALWAYS verify for:
- New API integrations (first time using this service)
- Major API version updates (v1 → v2, breaking changes)
- Changed authentication methods
- Deprecated endpoints or methods
Key Principles
When building new features, always remember:
- Type Safety First: Use branded types for IDs, never
any, always specify return types - Security by Default: Every protected route needs withAuth + validation + rate limiting
- Asynchronous Generation: ALL generation operations (images, video, upscaling, audio) MUST use Supabase queues for persistence
- Separate Concerns: Business logic goes in services, not API routes
- Test What You Build: Write tests as you code using AAA pattern
- Follow Conventions: Naming, imports, and style must be consistent
- Accessibility Matters: All components must meet WCAG 2.1 AA standards (keyboard nav, ARIA, alt text, focus, contrast)
- Don't Build Large, Refactorable Code: Keep functions focused, components small, logic extracted
- Verify API Patterns: Always check official docs (Firecrawl) and local docs before implementing API integrations
Quick Reference
When creating an API route:
withAuth(handler, { route, rateLimit });
validateAll([validateString(), validateUUID()]);
new MyService(supabase);
When creating a component:
React.forwardRef<ElementType, PropsInterface>()
interface Props extends React.HTMLAttributes
Hook order: context → state → refs → effects → custom
Accessibility: keyboard nav, ARIA, alt text, focus, contrast (WCAG AA)
When writing a test:
describe() → beforeEach() → afterEach()
it('descriptive name', async () => {
// ARRANGE, ACT, ASSERT
})
References
/docs/CODING_BEST_PRACTICES.md- Complete patterns and examples/docs/TESTING_GUIDE.md- Testing standards and utilities/docs/STYLE_GUIDE.md- Formatting and naming rules/CLAUDE.md- Documentation and workflow guidelines