Claude Code Plugins

Community-maintained marketplace

Feedback

linear-bug-summary

@harshfolio/claude-orbit
0
0

Track bug metrics for a period - new bugs, bugs resolved, open bugs, bug trends, resolution rates.

Install Skill

1Download skill
2Enable skills in Claude

Open claude.ai/settings/capabilities and find the "Skills" section

3Upload to Claude

Click "Upload skill" and select the downloaded ZIP file

Note: Please verify skill by going through its instructions before using it.

SKILL.md

name linear-bug-summary
description Track bug metrics for a period - new bugs, bugs resolved, open bugs, bug trends, resolution rates.

Linear Bug Summary

Usage

linear-bug-summary for last 7 days
linear-bug-summary for last 30 days

Process

1. Load Context

board = read('../data/board.json')

2. Parse Period & Query

Calculate start/end dates from user request.

PAGINATION CRITICAL: Linear MCP returns max 250 issues per call. For each mcp__linear__list_issues() query:

  • Always check pageInfo.hasNextPage after every call
  • If true, capture pageInfo.endCursor and call again with after: cursor
  • Continue sequentially until hasNextPage is false
  • Merge results by issue ID to avoid duplicates
  • Only proceed to analysis once ALL pages are fetched

RATE LIMITING: Linear API may return "Too many subrequests" errors during high query volume:

  • Reduce query complexity: Request only essential fields, avoid deep nested queries
  • Process sequentially: Execute queries one at a time, don't parallelize
  • Minimize field requests: Only request fields you actually need for analysis
  • Reduce page size if needed: If errors persist, try smaller page sizes (e.g., 100 instead of 250)
  • If error occurs: Reduce query complexity further or split into smaller queries
// New bugs created
newBugs = mcp__linear__list_issues({
  label: "Bug",
  createdAt: startDate
})

// Bugs resolved
resolvedBugs = mcp__linear__list_issues({
  label: "Bug",
  updatedAt: startDate,
  state: "Done" OR "Released"
})

// All open bugs
openBugs = mcp__linear__list_issues({
  label: "Bug",
  includeArchived: false
})

Filter open bugs to exclude Done/Released.

3. Resolve Unknowns

If assignee/creator not in board.json:

  • Use resolveUser() helper
  • See ../utils/sync-helpers.md

4. Categorize

Group bugs by:

  • Team: See ../../docs/concepts/team-structure.md (check subteam labels first)
  • Platform: From platform labels (Web/Desktop, Mobile/Android, etc.)
  • Feature: From feature labels (Checkout, PDP, Login, etc.)
  • Severity: From priority field (0=None, 1=Urgent, 2=High, 3=Medium, 4=Low)
  • Status: Current workflow state

5. Calculate Metrics

  • New bugs in period
  • Bugs resolved in period
  • Resolution rate % (resolved / created)
  • Open bugs by status
  • Average resolution time
  • Critical bugs open

6. Generate Report

Use this exact structure for bug summary reports:

# Bug Summary Report

**Period:** [start date] to [end date] ([X] days)
**Generated:** [timestamp]

## Executive Summary

| Metric | Value | Trend |
|--------|-------|-------|
| New Bugs Created | X | [↑/↓ Y% vs last period] |
| Bugs Resolved | X | [↑/↓ Y% vs last period] |
| Resolution Rate | Y% | [Target: >80%] |
| Currently Open | X | [↑/↓ Y from last period] |
| Critical Open | X | [Priority: Urgent + High] |

**Health Score:** [🟢 Green / 🟡 Yellow / 🔴 Red] - [Brief explanation]
**Avg Resolution Time:** [X] days (Target: <7 days)
**Bug Creation Trend:** [X] bugs/week, [↑/↓ Y% from previous period]

## New Bugs Created (X)

### By Team
| Team | Count | % |
|------|-------|---|
| Backend Team | X | Y% |
| Mobile Team | X | Y% |
| WebDev Team | X | Y% |
| DevOps & Infra | X | Y% |
| [Other teams] | X | Y% |

### By Platform
| Platform | Count | Critical |
|----------|-------|----------|
| Mobile/Android | X | X |
| Web/Desktop | X | X |
| Unspecified ⚠️ | X | X |

### By Feature
| Feature | Count | Critical |
|---------|-------|----------|
| Checkout | X | X |
| [Others] | X | X |

### By Severity
| Priority | Count | Avg Age |
|----------|-------|---------|
| 🔴 Urgent | X | Y days |
| 🟠 High | X | Y days |
| ⚪ None ⚠️ | X | Y days |

### Top Bug Reporters
| Reporter | Bugs Raised | Notes |
|----------|-------------|-------|
| Arti (QA) | X | ✅ Expected - QA role |
| Anurag (QA) | X | ✅ Expected - QA role |
| [Others] | X | |

## Bugs Resolved (X)

### By Team
| Team | Count | Avg Time |
|------|-------|----------|
| Engineering | X | Y days |

### Top Fixers
| Developer | Fixed | Avg Time |
|-----------|-------|----------|
| [name] | X | Y days |

## Open Bugs (X)

### By Status
| Status | Count | Oldest |
|--------|-------|--------|
| Backlog | X | Y days |
| In Progress | X | Y days |

### Critical Open
**Urgent (X):**
- TICKET-ID: Title - [assignee] - [X]d old

## Insights

**Key Findings:**
1. [Bug concentration observation]
2. [Resolution performance]
3. [Critical bug handling]

**Recommendations:**
1. [Immediate action]
2. [Process improvement]

7. Save & Cleanup

If unknown_cache modified: saveUnknownCache()

Offer to save report (user specifies location).


Rules

  • Skip AI & Research entirely: Excluded from all reports
  • QA creating bugs is EXPECTED: Don't flag QA (Anurag, Arti) for raising many bugs - that's their job
  • Show "Top Bug Reporters" with QA marked as "✅ Expected - QA role"
  • Use "Bug" label as primary filter
  • Track both created and resolved for health picture
  • Categorize by team, platform, feature, severity
  • Show ticket ID + TITLE not just ID
  • Flag quality issues (missing labels/priority)
  • Location agnostic - user chooses save path