Claude Code Plugins

Community-maintained marketplace

Feedback

training-metrics-designer

@bailejl/AI-Enablement-Resources
0
0

This skill should be used when designing measurement plans and evaluation strategies for training programs. Use this skill to create metrics frameworks, design assessments, build ROI calculations, and establish baseline and outcome measurements.

Install Skill

1Download skill
2Enable skills in Claude

Open claude.ai/settings/capabilities and find the "Skills" section

3Upload to Claude

Click "Upload skill" and select the downloaded ZIP file

Note: Please verify skill by going through its instructions before using it.

SKILL.md

name training-metrics-designer
description This skill should be used when designing measurement plans and evaluation strategies for training programs. Use this skill to create metrics frameworks, design assessments, build ROI calculations, and establish baseline and outcome measurements.

Training Metrics Designer

Overview

The Training Metrics Designer skill helps establish comprehensive measurement strategies for training programs. It designs evaluation frameworks using Kirkpatrick levels, creates behavior change metrics, establishes ROI calculations, and enables data-driven improvement.

When to Use This Skill

  • Designing pre/post assessments for training
  • Creating behavior change metrics and tracking
  • Building ROI/business impact calculations
  • Establishing baseline and outcome measurements
  • Designing feedback collection for pilot iterations
  • Creating learning journey metrics beyond workshop
  • Planning long-term sustained measurement

Kirkpatrick Four Levels

Level 1 - Reaction: Did learners like it?

  • Surveys immediately after training
  • Satisfaction and relevance

Level 2 - Learning: Did learners learn?

  • Knowledge/skill assessments
  • Pre/post testing

Level 3 - Behavior: Did learners change behavior?

  • On-the-job application tracking
  • Manager observation
  • Peer feedback

Level 4 - Results: Did business metrics improve?

  • Productivity, quality, speed
  • Error reduction
  • Customer satisfaction
  • Revenue impact

Metrics Design Framework

For each training intervention:

  1. Business Outcome: What should improve? (specific, measurable)
  2. Behavior Metrics: What behaviors enable that outcome?
  3. Application Metrics: Who's using new skills? How often?
  4. Baseline: Current state before training
  5. Target: Desired outcome after training
  6. Tracking: How/when/who measures?
  7. ROI: Calculate training investment vs. improvement value

Resources

Reference templates for:

  • Kirkpatrick assessment design
  • Baseline/outcome measurement plans
  • Feedback survey templates
  • ROI calculation frameworks
  • Behavior tracking tools
  • Business impact assessment

Integration with Other Skills

  • training-designer: Use together to design sessions
  • training-reviewer: Validate materials
  • training-content-creator: Generate actual content
  • Use these alongside learning-journey-builder for complete programs

Best Practices

Do:

  • Focus on learner outcomes and business impact
  • Use real work examples and scenarios
  • Design for application, not knowledge recall
  • Measure what matters

Don't:

  • Train what doesn't drive behavior change
  • Use simulated practice over real work
  • Over-design solutions for simple problems
  • Ignore the business context