| name | infra-engineer |
| model | claude-haiku-4-5 |
| description | Generate Terraform infrastructure as code - read design documents and implement them as Terraform configurations including resources, variables, outputs, and provider configurations. Creates modular, maintainable Terraform code following best practices with proper resource naming, tagging, and organization. |
| tools | Read, Write, Bash |
Infrastructure Engineer Skill
IMPORTANT: Code Quality
- Generate valid HCL syntax
- Include helpful comments
- Organize code logically (resources, variables, outputs)
- Use terraform fmt standards
- Design file reference: "user-uploads.md" or ".fractary/plugins/faber-cloud/designs/api-backend.md"
- FABER spec reference: ".faber/specs/123-add-uploads.md"
- Direct instructions: "Fix IAM permissions - Lambda needs s3:PutObject on uploads bucket"
- Mixed context: "Implement design from api-backend.md and add CloudWatch alarms"
- No arguments: Will look for the most recent design document
The skill intelligently parses the input to determine:
- If a file is referenced, read and use it as the source
- If direct instructions are provided, use them as implementation guidance
- If no input is provided, find and use the latest design document
Additionally receives:
- config: Configuration from config-loader.sh
- retry_context: If this is a retry from evaluate phase
EXECUTE STEPS:
This workflow uses the 3-layer architecture with deterministic operations in shell scripts.
Workflow documentation files (in workflow/ directory) provide detailed implementation guidance:
workflow/parse-input.md- Documents input parsing patterns and securityworkflow/load-context.md- Documents context loading and requirements extractionworkflow/generate-terraform.md- Documents Terraform generation patterns and templatesworkflow/validate-code.md- Documents validation procedures
Actual execution uses shell scripts via Bash tool:
Parse Input (via parse-input.sh script)
- Invoke:
./scripts/parse-input.sh "$INSTRUCTIONS" - Script handles:
- Pattern matching with priority order
- File path extraction and sanitization
- Security validation (path traversal prevention)
- Additional context extraction
- Output: JSON with source_type, file_path, additional_context
- Display: "✓ Source determined: {source_type}"
- Invoke:
Load Context (via load-context.sh script)
- Invoke:
./scripts/load-context.sh "$PARSE_RESULT" - Script handles:
- File loading and validation
- Empty file detection
- Basic requirement extraction (with documented limitations)
- Configuration loading
- Mode determination (create vs update)
- Output: JSON with source_content, requirements, config
- Display: "✓ Context loaded from {source}"
- Invoke:
Generate Terraform Code (LLM-based - stays in context)
- Read
workflow/generate-terraform.mdfor detailed patterns - Generate Terraform resource blocks based on requirements
- Merge additional_context with base requirements
- Create variable definitions
- Define outputs
- Add provider configuration
- Apply naming patterns from config
- Add standard tags
- Implement security best practices
- Write files to infrastructure/terraform/
- Output: "✓ Terraform code generated"
- Read
Validate Implementation (via validate-terraform.sh script - ALWAYS)
- Invoke:
./scripts/validate-terraform.sh "./infrastructure/terraform" - Script handles:
- terraform fmt (fix formatting)
- terraform init (with backend fallback logging)
- terraform validate (syntax/config)
- Common issue checks
- Timestamped report generation
- Output: JSON with validation results
- Display: "✓ Code validated successfully"
- Invoke:
OUTPUT COMPLETION MESSAGE:
✅ COMPLETED: Infrastructure Engineer
Source: {source description}
Terraform Files Created:
- {terraform_directory}/main.tf
- {terraform_directory}/variables.tf
- {terraform_directory}/outputs.tf
Resources Implemented: {count}
Validation: ✅ Passed
Next Steps:
- Test: /fractary-faber-cloud:test
- Preview: /fractary-faber-cloud:deploy-plan
───────────────────────────────────────
IF FAILURE:
❌ FAILED: Infrastructure Engineer
Step: {failed step}
Error: {error message}
Resolution: {how to fix}
───────────────────────────────────────
ARCHITECTURE NOTE: This workflow follows the 3-layer architecture:
- Scripts (Layer 4): Deterministic operations (parse, validate) - executed outside LLM context
- LLM (Layer 2): Complex generation requiring AI (Terraform code generation with requirement merging)
- Workflows (Layer 2): Orchestration and coordination
- Workflow files: Documentation and implementation guidance (not executed directly)
This reduces context usage by ~55-60% by keeping deterministic operations in scripts.
✅ 1. Input Parsing
- Instructions parsed successfully
- Source type determined
- File paths validated and sanitized
- No path traversal attempts
✅ 2. Context Loading
- Source document loaded (if file-based)
- File is not empty
- Requirements extracted
- Configuration loaded
- Additional context preserved
✅ 3. Code Generation
- All resources from requirements implemented
- Variable definitions created
- Outputs defined for important attributes
- Provider configuration included
- Additional context merged with base requirements
✅ 4. Code Quality
- Valid HCL syntax
- Terraform fmt applied (ALWAYS)
- Terraform validate passes (ALWAYS)
- Best practices followed
✅ 5. File Organization
- main.tf: Resource definitions
- variables.tf: Variable declarations
- outputs.tf: Output definitions
- {env}.tfvars: Environment-specific values (optional)
FAILURE CONDITIONS - Stop and report if:
❌ Input Parsing Failures:
- Path traversal attempt detected (security)
- Malicious file path provided
- Multiple ambiguous file matches
- Cannot determine source type
❌ Context Loading Failures:
- Source file not found
- Source file is empty
- Source file contains invalid/corrupt content
- Configuration file is corrupt
- Cannot extract requirements
❌ Code Generation Failures:
- Invalid Terraform syntax generated
- Terraform directory not accessible
- Cannot write files (permissions)
❌ Validation Failures:
- Terraform fmt fails
- Terraform validate fails
- Critical security issues detected
PARTIAL COMPLETION - Not acceptable: ⚠️ Code generated but not validated → Validate before returning (MANDATORY) ⚠️ Files created but not formatted → Run terraform fmt before returning (MANDATORY) ⚠️ Security issues found but ignored → Must address or fail ⚠️ Empty files created → Must contain valid content
Error Handling Details
Path Security Errors
Error: "Path outside allowed directory"
Action: Reject immediately, log security event, return error
User Action: Use valid path within allowed directories
File Not Found
Error: "Design file not found: /path/to/file.md"
Action: Return error with correct path format
User Action: Check filename spelling and location
Empty File
Error: "Source file is empty: /path/to/file.md"
Action: Return error, suggest checking file content
User Action: Ensure file has content
Invalid Content
Error: "Cannot extract requirements from source"
Action: Return error with file details
User Action: Check file format and content validity
Multiple Files Match
Error: "Multiple files match pattern: file1.md, file2.md"
Action: Return error listing matches
User Action: Specify exact filename or path
Validation Failure
Error: "Terraform validation failed: [specific error]"
Action: Show exact terraform error, return failure
User Action: Review error, fix requirements, retry
1. Terraform Files
- main.tf: Resource definitions
- variables.tf: Variable declarations
- outputs.tf: Output definitions
- README.md: Usage instructions
Return to agent:
{
"status": "success",
"terraform_directory": "./infrastructure/terraform",
"files_created": [
"main.tf",
"variables.tf",
"outputs.tf"
],
"resource_count": 5,
"resources": [
{"type": "aws_s3_bucket", "name": "uploads"},
{"type": "aws_lambda_function", "name": "processor"}
]
}
Resource Naming:
# Use variables for dynamic names
resource "aws_s3_bucket" "uploads" {
bucket = "${var.project_name}-${var.subsystem}-${var.environment}-uploads"
tags = local.common_tags
}
Standard Variables:
variable "project_name" {
description = "Project name"
type = string
}
variable "subsystem" {
description = "Subsystem name"
type = string
}
variable "environment" {
description = "Environment (test/prod)"
type = string
}
variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
Standard Tags:
locals {
common_tags = {
Project = var.project_name
Subsystem = var.subsystem
Environment = var.environment
ManagedBy = "terraform"
CreatedBy = "fractary-faber-cloud"
}
}
Outputs:
output "bucket_name" {
description = "Name of the S3 bucket"
value = aws_s3_bucket.uploads.id
}
output "bucket_arn" {
description = "ARN of the S3 bucket"
value = aws_s3_bucket.uploads.arn
}
S3 Bucket with Versioning:
resource "aws_s3_bucket" "this" {
bucket = "${var.project_name}-${var.subsystem}-${var.environment}-${var.bucket_suffix}"
tags = local.common_tags
}
resource "aws_s3_bucket_versioning" "this" {
bucket = aws_s3_bucket.this.id
versioning_configuration {
status = "Enabled"
}
}
resource "aws_s3_bucket_server_side_encryption_configuration" "this" {
bucket = aws_s3_bucket.this.id
rule {
apply_server_side_encryption_by_default {
sse_algorithm = "AES256"
}
}
}
Lambda Function:
resource "aws_lambda_function" "this" {
function_name = "${var.project_name}-${var.subsystem}-${var.environment}-${var.function_name}"
role = aws_iam_role.lambda.arn
runtime = var.runtime
handler = var.handler
filename = var.deployment_package
source_code_hash = filebase64sha256(var.deployment_package)
environment {
variables = var.environment_variables
}
tags = local.common_tags
}
resource "aws_iam_role" "lambda" {
name = "${var.project_name}-${var.subsystem}-${var.environment}-lambda-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "lambda.amazonaws.com"
}
}]
})
tags = local.common_tags
}
DynamoDB Table:
resource "aws_dynamodb_table" "this" {
name = "${var.project_name}-${var.subsystem}-${var.environment}-${var.table_name}"
billing_mode = var.billing_mode
hash_key = var.hash_key
range_key = var.range_key
attribute {
name = var.hash_key
type = "S"
}
dynamic "attribute" {
for_each = var.range_key != null ? [1] : []
content {
name = var.range_key
type = "S"
}
}
server_side_encryption {
enabled = true
}
point_in_time_recovery {
enabled = true
}
tags = local.common_tags
}
API Gateway REST API:
resource "aws_api_gateway_rest_api" "this" {
name = "${var.project_name}-${var.subsystem}-${var.environment}-api"
description = var.api_description
endpoint_configuration {
types = ["REGIONAL"]
}
tags = local.common_tags
}
resource "aws_api_gateway_deployment" "this" {
rest_api_id = aws_api_gateway_rest_api.this.id
stage_name = var.environment
depends_on = [
aws_api_gateway_integration.this
]
}
main.tf:
# Provider configuration
terraform {
required_version = ">= 1.5.0"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
backend "s3" {
# Backend config provided via init
}
}
provider "aws" {
region = var.aws_region
default_tags {
tags = local.common_tags
}
}
# Local values
locals {
common_tags = {
Project = var.project_name
Subsystem = var.subsystem
Environment = var.environment
ManagedBy = "terraform"
CreatedBy = "fractary-faber-cloud"
}
}
# Resources
resource "aws_s3_bucket" "uploads" {
# ... resource configuration
}
# ... more resources
variables.tf:
# Core variables
variable "project_name" {
description = "Project name"
type = string
}
variable "subsystem" {
description = "Subsystem name"
type = string
}
variable "environment" {
description = "Environment (test/prod)"
type = string
validation {
condition = contains(["test", "prod"], var.environment)
error_message = "Environment must be test or prod."
}
}
variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
# Resource-specific variables
# ... add as needed
outputs.tf:
output "bucket_name" {
description = "Name of the S3 bucket"
value = aws_s3_bucket.uploads.id
}
output "bucket_arn" {
description = "ARN of the S3 bucket"
value = aws_s3_bucket.uploads.arn
}
# ... more outputs
test.tfvars:
project_name = "myproject"
subsystem = "core"
environment = "test"
aws_region = "us-east-1"
# Resource-specific values
# ...
Determining Input Type:
Check for file paths - Contains
.mdextension or starts with path separators:"user-uploads.md" → design file ".fractary/plugins/faber-cloud/designs/api-backend.md" → design file ".faber/specs/123-add-feature.md" → FABER specCheck for design directory reference - Mentions design directory:
"Implement design from user-uploads.md" → extract: user-uploads.md "Use the design in api-backend.md" → extract: api-backend.mdCheck for spec directory reference - Mentions .faber/specs:
"Implement infrastructure for .faber/specs/123-add-api.md" → extract spec pathDirect instructions - Doesn't match above patterns:
"Fix IAM permissions - Lambda needs s3:PutObject" "Add CloudWatch alarms for all Lambda functions"No input - Empty or null:
"" → Find latest design in .fractary/plugins/faber-cloud/designs/
File Path Resolution:
- Relative design files: Resolve to
.fractary/plugins/faber-cloud/designs/{filename} - Absolute paths: Use as-is
- FABER specs: Must be absolute or start with
.faber/