Claude Code Plugins

Community-maintained marketplace

Feedback

Fetch data from URLs. Use when asked to download content, fetch remote files, or read web data.

Install Skill

1Download skill
2Enable skills in Claude

Open claude.ai/settings/capabilities and find the "Skills" section

3Upload to Claude

Click "Upload skill" and select the downloaded ZIP file

Note: Please verify skill by going through its instructions before using it.

SKILL.md

name wget-reader
description Fetch data from URLs. Use when asked to download content, fetch remote files, or read web data.
version 1.0.0

Wget URL Reader

Overview

Fetches content from URLs using wget command-line tool. Supports downloading files, reading web pages, and retrieving API responses.

Instructions

  1. When user provides a URL to read or fetch:

    • Validate the URL format
    • Use wget with appropriate flags based on content type
  2. For reading content to stdout (display):

    wget -qO- "<URL>"
    
  3. For downloading files:

    wget -O "<filename>" "<URL>"
    
  4. For JSON API responses:

    wget -qO- --header="Accept: application/json" "<URL>"
    
  5. Common wget flags:

    • -q: Quiet mode (no progress output)
    • -O-: Output to stdout
    • -O <file>: Output to specific file
    • --header: Add custom HTTP header
    • --timeout=<seconds>: Set timeout
    • --tries=<n>: Number of retries
    • --user-agent=<agent>: Set user agent

Examples

Example: Read webpage content

Input: "Read the content from https://example.com" Command:

wget -qO- "https://example.com"

Example: Download a file

Input: "Download the file from https://example.com/data.json" Command:

wget -O "data.json" "https://example.com/data.json"

Example: Fetch API with headers

Input: "Fetch JSON from https://api.example.com/data" Command:

wget -qO- --header="Accept: application/json" "https://api.example.com/data"

Example: Download with timeout and retries

Input: "Download with 30 second timeout" Command:

wget --timeout=30 --tries=3 -O "output.txt" "<URL>"

Guidelines

  • Always quote URLs to handle special characters
  • Use -q flag to suppress progress bars in scripts
  • For large files, consider adding --show-progress for user feedback
  • Respect robots.txt and rate limits when fetching multiple URLs
  • Use --no-check-certificate only when necessary (self-signed certs)
  • For authentication, use --user and --password or --header="Authorization: Bearer <token>"