| name | ftpc-storage |
| description | Read files from remote storage backends (local, FTP, SFTP, S3, Azure). List directories, download files, inspect metadata. Use for reading data from cloud storage, FTP servers, or remote filesystems without making changes. |
| allowed-tools | Read, Grep, Glob, Bash(python3:*) |
FTPC Storage (Read-Only)
Use the ftpc library to read files from various storage backends.
Supported Backends
| Protocol | URL Format | Example |
|---|---|---|
| Local | file:///path or /path |
/home/user/data |
| FTP | ftp://[user:pass@]host[:port]/path |
ftp://ftp.example.com/pub |
| FTPS | ftps://[user:pass@]host[:port]/path |
ftps://secure.example.com |
| SFTP | sftp://[user:pass@]host[:port]/path |
sftp://user:pass@host/data |
| S3 | s3://bucket/path |
s3://my-bucket/folder |
| Azure Data Lake | azure://account.dfs.core.windows.net/fs/path |
azure://myacct.dfs.core.windows.net/data |
| Azure Blob | blob://account.blob.core.windows.net/container/path |
blob://myacct.blob.core.windows.net/files |
Quick Start
from ftpc import connect_sync
# Connect using URL and list files
with connect_sync("s3://my-bucket") as store:
files = store.list("/")
for f in files:
print(f"{f.name} {'DIR' if f.is_directory else f.size}")
Available Operations
List Directory
from ftpc import connect_sync
with connect_sync("ftp://ftp.example.com") as store:
# List root (base path from URL)
files = store.list()
# List specific path
files = store.list("/documents")
# Each file is a FileDescriptor with:
# - name: str (filename only)
# - path: PurePath (full path)
# - is_file: bool
# - is_directory: bool
# - size: Optional[int] (bytes, None for directories)
# - modified_time: Optional[datetime]
Download File
from ftpc import connect_sync
with connect_sync("sftp://user:pass@host") as store:
# Download to local path
store.download("/remote/file.csv", "local_file.csv")
# With progress tracking
def progress(bytes_done: int) -> bool:
print(f"Downloaded {bytes_done} bytes")
return True # Return False to cancel
store.download("/large_file.zip", "output.zip", progress)
Using Named Remotes from Config
If ~/.ftpcconf.toml exists with configured remotes:
# ~/.ftpcconf.toml
[my-s3]
type = "s3"
bucket = "my-bucket"
region = "us-east-1"
[work-ftp]
type = "ftp"
url = "ftp.company.com"
username = "user"
password = "secret"
Load config and create client:
from ftpc.config import Config
from ftpc.clients.s3client import S3Client
config = Config.from_file() # Loads ~/.ftpcconf.toml
remote = config.remotes["my-s3"]
# Create client from config (varies by type)
with S3Client(bucket_name=remote.bucket, region_name=remote.region) as client:
files = client.ls("/")
Async Usage
import asyncio
from ftpc import Storage
async def main():
async with Storage.connect("s3://bucket") as store:
files = await store.list("/")
await store.download("/data.csv", "local.csv")
asyncio.run(main())
Named Constructors (Explicit Configuration)
from ftpc import Storage
# S3 with explicit credentials
with Storage.s3(
bucket="my-bucket",
region="us-east-1",
access_key_id="AKIAIOSFODNN7EXAMPLE",
secret_access_key="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
).sync() as store:
files = store.list("/")
# FTP with explicit config
with Storage.ftp(
host="ftp.example.com",
username="user",
password="pass",
tls=True # Use FTPS
).sync() as store:
store.download("/file.txt", "local.txt")
# SFTP with key file
with Storage.sftp(
host="server.example.com",
username="deploy",
key_filename="/home/user/.ssh/id_rsa"
).sync() as store:
files = store.list("/var/data")
Dependencies
Install required backends:
pip install ftpc # Core (local + FTP)
pip install ftpc[sftp] # + SFTP (paramiko)
pip install ftpc[s3] # + S3 (boto3)
pip install ftpc[azure] # + Azure (azure-storage-*)
pip install ftpc[all] # All backends