Claude Code Plugins

Community-maintained marketplace

Feedback

multiprocessing-python

@benchflow-ai/skillsbench
15
0

Use Python's multiprocessing module for parallel execution. Use when parallelizing CPU-bound tasks, creating process pools, sharing data between processes, and avoiding the GIL.

Install Skill

1Download skill
2Enable skills in Claude

Open claude.ai/settings/capabilities and find the "Skills" section

3Upload to Claude

Click "Upload skill" and select the downloaded ZIP file

Note: Please verify skill by going through its instructions before using it.

SKILL.md

name multiprocessing-python
description Use Python's multiprocessing module for parallel execution. Use when parallelizing CPU-bound tasks, creating process pools, sharing data between processes, and avoiding the GIL.

Python Multiprocessing

Process Pool

from multiprocessing import Pool

# Map function to iterable
with Pool(4) as pool:
    results = pool.map(square, range(100))
    results = pool.starmap(multiply, [(1,2), (3,4)])  # Multiple args
    async_result = pool.map_async(square, range(100))

Basic Process

from multiprocessing import Process

p = Process(target=worker, args=("A", 10))
p.start()
p.join()

Data Sharing

from multiprocessing import Value, Array, Manager

# Shared primitives
counter = Value('i', 0)
shared_array = Array('d', [0.0] * 10)

def worker(counter, arr):
    with counter.get_lock():
        counter.value += 1
    arr[0] = 42

# Complex objects via Manager
with Manager() as m:
    shared_dict = m.dict()
    shared_list = m.list()

Communication

from multiprocessing import Queue, Pipe

# Queue
queue = Queue()
queue.put(item)
item = queue.get()

# Pipe
parent, child = Pipe()
parent.send("msg")
msg = child.recv()

Synchronization

from multiprocessing import Lock, Barrier

# Lock
lock = Lock()
with lock:
    # Critical section
    pass

# Barrier
barrier = Barrier(4)
barrier.wait()

Pool Initialization

def init_worker(shared_data):
    global worker_data
    worker_data = shared_data

with Pool(4, initializer=init_worker, initargs=(100,)) as pool:
    results = pool.map(process_item, range(10))