dill
Extended Python pickle for serializing complex Python objects — extends pickle to handle lambdas, closures, generators, classes, and functions defined in __main__. dill features: dill.dumps()/dill.loads() for bytes, dill.dump()/dill.load() for files, serialization of lambda functions, nested functions, closures with free variables, class definitions, interactive session objects, generators (as frozen state), interactively-defined functions, class instances with unpicklable attributes, and Pickle compatibility mode. Used in multiprocessing, Pathos parallel processing, and distributed computing where standard pickle fails.
Score Breakdown
⚙ Agent Friendliness
🔒 Security
CRITICAL SECURITY WARNING: dill.loads() from untrusted data executes arbitrary code — more powerful and dangerous than pickle. Only deserialize dill from trusted sources. Use HMAC signatures to verify dill bytes before loading. Never expose dill deserialization as a network service. dill can serialize objects containing secrets — protect serialized bytes with encryption at rest and in transit.
⚡ Reliability
Best When
Extending pickle to serialize Python objects that standard pickle cannot handle — lambdas, closures, and interactively-defined functions for multiprocessing, distributed computing, and session state.
Avoid When
Untrusted data deserialization (code execution risk), cross-language (use JSON/protobuf), or when standard pickle works.
Use Cases
- • Agent lambda serialization — import dill; fn = lambda x: x * 2 + offset; payload = dill.dumps(fn); restored = dill.loads(payload); restored(5) — serialize lambda; agent passes lambda functions to worker processes; cloudpickle alternative; dill handles closures with captured variables
- • Agent multiprocessing with lambdas — from multiprocessing import Pool; import dill; def run_in_pool(fn_bytes, arg): fn = dill.loads(fn_bytes); return fn(arg); fn = lambda x: x**2; args = [(dill.dumps(fn), i) for i in range(10)]; with Pool() as p: results = p.starmap(run_in_pool, args) — serialize functions for Pool; standard Pool.map() can't pickle lambdas; dill bytes workaround
- • Agent session state save — import dill; state = {'model': my_model, 'history': [(q, a) for q, a in conversation], 'config': lambda x: x * scale}; with open('agent_state.pkl', 'wb') as f: dill.dump(state, f) — checkpoint including lambdas; agent saves complete session state; restore with dill.load()
- • Agent interactive function — def make_handler(threshold): return lambda x: 'high' if x > threshold else 'low'; handler = make_handler(0.8); serialized = dill.dumps(handler); restored = dill.loads(serialized); restored(0.9) — 'high' — closure serialization; agent configures handlers at runtime with captured config values
- • Agent inspect serialization — dill.detect.badobjects(fn) — detect what makes function unpicklable; dill.pokelattr(obj, 'attr', value) — force-set attribute; agent debugging serialization failures identifies which parts of complex objects are unpicklable
Not For
- • Untrusted data — NEVER deserialize dill bytes from untrusted sources — arbitrary code execution; same warning as pickle but deserves emphasis
- • Cross-language serialization — dill is Python-only; for cross-language use JSON, protobuf, or MessagePack
- • Production APIs — dill serialization is not stable across Python versions; for APIs use JSON or protobuf
Interface
Authentication
No auth — local serialization library.
Pricing
dill is BSD licensed. Free for all use.
Agent Metadata
Known Gotchas
- ⚠ CRITICAL: NEVER deserialize dill from untrusted sources — dill.loads(untrusted_bytes) executes arbitrary Python code; same security risk as pickle but stronger because dill can serialize more dangerous objects; agent receiving dill bytes from external sources must validate source authenticity via HMAC before deserializing
- ⚠ Serialized bytes not stable across Python versions — dill bytes from Python 3.10 may fail to load in Python 3.11 or 3.12 due to internal changes; agent using dill for persistent storage must use same Python version to load; for cross-version storage use JSON or protobuf
- ⚠ C extension objects still fail — dill cannot serialize C extension types (e.g., compiled Cython, ctypes objects, some PyTorch internals); dill.dumps(cuda_tensor) may fail; agent storing ML models must use framework-specific serialization (torch.save) not dill
- ⚠ Closures capture variables by reference not value — lambda: use_later(x) captures x at definition time; if x changes before serialization, serialized closure uses changed value; agent code with shared mutable state in closures gets unexpected serialized values; use default args to capture by value: lambda x=x: use_later(x)
- ⚠ dill.dumps() may fail silently for some objects — some objects serialize without error but produce bytes that cannot be deserialized; test round-trip: assert dill.loads(dill.dumps(obj)) == obj; agent code should verify round-trip for objects before relying on dill for state persistence
- ⚠ multiprocessing.Pool uses pickle not dill — Python's Pool.map(lambda: ...) raises PicklingError because Pool uses standard pickle; work around: serialize with dill manually and pass bytes, or use pathos.multiprocessing.Pool which uses dill: from pathos.multiprocessing import ProcessingPool as Pool
Alternatives
Full Evaluation Report
Comprehensive deep-dive: security analysis, reliability audit, agent experience review, cost modeling, competitive positioning, and improvement roadmap for dill.
AI-powered analysis · PDF + markdown · Delivered within 30 minutes
Package Brief
Quick verdict, integration guide, cost projections, gotchas with workarounds, and alternatives comparison.
Delivered within 10 minutes
Score Monitoring
Get alerted when this package's AF, security, or reliability scores change significantly. Stay ahead of regressions.
Continuous monitoring
Scores are editorial opinions as of 2026-03-06.