DEAP

Distributed Evolutionary Algorithms in Python — toolkit for implementing genetic algorithms, genetic programming, evolution strategies, and swarm intelligence. DEAP features: creator for custom individual/fitness classes, toolbox for operator registration, algorithms.eaSimple/eaMuPlusLambda for full EA loops, genetic operators (crossover, mutation, selection), Pareto front multi-objective optimization (NSGA-II), genetic programming tree evaluation, Hall of Fame for best individuals, Statistics tracking, pickle-based parallelism with scoop, and flexible fitness design. Used for feature selection, hyperparameter optimization, NAS, game playing, and prompt evolution.

Evaluated Mar 06, 2026 (0d ago) v1.4.x
Homepage ↗ Repo ↗ AI & Machine Learning python deap genetic-algorithm evolutionary optimization GP ES neuroevolution
⚙ Agent Friendliness
60
/ 100
Can an agent use this?
🔒 Security
88
/ 100
Is it safe for agents?
⚡ Reliability
76
/ 100
Does it work consistently?

Score Breakdown

⚙ Agent Friendliness

MCP Quality
--
Documentation
75
Error Messages
68
Auth Simplicity
98
Rate Limits
98

🔒 Security

TLS Enforcement
90
Auth Strength
90
Scope Granularity
88
Dep. Hygiene
82
Secret Handling
90

Local computation — no network calls. Multiprocessing parallelism uses pickle serialization — ensure evaluate() doesn't serialize sensitive data in individual representation. No security concerns for standard optimization use.

⚡ Reliability

Uptime/SLA
75
Version Stability
78
Breaking Changes
78
Error Recovery
72
AF Security Reliability

Best When

Highly customized evolutionary algorithms, multi-objective optimization with NSGA-II, genetic programming for symbolic regression, or neuroevolution — DEAP provides full control over evolutionary operators that Optuna and Nevergrad don't expose.

Avoid When

You want simple hyperparameter tuning (use Optuna), gradient-available optimization (use autograd), or convex problems (use CVXPY).

Use Cases

  • Agent hyperparameter evolution — creator.create('FitnessMax', base.Fitness, weights=(1.0,)); creator.create('Individual', list, fitness=creator.FitnessMax); toolbox = base.Toolbox(); toolbox.register('individual', tools.initRepeat, creator.Individual, toolbox.attr_float, 5); pop, log = algorithms.eaSimple(toolbox.population(n=50), toolbox, cxpb=0.5, mutpb=0.2, ngen=20) — evolve real-valued hyperparameters; agent genetic hyperparameter search finds configurations without gradients
  • Agent feature selection GA — individual represents feature mask (binary list); evaluate(individual) trains model on selected features, returns accuracy; tournament selection + one-point crossover + bit-flip mutation; agent automatically selects informative feature subset from 1000 candidate features
  • Agent multi-objective NSGA-II — creator.create('FitnessMulti', base.Fitness, weights=(-1.0, -1.0)); toolbox.register('select', tools.selNSGA2); algorithms.eaMuPlusLambda(pop, toolbox, mu=50, lambda_=100, cxpb=0.7, mutpb=0.2, ngen=100) — Pareto-optimal multi-objective optimization; agent finds trade-off frontier between model accuracy and inference speed
  • Agent genetic programming — pset = gp.PrimitiveSet('main', 2); pset.addPrimitive(operator.add, 2); pset.addPrimitive(operator.mul, 2); toolbox.register('expr', gp.genHalfAndHalf, pset=pset, min_=1, max_=3) — symbolic regression via genetic programming; agent discovers mathematical formula fitting observed data by evolving expression trees
  • Agent neuroevolution — individual encodes neural network weights as float list; evaluate trains individual on task for N episodes; CMA-ES mutation operator; agent evolves neural network weights for RL environment without backpropagation; DEAP provides evolutionary search over weight space

Not For

  • Gradient-based optimization — DEAP is gradient-free evolutionary computation; for gradient-available problems use Adam/SGD via PyTorch
  • Convex optimization — use CVXPY for guaranteed-optimal convex solutions; evolutionary search is approximate
  • Simple hyperparameter tuning — use Optuna or Nevergrad for simpler API; DEAP requires more setup but is more customizable

Interface

REST API
No
GraphQL
No
gRPC
No
MCP Server
No
SDK
Yes
Webhooks
No

Authentication

Methods: none
OAuth: No Scopes: No

No auth — local evolutionary computation library.

Pricing

Model: open_source
Free tier: Yes
Requires CC: No

DEAP is LGPL-3.0 licensed. Free for all use.

Agent Metadata

Pagination
none
Idempotent
Partial
Retry Guidance
Not documented

Known Gotchas

  • creator classes persist across module imports — creator.create('Individual', list, ...) registers globally; reimporting module or rerunning Jupyter cell raises RuntimeError: 'Individual' already in creator; agent notebooks must check: if 'Individual' not in creator.__dict__: creator.create('Individual', ...) before creating
  • Individual fitness must be tuple not scalar — FitnessMax weights=(1.0,) is single objective; evaluate() must return (score,) not score; toolbox.register('evaluate', lambda ind: (score,)); agent code returning plain float causes TypeError in fitness assignment; always return tuple even for single objective
  • eaSimple mutates individuals in place — algorithms.eaSimple modifies population; agent code that needs original population must copy before: original = [toolbox.clone(ind) for ind in pop]; DEAP operators work in-place for efficiency; clone() deep-copies individual including fitness
  • Parallelism requires picklable evaluate function — toolbox.register('map', pool.map) enables parallel evaluation; evaluate() must be picklable (top-level function, no lambda, no closure over non-picklable objects); agent code with class methods or closures as evaluate fails with pickle error
  • Invalid fitness must be cleared before re-evaluation — after crossover/mutation, offspring fitness is invalid; algorithms handle this, but custom loops must: del offspring.fitness.values before evaluating mutated individuals; stale fitness from parent causes offspring to skip re-evaluation
  • GP tree depth control prevents bloat — genetic programming trees grow unboundedly without depth limit; toolbox.decorate('mate', gp.staticLimit(key=operator.attrgetter('height'), max_value=17)) adds depth constraint; agent GP without bloat control generates unreadable thousand-node expression trees after many generations

Alternatives

Full Evaluation Report

Detailed scoring breakdown, competitive positioning, security analysis, and improvement recommendations for DEAP.

$99

Scores are editorial opinions as of 2026-03-06.

5173
Packages Evaluated
26151
Need Evaluation
173
Need Re-evaluation
Community Powered