aiocache

Async caching library for Python with pluggable backends — provides async cache operations for Redis, Memcached, and in-memory caching. aiocache features: @cached() decorator for async function memoization, @multi_cached() for batch cache, Cache.get()/set()/delete()/exists(), TTL support, serializers (JSON, pickle, msgpack), BaseCache for custom backends, RedisCache (aioredis), MemcachedCache (aiomcache), SimpleMemoryCache, namespace support, and plugins (HitMissRatioPlugin, TimingPlugin). Part of aio-libs ecosystem. Required for async web frameworks (FastAPI, aiohttp) where sync caching libraries would block the event loop.

Evaluated Mar 06, 2026 (0d ago) v0.12.x
Homepage ↗ Repo ↗ Developer Tools python aiocache async cache redis memcached asyncio decorator aio-libs
⚙ Agent Friendliness
58
/ 100
Can an agent use this?
🔒 Security
79
/ 100
Is it safe for agents?
⚡ Reliability
70
/ 100
Does it work consistently?

Score Breakdown

⚙ Agent Friendliness

MCP Quality
--
Documentation
75
Error Messages
72
Auth Simplicity
88
Rate Limits
90

🔒 Security

TLS Enforcement
82
Auth Strength
78
Scope Granularity
75
Dep. Hygiene
80
Secret Handling
82

Redis connection should use password and TLS in production. Cached data in Redis is accessible to all clients with Redis access — do not cache secrets or PII in shared Redis without encryption. SimpleMemoryCache is process-local — safer for sensitive temporary data.

⚡ Reliability

Uptime/SLA
72
Version Stability
68
Breaking Changes
70
Error Recovery
72
AF Security Reliability

Best When

Async Python services (FastAPI, aiohttp) needing Redis or Memcached caching without blocking the event loop — aiocache's @cached decorator integrates caching into async functions with one line.

Avoid When

Your codebase is synchronous (use cachetools/diskcache), need persistent disk caching, or need complex cache topologies.

Use Cases

  • Agent async function caching — from aiocache import cached; from aiocache.serializers import JsonSerializer; @cached(ttl=300, serializer=JsonSerializer()); async def get_agent_config(agent_id: str): return await db.fetch_config(agent_id) — async function result cached for 5 minutes; agent config lookup hits Redis cache on subsequent calls; zero code change to implement cache
  • Agent Redis caching — from aiocache import Cache; cache = Cache(Cache.REDIS, endpoint='redis', port=6379, namespace='agent'); await cache.set('session:123', session_data, ttl=3600); data = await cache.get('session:123') — async Redis cache for agent session state; FastAPI agent shares session cache across workers via Redis
  • Agent expensive computation caching — @cached(ttl=600, key_builder=lambda f, *args, **kwargs: f'embedding:{args[0]}'): async def compute_embedding(text: str): return await embedding_model.encode(text) — cache expensive embedding computation; agent caches repeated text embeddings for 10 minutes; reduces LLM API calls for repeated queries
  • Agent multi-key caching — @multi_cached(keys_from_attr='user_ids'): async def get_users(user_ids: list): return await db.bulk_fetch(user_ids) — multi_cached returns cached values and only fetches uncached IDs from DB; agent batch processing retrieves cached users and queries only new ones; reduces DB load for repeated ID sets
  • Agent cache invalidation — cache = Cache(Cache.REDIS, namespace='products'); await cache.delete('product:456'); await cache.clear(namespace='products') — targeted and bulk cache invalidation; agent product catalog invalidates specific product cache on update; clear() nukes all product cache on bulk import

Not For

  • Sync Python codebases — use cachetools or diskcache for sync caching; aiocache requires asyncio event loop
  • Persistent disk caching — aiocache is in-memory or network cache; for disk persistence use diskcache or shelve
  • Complex cache topologies — aiocache is simple get/set/delete; for complex caching strategies (L1/L2, write-behind) implement custom logic

Interface

REST API
No
GraphQL
No
gRPC
No
MCP Server
No
SDK
Yes
Webhooks
No

Authentication

Methods: none password
OAuth: No Scopes: No

No auth for SimpleMemoryCache. Redis auth via password parameter. Memcached typically no auth. Network caches in production should use VPC/firewall isolation.

Pricing

Model: open_source
Free tier: Yes
Requires CC: No

aiocache is BSD licensed. Cache backend hosting costs depend on provider.

Agent Metadata

Pagination
none
Idempotent
Full
Retry Guidance
Not documented

Known Gotchas

  • Backend dependency not installed by default — Cache(Cache.REDIS) requires pip install aiocache[redis]; Cache(Cache.MEMCACHED) requires pip install aiocache[memcached]; agent code failing with ImportError on RedisCache must install extras; base aiocache only includes SimpleMemoryCache
  • SimpleMemoryCache does not share across workers — Cache(Cache.MEMORY) is per-process; FastAPI with multiple workers has separate cache per worker; agent expecting shared cache across workers must use Cache(Cache.REDIS) with shared Redis instance; in-process cache is only useful for single-worker deployments
  • TTL of None means forever — cache.set(key, value) without ttl stores indefinitely; cache.set(key, value, ttl=None) same; agent code caching API responses without TTL fills Redis with stale data; always set explicit TTL for external API response caching
  • Serializer must match on get and set — @cached(serializer=JsonSerializer()) caches as JSON; reading back with different serializer raises CacheSerializationError; agent code switching from pickle to JSON serializer must flush old cache: await cache.clear(); cached data format must be consistent across deployments
  • Key collision between different functions — @cached without custom key_builder uses function name + args as key; two functions with same name in different modules collide in shared Redis; agent code must use: key_builder=lambda f, *args: f'{f.__module__}.{f.__name__}:{args}' for unique keys across agent components
  • Cache misses on exception do not cache — @cached does not cache exceptions; function raising exception always calls underlying function; agent functions that raise on API error will bypass cache on every error; implement negative caching manually if needed for error responses

Alternatives

Full Evaluation Report

Detailed scoring breakdown, competitive positioning, security analysis, and improvement recommendations for aiocache.

$99

Scores are editorial opinions as of 2026-03-06.

5208
Packages Evaluated
26151
Need Evaluation
173
Need Re-evaluation
Community Powered