{"id":"aiming-lab-simplemem","name":"SimpleMem","af_score":52.2,"security_score":48.8,"reliability_score":31.2,"what_it_does":"SimpleMem is a Python memory framework for LLM agents that stores, compresses (semantic lossless compression), and retrieves long-term memories using semantic/lexical/symbolic indexing. It supports cross-session/persistent memory and can be used via MCP (cloud-hosted and/or run locally with Docker) and via Python integration with OpenAI-compatible LLM/embedding backends.","best_when":"You want long-term memory for LLM agents with a MCP-accessible memory service and are willing to configure an OpenAI-compatible API for generation/embeddings.","avoid_when":"You cannot provide an OpenAI-compatible API key or cannot operate the required services (cloud MCP or local Docker stack). If you need strong, explicitly documented privacy/compliance guarantees, evaluate further beyond marketing/README content.","last_evaluated":"2026-03-29T18:05:20.509229+00:00","has_mcp":true,"has_api":true,"auth_methods":["MCP token query parameter (SSE endpoint)","OpenAI-compatible API key for LLM/embeddings (config.py)"],"has_free_tier":false,"known_gotchas":["MCP auth example suggests token is passed via query parameter; agents should avoid logging URLs containing tokens.","Requires an OpenAI-compatible API key and correct base URL/model config; misconfiguration can cause initialization failures.","If using Docker/local service, ensure persistent storage volumes are configured if cross-session memory is required.","Pagination/idempotency/retry semantics for memory write/retrieve operations were not evident in the provided README excerpt."],"error_quality":0.0}