aiomysql

Async MySQL client library for Python asyncio — wrapper around PyMySQL providing non-blocking MySQL connections. aiomysql features: aiomysql.connect() for single connection, aiomysql.create_pool() for connection pooling, cursor.execute()/executemany() for queries, cursor.fetchall()/fetchone()/fetchmany(), DictCursor for dict results, SSCursor for server-side cursors (large result sets), transactions (conn.begin(), commit(), rollback()), autocommit mode, SQLAlchemy ORM integration via aiomysql.sa, and aio-libs ecosystem integration. Part of aio-libs like aiohttp.

Evaluated Mar 06, 2026 (0d ago) v0.2.x
Homepage ↗ Repo ↗ Developer Tools python aiomysql mysql async asyncio mariadb database aio-libs
⚙ Agent Friendliness
61
/ 100
Can an agent use this?
🔒 Security
80
/ 100
Is it safe for agents?
⚡ Reliability
74
/ 100
Does it work consistently?

Score Breakdown

⚙ Agent Friendliness

MCP Quality
--
Documentation
78
Error Messages
78
Auth Simplicity
88
Rate Limits
92

🔒 Security

TLS Enforcement
82
Auth Strength
80
Scope Granularity
78
Dep. Hygiene
80
Secret Handling
80

CRITICAL: Always use parameterized queries — string interpolation into SQL causes SQL injection. Use TLS for MySQL connections in production: ssl={'ca': '/path/to/ca.pem'}. Use least-privilege MySQL users — agent should not have DROP or CREATE privileges. Store MySQL credentials in environment variables not source code. Use connection pooling to limit max connections to MySQL server.

⚡ Reliability

Uptime/SLA
75
Version Stability
72
Breaking Changes
72
Error Recovery
78
AF Security Reliability

Best When

Async Python services (FastAPI, aiohttp) connecting to MySQL or MariaDB — aiomysql provides non-blocking MySQL access in asyncio without blocking the event loop.

Avoid When

Your code is synchronous (use PyMySQL), you need PostgreSQL (use asyncpg), or you want SQLAlchemy ORM (use SQLAlchemy 2.x async).

Use Cases

  • Agent async MySQL query — import aiomysql; async with aiomysql.connect(host='db', user='agent', password='pass', db='agentdb') as conn: async with conn.cursor(aiomysql.DictCursor) as cur: await cur.execute('SELECT * FROM sessions WHERE agent_id=%s', (agent_id,)); rows = await cur.fetchall() — async MySQL; agent FastAPI reads session data without blocking event loop; DictCursor returns list of dicts instead of tuples
  • Agent connection pool — pool = await aiomysql.create_pool(host='db', user='u', password='p', db='db', minsize=5, maxsize=20); async with pool.acquire() as conn: async with conn.cursor() as cur: await cur.execute(sql, args) — connection pool for multi-request agent; pool.acquire() reuses connections; pool closed with pool.close(); await pool.wait_closed()
  • Agent transaction — async with conn.cursor() as cur: await conn.begin(); try: await cur.execute('UPDATE balance SET amt=amt-? WHERE id=?', (amount, sender)); await cur.execute('UPDATE balance SET amt=amt+? WHERE id=?', (amount, receiver)); await conn.commit(); except: await conn.rollback(); raise — explicit transaction; agent financial operations atomic with rollback on error
  • Agent bulk insert — async with conn.cursor() as cur: data = [(f'agent_{i}', score) for i, score in enumerate(scores)]; await cur.executemany('INSERT INTO results (name, score) VALUES (%s, %s)', data); await conn.commit() — bulk insert with executemany; agent batch data loading inserts 10K rows in one roundtrip; much faster than individual execute calls
  • Agent large result streaming — async with conn.cursor(aiomysql.SSCursor) as cur: await cur.execute('SELECT * FROM large_table'); while True: row = await cur.fetchone(); if row is None: break; process(row) — SSCursor fetches row-by-row from server; agent avoids loading 10M-row result into memory; memory-efficient for large dataset processing

Not For

  • Sync Python code — aiomysql requires asyncio; for sync MySQL use PyMySQL or mysqlclient
  • PostgreSQL — aiomysql is MySQL/MariaDB only; for PostgreSQL use asyncpg or psycopg3
  • SQLAlchemy 2.x ORM — aiomysql.sa is for SQLAlchemy Core not ORM; for async SQLAlchemy ORM use SQLAlchemy 2.x async engine with aiomysql dialect

Interface

REST API
No
GraphQL
No
gRPC
No
MCP Server
No
SDK
Yes
Webhooks
No

Authentication

Methods: password tls
OAuth: No Scopes: No

MySQL username/password auth. SSL/TLS via ssl parameter (dict with ca, cert, key paths). No OAuth — database-level auth only. Use least-privilege MySQL users for agent connections.

Pricing

Model: open_source
Free tier: Yes
Requires CC: No

aiomysql is MIT licensed. MySQL server licensing varies — community edition is GPL.

Agent Metadata

Pagination
none
Idempotent
Partial
Retry Guidance
Not documented

Known Gotchas

  • autocommit=False by default — aiomysql.connect() starts with autocommit=False; every execute() is in implicit transaction; agent code without explicit commit() or begin()/commit() never persists data; must either: set autocommit=True in connect(), or explicitly call await conn.commit() after writes; reads work without commit but writes silently rollback on close
  • Connection pool acquire() context manager required — pool.acquire() returns connection but must use: async with pool.acquire() as conn — if connection not returned to pool via context manager, pool exhausts; agent code must ALWAYS use async with for pool.acquire(); bare conn = await pool.acquire() leaks connections
  • Parameterized queries use %s not ? — aiomysql uses %s placeholder: execute('SELECT * WHERE id=%s', (id,)); NOT ? like sqlite3 or $1 like asyncpg; agent code migrating from other databases must change all placeholders; using ? raises ProgrammingError: not all arguments converted during string formatting
  • cursor.execute() must use tuple not list for args — await cur.execute(sql, (arg,)) with tuple; await cur.execute(sql, [arg]) with list also works; but await cur.execute(sql, arg) with single non-sequence raises TypeError for single-argument queries; always wrap single args in tuple: (value,) not just value
  • fetchall() after large queries loads all into memory — await cursor.fetchall() on million-row query loads all rows into Python list; agent processing large MySQL tables must use SSCursor for server-side cursor and iterate with fetchone(); or use LIMIT/OFFSET pagination; regular cursor with fetchall() on large tables causes OOM
  • Pool must be closed gracefully — pool = await aiomysql.create_pool(...); app_shutdown must call: pool.close(); await pool.wait_closed(); FastAPI agent must add shutdown event: @app.on_event('shutdown') async def shutdown(): pool.close(); await pool.wait_closed(); without cleanup, pool connections leak and MySQL server accumulates dead connections

Alternatives

Full Evaluation Report

Detailed scoring breakdown, competitive positioning, security analysis, and improvement recommendations for aiomysql.

$99

Scores are editorial opinions as of 2026-03-06.

5208
Packages Evaluated
26151
Need Evaluation
173
Need Re-evaluation
Community Powered