{"id":"6a1dcf63-e592-4106-b066-dc7a268687e6","shortId":"3zE5LC","kind":"skill","title":"saq","tagline":"Auto-activate for saq imports, SAQ task queue configuration. SAQ (Simple Async Queue): async task queues, background jobs, cron scheduling, worker lifecycle. Produces SAQ task definitions, Worker setup, CronJob scheduling, and queue configuration. Use when: defining background ta","description":"# SAQ (Simple Async Queue) Skill\n\nSAQ is a lightweight async task queue built on asyncio. Supports Redis and Postgres backends. Designed for simplicity with async-native patterns — no separate broker process, no class-based tasks, just plain async functions.\n\n## Code Style Rules\n\n- Use PEP 604 for unions: `T | None` (not `Optional[T]`)\n- **Never** use `from __future__ import annotations`\n- Use Google-style docstrings\n- All task functions must be `async def`\n- First argument of every task function is always the context dict (`ctx`)\n\n## Quick Reference\n\n### Queue Creation\n\n```python\nfrom saq import Queue\n\n# Redis backend\nqueue = Queue.from_url(\"redis://localhost\")\n\n# Postgres backend\nqueue = Queue.from_url(\"postgresql+asyncpg://user:pass@localhost/db\")\n```\n\n### Task Definition\n\n```python\nasync def send_email(ctx: dict, *, recipient: str, subject: str, body: str) -> None:\n    \"\"\"Send an email as a background task.\n\n    Args:\n        ctx: SAQ context dict (contains queue, job, and custom startup keys).\n        recipient: Email recipient address.\n        subject: Email subject line.\n        body: Email body content.\n    \"\"\"\n    mailer = ctx[\"mailer\"]  # injected via startup hook\n    await mailer.send(recipient, subject, body)\n```\n\n### Enqueueing Jobs\n\n```python\n# Fire and forget\nawait queue.enqueue(\"send_email\", recipient=\"user@example.com\", subject=\"Hello\", body=\"World\")\n\n# Enqueue and wait for result\nresult = await queue.apply(\"send_email\", recipient=\"user@example.com\", subject=\"Hello\", body=\"World\")\n\n# With job options\nawait queue.enqueue(\n    \"send_email\",\n    recipient=\"user@example.com\",\n    subject=\"Hello\",\n    body=\"World\",\n    timeout=30,\n    retries=3,\n    ttl=3600,\n    key=\"email-user@example.com\",  # deduplication key\n)\n```\n\n### CronJob Scheduling\n\n```python\nfrom saq import CronJob\n\n# Run at the top of every hour\nhourly_report = CronJob(\n    function=generate_report,\n    cron=\"0 * * * *\",\n    timeout=300,\n)\n\n# Run every 15 minutes\nhealth_check = CronJob(\n    function=check_health,\n    cron=\"*/15 * * * *\",\n    timeout=60,\n    retries=1,\n)\n```\n\n### Worker Setup\n\n```python\nfrom saq import Worker\n\nworker = Worker(\n    queue,\n    functions=[send_email, process_order, generate_report],\n    cron_jobs=[hourly_report, health_check],\n    concurrency=10,\n    startup=startup_hook,\n    shutdown=shutdown_hook,\n    before_process=before_process_hook,\n    after_process=after_process_hook,\n)\n\n# Run the worker (blocks)\nimport asyncio\nasyncio.run(worker.start())\n```\n\n### Job Options Reference\n\n| Option | Type | Default | Description |\n|---|---|---|---|\n| `timeout` | `int` | `None` | Seconds before job times out. **Always set this.** |\n| `retries` | `int` | `0` | Number of retry attempts on failure |\n| `ttl` | `int` | `600` | Seconds to retain result after completion |\n| `key` | `str` | `None` | Deduplication key — skip if a job with this key is already queued/active |\n| `heartbeat` | `int` | `0` | Seconds between heartbeat updates (use for long-running jobs) |\n| `scheduled` | `int` | `0` | Unix timestamp to delay job start |\n\n### Job Lifecycle\n\n```text\nqueued → active → complete\n                → failed\n                → aborted\n```\n\n### Context Dict\n\nThe `ctx` dict passed to every task contains:\n\n- `ctx[\"queue\"]` — the `Queue` instance\n- `ctx[\"job\"]` — the current `Job` object\n- Any keys added by your `startup` hook (e.g., `ctx[\"db\"]`, `ctx[\"mailer\"]`)\n\n<workflow>\n\n## Workflow\n\n### Step 1: Define Task Functions\n\nWrite `async def` functions with `ctx: dict` as the first positional arg and all task parameters as keyword-only args (after `*`). Keep task functions focused — each task does one thing.\n\n### Step 2: Configure the Queue\n\nCreate a `Queue` using `Queue.from_url()` with your Redis or Postgres DSN. Store the queue instance where it can be shared across your app (module-level, app state, or DI container).\n\n### Step 3: Define Lifecycle Hooks\n\nWrite `startup` and `shutdown` hooks to initialize and clean up shared resources (DB pools, HTTP clients, mailers). Attach resources to `ctx` in `startup` so all tasks can access them.\n\n### Step 4: Schedule CronJobs\n\nWrap any recurring work in `CronJob` instances with explicit cron expressions and timeouts. Do not use external cron tools (crontab, Kubernetes CronJob) for work that belongs in the queue.\n\n### Step 5: Create and Run Worker\n\nInstantiate `Worker` with the queue, task functions, cron jobs, concurrency limit, and lifecycle hooks. Run with `asyncio.run(worker.start())` or integrate into your process manager.\n\n### Step 6: Enqueue from Application Code\n\nCall `queue.enqueue()` for fire-and-forget or `queue.apply()` when you need the result. Use the `key` parameter for natural deduplication (e.g., per-user jobs that should not stack).\n\n</workflow>\n\n<guardrails>\n\n## Guardrails\n\n- **Always set `timeout`** — the default is no timeout. A hung task will block a worker slot forever.\n- **Use `heartbeat` for long-running jobs** — without heartbeat, SAQ may mark a long-active job as stuck and re-queue it. Set heartbeat to roughly 1/3 of expected runtime.\n- **Use `CronJob` for scheduled work** — do not schedule SAQ tasks from external cron tools. CronJobs are managed by the worker and participate in the job lifecycle (retries, timeouts, observability).\n- **First arg is always `ctx`** — SAQ injects the context dict as the first positional argument. Keyword-only task params come after `*`.\n- **Handle graceful shutdown** — call `await worker.stop()` on SIGTERM/SIGINT. Abrupt process kills can leave jobs stranded in `active` state.\n- **Use `key` for deduplication** — if the same logical job can be enqueued multiple times (e.g., per-user sync), set a stable `key` to prevent stacking.\n- **Set appropriate `concurrency`** — default is 10. Lower for CPU/memory-intensive tasks, higher for I/O-bound tasks. Consider backend connection pool sizes.\n- **Do not share mutable state between tasks** — use the context dict (populated per-worker in `startup`) for shared resources like DB pools and HTTP clients.\n\n</guardrails>\n\n<validation>\n\n### Validation Checkpoint\n\nBefore delivering SAQ code, verify:\n\n- [ ] Every task function is `async def` with `ctx: dict` as the first positional arg\n- [ ] All task parameters are keyword-only (defined after `*`)\n- [ ] `timeout` is set on all long-running jobs and `CronJob` definitions\n- [ ] `heartbeat` is set for jobs that run longer than ~30 seconds\n- [ ] Shared resources (DB, HTTP client) are initialized in `startup` hook and attached to `ctx`\n- [ ] `CronJob` is used for scheduled/recurring work (not external cron)\n- [ ] `key` is used where job deduplication is needed\n- [ ] Worker handles graceful shutdown\n\n</validation>\n\n<example>\n\n## Example\n\n**Task:** Background email sender with startup hook, cron health check, and deduplication.\n\n```python\nimport asyncio\nfrom saq import CronJob, Queue, Worker\n\n\n# --- Shared queue (module-level) ---\nqueue = Queue.from_url(\"redis://localhost\")\n\n\n# --- Lifecycle hooks ---\nasync def startup(ctx: dict) -> None:\n    \"\"\"Initialize shared resources and attach to context.\"\"\"\n    # Example: async HTTP client for sending email\n    import httpx\n    ctx[\"http\"] = httpx.AsyncClient()\n\n\nasync def shutdown(ctx: dict) -> None:\n    \"\"\"Clean up shared resources.\"\"\"\n    await ctx[\"http\"].aclose()\n\n\n# --- Task definitions ---\nasync def send_welcome_email(ctx: dict, *, user_id: int, email: str) -> None:\n    \"\"\"Send a welcome email to a new user.\n\n    Args:\n        ctx: SAQ context dict.\n        user_id: ID of the new user.\n        email: Recipient email address.\n    \"\"\"\n    http: httpx.AsyncClient = ctx[\"http\"]\n    await http.post(\n        \"https://api.email-provider.com/send\",\n        json={\"to\": email, \"template\": \"welcome\", \"user_id\": user_id},\n    )\n\n\nasync def process_export(ctx: dict, *, export_id: int) -> dict:\n    \"\"\"Process a data export job.\n\n    Args:\n        ctx: SAQ context dict.\n        export_id: ID of the export record to process.\n\n    Returns:\n        Dict with export result metadata.\n    \"\"\"\n    # Long-running — heartbeat prevents SAQ from marking it stuck\n    job = ctx[\"job\"]\n    # ... processing logic ...\n    return {\"export_id\": export_id, \"rows\": 1000}\n\n\nasync def check_queue_health(ctx: dict) -> None:\n    \"\"\"Scheduled health check — logs queue stats.\"\"\"\n    q: Queue = ctx[\"queue\"]\n    info = await q.info()\n    print(f\"Queue stats: {info}\")\n\n\n# --- CronJob ---\nhealth_check = CronJob(\n    function=check_queue_health,\n    cron=\"*/5 * * * *\",\n    timeout=30,\n)\n\n\n# --- Worker ---\nworker = Worker(\n    queue,\n    functions=[send_welcome_email, process_export],\n    cron_jobs=[health_check],\n    concurrency=10,\n    startup=startup,\n    shutdown=shutdown,\n)\n\n\n# --- Enqueueing from application code ---\nasync def on_user_created(user_id: int, email: str) -> None:\n    await queue.enqueue(\n        \"send_welcome_email\",\n        user_id=user_id,\n        email=email,\n        timeout=30,\n        retries=2,\n        key=f\"welcome-{user_id}\",  # deduplicate: only one welcome email per user\n    )\n\n\nasync def start_export(export_id: int) -> None:\n    await queue.enqueue(\n        \"process_export\",\n        export_id=export_id,\n        timeout=600,\n        heartbeat=120,  # update heartbeat every 2 minutes\n        key=f\"export-{export_id}\",\n    )\n\n\nif __name__ == \"__main__\":\n    asyncio.run(worker.start())\n```\n\n</example>\n\n---\n\n## References Index\n\nFor detailed guides and patterns, refer to the following documents in `references/`:\n\n- **[Advanced Patterns](references/patterns.md)** -- Heartbeat management, dead letter handling, job chaining, queue priorities, worker lifecycle hooks, Postgres backend.\n\n---\n\n## Official References\n\n- <https://github.com/tobymao/saq>\n- <https://saq-py.readthedocs.io/en/latest/>\n- <https://pypi.org/project/saq/>\n\n## Cross-References\n\n- For Litestar integration (SAQPlugin, DI, web UI, CLI): see `flow:litestar` → litestar-saq section.\n\n## Shared Styleguide Baseline\n\n- Use shared styleguides for generic language/framework rules to reduce duplication in this skill.\n- [General Principles](https://github.com/cofin/flow/blob/main/templates/styleguides/general.md)\n- [Python](https://github.com/cofin/flow/blob/main/templates/styleguides/languages/python.md)\n- Keep this skill focused on tool-specific workflows, edge cases, and integration details.","tags":["saq","flow","cofin","agent-skills","ai-agents","beads","claude-code","codex","cursor","developer-tools","gemini-cli","opencode"],"capabilities":["skill","source-cofin","skill-saq","topic-agent-skills","topic-ai-agents","topic-beads","topic-claude-code","topic-codex","topic-cursor","topic-developer-tools","topic-gemini-cli","topic-opencode","topic-plugin","topic-slash-commands","topic-spec-driven-development"],"categories":["flow"],"synonyms":[],"warnings":[],"endpointUrl":"https://skills.sh/cofin/flow/saq","protocol":"skill","transport":"skills-sh","auth":{"type":"none","details":{"cli":"npx skills add cofin/flow","source_repo":"https://github.com/cofin/flow","install_from":"skills.sh"}},"qualityScore":"0.455","qualityRationale":"deterministic score 0.46 from registry signals: · indexed on github topic:agent-skills · 11 github stars · SKILL.md body (10,321 chars)","verified":false,"liveness":"unknown","lastLivenessCheck":null,"agentReviews":{"count":0,"score_avg":null,"cost_usd_avg":null,"success_rate":null,"latency_p50_ms":null,"narrative_summary":null,"summary_updated_at":null},"enrichmentModel":"deterministic:skill-github:v1","enrichmentVersion":1,"enrichedAt":"2026-04-24T07:03:19.775Z","embedding":null,"createdAt":"2026-04-23T13:04:01.404Z","updatedAt":"2026-04-24T07:03:19.775Z","lastSeenAt":"2026-04-24T07:03:19.775Z","tsv":"'/15':298 '/5':1172 '/cofin/flow/blob/main/templates/styleguides/general.md)':1352 '/cofin/flow/blob/main/templates/styleguides/languages/python.md)':1356 '/en/latest/':1310 '/project/saq/':1313 '/send':1070 '/tobymao/saq':1307 '0':284,372,405,418 '1':302,468 '1/3':719 '10':327,823,1190 '1000':1136 '120':1256 '15':289 '2':504,1224,1260 '3':256,541 '30':254,914,1174,1222 '300':286 '3600':258 '4':575 '5':608 '6':638 '60':300 '600':381,1254 '604':87 'abort':432 'abrupt':782 'access':572 'aclos':1022 'across':529 'activ':4,429,706,790 'ad':456 'address':187,1061 'advanc':1286 'alreadi':401 'alway':120,367,674,755 'annot':100 'api.email-provider.com':1069 'api.email-provider.com/send':1068 'app':531,535 'applic':641,1197 'appropri':819 'arg':172,483,492,753,883,1046,1095 'argument':114,766 'async':14,16,43,50,66,80,111,152,473,874,984,998,1009,1025,1080,1137,1199,1237 'async-n':65 'asyncio':55,349,966 'asyncio.run':350,629,1270 'attach':562,927,994 'attempt':376 'auto':3 'auto-activ':2 'await':203,214,230,243,778,1019,1066,1156,1210,1245 'backend':60,135,141,833,1302 'background':19,39,170,953 'base':76 'baselin':1334 'belong':603 'block':347,686 'bodi':162,192,194,207,222,238,251 'broker':71 'built':53 'call':643,777 'case':1367 'chain':1295 'check':292,295,325,961,1139,1147,1165,1168,1188 'checkpoint':864 'class':75 'class-bas':74 'clean':553,1015 'cli':1324 'client':560,862,920,1000 'code':82,642,868,1198 'come':772 'complet':387,430 'concurr':326,622,820,1189 'configur':11,35,505 'connect':834 'consid':832 'contain':177,442,539 'content':195 'context':122,175,433,760,846,996,1049,1098 'cpu/memory-intensive':826 'creat':508,609,1203 'creation':128 'cron':21,283,297,320,587,595,620,735,938,959,1171,1185 'cronjob':31,263,269,279,293,577,583,599,724,737,903,930,970,1163,1166 'crontab':597 'cross':1315 'cross-refer':1314 'ctx':124,156,173,197,436,443,448,462,464,477,565,756,877,929,987,1006,1012,1020,1030,1047,1064,1084,1096,1126,1142,1153 'current':451 'custom':181 'data':1092 'db':463,557,858,918 'dead':1291 'dedupl':261,391,663,795,944,963,1230 'def':112,153,474,875,985,1010,1026,1081,1138,1200,1238 'default':357,678,821 'defin':38,469,542,891 'definit':28,150,904,1024 'delay':422 'deliv':866 'descript':358 'design':61 'detail':1275,1370 'di':538,1321 'dict':123,157,176,434,437,478,761,847,878,988,1013,1031,1050,1085,1089,1099,1110,1143 'docstr':105 'document':1283 'dsn':519 'duplic':1344 'e.g':461,664,806 'edg':1366 'email':155,167,185,189,193,217,233,246,315,954,1003,1029,1035,1041,1058,1060,1073,1182,1207,1214,1219,1220,1234 'email-user@example.com':260 'enqueu':208,224,639,803,1195 'everi':116,275,288,440,870,1259 'exampl':951,997 'expect':721 'explicit':586 'export':1083,1086,1093,1100,1105,1112,1131,1133,1184,1240,1241,1248,1249,1251,1264,1265 'express':588 'extern':594,734,937 'f':1159,1226,1263 'fail':431 'failur':378 'fire':211,647 'fire-and-forget':646 'first':113,481,752,764,881 'flow':1326 'focus':497,1360 'follow':1282 'forev':690 'forget':213,649 'function':81,108,118,280,294,313,471,475,496,619,872,1167,1179 'futur':98 'general':1348 'generat':281,318 'generic':1339 'github.com':1306,1351,1355 'github.com/cofin/flow/blob/main/templates/styleguides/general.md)':1350 'github.com/cofin/flow/blob/main/templates/styleguides/languages/python.md)':1354 'github.com/tobymao/saq':1305 'googl':103 'google-styl':102 'grace':775,949 'guardrail':673 'guid':1276 'handl':774,948,1293 'health':291,296,324,960,1141,1146,1164,1170,1187 'heartbeat':403,408,692,699,716,905,1118,1255,1258,1289 'hello':221,237,250 'higher':828 'hook':202,330,333,338,343,460,544,549,626,925,958,983,1300 'hour':276,277,322 'http':559,861,919,999,1007,1021,1062,1065 'http.post':1067 'httpx':1005 'httpx.asyncclient':1008,1063 'hung':683 'i/o-bound':830 'id':1033,1052,1053,1077,1079,1087,1101,1102,1132,1134,1205,1216,1218,1229,1242,1250,1252,1266 'import':7,99,132,268,308,348,965,969,1004 'index':1273 'info':1155,1162 'initi':551,922,990 'inject':199,758 'instanc':447,523,584 'instanti':613 'int':360,371,380,404,417,1034,1088,1206,1243 'integr':632,1319,1369 'job':20,179,209,241,321,352,364,396,415,423,425,449,452,621,668,697,707,747,787,800,901,909,943,1094,1125,1127,1186,1294 'json':1071 'keep':494,1357 'key':183,259,262,388,392,399,455,659,793,814,939,1225,1262 'keyword':490,768,889 'keyword-on':489,767,888 'kill':784 'kubernet':598 'language/framework':1340 'leav':786 'letter':1292 'level':534,977 'lifecycl':24,426,543,625,748,982,1299 'lightweight':49 'like':857 'limit':623 'line':191 'litestar':1318,1327,1329 'litestar-saq':1328 'localhost':139,981 'localhost/db':148 'log':1148 'logic':799,1129 'long':413,695,705,899,1116 'long-act':704 'long-run':412,694,898,1115 'longer':912 'lower':824 'mailer':196,198,465,561 'mailer.send':204 'main':1269 'manag':636,739,1290 'mark':702,1122 'may':701 'metadata':1114 'minut':290,1261 'modul':533,976 'module-level':532,975 'multipl':804 'must':109 'mutabl':840 'name':1268 'nativ':67 'natur':662 'need':654,946 'never':95 'new':1044,1056 'none':91,164,361,390,989,1014,1037,1144,1209,1244 'number':373 'object':453 'observ':751 'offici':1303 'one':501,1232 'option':93,242,353,355 'order':317 'param':771 'paramet':487,660,886 'particip':744 'pass':147,438 'pattern':68,1278,1287 'pep':86 'per':666,808,850,1235 'per-us':665,807 'per-work':849 'plain':79 'pool':558,835,859 'popul':848 'posit':482,765,882 'postgr':59,140,518,1301 'postgresql':145 'prevent':816,1119 'principl':1349 'print':1158 'prioriti':1297 'process':72,316,335,337,340,342,635,783,1082,1090,1108,1128,1183,1247 'produc':25 'pypi.org':1312 'pypi.org/project/saq/':1311 'python':129,151,210,265,305,964,1353 'q':1151 'q.info':1157 'queu':428 'queue':10,15,18,34,44,52,127,133,136,142,178,312,444,446,507,510,522,606,617,713,971,974,978,1140,1149,1152,1154,1160,1169,1178,1296 'queue.apply':231,651 'queue.enqueue':215,244,644,1211,1246 'queue.from':137,143,512,979 'queued/active':402 'quick':125 're':712 're-queu':711 'recipi':158,184,186,205,218,234,247,1059 'record':1106 'recur':580 'redi':57,134,516 'reduc':1343 'refer':126,354,1272,1279,1285,1304,1316 'references/patterns.md':1288 'report':278,282,319,323 'resourc':556,563,856,917,992,1018 'result':228,229,385,656,1113 'retain':384 'retri':255,301,370,375,749,1223 'return':1109,1130 'rough':718 'row':1135 'rule':84,1341 'run':270,287,344,414,611,627,696,900,911,1117 'runtim':722 'saq':1,6,8,12,26,41,46,131,174,267,307,700,731,757,867,968,1048,1097,1120,1330 'saq-py.readthedocs.io':1309 'saq-py.readthedocs.io/en/latest/':1308 'saqplugin':1320 'schedul':22,32,264,416,576,726,730,1145 'scheduled/recurring':934 'second':362,382,406,915 'section':1331 'see':1325 'send':154,165,216,232,245,314,1002,1027,1038,1180,1212 'sender':955 'separ':70 'set':368,675,715,811,818,895,907 'setup':30,304 'share':528,555,839,855,916,973,991,1017,1332,1336 'shutdown':331,332,548,776,950,1011,1193,1194 'sigterm/sigint':781 'simpl':13,42 'simplic':63 'size':836 'skill':45,1347,1359 'skill-saq' 'skip':393 'slot':689 'source-cofin' 'specif':1364 'stabl':813 'stack':672,817 'start':424,1239 'startup':182,201,328,329,459,546,567,853,924,957,986,1191,1192 'stat':1150,1161 'state':536,791,841 'step':467,503,540,574,607,637 'store':520 'str':159,161,163,389,1036,1208 'strand':788 'stuck':709,1124 'style':83,104 'styleguid':1333,1337 'subject':160,188,190,206,220,236,249 'support':56 'sync':810 'ta':40 'task':9,17,27,51,77,107,117,149,171,441,470,486,495,499,570,618,684,732,770,827,831,843,871,885,952,1023 'templat':1074 'text':427 'thing':502 'time':365,805 'timeout':253,285,299,359,590,676,681,750,893,1173,1221,1253 'timestamp':420 'tool':596,736,1363 'tool-specif':1362 'top':273 'topic-agent-skills' 'topic-ai-agents' 'topic-beads' 'topic-claude-code' 'topic-codex' 'topic-cursor' 'topic-developer-tools' 'topic-gemini-cli' 'topic-opencode' 'topic-plugin' 'topic-slash-commands' 'topic-spec-driven-development' 'ttl':257,379 'type':356 'ui':1323 'union':89 'unix':419 'updat':409,1257 'url':138,144,513,980 'use':36,85,96,101,410,511,593,657,691,723,792,844,932,941,1335 'user':146,667,809,1032,1045,1051,1057,1076,1078,1202,1204,1215,1217,1228,1236 'user@example.com':219,235,248 'valid':863 'verifi':869 'via':200 'wait':226 'web':1322 'welcom':1028,1040,1075,1181,1213,1227,1233 'without':698 'work':581,601,727,935 'worker':23,29,303,309,310,311,346,612,614,688,742,851,947,972,1175,1176,1177,1298 'worker.start':351,630,1271 'worker.stop':779 'workflow':466,1365 'world':223,239,252 'wrap':578 'write':472,545","prices":[{"id":"e3495259-7e6a-4f0a-8e1a-9253c871988c","listingId":"6a1dcf63-e592-4106-b066-dc7a268687e6","amountUsd":"0","unit":"free","nativeCurrency":null,"nativeAmount":null,"chain":null,"payTo":null,"paymentMethod":"skill-free","isPrimary":true,"details":{"org":"cofin","category":"flow","install_from":"skills.sh"},"createdAt":"2026-04-23T13:04:01.404Z"}],"sources":[{"listingId":"6a1dcf63-e592-4106-b066-dc7a268687e6","source":"github","sourceId":"cofin/flow/saq","sourceUrl":"https://github.com/cofin/flow/tree/main/skills/saq","isPrimary":false,"firstSeenAt":"2026-04-23T13:04:01.404Z","lastSeenAt":"2026-04-24T07:03:19.775Z"}],"details":{"listingId":"6a1dcf63-e592-4106-b066-dc7a268687e6","quickStartSnippet":null,"exampleRequest":null,"exampleResponse":null,"schema":null,"openapiUrl":null,"agentsTxtUrl":null,"citations":[],"useCases":[],"bestFor":[],"notFor":[],"kindDetails":{"org":"cofin","slug":"saq","github":{"repo":"cofin/flow","stars":11,"topics":["agent-skills","ai-agents","beads","claude-code","codex","context-driven-development","cursor","developer-tools","gemini-cli","opencode","plugin","slash-commands","spec-driven-development","subagents","tdd","workflow"],"license":"apache-2.0","html_url":"https://github.com/cofin/flow","pushed_at":"2026-04-19T23:22:27Z","description":"Context-Driven Development toolkit for AI agents — spec-first planning, TDD workflow, and Beads integration.","skill_md_sha":"1cf2e05991a38c566fc1a425fefa03e41c42fc17","skill_md_path":"skills/saq/SKILL.md","default_branch":"main","skill_tree_url":"https://github.com/cofin/flow/tree/main/skills/saq"},"layout":"multi","source":"github","category":"flow","frontmatter":{"name":"saq","description":"Auto-activate for saq imports, SAQ task queue configuration. SAQ (Simple Async Queue): async task queues, background jobs, cron scheduling, worker lifecycle. Produces SAQ task definitions, Worker setup, CronJob scheduling, and queue configuration. Use when: defining background tasks, enqueueing jobs, scheduling cron work, or managing worker lifecycle. Not for Celery, RQ, or Dramatiq -- SAQ has its own async-native patterns. For Litestar integration (SAQPlugin, DI, web UI, CLI), see flow:litestar."},"skills_sh_url":"https://skills.sh/cofin/flow/saq"},"updatedAt":"2026-04-24T07:03:19.775Z"}}