{"id":"642fa4c5-5e7e-43e4-8b0b-e7bc0b874429","shortId":"U54cGA","kind":"skill","title":"hugging-face-jobs","tagline":"Run workloads on Hugging Face Jobs with managed CPUs, GPUs, TPUs, secrets, and Hub persistence.","description":"# Running Workloads on Hugging Face Jobs\n\n## Overview\n\nRun any workload on fully managed Hugging Face infrastructure. No local setup required—jobs run on cloud CPUs, GPUs, or TPUs and can persist results to the Hugging Face Hub.\n\n**Common use cases:**\n- **Data Processing** - Transform, filter, or analyze large datasets\n- **Batch Inference** - Run inference on thousands of samples\n- **Experiments & Benchmarks** - Reproducible ML experiments\n- **Model Training** - Fine-tune models (see `model-trainer` skill for TRL-specific training)\n- **Synthetic Data Generation** - Generate datasets using LLMs\n- **Development & Testing** - Test code without local GPU setup\n- **Scheduled Jobs** - Automate recurring tasks\n\n**For model training specifically:** See the `model-trainer` skill for TRL-based training workflows.\n\n## When to Use This Skill\n\nUse this skill when users want to:\n- Run Python workloads on cloud infrastructure\n- Execute jobs without local GPU/TPU setup\n- Process data at scale\n- Run batch inference or experiments\n- Schedule recurring tasks\n- Use GPUs/TPUs for any workload\n- Persist results to the Hugging Face Hub\n\n## Key Directives\n\nWhen assisting with jobs:\n\n1. **ALWAYS use `hf_jobs()` MCP tool** - Submit jobs using `hf_jobs(\"uv\", {...})` or `hf_jobs(\"run\", {...})`. The `script` parameter accepts Python code directly. Do NOT save to local files unless the user explicitly requests it. Pass the script content as a string to `hf_jobs()`.\n\n2. **Always handle authentication** - Jobs that interact with the Hub require `HF_TOKEN` via secrets. See Token Usage section below.\n\n3. **Provide job details after submission** - After submitting, provide job ID, monitoring URL, estimated time, and note that the user can request status checks later.\n\n4. **Set appropriate timeouts** - Default 30min may be insufficient for long-running tasks.\n\n## Prerequisites Checklist\n\nBefore starting any job, verify:\n\n### ✅ **Account & Authentication**\n- Hugging Face Account with [Pro](https://hf.co/pro), [Team](https://hf.co/enterprise), or [Enterprise](https://hf.co/enterprise) plan (Jobs require paid plan)\n- Authenticated login: Check with `hf_whoami()`\n- **HF_TOKEN for Hub Access** ⚠️ CRITICAL - Required for any Hub operations (push models/datasets, download private repos, etc.)\n- Token must have appropriate permissions (read for downloads, write for uploads)\n\n### ✅ **Token Usage** (See Token Usage section for details)\n\n**When tokens are required:**\n- Pushing models/datasets to Hub\n- Accessing private repositories\n- Using Hub APIs in scripts\n- Any authenticated Hub operations\n\n**How to provide tokens:**\n```python\n# hf_jobs MCP tool — $HF_TOKEN is auto-replaced with real token:\n{\"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"}}\n\n# HfApi().run_uv_job() — MUST pass actual token:\nfrom huggingface_hub import get_token\nsecrets={\"HF_TOKEN\": get_token()}\n```\n\n**⚠️ CRITICAL:** The `$HF_TOKEN` placeholder is ONLY auto-replaced by the `hf_jobs` MCP tool. When using `HfApi().run_uv_job()`, you MUST pass the real token via `get_token()`. Passing the literal string `\"$HF_TOKEN\"` results in a 9-character invalid token and 401 errors.\n\n## Token Usage Guide\n\n### Understanding Tokens\n\n**What are HF Tokens?**\n- Authentication credentials for Hugging Face Hub\n- Required for authenticated operations (push, private repos, API access)\n- Stored securely on your machine after `hf auth login`\n\n**Token Types:**\n- **Read Token** - Can download models/datasets, read private repos\n- **Write Token** - Can push models/datasets, create repos, modify content\n- **Organization Token** - Can act on behalf of an organization\n\n### When Tokens Are Required\n\n**Always Required:**\n- Pushing models/datasets to Hub\n- Accessing private repositories\n- Creating new repositories\n- Modifying existing repositories\n- Using Hub APIs programmatically\n\n**Not Required:**\n- Downloading public models/datasets\n- Running jobs that don't interact with Hub\n- Reading public repository information\n\n### How to Provide Tokens to Jobs\n\n#### Method 1: Automatic Token (Recommended)\n\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"your_script.py\",\n    \"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"}  # ✅ Automatic replacement\n})\n```\n\n**How it works:**\n- `$HF_TOKEN` is a placeholder that gets replaced with your actual token\n- Uses the token from your logged-in session (`hf auth login`)\n- Most secure and convenient method\n- Token is encrypted server-side when passed as a secret\n\n**Benefits:**\n- No token exposure in code\n- Uses your current login session\n- Automatically updated if you re-login\n- Works seamlessly with MCP tools\n\n#### Method 2: Explicit Token (Not Recommended)\n\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"your_script.py\",\n    \"secrets\": {\"HF_TOKEN\": \"hf_abc123...\"}  # ⚠️ Hardcoded token\n})\n```\n\n**When to use:**\n- Only if automatic token doesn't work\n- Testing with a specific token\n- Organization tokens (use with caution)\n\n**Security concerns:**\n- Token visible in code/logs\n- Must manually update if token rotates\n- Risk of token exposure\n\n#### Method 3: Environment Variable (Less Secure)\n\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"your_script.py\",\n    \"env\": {\"HF_TOKEN\": \"hf_abc123...\"}  # ⚠️ Less secure than secrets\n})\n```\n\n**Difference from secrets:**\n- `env` variables are visible in job logs\n- `secrets` are encrypted server-side\n- Always prefer `secrets` for tokens\n\n### Using Tokens in Scripts\n\n**In your Python script, tokens are available as environment variables:**\n\n```python\n# /// script\n# dependencies = [\"huggingface-hub\"]\n# ///\n\nimport os\nfrom huggingface_hub import HfApi\n\n# Token is automatically available if passed via secrets\ntoken = os.environ.get(\"HF_TOKEN\")\n\n# Use with Hub API\napi = HfApi(token=token)\n\n# Or let huggingface_hub auto-detect\napi = HfApi()  # Automatically uses HF_TOKEN env var\n```\n\n**Best practices:**\n- Don't hardcode tokens in scripts\n- Use `os.environ.get(\"HF_TOKEN\")` to access\n- Let `huggingface_hub` auto-detect when possible\n- Verify token exists before Hub operations\n\n### Token Verification\n\n**Check if you're logged in:**\n```python\nfrom huggingface_hub import whoami\nuser_info = whoami()  # Returns your username if authenticated\n```\n\n**Verify token in job:**\n```python\nimport os\nassert \"HF_TOKEN\" in os.environ, \"HF_TOKEN not found!\"\ntoken = os.environ[\"HF_TOKEN\"]\nprint(f\"Token starts with: {token[:7]}...\")  # Should start with \"hf_\"\n```\n\n### Common Token Issues\n\n**Error: 401 Unauthorized**\n- **Cause:** Token missing or invalid\n- **Fix:** Add `secrets={\"HF_TOKEN\": \"$HF_TOKEN\"}` to job config\n- **Verify:** Check `hf_whoami()` works locally\n\n**Error: 403 Forbidden**\n- **Cause:** Token lacks required permissions\n- **Fix:** Ensure token has write permissions for push operations\n- **Check:** Token type at https://huggingface.co/settings/tokens\n\n**Error: Token not found in environment**\n- **Cause:** `secrets` not passed or wrong key name\n- **Fix:** Use `secrets={\"HF_TOKEN\": \"$HF_TOKEN\"}` (not `env`)\n- **Verify:** Script checks `os.environ.get(\"HF_TOKEN\")`\n\n**Error: Repository access denied**\n- **Cause:** Token doesn't have access to private repo\n- **Fix:** Use token from account with access\n- **Check:** Verify repo visibility and your permissions\n\n### Token Security Best Practices\n\n1. **Never commit tokens** - Use `$HF_TOKEN` placeholder or environment variables\n2. **Use secrets, not env** - Secrets are encrypted server-side\n3. **Rotate tokens regularly** - Generate new tokens periodically\n4. **Use minimal permissions** - Create tokens with only needed permissions\n5. **Don't share tokens** - Each user should use their own token\n6. **Monitor token usage** - Check token activity in Hub settings\n\n### Complete Token Example\n\n```python\n# Example: Push results to Hub\nhf_jobs(\"uv\", {\n    \"script\": \"\"\"\n# /// script\n# dependencies = [\"huggingface-hub\", \"datasets\"]\n# ///\n\nimport os\nfrom huggingface_hub import HfApi\nfrom datasets import Dataset\n\n# Verify token is available\nassert \"HF_TOKEN\" in os.environ, \"HF_TOKEN required!\"\n\n# Use token for Hub operations\napi = HfApi(token=os.environ[\"HF_TOKEN\"])\n\n# Create and push dataset\ndata = {\"text\": [\"Hello\", \"World\"]}\ndataset = Dataset.from_dict(data)\ndataset.push_to_hub(\"username/my-dataset\", token=os.environ[\"HF_TOKEN\"])\n\nprint(\"✅ Dataset pushed successfully!\")\n\"\"\",\n    \"flavor\": \"cpu-basic\",\n    \"timeout\": \"30m\",\n    \"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"}  # ✅ Token provided securely\n})\n```\n\n## Quick Start: Two Approaches\n\n### Approach 1: UV Scripts (Recommended)\n\nUV scripts use PEP 723 inline dependencies for clean, self-contained workloads.\n\n**MCP Tool:**\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"\"\"\n# /// script\n# dependencies = [\"transformers\", \"torch\"]\n# ///\n\nfrom transformers import pipeline\nimport torch\n\n# Your workload here\nclassifier = pipeline(\"sentiment-analysis\")\nresult = classifier(\"I love Hugging Face!\")\nprint(result)\n\"\"\",\n    \"flavor\": \"cpu-basic\",\n    \"timeout\": \"30m\"\n})\n```\n\n**CLI Equivalent:**\n```bash\nhf jobs uv run my_script.py --flavor cpu-basic --timeout 30m\n```\n\n**Python API:**\n```python\nfrom huggingface_hub import run_uv_job\nrun_uv_job(\"my_script.py\", flavor=\"cpu-basic\", timeout=\"30m\")\n```\n\n**Benefits:** Direct MCP tool usage, clean code, dependencies declared inline, no file saving required\n\n**When to use:** Default choice for all workloads, custom logic, any scenario requiring `hf_jobs()`\n\n#### Custom Docker Images for UV Scripts\n\nBy default, UV scripts use `ghcr.io/astral-sh/uv:python3.12-bookworm-slim`. For ML workloads with complex dependencies, use pre-built images:\n\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"inference.py\",\n    \"image\": \"vllm/vllm-openai:latest\",  # Pre-built image with vLLM\n    \"flavor\": \"a10g-large\"\n})\n```\n\n**CLI:**\n```bash\nhf jobs uv run --image vllm/vllm-openai:latest --flavor a10g-large inference.py\n```\n\n**Benefits:** Faster startup, pre-installed dependencies, optimized for specific frameworks\n\n#### Python Version\n\nBy default, UV scripts use Python 3.12. Specify a different version:\n\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"my_script.py\",\n    \"python\": \"3.11\",  # Use Python 3.11\n    \"flavor\": \"cpu-basic\"\n})\n```\n\n**Python API:**\n```python\nfrom huggingface_hub import run_uv_job\nrun_uv_job(\"my_script.py\", python=\"3.11\")\n```\n\n#### Working with Scripts\n\n⚠️ **Important:** There are *two* \"script path\" stories depending on how you run Jobs:\n\n- **Using the `hf_jobs()` MCP tool (recommended in this repo)**: the `script` value must be **inline code** (a string) or a **URL**. A local filesystem path (like `\"./scripts/foo.py\"`) won't exist inside the remote container.\n- **Using the `hf jobs uv run` CLI**: local file paths **do work** (the CLI uploads your script).\n\n**Common mistake with `hf_jobs()` MCP tool:**\n\n```python\n# ❌ Will fail (remote container can't see your local path)\nhf_jobs(\"uv\", {\"script\": \"./scripts/foo.py\"})\n```\n\n**Correct patterns with `hf_jobs()` MCP tool:**\n\n```python\n# ✅ Inline: read the local script file and pass its *contents*\nfrom pathlib import Path\nscript = Path(\"hf-jobs/scripts/foo.py\").read_text()\nhf_jobs(\"uv\", {\"script\": script})\n\n# ✅ URL: host the script somewhere reachable\nhf_jobs(\"uv\", {\"script\": \"https://huggingface.co/datasets/uv-scripts/.../raw/main/foo.py\"})\n\n# ✅ URL from GitHub\nhf_jobs(\"uv\", {\"script\": \"https://raw.githubusercontent.com/huggingface/trl/main/trl/scripts/sft.py\"})\n```\n\n**CLI equivalent (local paths supported):**\n\n```bash\nhf jobs uv run ./scripts/foo.py -- --your --args\n```\n\n#### Adding Dependencies at Runtime\n\nAdd extra dependencies beyond what's in the PEP 723 header:\n\n```python\nhf_jobs(\"uv\", {\n    \"script\": \"inference.py\",\n    \"dependencies\": [\"transformers\", \"torch>=2.0\"],  # Extra deps\n    \"flavor\": \"a10g-small\"\n})\n```\n\n**Python API:**\n```python\nfrom huggingface_hub import run_uv_job\nrun_uv_job(\"inference.py\", dependencies=[\"transformers\", \"torch>=2.0\"])\n```\n\n### Approach 2: Docker-Based Jobs\n\nRun jobs with custom Docker images and commands.\n\n**MCP Tool:**\n```python\nhf_jobs(\"run\", {\n    \"image\": \"python:3.12\",\n    \"command\": [\"python\", \"-c\", \"print('Hello from HF Jobs!')\"],\n    \"flavor\": \"cpu-basic\",\n    \"timeout\": \"30m\"\n})\n```\n\n**CLI Equivalent:**\n```bash\nhf jobs run python:3.12 python -c \"print('Hello from HF Jobs!')\"\n```\n\n**Python API:**\n```python\nfrom huggingface_hub import run_job\nrun_job(image=\"python:3.12\", command=[\"python\", \"-c\", \"print('Hello!')\"], flavor=\"cpu-basic\")\n```\n\n**Benefits:** Full Docker control, use pre-built images, run any command\n**When to use:** Need specific Docker images, non-Python workloads, complex environments\n\n**Example with GPU:**\n```python\nhf_jobs(\"run\", {\n    \"image\": \"pytorch/pytorch:2.6.0-cuda12.4-cudnn9-devel\",\n    \"command\": [\"python\", \"-c\", \"import torch; print(torch.cuda.get_device_name())\"],\n    \"flavor\": \"a10g-small\",\n    \"timeout\": \"1h\"\n})\n```\n\n**Using Hugging Face Spaces as Images:**\n\nYou can use Docker images from HF Spaces:\n```python\nhf_jobs(\"run\", {\n    \"image\": \"hf.co/spaces/lhoestq/duckdb\",  # Space as Docker image\n    \"command\": [\"duckdb\", \"-c\", \"SELECT 'Hello from DuckDB!'\"],\n    \"flavor\": \"cpu-basic\"\n})\n```\n\n**CLI:**\n```bash\nhf jobs run hf.co/spaces/lhoestq/duckdb duckdb -c \"SELECT 'Hello!'\"\n```\n\n### Finding More UV Scripts on Hub\n\nThe `uv-scripts` organization provides ready-to-use UV scripts stored as datasets on Hugging Face Hub:\n\n```python\n# Discover available UV script collections\ndataset_search({\"author\": \"uv-scripts\", \"sort\": \"downloads\", \"limit\": 20})\n\n# Explore a specific collection\nhub_repo_details([\"uv-scripts/classification\"], repo_type=\"dataset\", include_readme=True)\n```\n\n**Popular collections:** OCR, classification, synthetic-data, vLLM, dataset-creation\n\n## Hardware Selection\n\n> **Reference:** [HF Jobs Hardware Docs](https://huggingface.co/docs/hub/en/spaces-config-reference) (updated 07/2025)\n\n| Workload Type | Recommended Hardware | Use Case |\n|---------------|---------------------|----------|\n| Data processing, testing | `cpu-basic`, `cpu-upgrade` | Lightweight tasks |\n| Small models, demos | `t4-small` | <1B models, quick tests |\n| Medium models | `t4-medium`, `l4x1` | 1-7B models |\n| Large models, production | `a10g-small`, `a10g-large` | 7-13B models |\n| Very large models | `a100-large` | 13B+ models |\n| Batch inference | `a10g-large`, `a100-large` | High-throughput |\n| Multi-GPU workloads | `l4x4`, `a10g-largex2`, `a10g-largex4` | Parallel/large models |\n| TPU workloads | `v5e-1x1`, `v5e-2x2`, `v5e-2x4` | JAX/Flax, TPU-optimized |\n\n**All Available Flavors:**\n- **CPU:** `cpu-basic`, `cpu-upgrade`\n- **GPU:** `t4-small`, `t4-medium`, `l4x1`, `l4x4`, `a10g-small`, `a10g-large`, `a10g-largex2`, `a10g-largex4`, `a100-large`\n- **TPU:** `v5e-1x1`, `v5e-2x2`, `v5e-2x4`\n\n**Guidelines:**\n- Start with smaller hardware for testing\n- Scale up based on actual needs\n- Use multi-GPU for parallel workloads or large models\n- Use TPUs for JAX/Flax workloads\n- See `references/hardware_guide.md` for detailed specifications\n\n## Critical: Saving Results\n\n**⚠️ EPHEMERAL ENVIRONMENT—MUST PERSIST RESULTS**\n\nThe Jobs environment is temporary. All files are deleted when the job ends. If results aren't persisted, **ALL WORK IS LOST**.\n\n### Persistence Options\n\n**1. Push to Hugging Face Hub (Recommended)**\n\n```python\n# Push models\nmodel.push_to_hub(\"username/model-name\", token=os.environ[\"HF_TOKEN\"])\n\n# Push datasets\ndataset.push_to_hub(\"username/dataset-name\", token=os.environ[\"HF_TOKEN\"])\n\n# Push artifacts\napi.upload_file(\n    path_or_fileobj=\"results.json\",\n    path_in_repo=\"results.json\",\n    repo_id=\"username/results\",\n    token=os.environ[\"HF_TOKEN\"]\n)\n```\n\n**2. Use External Storage**\n\n```python\n# Upload to S3, GCS, etc.\nimport boto3\ns3 = boto3.client('s3')\ns3.upload_file('results.json', 'my-bucket', 'results.json')\n```\n\n**3. Send Results via API**\n\n```python\n# POST results to your API\nimport requests\nrequests.post(\"https://your-api.com/results\", json=results)\n```\n\n### Required Configuration for Hub Push\n\n**In job submission:**\n```python\n# hf_jobs MCP tool:\n{\"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"}}  # auto-replaced\n\n# HfApi().run_uv_job():\nfrom huggingface_hub import get_token\nsecrets={\"HF_TOKEN\": get_token()}  # must pass real token\n```\n\n**In script:**\n```python\nimport os\nfrom huggingface_hub import HfApi\n\n# Token automatically available from secrets\napi = HfApi(token=os.environ.get(\"HF_TOKEN\"))\n\n# Push your results\napi.upload_file(...)\n```\n\n### Verification Checklist\n\nBefore submitting:\n- [ ] Results persistence method chosen\n- [ ] Token in secrets if using Hub (MCP: `\"$HF_TOKEN\"`, Python API: `get_token()`)\n- [ ] Script handles missing token gracefully\n- [ ] Test persistence path works\n\n**See:** `references/hub_saving.md` for detailed Hub persistence guide\n\n## Timeout Management\n\n**⚠️ DEFAULT: 30 MINUTES**\n\nJobs automatically stop after the timeout. For long-running tasks like training, always set a custom timeout.\n\n### Setting Timeouts\n\n**MCP Tool:**\n```python\n{\n    \"timeout\": \"2h\"   # 2 hours\n}\n```\n\n**Supported formats:**\n- Integer/float: seconds (e.g., `300` = 5 minutes)\n- String with suffix: `\"5m\"` (minutes), `\"2h\"` (hours), `\"1d\"` (days)\n- Examples: `\"90m\"`, `\"2h\"`, `\"1.5h\"`, `300`, `\"1d\"`\n\n**Python API:**\n```python\nfrom huggingface_hub import run_job, run_uv_job\n\nrun_job(image=\"python:3.12\", command=[...], timeout=\"2h\")\nrun_uv_job(\"script.py\", timeout=7200)  # 2 hours in seconds\n```\n\n### Timeout Guidelines\n\n| Scenario | Recommended | Notes |\n|----------|-------------|-------|\n| Quick test | 10-30 min | Verify setup |\n| Data processing | 1-2 hours | Depends on data size |\n| Batch inference | 2-4 hours | Large batches |\n| Experiments | 4-8 hours | Multiple runs |\n| Long-running | 8-24 hours | Production workloads |\n\n**Always add 20-30% buffer** for setup, network delays, and cleanup.\n\n**On timeout:** Job killed immediately, all unsaved progress lost\n\n## Cost Estimation\n\n**General guidelines:**\n\n```\nTotal Cost = (Hours of runtime) × (Cost per hour)\n```\n\n**Example calculations:**\n\n**Quick test:**\n- Hardware: cpu-basic ($0.10/hour)\n- Time: 15 minutes (0.25 hours)\n- Cost: $0.03\n\n**Data processing:**\n- Hardware: l4x1 ($2.50/hour)\n- Time: 2 hours\n- Cost: $5.00\n\n**Batch inference:**\n- Hardware: a10g-large ($5/hour)\n- Time: 4 hours\n- Cost: $20.00\n\n**Cost optimization tips:**\n1. Start small - Test on cpu-basic or t4-small\n2. Monitor runtime - Set appropriate timeouts\n3. Use checkpoints - Resume if job fails\n4. Optimize code - Reduce unnecessary compute\n5. Choose right hardware - Don't over-provision\n\n## Monitoring and Tracking\n\n### Check Job Status\n\n**MCP Tool:**\n```python\n# List all jobs\nhf_jobs(\"ps\")\n\n# Inspect specific job\nhf_jobs(\"inspect\", {\"job_id\": \"your-job-id\"})\n\n# View logs\nhf_jobs(\"logs\", {\"job_id\": \"your-job-id\"})\n\n# Cancel a job\nhf_jobs(\"cancel\", {\"job_id\": \"your-job-id\"})\n```\n\n**Python API:**\n```python\nfrom huggingface_hub import list_jobs, inspect_job, fetch_job_logs, cancel_job\n\n# List your jobs\njobs = list_jobs()\n\n# List running jobs only\nrunning = [j for j in list_jobs() if j.status.stage == \"RUNNING\"]\n\n# Inspect specific job\njob_info = inspect_job(job_id=\"your-job-id\")\n\n# View logs\nfor log in fetch_job_logs(job_id=\"your-job-id\"):\n    print(log)\n\n# Cancel a job\ncancel_job(job_id=\"your-job-id\")\n```\n\n**CLI:**\n```bash\nhf jobs ps                    # List jobs\nhf jobs logs <job-id>         # View logs\nhf jobs cancel <job-id>       # Cancel job\n```\n\n**Remember:** Wait for user to request status checks. Avoid polling repeatedly.\n\n### Job URLs\n\nAfter submission, jobs have monitoring URLs:\n```\nhttps://huggingface.co/jobs/username/job-id\n```\n\nView logs, status, and details in the browser.\n\n### Wait for Multiple Jobs\n\n```python\nimport time\nfrom huggingface_hub import inspect_job, run_job\n\n# Run multiple jobs\njobs = [run_job(image=img, command=cmd) for img, cmd in workloads]\n\n# Wait for all to complete\nfor job in jobs:\n    while inspect_job(job_id=job.id).status.stage not in (\"COMPLETED\", \"ERROR\"):\n        time.sleep(10)\n```\n\n## Scheduled Jobs\n\nRun jobs on a schedule using CRON expressions or predefined schedules.\n\n**MCP Tool:**\n```python\n# Schedule a UV script that runs every hour\nhf_jobs(\"scheduled uv\", {\n    \"script\": \"your_script.py\",\n    \"schedule\": \"@hourly\",\n    \"flavor\": \"cpu-basic\"\n})\n\n# Schedule with CRON syntax\nhf_jobs(\"scheduled uv\", {\n    \"script\": \"your_script.py\",\n    \"schedule\": \"0 9 * * 1\",  # 9 AM every Monday\n    \"flavor\": \"cpu-basic\"\n})\n\n# Schedule a Docker-based job\nhf_jobs(\"scheduled run\", {\n    \"image\": \"python:3.12\",\n    \"command\": [\"python\", \"-c\", \"print('Scheduled!')\"],\n    \"schedule\": \"@daily\",\n    \"flavor\": \"cpu-basic\"\n})\n```\n\n**Python API:**\n```python\nfrom huggingface_hub import create_scheduled_job, create_scheduled_uv_job\n\n# Schedule a Docker job\ncreate_scheduled_job(\n    image=\"python:3.12\",\n    command=[\"python\", \"-c\", \"print('Running on schedule!')\"],\n    schedule=\"@hourly\"\n)\n\n# Schedule a UV script\ncreate_scheduled_uv_job(\"my_script.py\", schedule=\"@daily\", flavor=\"cpu-basic\")\n\n# Schedule with GPU\ncreate_scheduled_uv_job(\n    \"ml_inference.py\",\n    schedule=\"0 */6 * * *\",  # Every 6 hours\n    flavor=\"a10g-small\"\n)\n```\n\n**Available schedules:**\n- `@annually`, `@yearly` - Once per year\n- `@monthly` - Once per month\n- `@weekly` - Once per week\n- `@daily` - Once per day\n- `@hourly` - Once per hour\n- CRON expression - Custom schedule (e.g., `\"*/5 * * * *\"` for every 5 minutes)\n\n**Manage scheduled jobs:**\n```python\n# MCP Tool\nhf_jobs(\"scheduled ps\")                              # List scheduled jobs\nhf_jobs(\"scheduled inspect\", {\"job_id\": \"...\"})     # Inspect details\nhf_jobs(\"scheduled suspend\", {\"job_id\": \"...\"})     # Pause\nhf_jobs(\"scheduled resume\", {\"job_id\": \"...\"})      # Resume\nhf_jobs(\"scheduled delete\", {\"job_id\": \"...\"})      # Delete\n```\n\n**Python API for management:**\n```python\nfrom huggingface_hub import (\n    list_scheduled_jobs,\n    inspect_scheduled_job,\n    suspend_scheduled_job,\n    resume_scheduled_job,\n    delete_scheduled_job\n)\n\n# List all scheduled jobs\nscheduled = list_scheduled_jobs()\n\n# Inspect a scheduled job\ninfo = inspect_scheduled_job(scheduled_job_id)\n\n# Suspend (pause) a scheduled job\nsuspend_scheduled_job(scheduled_job_id)\n\n# Resume a scheduled job\nresume_scheduled_job(scheduled_job_id)\n\n# Delete a scheduled job\ndelete_scheduled_job(scheduled_job_id)\n```\n\n## Webhooks: Trigger Jobs on Events\n\nTrigger jobs automatically when changes happen in Hugging Face repositories.\n\n**Python API:**\n```python\nfrom huggingface_hub import create_webhook\n\n# Create webhook that triggers a job when a repo changes\nwebhook = create_webhook(\n    job_id=job.id,\n    watched=[\n        {\"type\": \"user\", \"name\": \"your-username\"},\n        {\"type\": \"org\", \"name\": \"your-org-name\"}\n    ],\n    domains=[\"repo\", \"discussion\"],\n    secret=\"your-secret\"\n)\n```\n\n**How it works:**\n1. Webhook listens for changes in watched repositories\n2. When triggered, the job runs with `WEBHOOK_PAYLOAD` environment variable\n3. Your script can parse the payload to understand what changed\n\n**Use cases:**\n- Auto-process new datasets when uploaded\n- Trigger inference when models are updated\n- Run tests when code changes\n- Generate reports on repository activity\n\n**Access webhook payload in script:**\n```python\nimport os\nimport json\n\npayload = json.loads(os.environ.get(\"WEBHOOK_PAYLOAD\", \"{}\"))\nprint(f\"Event type: {payload.get('event', {}).get('action')}\")\n```\n\nSee [Webhooks Documentation](https://huggingface.co/docs/huggingface_hub/guides/webhooks) for more details.\n\n## Common Workload Patterns\n\nThis repository ships ready-to-run UV scripts in `hf-jobs/scripts/`. Prefer using them instead of inventing new templates.\n\n### Pattern 1: Dataset → Model Responses (vLLM) — `scripts/generate-responses.py`\n\n**What it does:** loads a Hub dataset (chat `messages` or a `prompt` column), applies a model chat template, generates responses with vLLM, and **pushes** the output dataset + dataset card back to the Hub.\n\n**Requires:** GPU + **write** token (it pushes a dataset).\n\n```python\nfrom pathlib import Path\n\nscript = Path(\"hf-jobs/scripts/generate-responses.py\").read_text()\nhf_jobs(\"uv\", {\n    \"script\": script,\n    \"script_args\": [\n        \"username/input-dataset\",\n        \"username/output-dataset\",\n        \"--messages-column\", \"messages\",\n        \"--model-id\", \"Qwen/Qwen3-30B-A3B-Instruct-2507\",\n        \"--temperature\", \"0.7\",\n        \"--top-p\", \"0.8\",\n        \"--max-tokens\", \"2048\",\n    ],\n    \"flavor\": \"a10g-large\",\n    \"timeout\": \"4h\",\n    \"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"},\n})\n```\n\n### Pattern 2: CoT Self-Instruct Synthetic Data — `scripts/cot-self-instruct.py`\n\n**What it does:** generates synthetic prompts/answers via CoT Self-Instruct, optionally filters outputs (answer-consistency / RIP), then **pushes** the generated dataset + dataset card to the Hub.\n\n**Requires:** GPU + **write** token (it pushes a dataset).\n\n```python\nfrom pathlib import Path\n\nscript = Path(\"hf-jobs/scripts/cot-self-instruct.py\").read_text()\nhf_jobs(\"uv\", {\n    \"script\": script,\n    \"script_args\": [\n        \"--seed-dataset\", \"davanstrien/s1k-reasoning\",\n        \"--output-dataset\", \"username/synthetic-math\",\n        \"--task-type\", \"reasoning\",\n        \"--num-samples\", \"5000\",\n        \"--filter-method\", \"answer-consistency\",\n    ],\n    \"flavor\": \"l4x4\",\n    \"timeout\": \"8h\",\n    \"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"},\n})\n```\n\n### Pattern 3: Streaming Dataset Stats (Polars + HF Hub) — `scripts/finepdfs-stats.py`\n\n**What it does:** scans parquet directly from Hub (no 300GB download), computes temporal stats, and (optionally) uploads results to a Hub dataset repo.\n\n**Requires:** CPU is often enough; token needed **only** if you pass `--output-repo` (upload).\n\n```python\nfrom pathlib import Path\n\nscript = Path(\"hf-jobs/scripts/finepdfs-stats.py\").read_text()\nhf_jobs(\"uv\", {\n    \"script\": script,\n    \"script_args\": [\n        \"--limit\", \"10000\",\n        \"--show-plan\",\n        \"--output-repo\", \"username/finepdfs-temporal-stats\",\n    ],\n    \"flavor\": \"cpu-upgrade\",\n    \"timeout\": \"2h\",\n    \"env\": {\"HF_XET_HIGH_PERFORMANCE\": \"1\"},\n    \"secrets\": {\"HF_TOKEN\": \"$HF_TOKEN\"},\n})\n```\n\n## Common Failure Modes\n\n### Out of Memory (OOM)\n\n**Fix:**\n1. Reduce batch size or data chunk size\n2. Process data in smaller batches\n3. Upgrade hardware: cpu → t4 → a10g → a100\n\n### Job Timeout\n\n**Fix:**\n1. Check logs for actual runtime\n2. Increase timeout with buffer: `\"timeout\": \"3h\"`\n3. Optimize code for faster execution\n4. Process data in chunks\n\n### Hub Push Failures\n\n**Fix:**\n1. Add token to secrets: MCP uses `\"$HF_TOKEN\"` (auto-replaced), Python API uses `get_token()` (must pass real token)\n2. Verify token in script: `assert \"HF_TOKEN\" in os.environ`\n3. Check token permissions\n4. Verify repo exists or can be created\n\n### Missing Dependencies\n\n**Fix:**\nAdd to PEP 723 header:\n```python\n# /// script\n# dependencies = [\"package1\", \"package2>=1.0.0\"]\n# ///\n```\n\n### Authentication Errors\n\n**Fix:**\n1. Check `hf_whoami()` works locally\n2. Verify token in secrets — MCP: `\"$HF_TOKEN\"`, Python API: `get_token()` (NOT `\"$HF_TOKEN\"`)\n3. Re-login: `hf auth login`\n4. Check token has required permissions\n\n## Troubleshooting\n\n**Common issues:**\n- Job times out → Increase timeout, optimize code\n- Results not saved → Check persistence method, verify HF_TOKEN\n- Out of Memory → Reduce batch size, upgrade hardware\n- Import errors → Add dependencies to PEP 723 header\n- Authentication errors → Check token, verify secrets parameter\n\n**See:** `references/troubleshooting.md` for complete troubleshooting guide\n\n## Resources\n\n### References (In This Skill)\n- `references/token_usage.md` - Complete token usage guide\n- `references/hardware_guide.md` - Hardware specs and selection\n- `references/hub_saving.md` - Hub persistence guide\n- `references/troubleshooting.md` - Common issues and solutions\n\n### Scripts (In This Skill)\n- `scripts/generate-responses.py` - vLLM batch generation: dataset → responses → push to Hub\n- `scripts/cot-self-instruct.py` - CoT Self-Instruct synthetic data generation + filtering → push to Hub\n- `scripts/finepdfs-stats.py` - Polars streaming stats over `finepdfs-edu` parquet on Hub (optional push)\n\n### External Links\n\n**Official Documentation:**\n- [HF Jobs Guide](https://huggingface.co/docs/huggingface_hub/guides/jobs) - Main documentation\n- [HF Jobs CLI Reference](https://huggingface.co/docs/huggingface_hub/guides/cli#hf-jobs) - Command line interface\n- [HF Jobs API Reference](https://huggingface.co/docs/huggingface_hub/package_reference/hf_api) - Python API details\n- [Hardware Flavors Reference](https://huggingface.co/docs/hub/en/spaces-config-reference) - Available hardware\n\n**Related Tools:**\n- [UV Scripts Guide](https://docs.astral.sh/uv/guides/scripts/) - PEP 723 inline dependencies\n- [UV Scripts Organization](https://huggingface.co/uv-scripts) - Community UV script collection\n- [HF Hub Authentication](https://huggingface.co/docs/huggingface_hub/quick-start#authentication) - Token setup\n- [Webhooks Documentation](https://huggingface.co/docs/huggingface_hub/guides/webhooks) - Event triggers\n\n## Key Takeaways\n\n1. **Submit scripts inline** - The `script` parameter accepts Python code directly; no file saving required unless user requests\n2. **Jobs are asynchronous** - Don't wait/poll; let user check when ready\n3. **Always set timeout** - Default 30 min may be insufficient; set appropriate timeout\n4. **Always persist results** - Environment is ephemeral; without persistence, all work is lost\n5. **Use tokens securely** - MCP: `secrets={\"HF_TOKEN\": \"$HF_TOKEN\"}`, Python API: `secrets={\"HF_TOKEN\": get_token()}` — `\"$HF_TOKEN\"` only works with MCP tool\n6. **Choose appropriate hardware** - Start small, scale up based on needs (see hardware guide)\n7. **Use UV scripts** - Default to `hf_jobs(\"uv\", {...})` with inline scripts for Python workloads\n8. **Handle authentication** - Verify tokens are available before Hub operations\n9. **Monitor jobs** - Provide job URLs and status check commands\n10. **Optimize costs** - Choose right hardware, set appropriate timeouts\n\n## Quick Reference: MCP Tool vs CLI vs Python API\n\n| Operation | MCP Tool | CLI | Python API |\n|-----------|----------|-----|------------|\n| Run UV script | `hf_jobs(\"uv\", {...})` | `hf jobs uv run script.py` | `run_uv_job(\"script.py\")` |\n| Run Docker job | `hf_jobs(\"run\", {...})` | `hf jobs run image cmd` | `run_job(image, command)` |\n| List jobs | `hf_jobs(\"ps\")` | `hf jobs ps` | `list_jobs()` |\n| View logs | `hf_jobs(\"logs\", {...})` | `hf jobs logs <id>` | `fetch_job_logs(job_id)` |\n| Cancel job | `hf_jobs(\"cancel\", {...})` | `hf jobs cancel <id>` | `cancel_job(job_id)` |\n| Schedule UV | `hf_jobs(\"scheduled uv\", {...})` | `hf jobs scheduled uv run SCHEDULE script.py` | `create_scheduled_uv_job()` |\n| Schedule Docker | `hf_jobs(\"scheduled run\", {...})` | `hf jobs scheduled run SCHEDULE image cmd` | `create_scheduled_job()` |\n| List scheduled | `hf_jobs(\"scheduled ps\")` | `hf jobs scheduled ps` | `list_scheduled_jobs()` |\n| Delete scheduled | `hf_jobs(\"scheduled delete\", {...})` | `hf jobs scheduled delete <id>` | `delete_scheduled_job()` |\n\n## Limitations\n- Use this skill only when the task clearly matches the scope described above.\n- Do not treat the output as a substitute for environment-specific validation, testing, or expert review.\n- Stop and ask for clarification if required inputs, permissions, safety boundaries, or success criteria are missing.","tags":["hugging","face","jobs","antigravity","awesome","skills","sickn33","agent-skills","agentic-skills","ai-agent-skills","ai-agents","ai-coding"],"capabilities":["skill","source-sickn33","skill-hugging-face-jobs","topic-agent-skills","topic-agentic-skills","topic-ai-agent-skills","topic-ai-agents","topic-ai-coding","topic-ai-workflows","topic-antigravity","topic-antigravity-skills","topic-claude-code","topic-claude-code-skills","topic-codex-cli","topic-codex-skills"],"categories":["antigravity-awesome-skills"],"synonyms":[],"warnings":[],"endpointUrl":"https://skills.sh/sickn33/antigravity-awesome-skills/hugging-face-jobs","protocol":"skill","transport":"skills-sh","auth":{"type":"none","details":{"cli":"npx skills add sickn33/antigravity-awesome-skills","source_repo":"https://github.com/sickn33/antigravity-awesome-skills","install_from":"skills.sh"}},"qualityScore":"0.700","qualityRationale":"deterministic score 0.70 from registry signals: · indexed on github topic:agent-skills · 34768 github stars · SKILL.md body (31,317 chars)","verified":false,"liveness":"unknown","lastLivenessCheck":null,"agentReviews":{"count":0,"score_avg":null,"cost_usd_avg":null,"success_rate":null,"latency_p50_ms":null,"narrative_summary":null,"summary_updated_at":null},"enrichmentModel":"deterministic:skill-github:v1","enrichmentVersion":1,"enrichedAt":"2026-04-23T18:51:29.629Z","embedding":null,"createdAt":"2026-04-18T21:38:46.713Z","updatedAt":"2026-04-23T18:51:29.629Z","lastSeenAt":"2026-04-23T18:51:29.629Z","tsv":"'-13':1920 '-2':2371 '-24':2394 '-30':2364,2401 '-4':2380 '-7':1907 '-8':2386 '/5':2914 '/6':2878 '/astral-sh/uv:python3.12-bookworm-slim':1306 '/classification':1843 '/datasets/uv-scripts/.../raw/main/foo.py':1544 '/docs/hub/en/spaces-config-reference)':1870,3800 '/docs/huggingface_hub/guides/cli#hf-jobs)':3781 '/docs/huggingface_hub/guides/jobs)':3772 '/docs/huggingface_hub/guides/webhooks)':3182,3837 '/docs/huggingface_hub/package_reference/hf_api)':3791 '/docs/huggingface_hub/quick-start#authentication)':3830 '/enterprise)':317 '/enterprise),':312 '/hour':2439,2452 '/huggingface/trl/main/trl/scripts/sft.py':1554 '/jobs/username/job-id':2677 '/pro),':308 '/results':2164 '/scripts':3202 '/scripts/cot-self-instruct.py':3365 '/scripts/finepdfs-stats.py':3463 '/scripts/foo.py':1449,1496,1524,1565 '/scripts/generate-responses.py':3269 '/settings/tokens':955 '/spaces/lhoestq/duckdb':1764,1787 '/uv-scripts)':3820 '/uv/guides/scripts/)':3810 '0':2785,2877 '0.03':2446 '0.10':2438 '0.25':2443 '0.7':3290 '0.8':3294 '07/2025':1872 '1':187,582,1016,1174,1906,2079,2370,2473,2787,3099,3212,3493,3507,3531,3559,3619,3842 '1.0.0':3615 '1.5':2322 '10':2363,2737,3971 '10000':3474 '13b':1929 '15':2441 '1b':1896 '1d':2317,2325 '1h':1742 '1x1':1959,2007 '2':233,666,1027,1618,2126,2300,2352,2379,2454,2485,3107,3311,3515,3537,3580,3625,3860 '2.0':1592,1616 '2.50':2451 '2.6.0':1726 '20':1832,2400 '20.00':2469 '2048':3298 '2h':2299,2315,2321,2345,3487 '2x2':1962,2010 '2x4':1965,2013 '3':253,721,1038,2148,2491,3118,3407,3521,3544,3590,3640,3872 '3.11':1382,1385,1405 '3.12':1370,1639,1661,1682,2342,2808,2843 '30':2273,3877 '300':2307,2324 '300gb':3424 '30m':1160,1229,1243,1263,1653 '30min':283 '3h':3543 '4':278,1046,2385,2466,2498,3550,3594,3647,3885 '401':472,909 '403':933 '4h':3304 '5':1056,2308,2504,2917,3898 '5.00':2457 '5/hour':2464 '5000':3390 '5m':2313 '6':1068,2880,3922 '7':900,1919,3936 '7200':2351 '723':1182,1581,3608,3686,3812 '8':2393,3951 '8h':3400 '9':467,2786,2788,3961 '90m':2320 'a100':1927,1937,2002,3527 'a100-large':1926,1936,2001 'a10g':1335,1348,1597,1739,1914,1917,1934,1948,1951,1990,1993,1996,1999,2462,2884,3301,3526 'a10g-large':1334,1347,1916,1933,1992,2461,3300 'a10g-largex2':1947,1995 'a10g-largex4':1950,1998 'a10g-small':1596,1738,1913,1989,2883 'abc123':681,736 'accept':207,3849 'access':333,373,497,545,837,987,994,1004,3154 'account':299,303,1002 'act':529 'action':3176 'activ':1074,3153 'actual':414,612,2025,3535 'ad':1568 'add':917,1572,2399,3560,3605,3682 'alway':188,234,539,757,2288,2398,3873,3886 'analysi':1215 'analyz':65 'annual':2888 'answer':3334,3395 'answer-consist':3333,3394 'api':378,496,556,804,805,816,1125,1245,1391,1600,1670,2152,2158,2222,2251,2327,2564,2821,2962,3051,3572,3634,3787,3793,3909,3988,3994 'api.upload':2109,2231 'appli':3231 'approach':1172,1173,1617 'appropri':280,349,2489,3883,3924,3978 'aren':2070 'arg':1567,3278,3374,3472 'artifact':2108 'ask':4152 'assert':881,1112,3585 'assist':184 'asynchron':3863 'auth':505,624,3645 'authent':236,300,323,382,483,491,873,3616,3688,3827,3953 'author':1825 'auto':398,435,814,842,2186,3132,3569 'auto-detect':813,841 'auto-process':3131 'auto-replac':397,434,2185,3568 'autom':114 'automat':583,597,653,689,791,818,2218,2276,3042 'avail':772,792,1111,1819,1971,2219,2886,3801,3957 'avoid':2664 'b':1908,1921 'back':3247 'base':130,1621,2023,2800,3930 'bash':1232,1338,1560,1656,1781,2640 'basic':1158,1227,1241,1261,1389,1651,1691,1779,1884,1976,2437,2480,2773,2795,2819,2867 'batch':68,162,1931,2377,2383,2458,3509,3520,3676,3731 'behalf':531 'benchmark':77 'benefit':642,1264,1351,1692 'best':824,1014 'beyond':1575 'boto3':2137 'boto3.client':2139 'boundari':4160 'browser':2685 'bucket':2146 'buffer':2402,3541 'built':1316,1329,1699 'c':1642,1663,1685,1730,1771,1789,2811,2846 'calcul':2431 'cancel':2551,2556,2577,2628,2631,2653,2654,4048,4052,4055,4056 'card':3246,3343 'case':59,1878,3130 'caus':911,935,962,989 'caution':703 'chang':3044,3068,3103,3128,3148 'charact':468 'chat':3225,3234 'check':276,325,854,927,949,981,1005,1072,2516,2663,3532,3591,3620,3648,3666,3690,3869,3969 'checklist':293,2234 'checkpoint':2493 'choic':1282 'choos':2505,3923,3974 'chosen':2240 'chunk':3513,3554 'clarif':4154 'classif':1853 'classifi':1211,1217 'clean':1186,1269 'cleanup':2408 'clear':4127 'cli':1230,1337,1463,1470,1555,1654,1780,2639,3777,3985,3992 'cloud':43,149 'cmd':2710,2713,4020,4089 'code':107,209,647,1270,1438,2500,3147,3546,3662,3851 'code/logs':709 'collect':1822,1836,1851,3824 'column':3230,3283 'command':1630,1640,1683,1703,1728,1769,2343,2709,2809,2844,3782,3970,4024 'commit':1018 'common':57,905,1474,3186,3499,3654,3721 'communiti':3821 'complet':1078,2720,2734,3698,3707 'complex':1311,1715 'comput':2503,3426 'concern':705 'config':925 'configur':2168 'consist':3335,3396 'contain':1189,1456,1485 'content':226,525,1514 'control':1695 'conveni':629 'correct':1497 'cost':2418,2423,2427,2445,2456,2468,2470,3973 'cot':3312,3326,3739 'cpu':1157,1226,1240,1260,1388,1650,1690,1778,1883,1886,1973,1975,1978,2436,2479,2772,2794,2818,2866,3439,3484,3524 'cpu-bas':1156,1225,1239,1259,1387,1649,1689,1777,1882,1974,2435,2478,2771,2793,2817,2865 'cpu-upgrad':1885,1977,3483 'cpus':13,44 'creat':522,548,1050,1131,2827,2830,2838,2857,2871,3057,3059,3070,3601,4073,4090 'creation':1860 'credenti':484 'criteria':4163 'critic':334,427,2047 'cron':2746,2776,2909 'cuda12.4-cudnn9-devel':1727 'current':650 'custom':1286,1293,1626,2291,2911 'daili':2815,2863,2901 'data':60,98,158,1135,1142,1856,1879,2368,2375,2447,3317,3512,3517,3552,3744 'dataset':67,101,1096,1105,1107,1134,1139,1152,1812,1823,1846,1859,2098,3135,3213,3224,3244,3245,3258,3341,3342,3354,3377,3381,3409,3436,3733 'dataset-cr':1858 'dataset.from':1140 'dataset.push':1143,2099 'davanstrien/s1k-reasoning':3378 'day':2318,2904 'declar':1272 'default':282,1281,1300,1365,2272,3876,3940 'delay':2406 'delet':2063,2957,2960,2982,3025,3029,4106,4111,4115,4116 'demo':1892 'deni':988 'dep':1594 'depend':778,1092,1184,1199,1271,1312,1357,1416,1569,1574,1589,1613,2373,3603,3612,3683,3814 'describ':4131 'detail':256,364,1839,2045,2266,2682,2939,3185,3794 'detect':815,843 'develop':104 'devic':1735 'dict':1141 'differ':741,1373 'direct':182,210,1265,3420,3852 'discov':1818 'discuss':3091 'doc':1867 'docker':1294,1620,1627,1694,1709,1752,1767,2799,2836,4011,4078 'docker-bas':1619,2798 'docs.astral.sh':3809 'docs.astral.sh/uv/guides/scripts/)':3808 'document':3179,3766,3774,3834 'doesn':691,991 'domain':3089 'download':342,353,512,560,1830,3425 'duckdb':1770,1775,1788 'e.g':2306,2913 'edu':3757 'encrypt':633,753,1034 'end':2067 'enough':3442 'ensur':941 'enterpris':314 'env':732,744,822,978,1031,3488 'environ':722,774,961,1025,1716,2051,2057,3116,3889,4143 'environment-specif':4142 'ephemer':2050,3891 'equival':1231,1556,1655 'error':473,908,932,956,985,2735,3617,3681,3689 'estim':266,2419 'etc':345,2135 'event':3039,3171,3174,3838 'everi':2760,2790,2879,2916 'exampl':1080,1082,1717,2319,2430 'execut':151,3549 'exist':552,848,1452,3597 'experi':76,80,165,2384 'expert':4148 'explicit':220,667 'explor':1833 'exposur':645,719 'express':2747,2910 'extern':2128,3763 'extra':1573,1593 'f':895,3170 'face':3,9,24,34,55,179,302,487,1221,1745,1815,2083,3048 'fail':1483,2497 'failur':3500,3557 'faster':1352,3548 'fetch':2574,2617,4043 'file':216,1275,1465,1510,2061,2110,2142,2232,3854 'fileobj':2113 'filesystem':1446 'filter':63,3331,3392,3746 'filter-method':3391 'find':1792 'fine':84 'fine-tun':83 'finepdf':3756 'finepdfs-edu':3755 'fix':916,940,970,998,3506,3530,3558,3604,3618 'flavor':1155,1224,1238,1258,1333,1346,1386,1595,1648,1688,1737,1776,1972,2770,2792,2816,2864,2882,3299,3397,3482,3796 'forbidden':934 'format':2303 'found':889,959 'framework':1361 'full':1693 'fulli':31 'gcs':2134 'general':2420 'generat':99,100,1042,3149,3236,3322,3340,3732,3745 'get':420,425,456,608,2196,2201,2252,3175,3574,3635,3913 'ghcr.io':1305 'ghcr.io/astral-sh/uv:python3.12-bookworm-slim':1304 'github':1547 'gpu':110,1719,1944,1980,2030,2870,3252,3348 'gpu/tpu':155 'gpus':14,45 'gpus/tpus':170 'grace':2258 'guid':476,2269,3700,3710,3719,3769,3807,3935 'guidelin':2014,2357,2421 'h':2323 'handl':235,2255,3952 'happen':3045 'hardcod':682,828 'hardwar':1861,1866,1876,2018,2434,2449,2460,2507,3523,3679,3712,3795,3802,3925,3934,3976 'header':1582,3609,3687 'hello':1137,1644,1665,1687,1773,1791 'hf':190,197,201,231,244,327,329,390,394,404,406,423,429,439,462,481,504,587,593,595,602,623,672,678,680,727,733,735,799,820,834,882,886,892,904,919,921,928,973,975,983,1021,1087,1113,1117,1129,1149,1162,1164,1194,1233,1291,1319,1339,1376,1424,1459,1477,1492,1500,1522,1527,1538,1548,1561,1584,1634,1646,1657,1667,1721,1755,1758,1782,1864,2095,2105,2124,2176,2181,2183,2199,2226,2248,2525,2531,2542,2554,2641,2646,2651,2762,2778,2802,2925,2932,2940,2947,2954,3200,3267,3272,3306,3308,3363,3368,3402,3404,3412,3461,3466,3489,3495,3497,3566,3586,3621,3631,3638,3644,3670,3767,3775,3785,3825,3904,3906,3911,3915,3942,3998,4001,4013,4016,4027,4030,4037,4040,4050,4053,4062,4066,4079,4083,4095,4099,4108,4112 'hf-job':1521,3199,3266,3362,3460 'hf.co':307,311,316,1763,1786 'hf.co/enterprise)':315 'hf.co/enterprise),':310 'hf.co/pro),':306 'hf.co/spaces/lhoestq/duckdb':1762,1785 'hfapi':408,445,788,806,817,1103,1126,2188,2216,2223 'high':1940,3491 'high-throughput':1939 'host':1533 'hour':2301,2316,2353,2372,2381,2387,2395,2424,2429,2444,2455,2467,2761,2769,2852,2881,2905,2908 'hub':18,56,180,242,332,338,372,377,383,418,488,544,555,570,781,786,803,812,840,850,863,1076,1086,1095,1101,1123,1145,1249,1395,1604,1674,1797,1816,1837,2084,2091,2101,2170,2194,2214,2246,2267,2331,2568,2695,2825,2968,3055,3223,3250,3346,3413,3422,3435,3555,3717,3737,3749,3760,3826,3959 'hug':2,8,23,33,54,178,301,486,1220,1744,1814,2082,3047 'hugging-face-job':1 'huggingfac':417,780,785,811,839,862,1094,1100,1248,1394,1603,1673,2193,2213,2330,2567,2694,2824,2967,3054 'huggingface-hub':779,1093 'huggingface.co':954,1543,1869,2676,3181,3771,3780,3790,3799,3819,3829,3836 'huggingface.co/datasets/uv-scripts/.../raw/main/foo.py':1542 'huggingface.co/docs/hub/en/spaces-config-reference)':1868,3798 'huggingface.co/docs/huggingface_hub/guides/cli#hf-jobs)':3779 'huggingface.co/docs/huggingface_hub/guides/jobs)':3770 'huggingface.co/docs/huggingface_hub/guides/webhooks)':3180,3835 'huggingface.co/docs/huggingface_hub/package_reference/hf_api)':3789 'huggingface.co/docs/huggingface_hub/quick-start#authentication)':3828 'huggingface.co/jobs/username/job-id':2675 'huggingface.co/settings/tokens':953 'huggingface.co/uv-scripts)':3818 'id':263,2120,2535,2539,2546,2550,2558,2562,2607,2611,2621,2625,2634,2638,2729,2937,2945,2952,2959,3003,3014,3024,3034,3073,3287,4047,4059 'imag':1295,1317,1324,1330,1343,1628,1637,1680,1700,1710,1724,1748,1753,1761,1768,2340,2707,2806,2841,4019,4023,4088 'img':2708,2712 'immedi':2413 'import':419,782,787,864,879,1097,1102,1106,1204,1206,1250,1396,1409,1517,1605,1675,1731,2136,2159,2195,2210,2215,2332,2569,2691,2696,2826,2969,3056,3160,3162,3262,3358,3456,3680 'includ':1847 'increas':3538,3659 'infer':69,71,163,1932,2378,2459,3139 'inference.py':1323,1350,1588,1612 'info':867,2603,2997 'inform':574 'infrastructur':35,150 'inlin':1183,1273,1437,1505,3813,3845,3946 'input':4157 'insid':1453 'inspect':2528,2533,2572,2599,2604,2697,2726,2935,2938,2973,2993,2998 'instal':1356 'instead':3206 'instruct':3315,3329,3742 'insuffici':286,3881 'integer/float':2304 'interact':239,568 'interfac':3784 'invalid':469,915 'invent':3208 'issu':907,3655,3722 'j':2590,2592 'j.status.stage':2597 'jax/flax':1966,2040 'job':4,10,25,40,113,152,186,191,195,198,202,232,237,255,262,297,319,391,411,440,448,564,580,588,673,728,749,877,924,1088,1195,1234,1253,1256,1292,1320,1340,1377,1399,1402,1421,1425,1460,1478,1493,1501,1523,1528,1539,1549,1562,1585,1608,1611,1622,1624,1635,1647,1658,1668,1677,1679,1722,1759,1783,1865,2056,2066,2173,2177,2191,2275,2334,2337,2339,2348,2411,2496,2517,2524,2526,2530,2532,2534,2538,2543,2545,2549,2553,2555,2557,2561,2571,2573,2575,2578,2581,2582,2584,2587,2595,2601,2602,2605,2606,2610,2618,2620,2624,2630,2632,2633,2637,2642,2645,2647,2652,2655,2667,2671,2689,2698,2700,2703,2704,2706,2722,2724,2727,2728,2739,2741,2763,2779,2801,2803,2829,2833,2837,2840,2860,2874,2921,2926,2931,2933,2936,2941,2944,2948,2951,2955,2958,2972,2975,2978,2981,2984,2988,2992,2996,3000,3002,3008,3011,3013,3018,3021,3023,3028,3031,3033,3037,3041,3064,3072,3111,3201,3268,3273,3364,3369,3462,3467,3528,3656,3768,3776,3786,3861,3943,3963,3965,3999,4002,4008,4012,4014,4017,4022,4026,4028,4031,4034,4038,4041,4044,4046,4049,4051,4054,4057,4058,4063,4067,4076,4080,4084,4092,4096,4100,4105,4109,4113,4118 'job.id':2730,3074 'json':2165,3163 'json.loads':3165 'key':181,968,3840 'kill':2412 'l4x1':1905,1987,2450 'l4x4':1946,1988,3398 'lack':937 'larg':66,1336,1349,1910,1918,1924,1928,1935,1938,1994,2003,2035,2382,2463,3302 'largex2':1949,1997 'largex4':1952,2000 'later':277 'latest':1326,1345 'less':724,737 'let':810,838,3867 'lightweight':1888 'like':1448,2286 'limit':1831,3473,4119 'line':3783 'link':3764 'list':2522,2570,2579,2583,2585,2594,2644,2929,2970,2985,2990,4025,4033,4093,4103 'listen':3101 'liter':460 'llms':103 'load':3221 'local':37,109,154,215,931,1445,1464,1490,1508,1557,3624 'log':620,750,858,2541,2544,2576,2613,2615,2619,2627,2648,2650,2679,3533,4036,4039,4042,4045 'logged-in':619 'logic':1287 'login':324,506,625,651,659,3643,3646 'long':289,2283,2391 'long-run':288,2282,2390 'lost':2076,2417,3897 'love':1219 'machin':502 'main':3773 'manag':12,32,2271,2919,2964 'manual':711 'match':4128 'max':3296 'max-token':3295 'may':284,3879 'mcp':192,392,441,663,1191,1266,1426,1479,1502,1631,2178,2247,2295,2519,2751,2923,3564,3630,3902,3920,3982,3990 'medium':1900,1904,1986 'memori':3504,3674 'messag':3226,3282,3284 'messages-column':3281 'method':581,630,665,720,2239,3393,3668 'min':2365,3878 'minim':1048 'minut':2274,2309,2314,2442,2918 'miss':913,2256,3602,4165 'mistak':1475 'ml':79,1308 'ml_inference.py':2875 'mode':3501 'model':81,86,89,118,124,1891,1897,1901,1909,1911,1922,1925,1930,1954,2036,2088,3141,3214,3233,3286 'model-id':3285 'model-train':88,123 'model.push':2089 'models/datasets':341,370,513,521,542,562 'modifi':524,551 'monday':2791 'monitor':264,1069,2486,2513,2673,3962 'month':2893,2896 'multi':1943,2029 'multi-gpu':1942,2028 'multipl':2388,2688,2702 'must':347,412,450,710,1435,2052,2203,3576 'my-bucket':2144 'my_script.py':1237,1257,1380,1403,2861 'name':969,1736,3078,3084,3088 'need':1054,1707,2026,3444,3932 'network':2405 'never':1017 'new':549,1043,3134,3209 'non':1712 'non-python':1711 'note':269,2360 'num':3388 'num-sampl':3387 'ocr':1852 'offici':3765 'often':3441 'oom':3505 'oper':339,384,492,851,948,1124,3960,3989 'optim':1358,1969,2471,2499,3545,3661,3972 'option':2078,3330,3430,3761 'org':3083,3087 'organ':526,534,699,1802,3817 'os':783,880,1098,2211,3161 'os.environ':885,891,1116,1128,1148,2094,2104,2123,3589 'os.environ.get':798,833,982,2225,3166 'output':3243,3332,3380,3450,3479,4137 'output-dataset':3379 'output-repo':3449,3478 'over-provis':2510 'overview':26 'p':3293 'package1':3613 'package2':3614 'paid':321 'parallel':2032 'parallel/large':1953 'paramet':206,3694,3848 'parquet':3419,3758 'pars':3122 'pass':223,413,451,458,638,794,965,1512,2204,3448,3577 'path':1414,1447,1466,1491,1518,1520,1558,2111,2115,2261,3263,3265,3359,3361,3457,3459 'pathlib':1516,3261,3357,3455 'pattern':1498,3188,3211,3310,3406 'paus':2946,3005 'payload':3115,3124,3156,3164,3168 'payload.get':3173 'pep':1181,1580,3607,3685,3811 'per':2428,2891,2895,2899,2903,2907 'perform':3492 'period':1045 'permiss':350,939,945,1011,1049,1055,3593,3652,4158 'persist':19,50,174,2053,2072,2077,2238,2260,2268,3667,3718,3887,3893 'pipelin':1205,1212 'placehold':431,606,1023 'plan':318,322,3477 'polar':3411,3751 'poll':2665 'popular':1850 'possibl':845 'post':2154 'practic':825,1015 'pre':1315,1328,1355,1698 'pre-built':1314,1327,1697 'pre-instal':1354 'predefin':2749 'prefer':758,3203 'prerequisit':292 'print':894,1151,1222,1643,1664,1686,1733,2626,2812,2847,3169 'privat':343,374,494,515,546,996 'pro':305 'process':61,157,1880,2369,2448,3133,3516,3551 'product':1912,2396 'programmat':557 'progress':2416 'prompt':3229 'prompts/answers':3324 'provid':254,261,387,577,1167,1803,3964 'provis':2512 'ps':2527,2643,2928,4029,4032,4098,4102 'public':561,572 'push':340,369,493,520,541,947,1083,1133,1153,2080,2087,2097,2107,2171,2228,3241,3256,3338,3352,3556,3735,3747,3762 'python':146,208,389,586,671,726,768,776,860,878,1081,1193,1244,1246,1318,1362,1369,1375,1381,1384,1390,1392,1404,1481,1504,1583,1599,1601,1633,1638,1641,1660,1662,1669,1671,1681,1684,1713,1720,1729,1757,1817,2086,2130,2153,2175,2209,2250,2297,2326,2328,2341,2521,2563,2565,2690,2753,2807,2810,2820,2822,2842,2845,2922,2961,2965,3050,3052,3159,3259,3355,3453,3571,3610,3633,3792,3850,3908,3949,3987,3993 'pytorch/pytorch':1725 'quick':1169,1898,2361,2432,3980 'qwen/qwen3-30b-a3b-instruct-2507':3288 'raw.githubusercontent.com':1553 'raw.githubusercontent.com/huggingface/trl/main/trl/scripts/sft.py':1552 're':658,857,3642 're-login':657,3641 'reachabl':1537 'read':351,509,514,571,1506,1525,3270,3366,3464 'readi':1805,3193,3871 'readm':1848 'ready-to-run':3192 'ready-to-us':1804 'real':401,453,2205,3578 'reason':3386 'recommend':585,670,1177,1428,1875,2085,2359 'recur':115,167 'reduc':2501,3508,3675 'refer':1863,3702,3778,3788,3797,3981 'references/hardware_guide.md':2043,3711 'references/hub_saving.md':2264,3716 'references/token_usage.md':3706 'references/troubleshooting.md':3696,3720 'regular':1041 'relat':3803 'rememb':2656 'remot':1455,1484 'repeat':2666 'replac':399,436,598,609,2187,3570 'repo':344,495,516,523,997,1007,1431,1838,1844,2117,2119,3067,3090,3437,3451,3480,3596 'report':3150 'repositori':375,547,550,553,573,986,3049,3106,3152,3190 'reproduc':78 'request':221,274,2160,2661,3859 'requests.post':2161 'requir':39,243,320,335,368,489,538,540,559,938,1119,1277,1290,2167,3251,3347,3438,3651,3856,4156 'resourc':3701 'respons':3215,3237,3734 'result':51,175,464,1084,1216,1223,2049,2054,2069,2150,2155,2166,2230,2237,3432,3663,3888 'results.json':2114,2118,2143,2147 'resum':2494,2950,2953,2979,3015,3019 'return':869 'review':4149 'right':2506,3975 'rip':3336 'risk':716 'rotat':715,1039 'run':5,20,27,41,70,145,161,203,290,409,446,563,1236,1251,1254,1342,1397,1400,1420,1462,1564,1606,1609,1623,1636,1659,1676,1678,1701,1723,1760,1784,2189,2284,2333,2335,2338,2346,2389,2392,2586,2589,2598,2699,2701,2705,2740,2759,2805,2848,3112,3144,3195,3995,4004,4006,4010,4015,4018,4021,4070,4082,4086 'runtim':1571,2426,2487,3536 's3':2133,2138,2140 's3.upload':2141 'safeti':4159 'sampl':75,3389 'save':213,1276,2048,3665,3855 'scale':160,2021,3928 'scan':3418 'scenario':1289,2358 'schedul':112,166,2738,2744,2750,2754,2764,2768,2774,2780,2784,2796,2804,2813,2814,2828,2831,2834,2839,2850,2851,2853,2858,2862,2868,2872,2876,2887,2912,2920,2927,2930,2934,2942,2949,2956,2971,2974,2977,2980,2983,2987,2989,2991,2995,2999,3001,3007,3010,3012,3017,3020,3022,3027,3030,3032,4060,4064,4068,4071,4074,4077,4081,4085,4087,4091,4094,4097,4101,4104,4107,4110,4114,4117 'scope':4130 'script':205,225,380,590,675,730,765,769,777,831,980,1090,1091,1176,1179,1197,1198,1298,1302,1322,1367,1379,1408,1413,1433,1473,1495,1509,1519,1530,1531,1535,1541,1551,1587,1795,1801,1809,1821,1828,1842,2208,2254,2757,2766,2782,2856,3120,3158,3197,3264,3275,3276,3277,3360,3371,3372,3373,3458,3469,3470,3471,3584,3611,3725,3806,3816,3823,3844,3847,3939,3947,3997 'script.py':2349,4005,4009,4072 'scripts/cot-self-instruct.py':3318,3738 'scripts/finepdfs-stats.py':3414,3750 'scripts/generate-responses.py':3217,3729 'seamless':661 'search':1824 'second':2305,2355 'secret':16,247,403,422,592,641,677,740,743,751,759,796,918,963,972,1029,1032,1161,2180,2198,2221,2243,3092,3095,3305,3401,3494,3563,3629,3693,3903,3910 'section':251,362 'secur':499,627,704,725,738,1013,1168,3901 'see':87,121,248,359,1488,2042,2263,3177,3695,3933 'seed':3376 'seed-dataset':3375 'select':1772,1790,1862,3715 'self':1188,3314,3328,3741 'self-contain':1187 'self-instruct':3313,3327,3740 'send':2149 'sentiment':1214 'sentiment-analysi':1213 'server':635,755,1036 'server-sid':634,754,1035 'session':622,652 'set':279,1077,2289,2293,2488,3874,3882,3977 'setup':38,111,156,2367,2404,3832 'share':1059 'ship':3191 'show':3476 'show-plan':3475 'side':636,756,1037 'size':2376,3510,3514,3677 'skill':91,126,137,140,3705,3728,4122 'skill-hugging-face-jobs' 'small':1598,1740,1890,1895,1915,1983,1991,2475,2484,2885,3927 'smaller':2017,3519 'solut':3724 'somewher':1536 'sort':1829 'source-sickn33' 'space':1746,1756,1765 'spec':3713 'specif':95,120,697,1360,1708,1835,2046,2529,2600,4144 'specifi':1371 'start':295,897,902,1170,2015,2474,3926 'startup':1353 'stat':3410,3428,3753 'status':275,2518,2662,2680,3968 'status.stage':2731 'stop':2277,4150 'storag':2129 'store':498,1810 'stori':1415 'stream':3408,3752 'string':229,461,1440,2310 'submiss':258,2174,2670 'submit':194,260,2236,3843 'substitut':4140 'success':1154,4162 'suffix':2312 'support':1559,2302 'suspend':2943,2976,3004,3009 'syntax':2777 'synthet':97,1855,3316,3323,3743 'synthetic-data':1854 't4':1894,1903,1982,1985,2483,3525 't4-medium':1902,1984 't4-small':1893,1981,2482 'takeaway':3841 'task':116,168,291,1889,2285,3384,4126 'task-typ':3383 'team':309 'temperatur':3289 'templat':3210,3235 'tempor':3427 'temporari':2059 'test':105,106,694,1881,1899,2020,2259,2362,2433,2476,3145,4146 'text':1136,1526,3271,3367,3465 'thousand':73 'throughput':1941 'time':267,2440,2453,2465,2692,3657 'time.sleep':2736 'timeout':281,1159,1228,1242,1262,1652,1741,2270,2280,2292,2294,2298,2344,2350,2356,2410,2490,3303,3399,3486,3529,3539,3542,3660,3875,3884,3979 'tip':2472 'token':245,249,330,346,357,360,366,388,395,402,405,407,415,421,424,426,430,454,457,463,470,474,478,482,507,510,518,527,536,578,584,594,596,603,613,616,631,644,668,679,683,690,698,700,706,714,718,734,761,763,770,789,797,800,807,808,821,829,835,847,852,875,883,887,890,893,896,899,906,912,920,922,936,942,950,957,974,976,984,990,1000,1012,1019,1022,1040,1044,1051,1060,1067,1070,1073,1079,1109,1114,1118,1121,1127,1130,1147,1150,1163,1165,1166,2093,2096,2103,2106,2122,2125,2182,2184,2197,2200,2202,2206,2217,2224,2227,2241,2249,2253,2257,3254,3297,3307,3309,3350,3403,3405,3443,3496,3498,3561,3567,3575,3579,3582,3587,3592,3627,3632,3636,3639,3649,3671,3691,3708,3831,3900,3905,3907,3912,3914,3916,3955 'tool':193,393,442,664,1192,1267,1427,1480,1503,1632,2179,2296,2520,2752,2924,3804,3921,3983,3991 'top':3292 'top-p':3291 'topic-agent-skills' 'topic-agentic-skills' 'topic-ai-agent-skills' 'topic-ai-agents' 'topic-ai-coding' 'topic-ai-workflows' 'topic-antigravity' 'topic-antigravity-skills' 'topic-claude-code' 'topic-claude-code-skills' 'topic-codex-cli' 'topic-codex-skills' 'torch':1201,1207,1591,1615,1732 'torch.cuda.get':1734 'total':2422 'tpu':1955,1968,2004 'tpu-optim':1967 'tpus':15,47,2038 'track':2515 'train':82,96,119,131,2287 'trainer':90,125 'transform':62,1200,1203,1590,1614 'treat':4135 'trigger':3036,3040,3062,3109,3138,3839 'trl':94,129 'trl-base':128 'trl-specif':93 'troubleshoot':3653,3699 'true':1849 'tune':85 'two':1171,1412 'type':508,951,1845,1874,3076,3082,3172,3385 'unauthor':910 'understand':477,3126 'unless':217,3857 'unnecessari':2502 'unsav':2415 'updat':654,712,1871,3143 'upgrad':1887,1979,3485,3522,3678 'upload':356,1471,2131,3137,3431,3452 'url':265,1443,1532,1545,2668,2674,3966 'usag':250,358,361,475,1071,1268,3709 'use':58,102,135,138,169,189,196,376,444,554,614,648,686,701,762,801,819,832,971,999,1020,1028,1047,1064,1120,1180,1280,1303,1313,1368,1383,1422,1457,1696,1706,1743,1751,1807,1877,2027,2037,2127,2245,2492,2745,3129,3204,3565,3573,3899,3937,4120 'user':142,219,272,866,1062,2659,3077,3858,3868 'usernam':871,3081 'username/dataset-name':2102 'username/finepdfs-temporal-stats':3481 'username/input-dataset':3279 'username/model-name':2092 'username/my-dataset':1146 'username/output-dataset':3280 'username/results':2121 'username/synthetic-math':3382 'uv':199,410,447,589,674,729,1089,1175,1178,1196,1235,1252,1255,1297,1301,1321,1341,1366,1378,1398,1401,1461,1494,1529,1540,1550,1563,1586,1607,1610,1794,1800,1808,1820,1827,1841,2190,2336,2347,2756,2765,2781,2832,2855,2859,2873,3196,3274,3370,3468,3805,3815,3822,3938,3944,3996,4000,4003,4007,4061,4065,4069,4075 'uv-script':1799,1826,1840 'v5e':1958,1961,1964,2006,2009,2012 'v5e-1x1':1957,2005 'v5e-2x2':1960,2008 'v5e-2x4':1963,2011 'valid':4145 'valu':1434 'var':823 'variabl':723,745,775,1026,3117 'verif':853,2233 'verifi':298,846,874,926,979,1006,1108,2366,3581,3595,3626,3669,3692,3954 'version':1363,1374 'via':246,455,795,2151,3325 'view':2540,2612,2649,2678,4035 'visibl':707,747,1008 'vllm':1332,1857,3216,3239,3730 'vllm/vllm-openai':1325,1344 'vs':3984,3986 'wait':2657,2686,2716 'wait/poll':3866 'want':143 'watch':3075,3105 'webhook':3035,3058,3060,3069,3071,3100,3114,3155,3167,3178,3833 'week':2897,2900 'whoami':328,865,868,929,3622 'without':108,153,3892 'won':1450 'work':601,660,693,930,1406,1468,2074,2262,3098,3623,3895,3918 'workflow':132 'workload':6,21,29,147,173,1190,1209,1285,1309,1714,1873,1945,1956,2033,2041,2397,2715,3187,3950 'world':1138 'write':354,517,944,3253,3349 'wrong':967 'xet':3490 'year':2889,2892 'your-api.com':2163 'your-api.com/results':2162 'your-job-id':2536,2547,2559,2608,2622,2635 'your-org-nam':3085 'your-secret':3093 'your-usernam':3079 'your_script.py':591,676,731,2767,2783","prices":[{"id":"0e8e8b68-b681-4752-bee4-249fb70181c1","listingId":"642fa4c5-5e7e-43e4-8b0b-e7bc0b874429","amountUsd":"0","unit":"free","nativeCurrency":null,"nativeAmount":null,"chain":null,"payTo":null,"paymentMethod":"skill-free","isPrimary":true,"details":{"org":"sickn33","category":"antigravity-awesome-skills","install_from":"skills.sh"},"createdAt":"2026-04-18T21:38:46.713Z"}],"sources":[{"listingId":"642fa4c5-5e7e-43e4-8b0b-e7bc0b874429","source":"github","sourceId":"sickn33/antigravity-awesome-skills/hugging-face-jobs","sourceUrl":"https://github.com/sickn33/antigravity-awesome-skills/tree/main/skills/hugging-face-jobs","isPrimary":false,"firstSeenAt":"2026-04-18T21:38:46.713Z","lastSeenAt":"2026-04-23T18:51:29.629Z"}],"details":{"listingId":"642fa4c5-5e7e-43e4-8b0b-e7bc0b874429","quickStartSnippet":null,"exampleRequest":null,"exampleResponse":null,"schema":null,"openapiUrl":null,"agentsTxtUrl":null,"citations":[],"useCases":[],"bestFor":[],"notFor":[],"kindDetails":{"org":"sickn33","slug":"hugging-face-jobs","github":{"repo":"sickn33/antigravity-awesome-skills","stars":34768,"topics":["agent-skills","agentic-skills","ai-agent-skills","ai-agents","ai-coding","ai-workflows","antigravity","antigravity-skills","claude-code","claude-code-skills","codex-cli","codex-skills","cursor","cursor-skills","developer-tools","gemini-cli","gemini-skills","kiro","mcp","skill-library"],"license":"mit","html_url":"https://github.com/sickn33/antigravity-awesome-skills","pushed_at":"2026-04-23T06:41:03Z","description":"Installable GitHub library of 1,400+ agentic skills for Claude Code, Cursor, Codex CLI, Gemini CLI, Antigravity, and more. Includes installer CLI, bundles, workflows, and official/community skill collections.","skill_md_sha":"f94212ed4ab1e92742e441c5c3a2045148477603","skill_md_path":"skills/hugging-face-jobs/SKILL.md","default_branch":"main","skill_tree_url":"https://github.com/sickn33/antigravity-awesome-skills/tree/main/skills/hugging-face-jobs"},"layout":"multi","source":"github","category":"antigravity-awesome-skills","frontmatter":{"name":"hugging-face-jobs","license":"Complete terms in LICENSE.txt","description":"Run workloads on Hugging Face Jobs with managed CPUs, GPUs, TPUs, secrets, and Hub persistence."},"skills_sh_url":"https://skills.sh/sickn33/antigravity-awesome-skills/hugging-face-jobs"},"updatedAt":"2026-04-23T18:51:29.629Z"}}