{"id":"bdc2e0fd-3c28-4c16-a193-f85cc488b283","shortId":"3cVnzt","kind":"skill","title":"robotics-design-patterns","tagline":"Architecture patterns, design principles, and proven recipes for building robust robotics software. Use this skill when designing robot software architectures, choosing between behavioral frameworks, structuring perception-planning-control pipelines, implementing state machines, ","description":"# Robotics Design Patterns\n\n## When to Use This Skill\n- Designing robot software architecture from scratch\n- Choosing between behavior trees, FSMs, or hybrid approaches\n- Structuring perception → planning → control pipelines\n- Implementing safety systems and watchdogs\n- Building hardware abstraction layers (HAL)\n- Designing for sim-to-real transfer\n- Architecting multi-robot / fleet systems\n- Making real-time vs. non-real-time tradeoffs\n\n## Pattern 1: The Robot Software Stack\n\nEvery robot system follows this layered architecture, regardless of complexity:\n\n```\n┌─────────────────────────────────────────────┐\n│               APPLICATION LAYER              │\n│    Mission planning, task allocation, UI     │\n├─────────────────────────────────────────────┤\n│              BEHAVIORAL LAYER                │\n│  Behavior trees, FSMs, decision-making       │\n├─────────────────────────────────────────────┤\n│             FUNCTIONAL LAYER                 │\n│  Perception, Planning, Control, Estimation   │\n├─────────────────────────────────────────────┤\n│           COMMUNICATION LAYER                │\n│     ROS2, DDS, shared memory, IPC            │\n├─────────────────────────────────────────────┤\n│          HARDWARE ABSTRACTION LAYER          │\n│    Drivers, sensor interfaces, actuators     │\n├─────────────────────────────────────────────┤\n│              HARDWARE LAYER                  │\n│    Cameras, LiDARs, motors, grippers, IMUs   │\n└─────────────────────────────────────────────┘\n```\n\n**Design Rule**: Information flows UP through perception, decisions flow DOWN through control. Never let the application layer directly command hardware.\n\n## Pattern 2: Behavior Trees (BT)\n\nBehavior trees are the **recommended default** for robot decision-making. They're modular, reusable, and easier to debug than FSMs for complex behaviors.\n\n### Core Node Types\n\n```\nSequence (→)     : Execute children left-to-right, FAIL on first failure\nFallback (?)     : Execute children left-to-right, SUCCEED on first success\nParallel (⇉)     : Execute all children simultaneously\nDecorator        : Modify a single child's behavior\nAction (leaf)    : Execute a robot action\nCondition (leaf) : Check a condition (no side effects)\n```\n\n### Example: Pick-and-Place BT\n\n```\n                    → Sequence\n                   /    |      \\\n            → Check     → Pick     → Place\n           /    \\      /   |  \\     /  |  \\\n       Battery  Obj  Open  Move  Close Move Open Release\n       OK?    Found? Grip  To    Grip  To   Grip\n                      per  Obj   per   Goal per\n```\n\n### Implementation Pattern\n\n```python\nimport py_trees\n\nclass MoveToTarget(py_trees.behaviour.Behaviour):\n    \"\"\"Action node: Move robot to a target pose\"\"\"\n\n    def __init__(self, name, target_key=\"target_pose\"):\n        super().__init__(name)\n        self.target_key = target_key\n        self.action_client = None\n\n    def setup(self, **kwargs):\n        \"\"\"Called once when tree is set up — initialize resources\"\"\"\n        self.node = kwargs.get('node')  # ROS2 node\n        self.action_client = ActionClient(\n            self.node, MoveBase, 'move_base')\n\n    def initialise(self):\n        \"\"\"Called when this node first ticks — send the goal\"\"\"\n        bb = self.blackboard\n        target = bb.get(self.target_key)\n        self.goal_handle = self.action_client.send_goal(target)\n        self.logger.info(f\"Moving to {target}\")\n\n    def update(self):\n        \"\"\"Called every tick — check progress\"\"\"\n        if self.goal_handle is None:\n            return py_trees.common.Status.FAILURE\n\n        status = self.goal_handle.status\n        if status == GoalStatus.STATUS_SUCCEEDED:\n            return py_trees.common.Status.SUCCESS\n        elif status == GoalStatus.STATUS_ABORTED:\n            return py_trees.common.Status.FAILURE\n        else:\n            return py_trees.common.Status.RUNNING\n\n    def terminate(self, new_status):\n        \"\"\"Called when node exits — cancel if preempted\"\"\"\n        if new_status == py_trees.common.Status.INVALID:\n            if self.goal_handle:\n                self.goal_handle.cancel_goal()\n                self.logger.info(\"Movement cancelled\")\n\n# Build the tree\ndef create_pick_place_tree():\n    root = py_trees.composites.Sequence(\"PickAndPlace\", memory=True)\n\n    # Safety checks (Fallback: if any fails, abort)\n    safety = py_trees.composites.Sequence(\"SafetyChecks\", memory=False)\n    safety.add_children([\n        CheckBattery(\"BatteryOK\", threshold=20.0),\n        CheckEStop(\"EStopClear\"),\n    ])\n\n    pick = py_trees.composites.Sequence(\"Pick\", memory=True)\n    pick.add_children([\n        DetectObject(\"FindObject\"),\n        MoveToTarget(\"ApproachObject\", target_key=\"object_pose\"),\n        GripperCommand(\"CloseGripper\", action=\"close\"),\n    ])\n\n    place = py_trees.composites.Sequence(\"Place\", memory=True)\n    place.add_children([\n        MoveToTarget(\"MoveToPlace\", target_key=\"place_pose\"),\n        GripperCommand(\"OpenGripper\", action=\"open\"),\n    ])\n\n    root.add_children([safety, pick, place])\n    return root\n```\n\n### Blackboard Pattern\n\n```python\n# The Blackboard is the shared memory for BT nodes\nbb = py_trees.blackboard.Blackboard()\n\n# Perception nodes WRITE to blackboard\nclass DetectObject(py_trees.behaviour.Behaviour):\n    def update(self):\n        detections = self.perception.detect()\n        if detections:\n            self.blackboard.set(\"object_pose\", detections[0].pose)\n            self.blackboard.set(\"object_class\", detections[0].label)\n            return Status.SUCCESS\n        return Status.FAILURE\n\n# Action nodes READ from blackboard\nclass MoveToTarget(py_trees.behaviour.Behaviour):\n    def initialise(self):\n        target = self.blackboard.get(\"object_pose\")\n        self.send_goal(target)\n```\n\n## Pattern 3: Finite State Machines (FSM)\n\nUse FSMs for **simple, well-defined sequential behaviors** with clear states. Prefer BTs for anything complex.\n\n```python\nfrom enum import Enum, auto\nimport smach  # ROS state machine library\n\nclass RobotState(Enum):\n    IDLE = auto()\n    NAVIGATING = auto()\n    PICKING = auto()\n    PLACING = auto()\n    ERROR = auto()\n    CHARGING = auto()\n\n# SMACH implementation\nclass NavigateState(smach.State):\n    def __init__(self):\n        smach.State.__init__(self,\n            outcomes=['succeeded', 'aborted', 'preempted'],\n            input_keys=['target_pose'],\n            output_keys=['final_pose'])\n\n    def execute(self, userdata):\n        # Navigation logic\n        result = navigate_to(userdata.target_pose)\n        if result.success:\n            userdata.final_pose = result.pose\n            return 'succeeded'\n        return 'aborted'\n\n# Build state machine\nsm = smach.StateMachine(outcomes=['done', 'failed'])\nwith sm:\n    smach.StateMachine.add('NAVIGATE', NavigateState(),\n        transitions={'succeeded': 'PICK', 'aborted': 'ERROR'})\n    smach.StateMachine.add('PICK', PickState(),\n        transitions={'succeeded': 'PLACE', 'aborted': 'ERROR'})\n    smach.StateMachine.add('PLACE', PlaceState(),\n        transitions={'succeeded': 'done', 'aborted': 'ERROR'})\n    smach.StateMachine.add('ERROR', ErrorRecovery(),\n        transitions={'recovered': 'NAVIGATE', 'fatal': 'failed'})\n```\n\n**When to use FSM vs BT**:\n- FSM: Linear workflows, simple devices, UI states, protocol implementations\n- BT: Complex robots, reactive behaviors, many conditional branches, reusable sub-behaviors\n\n## Pattern 4: Perception Pipeline\n\n```\nRaw Sensors → Preprocessing → Detection/Estimation → Fusion → World Model\n```\n\n### Sensor Fusion Architecture\n\n```python\nclass SensorFusion:\n    \"\"\"Multi-sensor fusion using a central world model\"\"\"\n\n    def __init__(self):\n        self.world_model = WorldModel()\n        self.filters = {\n            'pose': ExtendedKalmanFilter(state_dim=6),\n            'objects': MultiObjectTracker(),\n        }\n\n    def update_from_camera(self, detections, timestamp):\n        \"\"\"Camera provides object detections with high latency\"\"\"\n        for det in detections:\n            self.filters['objects'].update(\n                det, sensor='camera',\n                uncertainty=det.confidence,\n                timestamp=timestamp\n            )\n\n    def update_from_lidar(self, points, timestamp):\n        \"\"\"LiDAR provides precise geometry with lower latency\"\"\"\n        clusters = self.segment_points(points)\n        for cluster in clusters:\n            self.filters['objects'].update(\n                cluster, sensor='lidar',\n                uncertainty=0.02,  # 2cm typical LiDAR accuracy\n                timestamp=timestamp\n            )\n\n    def update_from_imu(self, imu_data, timestamp):\n        \"\"\"IMU provides high-frequency attitude estimates\"\"\"\n        self.filters['pose'].predict(imu_data, dt=timestamp - self.last_imu_t)\n        self.last_imu_t = timestamp\n\n    def get_world_state(self):\n        \"\"\"Query the fused world model\"\"\"\n        return WorldState(\n            robot_pose=self.filters['pose'].state,\n            objects=self.filters['objects'].get_tracked_objects(),\n            confidence=self.filters['objects'].get_confidence_map()\n        )\n```\n\n### The Perception-Action Loop Timing\n\n```\nCamera (30Hz)  ─┐\nLiDAR (10Hz)   ─┼──→ Fusion (50Hz) ──→ Planner (10Hz) ──→ Controller (100Hz+)\nIMU (200Hz)    ─┘\n\nRULE: Controller frequency > Planner frequency > Sensor frequency\n      This ensures smooth execution despite variable perception latency.\n```\n\n## Pattern 5: Hardware Abstraction Layer (HAL)\n\n**Never let application code talk directly to hardware.** Always go through an abstraction layer.\n\n```python\nfrom abc import ABC, abstractmethod\n\nclass GripperInterface(ABC):\n    \"\"\"Abstract gripper interface — implement for each hardware type\"\"\"\n\n    @abstractmethod\n    def open(self, width: float = 1.0) -> bool: ...\n\n    @abstractmethod\n    def close(self, force: float = 0.5) -> bool: ...\n\n    @abstractmethod\n    def get_state(self) -> GripperState: ...\n\n    @abstractmethod\n    def get_width(self) -> float: ...\n\n\nclass RobotiqGripper(GripperInterface):\n    \"\"\"Concrete implementation for Robotiq 2F-85\"\"\"\n    def __init__(self, port='/dev/ttyUSB0'):\n        self.serial = serial.Serial(port, 115200)\n        # ... Modbus RTU setup\n\n    def close(self, force=0.5):\n        cmd = self._build_modbus_cmd(force=int(force * 255))\n        self.serial.write(cmd)\n        return self._wait_for_completion()\n\n\nclass SimulatedGripper(GripperInterface):\n    \"\"\"Simulation gripper for testing\"\"\"\n    def __init__(self):\n        self.width = 0.085  # 85mm open\n        self.state = GripperState.OPEN\n\n    def close(self, force=0.5):\n        self.width = 0.0\n        self.state = GripperState.CLOSED\n        return True\n\n\n# Factory pattern for hardware instantiation\ndef create_gripper(config: dict) -> GripperInterface:\n    gripper_type = config.get('type', 'simulated')\n    if gripper_type == 'robotiq':\n        return RobotiqGripper(port=config['port'])\n    elif gripper_type == 'simulated':\n        return SimulatedGripper()\n    else:\n        raise ValueError(f\"Unknown gripper type: {gripper_type}\")\n```\n\n## Pattern 6: Safety Systems\n\n### The Safety Hierarchy\n\n```\nLevel 0: Hardware E-Stop (physical button, cuts power)\nLevel 1: Safety-rated controller (SIL2/SIL3, hardware watchdog)\nLevel 2: Software watchdog (monitors heartbeats, enforces limits)\nLevel 3: Application safety (collision avoidance, workspace limits)\n```\n\n### Software Watchdog Pattern\n\n```python\nimport threading\nimport time\n\nclass SafetyWatchdog:\n    \"\"\"Monitors system health and triggers safe stop on failures\"\"\"\n\n    def __init__(self, timeout_ms=500):\n        self.timeout = timeout_ms / 1000.0\n        self.heartbeats = {}\n        self.lock = threading.Lock()\n        self.safe_stop_triggered = False\n\n        # Start monitoring thread\n        self.monitor_thread = threading.Thread(\n            target=self._monitor_loop, daemon=True)\n        self.monitor_thread.start()\n\n    def register_component(self, name: str, critical: bool = True):\n        \"\"\"Register a component that must send heartbeats\"\"\"\n        with self.lock:\n            self.heartbeats[name] = {\n                'last_beat': time.monotonic(),\n                'critical': critical,\n                'alive': True\n            }\n\n    def heartbeat(self, name: str):\n        \"\"\"Called by components to signal they're alive\"\"\"\n        with self.lock:\n            if name in self.heartbeats:\n                self.heartbeats[name]['last_beat'] = time.monotonic()\n                self.heartbeats[name]['alive'] = True\n\n    def _monitor_loop(self):\n        while True:\n            now = time.monotonic()\n            with self.lock:\n                for name, info in self.heartbeats.items():\n                    elapsed = now - info['last_beat']\n                    if elapsed > self.timeout and info['alive']:\n                        info['alive'] = False\n                        if info['critical']:\n                            self._trigger_safe_stop(\n                                f\"Critical component '{name}' \"\n                                f\"timed out ({elapsed:.1f}s)\")\n            time.sleep(self.timeout / 4)\n\n    def _trigger_safe_stop(self, reason: str):\n        if not self.safe_stop_triggered:\n            self.safe_stop_triggered = True\n            logger.critical(f\"SAFE STOP: {reason}\")\n            self._execute_safe_stop()\n\n    def _execute_safe_stop(self):\n        \"\"\"Bring robot to a safe state\"\"\"\n        # 1. Stop all motion (zero velocity command)\n        # 2. Engage brakes\n        # 3. Publish emergency state to all nodes\n        # 4. Log the event\n        pass\n```\n\n### Workspace Limits\n\n```python\nclass WorkspaceMonitor:\n    \"\"\"Enforce that robot stays within safe operational bounds\"\"\"\n\n    def __init__(self, limits: dict):\n        self.joint_limits = limits['joints']    # {joint: (min, max)}\n        self.cartesian_bounds = limits['cartesian']  # AABB or convex hull\n        self.velocity_limits = limits['velocity']\n        self.force_limits = limits['force']\n\n    def check_command(self, command) -> SafetyResult:\n        \"\"\"Validate a command BEFORE sending to hardware\"\"\"\n        violations = []\n\n        # Joint limit check\n        for joint, value in command.joint_positions.items():\n            lo, hi = self.joint_limits[joint]\n            if not (lo <= value <= hi):\n                violations.append(\n                    f\"Joint {joint}={value:.3f} outside [{lo:.3f}, {hi:.3f}]\")\n\n        # Velocity check\n        for joint, vel in command.joint_velocities.items():\n            if abs(vel) > self.velocity_limits[joint]:\n                violations.append(\n                    f\"Joint {joint} velocity {vel:.3f} exceeds limit\")\n\n        if violations:\n            return SafetyResult(safe=False, violations=violations)\n        return SafetyResult(safe=True)\n```\n\n## Pattern 7: Sim-to-Real Architecture\n\n```\n┌────────────────────────────────────┐\n│         Application Code           │\n│  (Same code runs in sim AND real)  │\n├──────────────┬─────────────────────┤\n│   Sim HAL    │     Real HAL        │\n│  (MuJoCo/    │  (Hardware          │\n│   Gazebo/    │   drivers)          │\n│   Isaac)     │                     │\n└──────────────┴─────────────────────┘\n```\n\n**Key Principles**:\n1. Application code NEVER knows if it's in sim or real\n2. Same message types, same topic names, same interfaces\n3. Use `use_sim_time` parameter to switch clock sources\n4. Domain randomization happens INSIDE the sim HAL\n5. Transfer learning adapters sit at the HAL boundary\n\n```python\n# Config-driven sim/real switching\nclass RobotDriver:\n    def __init__(self, config):\n        if config['mode'] == 'simulation':\n            self.arm = SimulatedArm(config['sim'])\n            self.camera = SimulatedCamera(config['sim'])\n        elif config['mode'] == 'real':\n            self.arm = UR5Driver(config['real']['arm_ip'])\n            self.camera = RealSenseDriver(config['real']['camera_serial'])\n\n        # Application code uses the same interface regardless\n        self.perception = PerceptionPipeline(self.camera)\n        self.planner = MotionPlanner(self.arm)\n```\n\n## Pattern 8: Data Recording Architecture\n\n**Critical for learning-based robotics** — designed for the ForgeIR ecosystem:\n\n```\n┌─────────────────────────────────────────────┐\n│              Event-Based Recorder            │\n│  Triggers: action boundaries, anomalies,     │\n│  task completions, operator signals           │\n├─────────────────────────────────────────────┤\n│           Multimodal Data Streams            │\n│  Camera (30Hz) | Joint State (100Hz) |       │\n│  Force/Torque (1kHz) | Language Annotations  │\n├─────────────────────────────────────────────┤\n│            Storage Layer                     │\n│  Episode-based structure with metadata       │\n│  Format: MCAP / Zarr / HDF5 / RLDS          │\n├─────────────────────────────────────────────┤\n│           Quality Assessment                 │\n│  Completeness checks, trajectory validation  │\n│  Anomaly detection, diversity analysis       │\n└─────────────────────────────────────────────┘\n```\n\n```python\nclass EpisodeRecorder:\n    \"\"\"Records robot episodes with event-based boundaries\"\"\"\n\n    def __init__(self, config):\n        self.streams = {}\n        self.episode_active = False\n        self.current_episode = None\n        self.storage = StorageBackend(config['format'])  # Zarr, MCAP, etc.\n\n    def register_stream(self, name, msg_type, frequency_hz):\n        self.streams[name] = StreamConfig(\n            name=name, type=msg_type, freq=frequency_hz)\n\n    def start_episode(self, metadata: dict):\n        \"\"\"Begin recording an episode with metadata\"\"\"\n        self.current_episode = Episode(\n            id=uuid4(),\n            start_time=time.monotonic(),\n            metadata=metadata,  # task, operator, environment, etc.\n            streams={name: [] for name in self.streams}\n        )\n        self.episode_active = True\n\n    def record_step(self, stream_name, data, timestamp):\n        if self.episode_active:\n            self.current_episode.streams[stream_name].append(\n                DataPoint(data=data, timestamp=timestamp))\n\n    def end_episode(self, outcome: str, annotations: dict = None):\n        \"\"\"Finalize and store the episode\"\"\"\n        self.episode_active = False\n        self.current_episode.end_time = time.monotonic()\n        self.current_episode.outcome = outcome\n        self.current_episode.annotations = annotations\n\n        # Validate before saving\n        quality = self.validate_episode(self.current_episode)\n        self.current_episode.quality_score = quality\n\n        self.storage.save(self.current_episode)\n        return self.current_episode.id\n```\n\n## Anti-Patterns to Avoid\n\n### 1. God Node\n**Problem**: One node does everything — perception, planning, control, logging.\n**Fix**: Single responsibility. One node, one job. Connect via topics.\n\n### 2. Hardcoded Magic Numbers\n**Problem**: `if distance < 0.35:` scattered everywhere.\n**Fix**: Parameters with descriptive names, loaded from config files.\n\n### 3. Polling Instead of Events\n**Problem**: `while True: check_sensor(); sleep(0.01)`\n**Fix**: Use callbacks, subscribers, event-driven architecture.\n\n### 4. No Error Recovery\n**Problem**: Robot stops forever on first error.\n**Fix**: Every action node needs a failure mode. Behavior trees with fallbacks.\n\n### 5. Sim-Only Code\n**Problem**: Code works perfectly in simulation, crashes on real hardware.\n**Fix**: HAL pattern. Test with hardware-in-the-loop early and often.\n\n### 6. No Timestamps\n**Problem**: Sensor data without timestamps — impossible to fuse or replay.\n**Fix**: Timestamp EVERYTHING at the source. Use monotonic clocks for control.\n\n### 7. Blocking the Control Loop\n**Problem**: Perception computation blocks the 100Hz control loop.\n**Fix**: Separate processes/threads. Control loop must NEVER be blocked.\n\n### 8. No Data Logging\n**Problem**: Can't reproduce bugs, can't train models, can't audit behavior.\n**Fix**: Always record. Event-based recording is cheap. Use MCAP format.\n\n## Architecture Decision Checklist\n\nWhen designing a new robot system, answer these questions:\n\n1. **What's the safety architecture?** (E-stop, watchdog, workspace limits)\n2. **What are the real-time requirements?** (Control at 100Hz+, perception at 10-30Hz)\n3. **What's the behavioral framework?** (BT for complex, FSM for simple)\n4. **How does sim-to-real work?** (HAL pattern, same interfaces)\n5. **How is data recorded?** (Episode-based, event-triggered, with metadata)\n6. **How are failures handled?** (Graceful degradation, recovery behaviors)\n7. **What's the communication middleware?** (ROS2 for most cases)\n8. **How is the system deployed?** (Docker, snap, direct install)\n9. **How is it tested?** (Unit, integration, hardware-in-the-loop, field)\n10. **How is it monitored?** (Heartbeats, metrics, dashboards)","tags":["robotics","design","patterns","agent","skills","arpitg1304","agent-skills","ai-coding-assistant","claude-skills"],"capabilities":["skill","source-arpitg1304","skill-robotics-design-patterns","topic-agent-skills","topic-ai-coding-assistant","topic-claude-skills","topic-robotics"],"categories":["robotics-agent-skills"],"synonyms":[],"warnings":[],"endpointUrl":"https://skills.sh/arpitg1304/robotics-agent-skills/robotics-design-patterns","protocol":"skill","transport":"skills-sh","auth":{"type":"none","details":{"cli":"npx skills add arpitg1304/robotics-agent-skills","source_repo":"https://github.com/arpitg1304/robotics-agent-skills","install_from":"skills.sh"}},"qualityScore":"0.544","qualityRationale":"deterministic score 0.54 from registry signals: · indexed on github topic:agent-skills · 189 github stars · SKILL.md body (20,786 chars)","verified":false,"liveness":"unknown","lastLivenessCheck":null,"agentReviews":{"count":0,"score_avg":null,"cost_usd_avg":null,"success_rate":null,"latency_p50_ms":null,"narrative_summary":null,"summary_updated_at":null},"enrichmentModel":"deterministic:skill-github:v1","enrichmentVersion":1,"enrichedAt":"2026-05-02T18:54:20.922Z","embedding":null,"createdAt":"2026-04-18T22:05:35.147Z","updatedAt":"2026-05-02T18:54:20.922Z","lastSeenAt":"2026-05-02T18:54:20.922Z","tsv":"'-30':2022 '-85':999 '/dev/ttyusb0':1004 '0':539,545,1102 '0.0':1049 '0.01':1849 '0.02':828 '0.085':1038 '0.35':1826 '0.5':977,1016,1047 '1':99,1112,1317,1484,1797,1996 '1.0':969 '10':2021,2103 '1000.0':1164 '100hz':908,1620,1943,2018 '10hz':902,906 '115200':1008 '1f':1279 '1khz':1622 '2':177,1121,1324,1496,1819,2008 '20.0':460 '200hz':910 '255':1022 '2cm':829 '2f':998 '3':570,1129,1327,1505,1838,2024 '30hz':900,1617 '3f':1417,1420,1422,1442 '4':732,1283,1334,1515,1858,2036 '5':927,1523,1881,2048 '500':1160 '50hz':904 '6':768,1095,1909,2061 '7':1458,1933,2070 '8':1586,1955,2080 '85mm':1039 '9':2090 'aabb':1368 'ab':1431 'abc':948,950,954 'abort':400,449,632,661,678,686,694 'abstract':72,143,929,944,955 'abstractmethod':951,963,971,979,985 'accuraci':832 'action':242,247,295,480,497,551,896,1606,1871 'actioncli':341 'activ':1665,1730,1742,1767 'actuat':148 'adapt':1526 'aliv':1208,1222,1236,1263,1265 'alloc':119 'alway':940,1973 'analysi':1647 'annot':1624,1758,1775 'anomali':1608,1644 'answer':1993 'anti':1793 'anti-pattern':1792 'anyth':590 'append':1746 'applic':114,171,934,1130,1464,1485,1572 'approach':59 'approachobject':473 'architect':82 'architectur':5,24,49,110,744,1463,1589,1857,1984,2001 'arm':1564 'assess':1639 'attitud':848 'audit':1970 'auto':597,608,610,612,614,616,618 'avoid':1133,1796 'base':345,1594,1603,1629,1657,1977,2055 'batteri':266 'batteryok':458 'bb':358,518 'bb.get':361 'beat':1204,1232,1257 'begin':1703 'behavior':27,54,121,123,178,181,204,241,583,723,730,1877,1971,2028,2069 'blackboard':506,510,524,555 'block':1934,1941,1954 'bool':970,978,1190 'bound':1351,1365 'boundari':1531,1607,1658 'brake':1326 'branch':726 'bring':1311 'bt':180,261,516,709,719,2030 'bts':588 'bug':1963 'build':13,70,430,662 'button':1108 'call':325,349,377,411,1215 'callback':1852 'camera':151,774,778,794,899,1570,1616 'cancel':415,429 'cartesian':1367 'case':2079 'central':754 'charg':617 'cheap':1980 'check':250,263,380,444,1381,1396,1424,1641,1846 'checkbatteri':457 'checkestop':461 'checklist':1986 'child':239 'children':210,221,233,456,469,488,500 'choos':25,52 'class':292,525,543,556,604,621,746,952,991,1027,1144,1342,1538,1649 'clear':585 'client':319,340 'clock':1513,1930 'close':270,481,973,1013,1044 'closegripp':479 'cluster':813,818,820,824 'cmd':1017,1024 'code':935,1465,1467,1486,1573,1885,1887 'collis':1132 'command':174,1323,1382,1384,1388 'command.joint_positions.items':1401 'command.joint_velocities.items':1429 'communic':135,2074 'complet':1610,1640 'complex':113,203,591,720,2032 'compon':1185,1194,1217,1273 'comput':1940 'concret':994 'condit':248,252,725 'confid':887,891 'config':1062,1077,1534,1543,1545,1550,1554,1557,1562,1568,1662,1672,1836 'config-driven':1533 'config.get':1067 'connect':1816 'control':33,63,133,167,907,912,1116,1807,1932,1936,1944,1949,2016 'convex':1370 'core':205 'crash':1892 'creat':434,1060 'critic':1189,1206,1207,1269,1272,1590 'cut':1109 'daemon':1180 'dashboard':2110 'data':841,854,1587,1614,1738,1748,1749,1914,1957,2051 'datapoint':1747 'dds':138 'debug':199 'decis':127,163,190,1985 'decision-mak':126,189 'decor':235 'def':303,321,346,374,406,433,528,559,624,642,757,771,799,835,864,964,972,980,986,1000,1012,1034,1043,1059,1155,1183,1210,1238,1284,1306,1352,1380,1540,1659,1677,1697,1732,1752 'default':186 'defin':581 'degrad':2067 'deploy':2085 'descript':1832 'design':3,7,21,39,46,75,156,1596,1988 'despit':922 'det':786,792 'det.confidence':796 'detect':531,534,538,544,776,781,788,1645 'detection/estimation':738 'detectobject':470,526 'devic':714 'dict':1063,1356,1702,1759 'dim':767 'direct':173,937,2088 'distanc':1825 'divers':1646 'docker':2086 'domain':1516 'done':668,693 'driven':1535,1856 'driver':145,1480 'dt':855 'e':1105,2003 'e-stop':1104,2002 'earli':1906 'easier':197 'ecosystem':1600 'effect':255 'elaps':1253,1259,1278 'elif':397,1079,1556 'els':403,1085 'emerg':1329 'end':1753 'enforc':1126,1344 'engag':1325 'ensur':919 'enum':594,596,606 'environ':1721 'episod':1628,1653,1668,1699,1706,1710,1711,1754,1765,1781,1783,1789,2054 'episode-bas':1627,2053 'episoderecord':1650 'error':615,679,687,695,697,1860,1868 'errorrecoveri':698 'estim':134,849 'estopclear':462 'etc':1676,1722 'event':1337,1602,1656,1842,1855,1976,2057 'event-bas':1601,1655,1975 'event-driven':1854 'event-trigg':2056 'everi':104,378,1870 'everyth':1804,1924 'everywher':1828 'exampl':256 'exceed':1443 'execut':209,220,231,244,643,921,1307 'exit':414 'extendedkalmanfilt':765 'f':370,1088,1271,1275,1301,1413,1437 'factori':1054 'fail':215,448,669,703 'failur':218,1154,1875,2064 'fallback':219,445,1880 'fals':454,1171,1266,1450,1666,1768 'fatal':702 'field':2102 'file':1837 'final':640,1761 'findobject':471 'finit':571 'first':217,228,353,1867 'fix':1809,1829,1850,1869,1896,1922,1946,1972 'fleet':86 'float':968,976,990 'flow':159,164 'follow':107 'forc':975,1015,1019,1021,1046,1379 'force/torque':1621 'forev':1865 'forgeir':1599 'format':1633,1673,1983 'found':275 'framework':28,2029 'freq':1694 'frequenc':847,913,915,917,1684,1695 'fsm':574,707,710,2033 'fsms':56,125,201,576 'function':129 'fuse':871,1919 'fusion':739,743,751,903 'gazebo':1479 'geometri':809 'get':865,884,890,981,987 'go':941 'goal':284,357,367,426,567 'goalstatus.status':393,399 'god':1798 'grace':2066 'grip':276,278,280 'gripper':154,956,1031,1061,1065,1071,1080,1090,1092 'grippercommand':478,495 'gripperinterfac':953,993,1029,1064 'gripperst':984 'gripperstate.closed':1051 'gripperstate.open':1042 'hal':74,931,1474,1476,1522,1530,1897,2044 'handl':365,384,424,2065 'happen':1518 'hardcod':1820 'hardwar':71,142,149,175,928,939,961,1057,1103,1118,1392,1478,1895,1902,2098 'hardware-in-the-loop':1901,2097 'hdf5':1636 'health':1148 'heartbeat':1125,1198,1211,2108 'hi':1403,1411,1421 'hierarchi':1100 'high':783,846 'high-frequ':845 'hull':1371 'hybrid':58 'hz':1685,1696,2023 'id':1712 'idl':607 'implement':35,65,286,620,718,958,995 'import':289,595,598,949,1140,1142 'imposs':1917 'imu':838,840,843,853,858,861,909 'imus':155 'info':1250,1255,1262,1264,1268 'inform':158 'init':304,312,625,628,758,1001,1035,1156,1353,1541,1660 'initi':332 'initialis':347,560 'input':634 'insid':1519 'instal':2089 'instanti':1058 'instead':1840 'int':1020 'integr':2096 'interfac':147,957,1504,1577,2047 'ip':1565 'ipc':141 'isaac':1481 'job':1815 'joint':1360,1361,1394,1398,1406,1414,1415,1426,1435,1438,1439,1618 'key':308,315,317,363,475,492,635,639,1482 'know':1488 'kwarg':324 'kwargs.get':335 'label':546 'languag':1623 'last':1203,1231,1256 'latenc':784,812,925 'layer':73,109,115,122,130,136,144,150,172,930,945,1626 'leaf':243,249 'learn':1525,1593 'learning-bas':1592 'left':212,223 'left-to-right':211,222 'let':169,933 'level':1101,1111,1120,1128 'librari':603 'lidar':152,802,806,826,831,901 'limit':1127,1135,1340,1355,1358,1359,1366,1373,1374,1377,1378,1395,1405,1434,1444,2007 'linear':711 'lo':1402,1409,1419 'load':1834 'log':1335,1808,1958 'logger.critical':1300 'logic':647 'loop':897,1240,1905,1937,1945,1950,2101 'lower':811 'machin':37,573,602,664 'magic':1821 'make':88,128,191 'mani':724 'map':892 'max':1363 'mcap':1634,1675,1982 'memori':140,441,453,466,485,514 'messag':1498 'metadata':1632,1701,1708,1717,1718,2060 'metric':2109 'middlewar':2075 'min':1362 'mission':116 'modbus':1009 'mode':1546,1558,1876 'model':741,756,761,873,1967 'modifi':236 'modular':194 'monitor':1124,1146,1173,1239,2107 'monoton':1929 'motion':1320 'motionplann':1583 'motor':153 'move':269,271,297,344,371 'movebas':343 'movement':428 'movetoplac':490 'movetotarget':293,472,489,557 'ms':1159,1163 'msg':1682,1692 'mujoco':1477 'multi':84,749 'multi-robot':83 'multi-sensor':748 'multimod':1613 'multiobjecttrack':770 'must':1196,1951 'name':306,313,1187,1202,1213,1226,1230,1235,1249,1274,1502,1681,1687,1689,1690,1724,1726,1737,1745,1833 'navig':609,646,649,673,701 'navigatest':622,674 'need':1873 'never':168,932,1487,1952 'new':409,419,1990 'node':206,296,336,338,352,413,517,521,552,1333,1799,1802,1813,1872 'non':94 'non-real-tim':93 'none':320,386,1669,1760 'number':1822 'obj':267,282 'object':476,536,542,564,769,780,790,822,881,883,886,889 'often':1908 'ok':274 'one':1801,1812,1814 'open':268,272,498,965,1040 'opengripp':496 'oper':1350,1611,1720 'outcom':630,667,1756,1773 'output':638 'outsid':1418 'parallel':230 'paramet':1510,1830 'pass':1338 'pattern':4,6,40,98,176,287,507,569,731,926,1055,1094,1138,1457,1585,1794,1898,2045 'per':281,283,285 'percept':31,61,131,162,520,733,895,924,1805,1939,2019 'perception-act':894 'perception-planning-control':30 'perceptionpipelin':1580 'perfect':1889 'physic':1107 'pick':258,264,435,463,465,502,611,677,681 'pick-and-plac':257 'pick.add':468 'pickandplac':440 'pickstat':682 'pipelin':34,64,734 'place':260,265,436,482,484,493,503,613,685,689 'place.add':487 'placest':690 'plan':32,62,117,132,1806 'planner':905,914 'point':804,815,816 'poll':1839 'port':1003,1007,1076,1078 'pose':302,310,477,494,537,540,565,637,641,652,656,764,851,877,879 'power':1110 'precis':808 'predict':852 'preempt':417,633 'prefer':587 'preprocess':737 'principl':8,1483 'problem':1800,1823,1843,1862,1886,1912,1938,1959 'processes/threads':1948 'progress':381 'protocol':717 'proven':10 'provid':779,807,844 'publish':1328 'py':290 'py_trees.behaviour.behaviour':294,527,558 'py_trees.blackboard.blackboard':519 'py_trees.common.status.failure':388,402 'py_trees.common.status.invalid':421 'py_trees.common.status.running':405 'py_trees.common.status.success':396 'py_trees.composites.sequence':439,451,464,483 'python':288,508,592,745,946,1139,1341,1532,1648 'qualiti':1638,1779,1786 'queri':869 'question':1995 'rais':1086 'random':1517 'rate':1115 'raw':735 're':193,1221 'reactiv':722 'read':553 'real':80,90,95,1462,1472,1475,1495,1559,1563,1569,1894,2013,2042 'real-tim':89,2012 'realsensedriv':1567 'reason':1289,1304 'recip':11 'recommend':185 'record':1588,1604,1651,1704,1733,1974,1978,2052 'recov':700 'recoveri':1861,2068 'regardless':111,1578 'regist':1184,1192,1678 'releas':273 'replay':1921 'reproduc':1962 'requir':2015 'resourc':333 'respons':1811 'result':648 'result.pose':657 'result.success':654 'return':387,395,401,404,504,547,549,658,660,874,1025,1052,1074,1083,1447,1453,1790 'reusabl':195,727 'right':214,225 'rlds':1637 'robot':2,15,22,38,47,85,101,105,188,246,298,721,876,1312,1346,1595,1652,1863,1991 'robotdriv':1539 'robotics-design-pattern':1 'robotiq':997,1073 'robotiqgripp':992,1075 'robotst':605 'robust':14 'root':438,505 'root.add':499 'ros':600 'ros2':137,337,2076 'rtu':1010 'rule':157,911 'run':1468 'safe':1151,1286,1302,1308,1315,1349,1449,1455 'safeti':66,443,450,501,1096,1099,1114,1131,2000 'safety-r':1113 'safety.add':455 'safetycheck':452 'safetyresult':1385,1448,1454 'safetywatchdog':1145 'save':1778 'scatter':1827 'score':1785 'scratch':51 'self':305,323,348,376,408,530,561,626,629,644,759,775,803,839,868,966,974,983,989,1002,1014,1036,1045,1157,1186,1212,1241,1288,1310,1354,1383,1542,1661,1680,1700,1735,1755 'self._build_modbus_cmd':1018 'self._execute_safe_stop':1305 'self._monitor_loop':1179 'self._trigger_safe_stop':1270 'self._wait_for_completion':1026 'self.action':318,339 'self.action_client.send':366 'self.arm':1548,1560,1584 'self.blackboard':359 'self.blackboard.get':563 'self.blackboard.set':535,541 'self.camera':1552,1566,1581 'self.cartesian':1364 'self.current':1667,1709,1782,1788 'self.current_episode.annotations':1774 'self.current_episode.end':1769 'self.current_episode.id':1791 'self.current_episode.outcome':1772 'self.current_episode.quality':1784 'self.current_episode.streams':1743 'self.episode':1664,1729,1741,1766 'self.filters':763,789,821,850,878,882,888 'self.force':1376 'self.goal':364,383,423 'self.goal_handle.cancel':425 'self.goal_handle.status':390 'self.heartbeats':1165,1201,1228,1229,1234 'self.heartbeats.items':1252 'self.joint':1357,1404 'self.last':857,860 'self.lock':1166,1200,1224,1247 'self.logger.info':369,427 'self.monitor':1175 'self.monitor_thread.start':1182 'self.node':334,342 'self.perception':1579 'self.perception.detect':532 'self.planner':1582 'self.safe':1168,1293,1296 'self.segment':814 'self.send':566 'self.serial':1005 'self.serial.write':1023 'self.state':1041,1050 'self.storage':1670 'self.storage.save':1787 'self.streams':1663,1686,1728 'self.target':314,362 'self.timeout':1161,1260,1282 'self.validate':1780 'self.velocity':1372,1433 'self.width':1037,1048 'self.world':760 'send':355,1197,1390 'sensor':146,736,742,750,793,825,916,1847,1913 'sensorfus':747 'separ':1947 'sequenc':208,262 'sequenti':582 'serial':1571 'serial.serial':1006 'set':330 'setup':322,1011 'share':139,513 'side':254 'signal':1219,1612 'sil2/sil3':1117 'sim':78,1460,1470,1473,1493,1508,1521,1551,1555,1883,2040 'sim-on':1882 'sim-to-r':77,1459,2039 'sim/real':1536 'simpl':578,713,2035 'simul':1030,1069,1082,1547,1891 'simulatedarm':1549 'simulatedcamera':1553 'simulatedgripp':1028,1084 'simultan':234 'singl':238,1810 'sit':1527 'skill':19,45 'skill-robotics-design-patterns' 'sleep':1848 'sm':665,671 'smach':599,619 'smach.state':623,627 'smach.statemachine':666 'smach.statemachine.add':672,680,688,696 'smooth':920 'snap':2087 'softwar':16,23,48,102,1122,1136 'sourc':1514,1927 'source-arpitg1304' 'stack':103 'start':1172,1698,1714 'state':36,572,586,601,663,716,766,867,880,982,1316,1330,1619 'status':389,392,398,410,420 'status.failure':550 'status.success':548 'stay':1347 'step':1734 'stop':1106,1152,1169,1287,1294,1297,1303,1309,1318,1864,2004 'storag':1625 'storagebackend':1671 'store':1763 'str':1188,1214,1290,1757 'stream':1615,1679,1723,1736,1744 'streamconfig':1688 'structur':29,60,1630 'sub':729 'sub-behavior':728 'subscrib':1853 'succeed':226,394,631,659,676,684,692 'success':229 'super':311 'switch':1512,1537 'system':67,87,106,1097,1147,1992,2084 'talk':936 'target':301,307,309,316,360,368,373,474,491,562,568,636,1178 'task':118,1609,1719 'termin':407 'test':1033,1899,2094 'thread':1141,1174,1176 'threading.lock':1167 'threading.thread':1177 'threshold':459 'tick':354,379 'time':91,96,898,1143,1276,1509,1715,1770,2014 'time.monotonic':1205,1233,1245,1716,1771 'time.sleep':1281 'timeout':1158,1162 'timestamp':777,797,798,805,833,834,842,856,863,1739,1750,1751,1911,1916,1923 'topic':1501,1818 'topic-agent-skills' 'topic-ai-coding-assistant' 'topic-claude-skills' 'topic-robotics' 'track':885 'tradeoff':97 'train':1966 'trajectori':1642 'transfer':81,1524 'transit':675,683,691,699 'tree':55,124,179,182,291,328,432,437,1878 'trigger':1150,1170,1285,1295,1298,1605,2058 'true':442,467,486,1053,1181,1191,1209,1237,1243,1299,1456,1731,1845 'type':207,962,1066,1068,1072,1081,1091,1093,1499,1683,1691,1693 'typic':830 'ui':120,715 'uncertainti':795,827 'unit':2095 'unknown':1089 'updat':375,529,772,791,800,823,836 'ur5driver':1561 'use':17,43,575,706,752,1506,1507,1574,1851,1928,1981 'userdata':645 'userdata.final':655 'userdata.target':651 'uuid4':1713 'valid':1386,1643,1776 'valu':1399,1410,1416 'valueerror':1087 'variabl':923 'vel':1427,1432,1441 'veloc':1322,1375,1423,1440 'via':1817 'violat':1393,1446,1451,1452 'violations.append':1412,1436 'vs':92,708 'watchdog':69,1119,1123,1137,2005 'well':580 'well-defin':579 'width':967,988 'within':1348 'without':1915 'work':1888,2043 'workflow':712 'workspac':1134,1339,2006 'workspacemonitor':1343 'world':740,755,866,872 'worldmodel':762 'worldstat':875 'write':522 'zarr':1635,1674 'zero':1321","prices":[{"id":"94fb9134-601c-4f5f-a496-3eb288fb4a34","listingId":"bdc2e0fd-3c28-4c16-a193-f85cc488b283","amountUsd":"0","unit":"free","nativeCurrency":null,"nativeAmount":null,"chain":null,"payTo":null,"paymentMethod":"skill-free","isPrimary":true,"details":{"org":"arpitg1304","category":"robotics-agent-skills","install_from":"skills.sh"},"createdAt":"2026-04-18T22:05:35.147Z"}],"sources":[{"listingId":"bdc2e0fd-3c28-4c16-a193-f85cc488b283","source":"github","sourceId":"arpitg1304/robotics-agent-skills/robotics-design-patterns","sourceUrl":"https://github.com/arpitg1304/robotics-agent-skills/tree/main/skills/robotics-design-patterns","isPrimary":false,"firstSeenAt":"2026-04-18T22:05:35.147Z","lastSeenAt":"2026-05-02T18:54:20.922Z"}],"details":{"listingId":"bdc2e0fd-3c28-4c16-a193-f85cc488b283","quickStartSnippet":null,"exampleRequest":null,"exampleResponse":null,"schema":null,"openapiUrl":null,"agentsTxtUrl":null,"citations":[],"useCases":[],"bestFor":[],"notFor":[],"kindDetails":{"org":"arpitg1304","slug":"robotics-design-patterns","github":{"repo":"arpitg1304/robotics-agent-skills","stars":189,"topics":["agent-skills","ai-coding-assistant","claude-skills","robotics"],"license":"apache-2.0","html_url":"https://github.com/arpitg1304/robotics-agent-skills","pushed_at":"2026-03-25T03:44:12Z","description":"Agent skills that make AI coding assistants write production-grade robotics software. ROS1, ROS2, design patterns, SOLID principles, and testing — for Claude Code, Cursor, Copilot, and any SKILL.md-compatible agent.","skill_md_sha":"981e71b0635dcdb8dba96e5c9de6cbcee642a95a","skill_md_path":"skills/robotics-design-patterns/SKILL.md","default_branch":"main","skill_tree_url":"https://github.com/arpitg1304/robotics-agent-skills/tree/main/skills/robotics-design-patterns"},"layout":"multi","source":"github","category":"robotics-agent-skills","frontmatter":{"name":"robotics-design-patterns","description":"Architecture patterns, design principles, and proven recipes for building robust robotics software. Use this skill when designing robot software architectures, choosing between behavioral frameworks, structuring perception-planning-control pipelines, implementing state machines, designing safety systems, or architecting multi-robot systems. Trigger whenever the user mentions behavior trees, finite state machines, subsumption architecture, sensor fusion, robot safety, watchdogs, heartbeats, graceful degradation, hardware abstraction layers, real-time constraints, or software architecture for robots. Also applies to sim-to-real transfer, digital twins, and robot fleet management."},"skills_sh_url":"https://skills.sh/arpitg1304/robotics-agent-skills/robotics-design-patterns"},"updatedAt":"2026-05-02T18:54:20.922Z"}}