{"id":"bf389d32-0618-4946-bfe2-ea505477c3f1","shortId":"TGDfQY","kind":"skill","title":"robotics-software-principles","tagline":"Foundational software design principles applied specifically to robotics module development. Use this skill when designing robot software modules, structuring codebases, making architecture decisions, reviewing robotics code, or building reusable robotics libraries. Trigger whene","description":"# Robotics Software Design Principles\n\n## Why Robotics Software Is Different\n\nRobotics code operates under constraints that most software never faces:\n\n1. **Physical consequences** — A bug doesn't just crash a process, it crashes a robot into a wall\n2. **Real-time deadlines** — Missing a 1ms control loop deadline can cause oscillation or damage\n3. **Sensor uncertainty** — All inputs are noisy, delayed, and occasionally wrong\n4. **Hardware diversity** — Same algorithm must work on 10 different grippers from 5 vendors\n5. **Sim-to-real gap** — Code must run identically in simulation and on real hardware\n6. **Long-running operation** — Robots run for hours/days; memory leaks and drift matter\n7. **Safety criticality** — Some failures must NEVER happen, regardless of software state\n\nThese constraints demand disciplined design. Below are principles that account for them.\n\n---\n\n## Principle 1: Single Responsibility — One Module, One Job\n\nEvery module (node, class, function) should have exactly ONE reason to change.\n\n**Why it matters in robotics**: A perception module that also does control means a camera driver update can break your arm controller. In safety-critical systems, this coupling is unacceptable.\n\n```python\n# ❌ BAD: God module — perception + planning + control + logging\nclass RobotController:\n    def __init__(self):\n        self.camera = RealSenseCamera()\n        self.detector = YOLODetector()\n        self.planner = RRTPlanner()\n        self.arm = UR5Driver()\n        self.logger = DataLogger()\n\n    def run(self):\n        image = self.camera.capture()\n        objects = self.detector.detect(image)\n        path = self.planner.plan(objects[0].pose)\n        self.arm.execute(path)\n        self.logger.log(image, objects, path)\n        # If ANY of these changes, you touch this class\n\n# ✅ GOOD: Separated responsibilities with clear interfaces\nclass PerceptionModule:\n    \"\"\"ONLY responsibility: raw sensor data → detected objects\"\"\"\n    def __init__(self, camera: CameraInterface, detector: DetectorInterface):\n        self.camera = camera\n        self.detector = detector\n\n    def get_detections(self) -> List[Detection]:\n        image = self.camera.capture()\n        return self.detector.detect(image)\n\nclass PlanningModule:\n    \"\"\"ONLY responsibility: goal + world state → trajectory\"\"\"\n    def __init__(self, planner: PlannerInterface):\n        self.planner = planner\n\n    def plan_to(self, target: Pose, obstacles: List[Obstacle]) -> Trajectory:\n        return self.planner.plan(target, obstacles)\n\nclass ExecutionModule:\n    \"\"\"ONLY responsibility: trajectory → hardware commands\"\"\"\n    def __init__(self, arm: ArmInterface):\n        self.arm = arm\n\n    def execute(self, trajectory: Trajectory) -> ExecutionResult:\n        return self.arm.follow_trajectory(trajectory)\n```\n\n**Test**: Can you describe what a module does WITHOUT using \"and\"? If not, split it.\n\n---\n\n## Principle 2: Dependency Inversion — Depend on Abstractions, Not Hardware\n\nHigh-level modules (planning, behavior) should never depend on low-level modules (drivers, hardware). Both should depend on abstractions.\n\n**Why it matters in robotics**: This is the foundation of sim-to-real. If your planner imports `UR5Driver` directly, it can't run in simulation. If it depends on `ArmInterface`, you swap implementations freely.\n\n```python\nfrom abc import ABC, abstractmethod\nfrom dataclasses import dataclass\nfrom typing import List, Optional\nimport numpy as np\n\n# ─── ABSTRACTIONS (the contracts) ────────────────────────────\n\nclass ArmInterface(ABC):\n    \"\"\"Abstract arm — every arm implementation must honor this contract\"\"\"\n\n    @abstractmethod\n    def get_joint_positions(self) -> np.ndarray:\n        \"\"\"Returns current joint positions in radians\"\"\"\n        ...\n\n    @abstractmethod\n    def get_ee_pose(self) -> Pose:\n        \"\"\"Returns current end-effector pose\"\"\"\n        ...\n\n    @abstractmethod\n    def move_to_joints(self, positions: np.ndarray,\n                        velocity: float = 0.5) -> bool:\n        \"\"\"Move to joint positions. Returns True on success.\"\"\"\n        ...\n\n    @abstractmethod\n    def stop(self) -> None:\n        \"\"\"Immediately stop all motion\"\"\"\n        ...\n\n    @property\n    @abstractmethod\n    def joint_limits(self) -> List[tuple]:\n        \"\"\"Returns [(min, max)] for each joint\"\"\"\n        ...\n\n\nclass CameraInterface(ABC):\n    \"\"\"Abstract camera — any RGB camera must honor this\"\"\"\n\n    @abstractmethod\n    def capture(self) -> np.ndarray:\n        \"\"\"Returns (H, W, 3) uint8 RGB image\"\"\"\n        ...\n\n    @abstractmethod\n    def get_intrinsics(self) -> CameraIntrinsics:\n        \"\"\"Returns camera intrinsic parameters\"\"\"\n        ...\n\n    @property\n    @abstractmethod\n    def resolution(self) -> tuple:\n        \"\"\"Returns (width, height)\"\"\"\n        ...\n\n\nclass GripperInterface(ABC):\n    @abstractmethod\n    def open(self, width: float = 1.0) -> bool: ...\n\n    @abstractmethod\n    def close(self, force: float = 0.5) -> bool: ...\n\n    @abstractmethod\n    def get_width(self) -> float: ...\n\n    @abstractmethod\n    def is_grasping(self) -> bool: ...\n\n\n# ─── CONCRETE IMPLEMENTATIONS ────────────────────────────────\n\nclass UR5Arm(ArmInterface):\n    \"\"\"Real UR5 via RTDE protocol\"\"\"\n    def __init__(self, ip: str):\n        self.rtde = RTDEControl(ip)\n        self.rtde_receive = RTDEReceive(ip)\n\n    def get_joint_positions(self) -> np.ndarray:\n        return np.array(self.rtde_receive.getActualQ())\n\n    def move_to_joints(self, positions, velocity=0.5):\n        self.rtde.moveJ(positions.tolist(), velocity)\n        return True\n\n    def stop(self):\n        self.rtde.stopScript()\n\n    @property\n    def joint_limits(self):\n        return [(-2*np.pi, 2*np.pi)] * 6\n\n\nclass MuJoCoArm(ArmInterface):\n    \"\"\"Simulated arm in MuJoCo — SAME interface\"\"\"\n    def __init__(self, model_path: str, joint_names: List[str]):\n        self.model = mujoco.MjModel.from_xml_path(model_path)\n        self.data = mujoco.MjData(self.model)\n        self.joint_ids = [mujoco.mj_name2id(self.model, mujoco.mjtObj.mjOBJ_JOINT, n)\n                          for n in joint_names]\n\n    def get_joint_positions(self) -> np.ndarray:\n        return np.array([self.data.qpos[jid] for jid in self.joint_ids])\n\n    def move_to_joints(self, positions, velocity=0.5):\n        # Simulate motion with position control\n        self.data.ctrl[:len(positions)] = positions\n        for _ in range(100):\n            mujoco.mj_step(self.model, self.data)\n        return True\n\n    def stop(self):\n        self.data.ctrl[:] = 0\n\n\n# ─── HIGH-LEVEL CODE DEPENDS ONLY ON ABSTRACTIONS ────────────\n\nclass PickPlaceTask:\n    \"\"\"This class works with ANY arm + gripper + camera.\n    It never knows or cares if it's sim or real.\"\"\"\n\n    def __init__(self, arm: ArmInterface, gripper: GripperInterface,\n                 camera: CameraInterface, detector: DetectorInterface):\n        self.arm = arm\n        self.gripper = gripper\n        self.camera = camera\n        self.detector = detector\n\n    def execute(self, target_class: str) -> bool:\n        image = self.camera.capture()\n        detections = self.detector.detect(image)\n        target = next((d for d in detections if d.label == target_class), None)\n        if target is None:\n            return False\n\n        self.arm.move_to_joints(self.ik(target.pose))\n        self.gripper.close()\n        self.arm.move_to_joints(self.place_joints)\n        self.gripper.open()\n        return True\n```\n\n**The Dependency Rule in Robotics**:\n```\nApplication / Tasks\n    ↓ depends on\nInterfaces (ABC)\n    ↑ implements\nHardware Drivers / Simulators\n```\n\nArrows point inward. High-level policy never imports low-level drivers.\n\n---\n\n## Principle 3: Open-Closed — Extend Without Modifying\n\nModules should be open for extension but closed for modification. Add new capabilities by adding new code, not changing existing code.\n\n**Why it matters in robotics**: You constantly add new sensors, new robots, new tasks. If adding a new camera requires modifying your perception pipeline, you'll break existing deployments.\n\n```python\n# ❌ BAD: Adding a new sensor requires modifying existing code\nclass PerceptionPipeline:\n    def process(self, sensor_type: str, data):\n        if sensor_type == 'realsense':\n            return self._process_realsense(data)\n        elif sensor_type == 'zed':\n            return self._process_zed(data)\n        elif sensor_type == 'oakd':    # New sensor = modify this class\n            return self._process_oakd(data)\n\n# ✅ GOOD: Plugin architecture — add sensors without touching core\nclass SensorPlugin(ABC):\n    \"\"\"Base class for all sensor plugins\"\"\"\n    @abstractmethod\n    def name(self) -> str: ...\n\n    @abstractmethod\n    def process(self, raw_data) -> ProcessedData: ...\n\n    @abstractmethod\n    def get_intrinsics(self) -> dict: ...\n\n\nclass RealSensePlugin(SensorPlugin):\n    def name(self): return 'realsense'\n    def process(self, raw_data):\n        # RealSense-specific processing\n        return ProcessedData(...)\n\n\nclass ZEDPlugin(SensorPlugin):\n    def name(self): return 'zed'\n    def process(self, raw_data):\n        # ZED-specific processing\n        return ProcessedData(...)\n\n\n# Core pipeline never changes when you add sensors\nclass PerceptionPipeline:\n    def __init__(self):\n        self._plugins: dict[str, SensorPlugin] = {}\n\n    def register_sensor(self, plugin: SensorPlugin):\n        \"\"\"Extend the pipeline without modifying it\"\"\"\n        self._plugins[plugin.name()] = plugin\n\n    def process(self, sensor_name: str, data):\n        if sensor_name not in self._plugins:\n            raise ValueError(f\"Unknown sensor: {sensor_name}\")\n        return self._plugins[sensor_name].process(data)\n\n\n# Adding OAK-D = add a file, register at startup. Zero changes to core.\nclass OAKDPlugin(SensorPlugin):\n    def name(self): return 'oakd'\n    def process(self, raw_data):\n        return ProcessedData(...)\n\npipeline = PerceptionPipeline()\npipeline.register_sensor(RealSensePlugin())\npipeline.register_sensor(OAKDPlugin())  # No core code changed\n```\n\n---\n\n## Principle 4: Interface Segregation — Small, Focused Interfaces\n\nDon't force modules to depend on interfaces they don't use. Many small interfaces beat one large one.\n\n**Why it matters in robotics**: A simple 1-DOF gripper shouldn't implement a 6-DOF dexterous hand interface. A fixed camera shouldn't implement pan-tilt methods.\n\n```python\n# ❌ BAD: Fat interface — every camera must implement ALL of these\nclass CameraInterface(ABC):\n    @abstractmethod\n    def capture_rgb(self) -> np.ndarray: ...\n    @abstractmethod\n    def capture_depth(self) -> np.ndarray: ...\n    @abstractmethod\n    def capture_pointcloud(self) -> np.ndarray: ...\n    @abstractmethod\n    def set_exposure(self, value: float): ...\n    @abstractmethod\n    def set_pan_tilt(self, pan: float, tilt: float): ...\n    @abstractmethod\n    def stream_video(self) -> Iterator[np.ndarray]: ...\n    # A simple USB webcam can't do half of these!\n\n# ✅ GOOD: Segregated interfaces — implement only what you support\nclass RGBCamera(ABC):\n    \"\"\"Any camera that produces RGB images\"\"\"\n    @abstractmethod\n    def capture_rgb(self) -> np.ndarray: ...\n\n    @property\n    @abstractmethod\n    def resolution(self) -> tuple: ...\n\nclass DepthCamera(ABC):\n    \"\"\"Cameras that also produce depth\"\"\"\n    @abstractmethod\n    def capture_depth(self) -> np.ndarray: ...\n\n    @abstractmethod\n    def get_depth_intrinsics(self) -> DepthIntrinsics: ...\n\nclass ControllableCamera(ABC):\n    \"\"\"Cameras with adjustable settings\"\"\"\n    @abstractmethod\n    def set_exposure(self, value: float): ...\n\n    @abstractmethod\n    def set_white_balance(self, value: float): ...\n\nclass PTZCamera(ABC):\n    \"\"\"Pan-tilt-zoom cameras\"\"\"\n    @abstractmethod\n    def set_pan_tilt(self, pan: float, tilt: float): ...\n\n    @abstractmethod\n    def set_zoom(self, level: float): ...\n\n\n# A RealSense implements RGB + Depth, but not PTZ\nclass RealSenseD435(RGBCamera, DepthCamera, ControllableCamera):\n    def capture_rgb(self): ...\n    def capture_depth(self): ...\n    def set_exposure(self, value): ...\n    # No PTZ methods — it's not a PTZ camera!\n\n# A webcam implements only RGB\nclass USBWebcam(RGBCamera):\n    def capture_rgb(self): ...\n    # Nothing else required\n\n# Perception code that only needs RGB doesn't pull in depth dependencies\nclass ObjectDetector:\n    def __init__(self, camera: RGBCamera):  # Only needs RGB\n        self.camera = camera\n\n    def detect(self) -> List[Detection]:\n        image = self.camera.capture_rgb()\n        return self.model.predict(image)\n```\n\n---\n\n## Principle 5: Liskov Substitution — Replaceable Implementations\n\nAny implementation of an interface must be substitutable without the caller knowing. If your code works with `ArmInterface`, it must work with ANY arm that implements it.\n\n**Why it matters in robotics**: Sim-to-real transfer, hardware swaps, and multi-robot support all depend on this.\n\n```python\n# ❌ BAD: Violates substitution — caller must know the implementation\nclass FrankaArm(ArmInterface):\n    def move_to_joints(self, positions, velocity=0.5):\n        if len(positions) != 7:\n            raise ValueError(\"Franka has 7 joints!\")  # Franka-specific\n        # ...\n\nclass UR5Arm(ArmInterface):\n    def move_to_joints(self, positions, velocity=0.5):\n        if len(positions) != 6:\n            raise ValueError(\"UR5 has 6 joints!\")  # UR5-specific\n        # ...\n\n# Caller must know which arm it's using to pass correct joint count!\n# This breaks substitutability.\n\n# ✅ GOOD: Self-describing implementations\nclass ArmInterface(ABC):\n    @property\n    @abstractmethod\n    def num_joints(self) -> int: ...\n\n    @property\n    @abstractmethod\n    def joint_limits(self) -> List[tuple]: ...\n\n    @abstractmethod\n    def move_to_joints(self, positions: np.ndarray, velocity: float = 0.5) -> bool:\n        \"\"\"Positions must have length == self.num_joints\"\"\"\n        ...\n\nclass FrankaArm(ArmInterface):\n    @property\n    def num_joints(self): return 7\n\n    def move_to_joints(self, positions, velocity=0.5):\n        assert len(positions) == self.num_joints\n        # ...\n\n# Caller code is generic — works with any arm\ndef move_to_home(arm: ArmInterface):\n    home = np.zeros(arm.num_joints)  # Queries the arm, doesn't assume\n    arm.move_to_joints(home)\n```\n\n**Substitution test**: Take every line of caller code. Replace `UR5` with `Franka` with `SimArm`. Does it still work? If not, your abstraction leaks.\n\n---\n\n## Principle 6: Separation of Rates — Respect Timing Boundaries\n\nDifferent subsystems run at different rates. Never couple them.\n\n```\nComponent          Typical Rate     Criticality\n─────────────────────────────────────────────────\nSafety monitor     1000 Hz          HARD real-time\nJoint controller   500-1000 Hz      HARD real-time\nTrajectory exec    100-200 Hz       Firm real-time\nState estimation   50-200 Hz        Firm real-time\nPerception         10-30 Hz         Soft real-time\nPlanning           1-10 Hz          Best effort\nTask management    0.1-1 Hz         Best effort\nLogging            1-30 Hz          Best effort\nUI/Dashboard       1-10 Hz          Best effort\n```\n\n```python\n# ❌ BAD: Perception blocks the control loop\nclass Robot:\n    def control_loop(self):  # Must run at 100Hz = 10ms budget\n        image = self.camera.capture()           # 5ms\n        objects = self.detector.detect(image)    # 200ms ← BLOCKS!\n        pose = self.estimate_pose(objects)       # 2ms\n        cmd = self.controller.compute(pose)      # 0.1ms\n        self.arm.send_command(cmd)               # 0.5ms\n        # Total: 207ms. Control runs at 5Hz instead of 100Hz!\n\n# ✅ GOOD: Decoupled rates with async boundaries\nclass Robot:\n    def __init__(self):\n        self.latest_detections = []\n        self.detection_lock = threading.Lock()\n\n        # Perception runs in its own thread at its own rate\n        self.perception_thread = threading.Thread(\n            target=self._perception_loop, daemon=True)\n        self.perception_thread.start()\n\n    def _perception_loop(self):\n        \"\"\"Runs at ~10Hz — as fast as the detector allows\"\"\"\n        while self.running:\n            image = self.camera.capture()\n            detections = self.detector.detect(image)\n            with self.detection_lock:\n                self.latest_detections = detections\n\n    def control_loop(self):\n        \"\"\"Runs at 100Hz — NEVER blocked by perception\"\"\"\n        rate = Rate(100)  # 10ms period\n        while self.running:\n            with self.detection_lock:\n                detections = self.latest_detections  # Latest available\n\n            pose = self.estimate_pose(detections)\n            cmd = self.controller.compute(pose)\n            self.arm.send_command(cmd)\n            rate.sleep()\n```\n\n**Rule**: If subsystem A is slower than subsystem B, A must communicate to B via a buffer (topic, shared variable, queue) — never by direct call.\n\n---\n\n## Principle 7: Fail-Safe Defaults — Safe Until Proven Otherwise\n\nEvery module should default to the safest possible behavior. Safety is not a feature you add — it's the default you degrade from.\n\n```python\n# ❌ BAD: Unsafe defaults\nclass ArmController:\n    def __init__(self):\n        self.max_velocity = 3.14       # Full speed by default!\n        self.collision_check = False    # Off by default!\n        self.workspace_limits = None    # No limits by default!\n\n# ✅ GOOD: Safe defaults — must explicitly opt into danger\nclass ArmController:\n    def __init__(self):\n        self.max_velocity = 0.1              # Crawl speed by default\n        self.collision_check = True           # Always on\n        self.workspace_limits = DEFAULT_SAFE_WORKSPACE  # Conservative box\n        self.require_enable = True            # Must be explicitly enabled\n        self._enabled = False\n\n    def enable(self, operator_confirmed: bool = False):\n        \"\"\"Explicit enable step — requires operator confirmation for real hardware\"\"\"\n        if not operator_confirmed and not self.is_simulation:\n            raise SafetyError(\n                \"Real hardware requires operator confirmation to enable\")\n        self._enabled = True\n\n    def move_to(self, target: np.ndarray, velocity: float = None):\n        if not self._enabled:\n            raise SafetyError(\"Controller not enabled\")\n\n        velocity = velocity or self.max_velocity\n\n        # Clamp velocity to safe range\n        velocity = min(velocity, self.max_velocity)\n\n        # Check workspace limits BEFORE moving\n        if not self.workspace_limits.contains(target):\n            raise WorkspaceViolation(f\"Target {target} outside safe workspace\")\n\n        # Check for collisions BEFORE moving\n        if self.collision_check:\n            if self.collision_detector.would_collide(target):\n                raise CollisionRisk(f\"Collision predicted for target {target}\")\n\n        return self._execute_move(target, velocity)\n```\n\n**The rule**: What happens when a module receives no input, invalid input, or loses communication? It should stop safely, not continue blindly.\n\n```python\nclass SafetyDefaults:\n    \"\"\"Centralized safe defaults for the entire system\"\"\"\n\n    # Communication loss → stop\n    HEARTBEAT_TIMEOUT_MS = 500\n    ACTION_ON_TIMEOUT = 'stop'           # Not 'continue_last_command'\n\n    # Unknown state → stop\n    ACTION_ON_UNKNOWN_STATE = 'stop'     # Not 'assume_safe'\n\n    # Sensor failure → stop\n    ACTION_ON_SENSOR_LOSS = 'stop'       # Not 'use_last_reading'\n\n    # Joint limit approach → slow down\n    JOINT_LIMIT_MARGIN_RAD = 0.05        # Stop 0.05 rad before limit\n    VELOCITY_NEAR_LIMITS = 0.05          # Crawl near limits\n\n    # Default workspace (conservative bounding box)\n    WORKSPACE_MIN = np.array([-0.5, -0.5, 0.0])   # meters\n    WORKSPACE_MAX = np.array([0.5, 0.5, 0.8])      # meters\n```\n\n---\n\n## Principle 8: Configuration Over Code — Externalize Everything That Changes\n\nAnything that might differ between deployments, robots, or environments should be in configuration, not code.\n\n```python\n# ❌ BAD: Hardcoded values scattered across files\nclass GraspPlanner:\n    def plan(self, object_pose):\n        approach_height = 0.15          # Magic number\n        grasp_depth = 0.02              # Magic number\n        if object_pose.z < 0.05:        # Magic number\n            return None\n\n# ✅ GOOD: Configuration-driven\n# config/grasp_planner.yaml\n# grasp_planner:\n#   approach_height_m: 0.15\n#   grasp_depth_m: 0.02\n#   min_object_height_m: 0.05\n#   max_grasp_width_m: 0.08\n#   approach_velocity: 0.1\n#   grasp_force_n: 10.0\n\n@dataclass\nclass GraspConfig:\n    approach_height_m: float = 0.15\n    grasp_depth_m: float = 0.02\n    min_object_height_m: float = 0.05\n    max_grasp_width_m: float = 0.08\n    approach_velocity: float = 0.1\n    grasp_force_n: float = 10.0\n\n    @classmethod\n    def from_yaml(cls, path: str) -> 'GraspConfig':\n        with open(path) as f:\n            data = yaml.safe_load(f)\n        return cls(**data.get('grasp_planner', {}))\n\n    def validate(self):\n        assert self.approach_height_m > 0, \"Approach height must be positive\"\n        assert 0 < self.grasp_force_n < 100, \"Force out of safe range\"\n\n\nclass GraspPlanner:\n    def __init__(self, config: GraspConfig):\n        config.validate()\n        self.config = config\n\n    def plan(self, object_pose):\n        if object_pose.z < self.config.min_object_height_m:\n            return None\n        # ...\n```\n\n**What goes in config**: robot IP addresses, joint limits, sensor parameters, safety thresholds, workspace boundaries, task-specific constants, file paths, feature flags.\n\n**What stays in code**: algorithms, control logic, data structures, interface definitions, error handling.\n\n---\n\n## Principle 9: Idempotent Operations — Safe to Retry\n\nEvery command should be safe to send twice. Network drops, message duplicates, and retries are facts of life in robotics.\n\n```python\n# ❌ BAD: Non-idempotent — sending twice moves the robot twice as far\ndef move_relative(self, delta: np.ndarray):\n    current = self.get_position()\n    self.move_to(current + delta)\n    # If this message is sent twice due to a retry,\n    # the robot moves 2x the intended distance!\n\n# ✅ GOOD: Idempotent — sending twice has the same effect as once\ndef move_to_absolute(self, target: np.ndarray, command_id: str):\n    if command_id == self._last_executed_command:\n        return  # Already executed this command, skip\n    self._last_executed_command = command_id\n    self.move_to(target)\n    # Sending this twice is harmless — same target, same result\n\n# ✅ GOOD: Idempotent gripper commands\ndef set_gripper(self, width: float):\n    \"\"\"Set gripper to absolute width — not open/close toggle\"\"\"\n    self.gripper.move_to_width(width)\n    # Calling set_gripper(0.04) ten times still results in 0.04m width\n```\n\n---\n\n## Principle 10: Observe Everything — You Can't Debug What You Can't See\n\nEvery module should emit structured telemetry. When a robot behaves unexpectedly at 2 AM, logs are all you have.\n\n```python\nimport structlog\nfrom dataclasses import dataclass, asdict\n\nlogger = structlog.get_logger()\n\n@dataclass\nclass PerceptionEvent:\n    timestamp: float\n    num_detections: int\n    processing_time_ms: float\n    frame_id: str\n    detector_confidence: float\n\nclass PerceptionModule:\n    def process(self, image):\n        t_start = time.monotonic()\n        detections = self.detector.detect(image)\n        t_elapsed = (time.monotonic() - t_start) * 1000\n\n        # Structured logging — machine-parseable\n        event = PerceptionEvent(\n            timestamp=time.time(),\n            num_detections=len(detections),\n            processing_time_ms=t_elapsed,\n            frame_id=image.header.frame_id,\n            detector_confidence=max(\n                (d.confidence for d in detections), default=0.0),\n        )\n        logger.info(\"perception.processed\", **asdict(event))\n\n        # Performance warnings\n        if t_elapsed > 100:\n            logger.warning(\"perception.slow\",\n                processing_time_ms=t_elapsed,\n                threshold_ms=100)\n\n        # Anomaly detection\n        if len(detections) == 0 and self._expected_objects > 0:\n            logger.warning(\"perception.no_detections\",\n                expected=self._expected_objects,\n                image_mean=float(image.data.mean()))\n\n        return detections\n```\n\n**What to log**: state transitions, command executions, safety events, performance metrics, sensor health, error conditions, configuration changes.\n\n**How to log**: structured key-value pairs (not printf strings), with timestamps, severity levels, and module identifiers.\n\n---\n\n## Principle 11: Composability — Build Complex Behaviors from Simple Ones\n\nDesign modules as composable building blocks. Complex robot behaviors should emerge from combining simple, tested primitives.\n\n```python\n# Primitive skills — simple, tested, reusable\nclass MoveTo(Skill):\n    \"\"\"Move end-effector to a target pose\"\"\"\n    def execute(self, target: Pose) -> bool: ...\n\nclass Grasp(Skill):\n    \"\"\"Close gripper with force control\"\"\"\n    def execute(self, force: float = 10.0) -> bool: ...\n\nclass Release(Skill):\n    \"\"\"Open gripper\"\"\"\n    def execute(self) -> bool: ...\n\nclass LookAt(Skill):\n    \"\"\"Point camera at a target\"\"\"\n    def execute(self, target: Point) -> bool: ...\n\nclass Detect(Skill):\n    \"\"\"Detect objects of a given class\"\"\"\n    def execute(self, target_class: str) -> List[Detection]: ...\n\n\n# Composite skills — built from primitives\nclass Pick(CompositeSkill):\n    \"\"\"Pick = Detect + MoveTo + Grasp\"\"\"\n    def __init__(self, detect: Detect, move: MoveTo, grasp: Grasp):\n        self.detect = detect\n        self.move = move\n        self.grasp = grasp\n\n    def execute(self, object_class: str) -> bool:\n        detections = self.detect.execute(object_class)\n        if not detections:\n            return False\n        approach = compute_approach_pose(detections[0].pose)\n        if not self.move.execute(approach):\n            return False\n        if not self.move.execute(detections[0].pose):\n            return False\n        return self.grasp.execute()\n\n\nclass Place(CompositeSkill):\n    \"\"\"Place = MoveTo + Release\"\"\"\n    def __init__(self, move: MoveTo, release: Release):\n        self.move = move\n        self.release = release\n\n    def execute(self, target: Pose) -> bool:\n        if not self.move.execute(target):\n            return False\n        return self.release.execute()\n\n\nclass PickAndPlace(CompositeSkill):\n    \"\"\"PickAndPlace = Pick + Place — composed from compositions\"\"\"\n    def __init__(self, pick: Pick, place: Place):\n        self.pick = pick\n        self.place = place\n\n    def execute(self, object_class: str, target: Pose) -> bool:\n        if not self.pick.execute(object_class):\n            return False\n        return self.place.execute(target)\n\n\n# Dependency injection wires everything together at startup\ndef build_skill_library(arm, gripper, camera, detector):\n    move = MoveTo(arm)\n    grasp = Grasp(gripper)\n    release = Release(gripper)\n    look = LookAt(arm)\n    detect = Detect(camera, detector)\n    pick = Pick(detect, move, grasp)\n    place = Place(move, release)\n    pick_and_place = PickAndPlace(pick, place)\n    return {\n        'move': move, 'grasp': grasp, 'release': release,\n        'pick': pick, 'place': place,\n        'pick_and_place': pick_and_place,\n    }\n```\n\n---\n\n## Principle 12: Graceful Degradation — Work With What You Have\n\nWhen components fail, the robot should degrade gracefully rather than stop entirely.\n\n```python\nclass DegradedModeManager:\n    \"\"\"Manages capability degradation as components fail\"\"\"\n\n    def __init__(self):\n        self.capabilities = {\n            'full_autonomy': {'requires': ['camera', 'lidar', 'arm', 'gripper']},\n            'blind_manipulation': {'requires': ['arm', 'gripper']},\n            'perception_only': {'requires': ['camera', 'lidar']},\n            'safe_stop': {'requires': []},\n        }\n        self.active_components = set()\n\n    def component_online(self, name: str):\n        self.active_components.add(name)\n        self._update_mode()\n\n    def component_offline(self, name: str):\n        self.active_components.discard(name)\n        logger.warning(f\"Component offline: {name}\")\n        self._update_mode()\n\n    def _update_mode(self):\n        \"\"\"Find the best mode we can support with available components\"\"\"\n        for mode, spec in self.capabilities.items():\n            if set(spec['requires']).issubset(self.active_components):\n                if mode != self.current_mode:\n                    logger.info(f\"Mode change: {self.current_mode} → {mode}\")\n                    self.current_mode = mode\n                return\n        self.current_mode = 'safe_stop'\n        self._execute_safe_stop()\n```\n\n---\n\n## Quick Reference: Principle Checklist\n\nUse this during code reviews:\n\n| # | Principle | Check |\n|---|-----------|-------|\n| 1 | Single Responsibility | Can you describe the module without \"and\"? |\n| 2 | Dependency Inversion | Does high-level code import hardware drivers? |\n| 3 | Open-Closed | Does adding a new sensor require modifying existing code? |\n| 4 | Interface Segregation | Are implementations forced to stub out unused methods? |\n| 5 | Liskov Substitution | Can you swap sim/real without changing caller code? |\n| 6 | Separation of Rates | Does perception block the control loop? |\n| 7 | Fail-Safe Defaults | What happens on communication loss? |\n| 8 | Configuration Over Code | Are there magic numbers in the source? |\n| 9 | Idempotent Operations | Is it safe to send every command twice? |\n| 10 | Observe Everything | Can you diagnose a 2 AM failure from logs alone? |\n| 11 | Composability | Can you build new tasks from existing skills? |\n| 12 | Graceful Degradation | What's the robot's behavior when a sensor fails? |","tags":["robotics","software","principles","agent","skills","arpitg1304","agent-skills","ai-coding-assistant","claude-skills"],"capabilities":["skill","source-arpitg1304","skill-robotics-software-principles","topic-agent-skills","topic-ai-coding-assistant","topic-claude-skills","topic-robotics"],"categories":["robotics-agent-skills"],"synonyms":[],"warnings":[],"endpointUrl":"https://skills.sh/arpitg1304/robotics-agent-skills/robotics-software-principles","protocol":"skill","transport":"skills-sh","auth":{"type":"none","details":{"cli":"npx skills add arpitg1304/robotics-agent-skills","source_repo":"https://github.com/arpitg1304/robotics-agent-skills","install_from":"skills.sh"}},"qualityScore":"0.544","qualityRationale":"deterministic score 0.54 from registry signals: · indexed on github topic:agent-skills · 189 github stars · SKILL.md body (29,681 chars)","verified":false,"liveness":"unknown","lastLivenessCheck":null,"agentReviews":{"count":0,"score_avg":null,"cost_usd_avg":null,"success_rate":null,"latency_p50_ms":null,"narrative_summary":null,"summary_updated_at":null},"enrichmentModel":"deterministic:skill-github:v1","enrichmentVersion":1,"enrichedAt":"2026-05-02T18:54:21.054Z","embedding":null,"createdAt":"2026-04-18T22:05:36.783Z","updatedAt":"2026-05-02T18:54:21.054Z","lastSeenAt":"2026-05-02T18:54:21.054Z","tsv":"'-0.5':2278,2279 '-1':1778 '-10':1771,1790 '-1000':1737 '-2':672 '-200':1746,1755 '-30':1763,1784 '0':255,764,2440,2447,2802,2805,3003,3015 '0.0':2280,2776 '0.02':2334,2359,2389 '0.04':2657,2663 '0.05':2257,2259,2266,2340,2364,2395 '0.08':2369,2401 '0.1':1777,1829,2044,2372,2405 '0.15':2329,2355,2384 '0.5':512,604,656,740,1536,1560,1623,1648,1834,2285,2286 '0.8':2287 '1':57,171,1193,1770,1783,1789,3291 '1.0':596 '10':110,1762,2667,3389 '10.0':2376,2410,2913 '100':753,1745,1918,2451,2786,2796 '1000':1728,2744 '100hz':1810,1844,1911 '10hz':1885 '10ms':1811,1919 '11':2853,3402 '12':3155,3412 '1ms':82 '2':75,378,674,2691,3301,3396 '200ms':1819 '207ms':1837 '2ms':1825 '2x':2583 '3':91,564,886,3312 '3.14':2011 '4':102,1161,3325 '5':114,116,1464,3336 '50':1754 '500':1736,2216 '5hz':1841 '5ms':1815 '6':132,676,1200,1564,1569,1706,3347 '7':146,1540,1545,1640,1968,3357 '8':2290,3367 '9':2518,3378 'abc':444,446,466,547,589,867,998,1228,1291,1312,1333,1355,1597 'absolut':2600,2645 'abstract':383,406,461,467,548,772,1703 'abstractmethod':447,476,489,502,522,532,556,568,579,590,598,606,612,1005,1010,1017,1229,1235,1241,1247,1254,1264,1298,1305,1318,1324,1338,1345,1361,1371,1599,1606,1613 'account':167 'across':2318 'action':2217,2228,2239 'ad':907,929,945,1119,3317 'add':903,921,991,1067,1123,1992 'address':2487 'adjust':1336 'algorithm':106,2508 'allow':1891 'alon':3401 'alreadi':2612 'also':199,1315 'alway':2052 'anomali':2797 'anyth':2298 'appli':9 'applic':862 'approach':2250,2327,2352,2370,2380,2402,2441,2998,3000,3008 'architectur':26,990 'arm':210,348,351,468,470,681,780,797,806,1492,1578,1661,1666,1674,3102,3108,3117,3193,3198 'arm.move':1678 'arm.num':1670 'armcontrol':2005,2038 'arminterfac':349,437,465,622,679,798,1486,1528,1552,1596,1633,1667 'arrow':872 'asdict':2705,2779 'assert':1649,2436,2446 'assum':1677,2234 'async':1849 'autonomi':3189 'avail':1930,3246 'b':1950,1955 'bad':222,944,1216,1518,1795,2001,2314,2545 'balanc':1349 'base':999 'beat':1182 'behav':2688 'behavior':391,1985,2857,2869,3420 'best':1773,1780,1786,1792,3240 'blind':2199,3195 'block':1797,1820,1913,2866,3353 'bool':513,597,605,617,819,1624,2075,2899,2914,2923,2937,2988,3043,3080 'bound':2273 'boundari':1712,1850,2495 'box':2060,2274 'break':208,940,1588 'budget':1812 'buffer':1958 'bug':61 'build':32,2855,2865,3099,3406 'built':2957 'call':1966,2654 'caller':1479,1521,1574,1654,1688,3345 'camera':204,290,295,549,552,575,782,801,810,932,1207,1220,1293,1313,1334,1360,1412,1445,1451,2928,3104,3120,3191,3203 'camerainterfac':291,546,802,1227 'cameraintrins':573 'capabl':905,3179 'captur':558,1231,1237,1243,1300,1320,1392,1396,1422 'care':787 'caus':87 'central':2203 'chang':189,267,911,1064,1130,1159,2297,2833,3267,3344 'check':2017,2050,2137,2154,2161,3290 'checklist':3283 'clamp':2127 'class':181,229,271,278,309,338,464,545,587,620,677,773,776,817,835,953,984,996,1000,1023,1042,1069,1133,1226,1289,1310,1331,1353,1386,1418,1440,1526,1550,1595,1631,1801,1851,2004,2037,2201,2320,2378,2457,2710,2727,2883,2900,2915,2924,2938,2946,2951,2960,2986,2992,3021,3052,3076,3085,3176 'classmethod':2411 'clear':276 'close':600,889,900,2903,3315 'cls':2415,2429 'cmd':1826,1833,1935,1940 'code':30,48,122,768,909,913,952,1158,1429,1483,1655,1689,2293,2312,2507,3287,3308,3324,3346,3370 'codebas':24 'collid':2164 'collis':2156,2169 'collisionrisk':2167 'combin':2873 'command':344,1832,1939,2224,2525,2604,2608,2615,2618,2635,2822,3387 'communic':1953,2192,2210,3365 'complex':2856,2867 'compon':1722,3164,3182,3209,3212,3221,3230,3247,3259 'compos':2854,2864,3058,3403 'composit':2955,3060 'compositeskil':2962,3023,3054 'comput':2999 'concret':618 'condit':2831 'confid':2725,2768 'config':2462,2466,2484 'config.validate':2464 'config/grasp_planner.yaml':2349 'configur':2291,2310,2347,2832,3368 'configuration-driven':2346 'confirm':2074,2082,2089,2100 'consequ':59 'conserv':2059,2272 'constant':920,2499 'constraint':51,159 'continu':2198,2222 'contract':463,475 'control':83,201,211,227,745,1735,1799,1804,1838,1906,2119,2509,2907,3355 'controllablecamera':1332,1390 'core':995,1061,1132,1157 'correct':1584 'count':1586 'coupl':218,1720 'crash':65,69 'crawl':2045,2267 'critic':148,215,1725 'current':484,497,2563,2568 'd':827,829,1122,2772 'd.confidence':2770 'd.label':833 'daemon':1876 'damag':90 'danger':2036 'data':284,961,968,975,987,1015,1035,1054,1099,1118,1145,2424,2511 'data.get':2430 'dataclass':449,451,2377,2702,2704,2709 'datalogg':243 'deadlin':79,85 'debug':2673 'decis':27 'decoupl':1846 'def':231,244,287,298,317,324,345,352,477,490,503,523,533,557,569,580,591,599,607,613,628,640,649,662,667,686,718,733,760,794,813,955,1006,1011,1018,1026,1031,1045,1050,1071,1078,1093,1136,1141,1230,1236,1242,1248,1255,1265,1299,1306,1319,1325,1339,1346,1362,1372,1391,1395,1399,1421,1442,1452,1529,1553,1600,1607,1614,1635,1641,1662,1803,1853,1879,1905,2006,2039,2070,2105,2322,2412,2433,2459,2467,2557,2597,2636,2729,2894,2908,2920,2932,2947,2967,2982,3027,3038,3061,3072,3098,3184,3211,3220,3234 'default':1972,1980,1996,2003,2015,2021,2028,2031,2048,2056,2205,2270,2775,3361 'definit':2514 'degrad':1998,3157,3169,3180,3414 'degradedmodemanag':3177 'delay':98 'delta':2561,2569 'demand':160 'depend':379,381,394,404,435,769,858,864,1172,1439,1514,3091,3302 'deploy':942,2303 'depth':1238,1317,1321,1327,1382,1397,1438,2333,2357,2386 'depthcamera':1311,1389 'depthintrins':1330 'describ':365,1593,3296 'design':7,19,40,162,2861 'detect':285,300,303,822,831,1453,1456,1857,1896,1903,1904,1926,1928,1934,2715,2736,2755,2757,2774,2798,2801,2808,2816,2939,2941,2954,2964,2970,2971,2977,2989,2995,3002,3014,3118,3119,3124 'detector':292,297,803,812,1890,2724,2767,3105,3121 'detectorinterfac':293,804 'develop':14 'dexter':1202 'diagnos':3394 'dict':1022,1075 'differ':46,111,1713,1717,2301 'direct':426,1965 'disciplin':161 'distanc':2586 'divers':104 'doesn':62,1434,1675 'dof':1194,1201 'drift':144 'driven':2348 'driver':205,400,870,884,3311 'drop':2533 'due':2576 'duplic':2535 'ee':492 'effect':2594 'effector':500,2889 'effort':1774,1781,1787,1793 'elaps':2740,2762,2785,2793 'elif':969,976 'els':1426 'emerg':2871 'emit':2682 'enabl':2062,2067,2071,2078,2102,2121 'end':499,2888 'end-effector':498,2887 'entir':2208,3174 'environ':2306 'error':2515,2830 'estim':1753 'event':2750,2780,2825 'everi':178,469,1219,1685,1977,2524,2679,3386 'everyth':2295,2669,3094,3391 'exact':185 'exec':1744 'execut':353,814,2613,2823,2895,2909,2921,2933,2948,2983,3039,3073 'executionmodul':339 'executionresult':357 'exist':912,941,951,3323,3410 'expect':2809 'explicit':2033,2066,2077 'exposur':1250,1341,1401 'extend':890,1084 'extens':898 'extern':2294 'f':1108,2148,2168,2423,2427,3229,3265 'face':56 'fact':2539 'fail':1970,3165,3183,3359,3424 'fail-saf':1969,3358 'failur':150,2237,3398 'fals':842,2018,2069,2076,2997,3010,3018,3049,3087 'far':2556 'fast':1887 'fat':1217 'featur':1990,2502 'file':1125,2319,2500 'find':3238 'firm':1748,1757 'fix':1206 'flag':2503 'float':511,595,603,611,1253,1261,1263,1344,1352,1368,1370,1377,1622,2112,2383,2388,2394,2400,2404,2409,2641,2713,2720,2726,2813,2912 'focus':1165 'forc':602,1169,2374,2407,2449,2452,2906,2911,3330 'foundat':5,415 'frame':2721,2763 'franka':1543,1548,1693 'franka-specif':1547 'frankaarm':1527,1632 'freeli':441 'full':2012,3188 'function':182 'gap':121 'generic':1657 'get':299,478,491,570,608,641,719,1019,1326 'given':2945 'goal':313 'god':223 'goe':2482 'good':272,988,1281,1590,1845,2029,2345,2587,2632 'grace':3156,3170,3413 'grasp':615,2332,2350,2356,2366,2373,2385,2397,2406,2431,2901,2966,2974,2975,2981,3109,3110,3126,3140,3141 'graspconfig':2379,2418,2463 'graspplann':2321,2458 'gripper':112,781,799,808,1195,2634,2638,2643,2656,2904,2919,3103,3111,3114,3194,3199 'gripperinterfac':588,800 'h':562 'half':1278 'hand':1203 'handl':2516 'happen':153,2181,3363 'hard':1730,1739 'hardcod':2315 'hardwar':103,131,343,385,401,869,1506,2085,2097,3310 'harmless':2627 'health':2829 'heartbeat':2213 'height':586,2328,2353,2362,2381,2392,2438,2442,2477 'high':387,766,876,3306 'high-level':386,765,875,3305 'home':1665,1668,1681 'honor':473,554 'hours/days':140 'hz':1729,1738,1747,1756,1764,1772,1779,1785,1791 'id':706,732,2605,2609,2619,2722,2764,2766 'idempot':2519,2548,2588,2633,3379 'ident':125 'identifi':2851 'imag':247,251,260,304,308,567,820,824,1297,1457,1462,1813,1818,1894,1898,2732,2738,2811 'image.data.mean':2814 'image.header.frame':2765 'immedi':527 'implement':440,471,619,868,1198,1210,1222,1284,1380,1415,1468,1470,1494,1525,1594,3329 'import':424,445,450,454,457,880,2699,2703,3309 'init':232,288,318,346,629,687,795,1072,1443,1854,2007,2040,2460,2968,3028,3062,3185 'inject':3092 'input':95,2187,2189 'instead':1842 'int':1604,2716 'intend':2585 'interfac':277,685,866,1162,1166,1174,1181,1204,1218,1283,1473,2513,3326 'intrins':571,576,1020,1328 'invalid':2188 'invers':380,3303 'inward':874 'ip':631,635,639,2486 'issubset':3257 'iter':1269 'jid':727,729 'job':177 'joint':479,485,506,516,534,544,642,652,668,692,711,716,720,736,845,851,853,1532,1546,1556,1570,1585,1602,1608,1617,1630,1637,1644,1653,1671,1680,1734,2248,2253,2488 'key':2839 'key-valu':2838 'know':785,1480,1523,1576 'larg':1184 'last':2223,2246 'latest':1929 'leak':142,1704 'len':747,1538,1562,1650,2756,2800 'length':1628 'level':388,398,767,877,883,1376,2848,3307 'librari':35,3101 'lidar':3192,3204 'life':2541 'limit':535,669,1609,2023,2026,2055,2139,2249,2254,2262,2265,2269,2489 'line':1686 'liskov':1465,3337 'list':302,331,455,537,694,1455,1611,2953 'll':939 'load':2426 'lock':1859,1901,1925 'log':228,1782,2693,2746,2819,2836,3400 'logger':2706,2708 'logger.info':2777,3264 'logger.warning':2787,2806,3228 'logic':2510 'long':134 'long-run':133 'look':3115 'lookat':2925,3116 'loop':84,1800,1805,1881,1907,3356 'lose':2191 'loss':2211,2242,3366 'low':397,882 'low-level':396,881 'm':2354,2358,2363,2368,2382,2387,2393,2399,2439,2478,2664 'machin':2748 'machine-pars':2747 'magic':2330,2335,2341,3373 'make':25 'manag':1776,3178 'mani':1179 'manipul':3196 'margin':2255 'matter':145,192,409,916,1188,1498 'max':541,2283,2365,2396,2769 'mean':202,2812 'memori':141 'messag':2534,2572 'meter':2281,2288 'method':1214,1406,3335 'metric':2827 'might':2300 'min':540,2133,2276,2360,2390 'miss':80 'mode':3236,3241,3249,3261,3263,3266,3269,3270,3272,3273,3276 'model':689,700 'modif':902 'modifi':892,934,950,982,1088,3322 'modul':13,22,175,179,197,224,368,389,399,893,1170,1978,2184,2680,2850,2862,3298 'monitor':1727 'motion':530,742 'move':504,514,650,734,1530,1554,1615,1642,1663,2106,2141,2158,2551,2558,2582,2598,2886,2972,2979,3030,3035,3106,3125,3129,3138,3139 'moveto':2884,2965,2973,3025,3031,3107 'ms':1830,1835,2215,2719,2760,2791,2795 'mujoco':683 'mujoco.mj':707,754 'mujoco.mjdata':703 'mujoco.mjmodel.from':697 'mujoco.mjtobj.mjobj':710 'mujocoarm':678 'multi':1510 'multi-robot':1509 'must':107,123,151,472,553,1221,1474,1488,1522,1575,1626,1807,1952,2032,2064,2443 'n':712,714,2375,2408,2450 'name':693,717,1007,1027,1046,1097,1102,1112,1116,1137,3215,3218,3224,3227,3232 'name2id':708 'near':2264,2268 'need':1432,1448 'network':2532 'never':55,152,393,784,879,1063,1719,1912,1963 'new':904,908,922,924,926,931,947,980,3319,3407 'next':826 'node':180 'noisi':97 'non':2547 'non-idempot':2546 'none':526,836,840,2024,2113,2344,2480 'noth':1425 'np':460 'np.array':647,725,2277,2284 'np.ndarray':482,509,560,645,723,1234,1240,1246,1270,1303,1323,1620,2110,2562,2603 'np.pi':673,675 'np.zeros':1669 'num':1601,1636,2714,2754 'number':2331,2336,2342,3374 'numpi':458 'oak':1121 'oak-d':1120 'oakd':979,1140 'oakdplugin':1134,1155 'object':249,254,261,286,1816,1824,2325,2338,2361,2391,2470,2473,2476,2942,2985,2991,3075,3084 'objectdetector':1441 'observ':2668,3390 'obstacl':330,332,337 'occasion':100 'offlin':3222,3231 'one':174,176,186,1183,1185,2860 'onlin':3213 'open':592,888,896,2420,2918,3314 'open-clos':887,3313 'open/close':2648 'oper':49,136,2073,2081,2088,2099,2520,3380 'opt':2034 'option':456 'oscil':88 'otherwis':1976 'outsid':2151 'pair':2841 'pan':1212,1257,1260,1357,1364,1367 'pan-tilt':1211 'pan-tilt-zoom':1356 'paramet':577,2491 'parseabl':2749 'pass':1583 'path':252,258,262,690,699,701,2416,2421,2501 'percept':196,225,936,1428,1761,1796,1861,1880,1915,3200,3352 'perception.no':2807 'perception.processed':2778 'perception.slow':2788 'perceptionev':2711,2751 'perceptionmodul':279,2728 'perceptionpipelin':954,1070,1149 'perform':2781,2826 'period':1920 'physic':58 'pick':2961,2963,3056,3064,3065,3069,3122,3123,3131,3135,3144,3145,3148,3151 'pickandplac':3053,3055,3134 'pickplacetask':774 'pipelin':937,1062,1086,1148 'pipeline.register':1150,1153 'place':3022,3024,3057,3066,3067,3071,3127,3128,3133,3136,3146,3147,3150,3153 'plan':226,325,390,1769,2323,2468 'planner':320,323,423,2351,2432 'plannerinterfac':321 'planningmodul':310 'plugin':989,1004,1082,1092 'plugin.name':1091 'point':873,2927,2936 'pointcloud':1244 'polici':878 'pose':256,329,493,495,501,1821,1823,1828,1931,1933,1937,2326,2471,2893,2898,3001,3004,3016,3042,3079 'pose.z':2339,2474 'posit':480,486,508,517,643,654,721,738,744,748,749,1534,1539,1558,1563,1619,1625,1646,1651,2445,2565 'positions.tolist':658 'possibl':1984 'predict':2170 'primit':2876,2878,2959 'principl':4,8,41,165,170,377,885,1160,1463,1705,1967,2289,2517,2666,2852,3154,3282,3289 'printf':2843 'process':67,956,1012,1032,1039,1051,1058,1094,1117,1142,2717,2730,2758,2789 'processeddata':1016,1041,1060,1147 'produc':1295,1316 'properti':531,578,666,1304,1598,1605,1634 'protocol':627 'proven':1975 'ptz':1385,1405,1411 'ptzcamera':1354 'pull':1436 'python':221,442,943,1215,1517,1794,2000,2200,2313,2544,2698,2877,3175 'queri':1672 'queue':1962 'quick':3280 'rad':2256,2260 'radian':488 'rais':1106,1541,1565,2094,2117,2146,2166 'rang':752,2131,2456 'rate':1709,1718,1724,1847,1870,1916,1917,3350 'rate.sleep':1941 'rather':3171 'raw':282,1014,1034,1053,1144 'read':2247 'real':77,120,130,420,623,793,1504,1732,1741,1750,1759,1767,2084,2096 'real-tim':76,1731,1740,1749,1758,1766 'realsens':965,1030,1037,1379 'realsense-specif':1036 'realsensecamera':235 'realsensed435':1387 'realsenseplugin':1024,1152 'reason':187 'receiv':637,2185 'refer':3281 'regardless':154 'regist':1079,1126 'relat':2559 'releas':2916,3026,3032,3033,3037,3112,3113,3130,3142,3143 'replac':1467,1690 'requir':933,949,1427,2080,2098,3190,3197,3202,3207,3256,3321 'resolut':581,1307 'respect':1710 'respons':173,274,281,312,341,3293 'result':2631,2661 'retri':2523,2537,2579 'return':306,334,358,483,496,518,539,561,574,584,646,660,671,724,758,841,855,966,973,985,1029,1040,1048,1059,1113,1139,1146,1460,1639,2174,2343,2428,2479,2611,2815,2996,3009,3017,3019,3048,3050,3086,3088,3137,3274 'reusabl':33,2882 'review':28,3288 'rgb':551,566,1232,1296,1301,1381,1393,1417,1423,1433,1449,1459 'rgbcamera':1290,1388,1420,1446 'robot':2,12,20,29,34,38,43,47,71,137,194,411,861,918,925,1190,1500,1511,1802,1852,2304,2485,2543,2553,2581,2687,2868,3167,3418 'robotcontrol':230 'robotics-software-principl':1 'rrtplanner':239 'rtde':626 'rtdecontrol':634 'rtderec':638 'rule':859,1942,2179 'run':124,135,138,245,430,1715,1808,1839,1862,1883,1909 'safe':1971,1973,2030,2057,2130,2152,2196,2204,2235,2455,2521,2528,3205,3277,3360,3383 'safest':1983 'safeti':147,214,1726,1986,2492,2824 'safety-crit':213 'safetydefault':2202 'safetyerror':2095,2118 'scatter':2317 'see':2678 'segreg':1163,1282,3327 'self':233,246,289,301,319,327,347,354,481,494,507,525,536,559,572,582,593,601,610,616,630,644,653,664,670,688,722,737,762,796,815,957,1008,1013,1021,1028,1033,1047,1052,1073,1081,1095,1138,1143,1233,1239,1245,1251,1259,1268,1302,1308,1322,1329,1342,1350,1366,1375,1394,1398,1402,1424,1444,1454,1533,1557,1592,1603,1610,1618,1638,1645,1806,1855,1882,1908,2008,2041,2072,2108,2324,2435,2461,2469,2560,2601,2639,2731,2896,2910,2922,2934,2949,2969,2984,3029,3040,3063,3074,3186,3214,3223,3237 'self-describ':1591 'self._enabled':2068,2103,2116 'self._execute_move':2175 'self._execute_safe_stop':3279 'self._expected_objects':2804,2810 'self._last_executed_command':2610,2617 'self._perception_loop':1875 'self._plugins':1074,1090,1105,1114 'self._process_oakd':986 'self._process_realsense':967 'self._process_zed':974 'self._update_mode':3219,3233 'self.active':3208,3258 'self.active_components.add':3217 'self.active_components.discard':3226 'self.approach':2437 'self.arm':240,350,805 'self.arm.execute':257 'self.arm.follow':359 'self.arm.move':843,849 'self.arm.send':1831,1938 'self.camera':234,294,809,1450 'self.camera.capture':248,305,821,1458,1814,1895 'self.capabilities':3187 'self.capabilities.items':3252 'self.collision':2016,2049,2160 'self.collision_detector.would':2163 'self.config':2465 'self.config.min':2475 'self.controller.compute':1827,1936 'self.current':3262,3268,3271,3275 'self.data':702,757 'self.data.ctrl':746,763 'self.data.qpos':726 'self.detect':2976 'self.detect.execute':2990 'self.detection':1858,1900,1924 'self.detector':236,296,811 'self.detector.detect':250,307,823,1817,1897,2737 'self.estimate':1822,1932 'self.get':2564 'self.grasp':2448,2980 'self.grasp.execute':3020 'self.gripper':807 'self.gripper.close':848 'self.gripper.move':2650 'self.gripper.open':854 'self.ik':846 'self.is':2092 'self.joint':705,731 'self.latest':1856,1902,1927 'self.logger':242 'self.logger.log':259 'self.max':2009,2042,2125,2135 'self.model':696,704,709,756 'self.model.predict':1461 'self.move':2566,2620,2978,3034 'self.move.execute':3007,3013,3046 'self.num':1629,1652 'self.perception':1871 'self.perception_thread.start':1878 'self.pick':3068 'self.pick.execute':3083 'self.place':852,3070 'self.place.execute':3089 'self.planner':238,322 'self.planner.plan':253,335 'self.release':3036 'self.release.execute':3051 'self.require':2061 'self.rtde':633,636 'self.rtde.movej':657 'self.rtde.stopscript':665 'self.rtde_receive.getactualq':648 'self.running':1893,1922 'self.workspace':2022,2054 'self.workspace_limits.contains':2144 'send':2530,2549,2589,2623,3385 'sensor':92,283,923,948,958,963,970,977,981,992,1003,1068,1080,1096,1101,1110,1111,1115,1151,1154,2236,2241,2490,2828,3320,3423 'sensorplugin':997,1025,1044,1077,1083,1135 'sent':2574 'separ':273,1707,3348 'set':1249,1256,1337,1340,1347,1363,1373,1400,2637,2642,2655,3210,3254 'sever':2847 'share':1960 'shouldn':1196,1208 'sim':118,418,791,1502 'sim-to-r':117,417,1501 'sim/real':3342 'simarm':1695 'simpl':1192,1272,2859,2874,2880 'simul':127,432,680,741,871,2093 'singl':172,3292 'skill':17,2879,2885,2902,2917,2926,2940,2956,3100,3411 'skill-robotics-software-principles' 'skip':2616 'slow':2251 'slower':1947 'small':1164,1180 'soft':1765 'softwar':3,6,21,39,44,54,156 'sourc':3377 'source-arpitg1304' 'spec':3250,3255 'specif':10,1038,1057,1549,1573,2498 'speed':2013,2046 'split':375 'start':2734,2743 'startup':1128,3097 'state':157,315,1752,2226,2231,2820 'stay':2505 'step':755,2079 'still':1698,2660 'stop':524,528,663,761,2195,2212,2220,2227,2232,2238,2243,2258,3173,3206,3278 'str':632,691,695,818,960,1009,1076,1098,2417,2606,2723,2952,2987,3077,3216,3225 'stream':1266 'string':2844 'structlog':2700 'structlog.get':2707 'structur':23,2512,2683,2745,2837 'stub':3332 'substitut':1466,1476,1520,1589,1682,3338 'subsystem':1714,1944,1949 'success':521 'support':1288,1512,3244 'swap':439,1507,3341 'system':216,2209 'take':1684 'target':328,336,816,825,834,838,1874,2109,2145,2149,2150,2165,2172,2173,2176,2602,2622,2629,2892,2897,2931,2935,2950,3041,3047,3078,3090 'target.pose':847 'task':863,927,1775,2497,3408 'task-specif':2496 'telemetri':2684 'ten':2658 'test':362,1683,2875,2881 'thread':1866,1872 'threading.lock':1860 'threading.thread':1873 'threshold':2493,2794 'tilt':1213,1258,1262,1358,1365,1369 'time':78,1711,1733,1742,1751,1760,1768,2659,2718,2759,2790 'time.monotonic':2735,2741 'time.time':2753 'timeout':2214,2219 'timestamp':2712,2752,2846 'togeth':3095 'toggl':2649 'topic':1959 'topic-agent-skills' 'topic-ai-coding-assistant' 'topic-claude-skills' 'topic-robotics' 'total':1836 'touch':269,994 'trajectori':316,333,342,355,356,360,361,1743 'transfer':1505 'transit':2821 'trigger':36 'true':519,661,759,856,1877,2051,2063,2104 'tupl':538,583,1309,1612 'twice':2531,2550,2554,2575,2590,2625,3388 'type':453,959,964,971,978 'typic':1723 'ui/dashboard':1788 'uint8':565 'unaccept':220 'uncertainti':93 'unexpect':2689 'unknown':1109,2225,2230 'unsaf':2002 'unus':3334 'updat':206,3235 'ur5':624,1567,1572,1691 'ur5-specific':1571 'ur5arm':621,1551 'ur5driver':241,425 'usb':1273 'usbwebcam':1419 'use':15,371,1178,1581,2245,3284 'valid':2434 'valu':1252,1343,1351,1403,2316,2840 'valueerror':1107,1542,1566 'variabl':1961 'veloc':510,655,659,739,1535,1559,1621,1647,2010,2043,2111,2122,2123,2126,2128,2132,2134,2136,2177,2263,2371,2403 'vendor':115 'via':625,1956 'video':1267 'violat':1519 'w':563 'wall':74 'warn':2782 'webcam':1274,1414 'whene':37 'white':1348 'width':585,594,609,2367,2398,2640,2646,2652,2653,2665 'wire':3093 'without':370,891,993,1087,1477,3299,3343 'work':108,777,1484,1489,1658,1699,3158 'workspac':2058,2138,2153,2271,2275,2282,2494 'workspaceviol':2147 'world':314 'wrong':101 'xml':698 'yaml':2414 'yaml.safe':2425 'yolodetector':237 'zed':972,1049,1056 'zed-specif':1055 'zedplugin':1043 'zero':1129 'zoom':1359,1374","prices":[{"id":"bac96802-161e-4707-be73-837932cb6b15","listingId":"bf389d32-0618-4946-bfe2-ea505477c3f1","amountUsd":"0","unit":"free","nativeCurrency":null,"nativeAmount":null,"chain":null,"payTo":null,"paymentMethod":"skill-free","isPrimary":true,"details":{"org":"arpitg1304","category":"robotics-agent-skills","install_from":"skills.sh"},"createdAt":"2026-04-18T22:05:36.783Z"}],"sources":[{"listingId":"bf389d32-0618-4946-bfe2-ea505477c3f1","source":"github","sourceId":"arpitg1304/robotics-agent-skills/robotics-software-principles","sourceUrl":"https://github.com/arpitg1304/robotics-agent-skills/tree/main/skills/robotics-software-principles","isPrimary":false,"firstSeenAt":"2026-04-18T22:05:36.783Z","lastSeenAt":"2026-05-02T18:54:21.054Z"}],"details":{"listingId":"bf389d32-0618-4946-bfe2-ea505477c3f1","quickStartSnippet":null,"exampleRequest":null,"exampleResponse":null,"schema":null,"openapiUrl":null,"agentsTxtUrl":null,"citations":[],"useCases":[],"bestFor":[],"notFor":[],"kindDetails":{"org":"arpitg1304","slug":"robotics-software-principles","github":{"repo":"arpitg1304/robotics-agent-skills","stars":189,"topics":["agent-skills","ai-coding-assistant","claude-skills","robotics"],"license":"apache-2.0","html_url":"https://github.com/arpitg1304/robotics-agent-skills","pushed_at":"2026-03-25T03:44:12Z","description":"Agent skills that make AI coding assistants write production-grade robotics software. ROS1, ROS2, design patterns, SOLID principles, and testing — for Claude Code, Cursor, Copilot, and any SKILL.md-compatible agent.","skill_md_sha":"7b5e1057a8339038fade0b3b9dbb27f026bfe81e","skill_md_path":"skills/robotics-software-principles/SKILL.md","default_branch":"main","skill_tree_url":"https://github.com/arpitg1304/robotics-agent-skills/tree/main/skills/robotics-software-principles"},"layout":"multi","source":"github","category":"robotics-agent-skills","frontmatter":{"name":"robotics-software-principles","description":"Foundational software design principles applied specifically to robotics module development. Use this skill when designing robot software modules, structuring codebases, making architecture decisions, reviewing robotics code, or building reusable robotics libraries. Trigger whenever the user mentions SOLID principles for robots, modular robotics software, clean architecture for robots, dependency injection in robotics, interface design for hardware, real-time design constraints, error handling strategies for robots, configuration management, separation of concerns in perception-planning- control, composability of robot behaviors, or any discussion of software craftsmanship in a robotics context. Also trigger for code reviews of robotics code, refactoring robot software, or designing APIs for robotics libraries."},"skills_sh_url":"https://skills.sh/arpitg1304/robotics-agent-skills/robotics-software-principles"},"updatedAt":"2026-05-02T18:54:21.054Z"}}