Sunday, 22 March 2026

Update mar 22 new auto updates

 In a effort to keep track of change in Robbies Code a summary of changes and mods we will post a summary of activity 

 

 

March 2026: Swerve Drive Gets Smarter

The swerve drive looked right on paper but had an annoying judder in practice: wheels would oscillate back and forth when trying to hold a near-zero velocity. The root cause was a missing hysteresis band around the wheel flip decision. When a wheel needs to go, say, 91° from its current position, it can either rotate 91° or flip 180° and go 89° the other way — both are mechanically equivalent. Without a proper dead-band, the controller would keep flip-flopping between the two options near the boundary. Adding a 45°/90° hysteresis band eliminated the oscillation.

Two other quality-of-life improvements went in at the same time: a hardware steering limit (MAX_STEERING_RAD) that makes the wheel flip rather than clamp when it hits the physical range, and first-order velocity smoothing (MOTOR_RAMP_TAU) to take the edge off acceleration. The robot moves noticeably more smoothly as a result.

Tunable Firmware Over Serial

One thing worth highlighting: we added a maxrate parameter to the ESP32 steering firmware. Previously, adjusting the maximum stepper speed meant changing a constant, recompiling, and reflashing — a five-minute interruption that breaks your testing flow. Now you send a single serial command:

echo -e "S 800 200 1000\n" > /dev/esp

That sets max rate to 800 steps/s, stall detection timeout to 200ms, and watchdog to 1000ms. The setting is saved to NVS flash and survives a power cycle. To query the current config:

echo -e "q\n" > /dev/esp

This sounds like a small thing but makes a real difference when tuning — try a value, drive the robot, adjust, repeat without stopping anything.

Splitting the Arm's Hardware Interface

The SCARA arm had a different kind of conflict. The arm's vertical axis (joint 0) is a stepper motor driven by a separate ESP32-based controller, while all the other joints use Feetech/ST serial servos. Originally both were lumped into a single ros2_control hardware block, which isn't how ros2_control is designed to work — different hardware interfaces need separate blocks. Splitting them into _j0 (stepper, via stepper_ros2_driver) and _servos (Feetech, via feetech_ros2_driver) cleaned this up properly.

We also added a use_mock_hardware:=true xacro argument so the arm can be launched in simulation mode without any physical hardware attached — useful for testing MoveIt2 motion planning on the desk before trusting the real arm.


March 2026: Teaching the Robot to See Low Obstacles

This is the big one from this week, and the one that required the most debugging.

The Problem: The Lidar Is Blind Below 390mm

Robbie's RPLidar is mounted at 390mm above the floor. Its beam is horizontal, so it simply cannot detect anything shorter than that — a 305mm drinks can sitting on the floor is completely invisible to it from a navigation standpoint. This isn't a configuration issue; it's physics.

The RealSense D435 at the front of the robot is mounted at 300mm and has a 57° vertical field of view, so it does see low obstacles clearly. The question was getting that data into the Nav2 costmap properly.

Ghost Obstacles: A 3D Raycasting Problem

We switched the local costmap from a 2D ObstacleLayer to a VoxelLayer, which can ingest 3D pointcloud data and mark obstacles at the correct height. The obstacle marking worked fine — slide a box in front of the camera and it appeared in the costmap immediately. The problem was clearing: when you removed the box, the mark stayed. Move someone's foot through the space and the footprint remained permanently. The robot would start navigating around nothing.

The cause was subtle. The VoxelLayer stores a 3D grid of voxels (cubes). We had it configured with 16 height slices of 5cm each, covering 0–80cm. The lidar clears free space by raycasting — sweeping a beam horizontally at 390mm. That beam only intersects voxel slice index 7 (0.35–0.40m). Obstacle points from the RealSense were marking slices 2 through 6 (0.10–0.30m, the height of a typical low obstacle). The lidar's clearing rays passed right over those slices and never touched them. Once marked, they stayed forever.

The fix is almost embarrassingly simple in hindsight: collapse all 16 slices into one single 80cm-tall voxel.

z_voxels: 1
z_resolution: 0.8   # one voxel covering 0–0.8m

Now the lidar ray at any height within 0–80cm intersects the single voxel and clears the whole column. The 3D height discrimination is gone, but for a local costmap whose only job is to answer "is there something in the way", that trade-off is fine.

The Wrong Topic Name

While digging through the config we found that the pointcloud topic in nav2_params.yaml was set to /front_depth_camera/depth/color/points but the RealSense launch file sets camera_name: front_camera, making the actual published topic /front_camera/depth/color/points. The costmap had been silently failing to subscribe to any pointcloud data at all — which explains why obstacles weren't appearing in the first place. A frustrating but instructive bug: always verify what topics are actually being published before assuming your configuration is working.


Fixing the Crab-Walking

With the costmap now working, a different problem became obvious: the robot was navigating like a crab.

On a route that required turning — go forward, turn left, continue — the robot was cutting diagonally across the corner rather than following the path and rotating to face the next leg. Because the RealSense only covers the forward 86°, a robot strafing sideways is effectively navigation-blind in its actual direction of travel. The voxel layer it had just been given was immediately undermined by the robot pointing the camera the wrong way.

This came down to two settings in the MPPI (Model Predictive Path Integral) controller:

  1. PathAngleCritic.max_angle_to_furthest: 1.0 — this critic only fires when the robot's heading is more than 1.0 radian (57°) off the path direction. So on a 45° diagonal approach the critic was completely silent, and the robot was free to strafe without penalty. Dropping this to 0.4 radians (23°) means the critic activates much earlier and pushes the robot to align with the path.
  2. PathAlignCritic.use_path_orientations: false — the path planner knows what direction the robot should be facing at every point along the path. With this set to false, MPPI was ignoring that information entirely — just trying to stay near the path line without caring about heading. Flipping it to true gives the controller the directional information it needs.

We also raised PathAngleCritic.cost_weight from 5.0 to 12.0 to make it the dominant force when the robot is heading the wrong way.