Building a Robot on Basis 02 - Software
Kyle Franz
Nov 24, 24I’ve spent the past few weeks working on a small robot to both be able to give demos with and exercise our code. This post is about the current software architecture for the robot.
- Part 01 - Hardware
- Part 02 - Software (You’re here!)
- Part 03 - tf2 support and LiDAR
After getting the hardware working, I moved on to the software. This required a few small changes to the core framework (mostly fixing CMake technicalities), but nothing crazy.
I can now move the robot around with an wireless controller! The left stick and bumpers control the wheels and the right stick controls the servos.
The Architecture:
--- config: flowchart: nodeSpacing: 5 subGraphTitleMargin: bottom: 10 defaultRenderer: elk elk: mergeEdges: True --- graph LR %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% subgraph unit_/freenove/rpi_freenove_mecanum_driver["/freenove/rpi_freenove_mecanum_driver"] handler_/freenove/rpi_freenove_mecanum_driver::Update[["Update() 100Hz"]] end subgraph unit_/freenove/rpi_freenove_servo_driver["/freenove/rpi_freenove_servo_driver"] handler_/freenove/rpi_freenove_servo_driver::Update[["Update() 100Hz"]] end subgraph unit_/freenove/rpi_libcamera_driver["/freenove/rpi_libcamera_driver"] handler_/freenove/rpi_libcamera_driver::OnCameraImage[["OnCameraImage() 30Hz"]] end subgraph unit_/freenove/joystick_driver["/freenove/joystick_driver"] handler_/freenove/joystick_driver::Tick[["Tick() 20Hz"]] end subgraph unit_/foxglove/foxglove["/foxglove/foxglove"] handler_/foxglove/foxglove:::hidden end handler_/freenove/rpi_freenove_servo_driver::Update::/servo/1/request_degrees:::hidden x--/servo/1/request_degrees--> handler_/freenove/rpi_freenove_servo_driver::Update handler_/freenove/rpi_freenove_servo_driver::Update::/servo/0/request_degrees:::hidden x--/servo/0/request_degrees--> handler_/freenove/rpi_freenove_servo_driver::Update handler_/freenove/joystick_driver::Tick --/user_inputs--> handler_/freenove/rpi_freenove_mecanum_driver::Update handler_/freenove/joystick_driver::Tick --/user_inputs--> handler_/freenove/rpi_freenove_servo_driver::Update handler_/freenove/rpi_libcamera_driver::OnCameraImage --/camera/rgb--x /camera/rgb::handler_/freenove/rpi_libcamera_driver::OnCameraImage:::hidden handler_/freenove/rpi_freenove_servo_driver::Update --/servo/1/current_degrees--x /servo/1/current_degrees::handler_/freenove/rpi_freenove_servo_driver::Update:::hidden handler_/freenove/rpi_freenove_servo_driver::Update --/servo/0/current_degrees--x /servo/0/current_degrees::handler_/freenove/rpi_freenove_servo_driver::Update:::hidden handler_/freenove/rpi_freenove_mecanum_driver::Update --/motor_state--x /motor_state::handler_/freenove/rpi_freenove_mecanum_driver::Update:::hidden
This is a pretty straightforward architecture, for now. We run joystick input, allowing it to control both the servos the camera is mounted on as well as the wheels. Later, we’ll move the joystick input to the wheels to instead be an input to some sort of planning stick.
This graph was generated with basis launch --mermaid
- it does a dry run, outputting information about the launch in mermaid. This is really useful - I can copy/paste directly into a blog post or github markdown document. The PR for this will be merged soon.
The code
rpi_libcamera_driver
This unit implements libcamera support. libcamera on raspberry pi does have a v4l2 interface, I sadly wasn’t able to get it working. There’s nothing special about this code, you can see it here. It’s based off of the libcamera tutorial in their docs. The only odd point was that libcamera::formats::RGB888
appears to be BGR
- I didn’t bother to track down why, I instead just swapped to requesting BGR
.
Note: it’s not perfect that we run this unit at a fixed 30hz - ideally, this unit (and other driver-like units) can update freely on a thread, and publish at will. This requires a change to Basis (sync: type: external
), which will be made in the next month.
joystick_driver
Again, very straightforward. I initially implemented this using ioctl
and then switched to libevdev
. This really simplified controller access. See here.
rpi_freenove_servo_driver
Finally - a little bit of complexity. This unit runs at 100hz, and picks up any inputs that were published since the last tick. We cache /user_inputs
- it can run at a lower rate, and we’re okay with reusing the last joystick input for 5 ticks - nobody will notice the difference.
Doing things the hard way
The initial version of the servo looked something like this:
--- config: flowchart: nodeSpacing: 5 subGraphTitleMargin: bottom: 10 defaultRenderer: elk elk: mergeEdges: True --- graph LR %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% subgraph unit_/freenove/rpi_freenove_servo_driver["/freenove/rpi_freenove_servo_driver"] handler_/freenove/rpi_freenove_servo_driver::OnInputs[["OnInputs()"]] handler_/freenove/rpi_freenove_servo_driver::RequestState0[["RequestState0()"]] handler_/freenove/rpi_freenove_servo_driver::RequestState1[["RequestState1()"]] handler_/freenove/rpi_freenove_servo_driver::Update[["Update() 100Hz"]] end handler_/freenove/rpi_freenove_servo_driver::RequestState1::/servo/1/request_degrees:::hidden x--/servo/1/request_degrees--> handler_/freenove/rpi_freenove_servo_driver::RequestState1 handler_/freenove/rpi_freenove_servo_driver::RequestState0::/servo/0/request_degrees:::hidden x--/servo/0/request_degrees--> handler_/freenove/rpi_freenove_servo_driver::RequestState0 handler_/freenove/rpi_freenove_servo_driver::OnInputs::/user_inputs:::hidden x--/user_inputs--> handler_/freenove/rpi_freenove_servo_driver::OnInputs handler_/freenove/rpi_freenove_servo_driver::Update --/servo/1/current_degrees--x /servo/1/current_degrees::handler_/freenove/rpi_freenove_servo_driver::Update:::hidden handler_/freenove/rpi_freenove_servo_driver::Update --/servo/0/current_degrees--x /servo/0/current_degrees::handler_/freenove/rpi_freenove_servo_driver::Update:::hidden
This required storing each input to the unit and is less performant than letting the framework handle it.
Optional and Cached
With optional
and cached
, the code is straightforward. optional
lets a handler run without the tagged input. cached
keeps an input around for future executions of the handler. You can use it to work around differences in publish rates while still having a single handler. In this example we use it for input, but another use might be for something like loading a map and publishing it at a low rate. The less often messages need published, the better.
--- config: flowchart: nodeSpacing: 5 subGraphTitleMargin: bottom: 10 defaultRenderer: elk elk: mergeEdges: True --- graph LR %%{init: {"flowchart": {"defaultRenderer": "elk"}} }%% subgraph unit_/freenove/rpi_freenove_servo_driver["/freenove/rpi_freenove_servo_driver"] handler_/freenove/rpi_freenove_servo_driver::Update[["Update() 100Hz"]] end handler_/freenove/rpi_freenove_servo_driver::Update::/user_inputs:::hidden x--/user_inputs--> handler_/freenove/rpi_freenove_servo_driver::Update handler_/freenove/rpi_freenove_servo_driver::Update::/servo/1/request_degrees:::hidden x--/servo/1/request_degrees--> handler_/freenove/rpi_freenove_servo_driver::Update handler_/freenove/rpi_freenove_servo_driver::Update::/servo/0/request_degrees:::hidden x--/servo/0/request_degrees--> handler_/freenove/rpi_freenove_servo_driver::Update handler_/freenove/rpi_freenove_servo_driver::Update --/servo/1/current_degrees--x /servo/1/current_degrees::handler_/freenove/rpi_freenove_servo_driver::Update:::hidden handler_/freenove/rpi_freenove_servo_driver::Update --/servo/0/current_degrees--x /servo/0/current_degrees::handler_/freenove/rpi_freenove_servo_driver::Update:::hidden
I’ll show the code here…
The declaration for our Unit is nothing special. We store the PCA9685 interface, and store both the current state and requested state for each servo.
Notice - we don’t have to store any messages, we don’t have to write any synchronizer code. Very straightforward.
rpi_freenove_mecanum_driver
Same story as rpi_freenove_servo_driver
. The only code of note is this:
Mecanum wheel control is really simple. This function takes in the x/y joystick input and the sum of the triggers (theta), and outputs the power to each motor to satisfy those inputs.
Final thoughts
This was pretty simple to do - helped of course by the availability of other libraries out there. I’m looking forward to getting LiDAR working - and then either SLAM (with an IMU?) or a simple planning+controls stack.