Sensors are critical in robotics. A robot relies on its sensor package to perform its programmed duties. If sensors are damaged or non-functional, the robot can perform unpredictably, or even fail entirely. [Dheera Venkatraman] has been working to make debugging sensor issues easier with the rosshow package for Robot Operating System.
Normally, if you want to be certain a camera feed is working on a robot, normally you’d have to connect a monitor and other peripherals, check manually, then put everything away again when you’re finished. [Dheera] considered this was altogether too much of a pain for basic sensor checks.
Instead, rosshow uses the power of SSH to speed things along. Log in to the robot, fire off a few command line instructions, and rosshow will start displaying sensor data in the terminal on your remote machine. It’s achieved through the use of Unicode Braille art in the terminal. Sure, you won’t get a full-resolution feed from your high-definition camera, and the display from the laser scanner isn’t exactly perfect. But it’s enough to provide an instant verification that sensors are connected and working, and will speed up those routine is-it-connected checks by an order of magnitude.
Robot Operating System is a particularly useful platform if you’re thinking about the software platform for your next build. If you do put something together, be sure to let us know.
6 thoughts on “ROS Gets Quick Sensor Debugging In The Terminal”
rostopic hz /left_camera/image_color/compressed ?
This will give you the publishing frequency of the (compressed) topic but not the contents themselves
New to this topic, need more information
“Sensors are critical in robotics. A robot relies on its sensor package to perform its programmed duties. If sensors are damaged or non-functional, the robot can perform unpredictably, or even fail entirely. ”
Same with biological. Where’s our debug console?
I think the post is missing the point of it, mainly here:
“Normally, if you want to be certain a camera feed is working on a robot, normally you’d have to connect a monitor and other peripherals, check manually, then put everything away again when you’re finished”
Well, if you are connected through ssh you are probably in front of a monitor and hands on a keyboard, so you wouldn’t need any other hardware. Since the system would be using ROS and your client comouter would be on the same network you could visualise data directly using rviz or image_view subscribing to the camera, LiDAR or whatever topic the robot publishes, if you have ROS in your computer.
The point of this tool, as I understand, is that you don’t need to have ROS installed in the client to visually check the data, you can use other OSs than Unix based as a side effect. And if you do have ROS set up, you can avoid starting up rviz and probably needing to configure the view, which can be handy sometimes.
Anyway, good work!
Please be kind and respectful to help make the comments section excellent. (Comment Policy)