JeVois is a small, open-source, smart machine vision camera that was funded on Kickstarter in early 2017. I backed it because cameras that embed machine vision elements are steadily growing more capable, and JeVois boasts an impressive range of features. It runs embedded Linux and can process video at high frame rates using OpenCV algorithms. It can run standalone, or as a USB camera streaming raw or pre-processed video to a host computer for further action. In either case it can communicate to (and be controlled by) other devices via serial port.
But none of that is what really struck me about the camera when I received my unit. What really stood out was the demo mode. The team behind JeVois nailed an effective demo mode for a complex device. That didn’t happen by accident, and the results are worth sharing.
The Importance of a Good Demo
When it comes to complex systems, a good demo mode is essentially an elevator pitch for the unit’s capabilities. To a user, it answers “what can this do, and what possibilities does it open for me?”
The JeVois camera’s demo mode succeeded in this by doing several things:
- Make the demo self-contained and easy to start. Require a minimum of parts or setup from the user to get started. After putting the system image onto the included SD card, I only needed to plug it in to my laptop and start a camera viewer.
- Make it interactive. Respond to user input immediately, and show the processes at work as much as possible.
- Keep it simple. The demo isn’t the device’s one and only opportunity to explain everything! Leave the user free to focus on absorbing what is being shown; avoid bogging the user down with figuring out an interface or troubleshooting issues.
Demo mode on hardware is frequently an afterthought if it exists at all, but it deserves attention and polish if for no other reason than it is the one element of a product that it is virtually certain every user will engage with.
Setup and Demo of JeVois
I had to copy a system image to the micro SD card to ensure I had the latest version of the JeVois software, but after that all I needed to do was plug it into a USB port (where my computer recognized it as a USB camera device) and open a webcam viewer. Once that was done, the demo automatically started. The hardware streams a hybrid mixture of a looping “tour” video mixed with live camera images and a number of vision processes running in parallel. To the host OS, it’s just a video stream from a USB camera like any other.
Not every element is flawless; the handwritten number recognition in particular is hit or miss, although the demo acknowledges this. The demo video below was recorded in a white room with sunlight, but things still look on the dim side. Finally, the camera and lens clearly aren’t intended to be changed or adjusted. I’d have preferred a module with an M12 (S-mount) lens, but the camera on JeVois at least makes for a very small package. Regardless, the demo succeeds in easily and concisely showing off what’s possible.
The default camera system image comes with a variety of bundled machine vision modules and demos, each anchored to a specific camera setting. For example, one of these modules recognizes ArUco markers. To activate this mode, set the camera to “YUV 640×500” and the module activates immediately. A recording of this mode is below.
Details on detected markers are sent over the serial port. In this way an Arduino or microcontroller can interface to and even control the camera. There are many other bundled modes in the default image, and of course users are free to develop their own.
Other Embedded Machine Vision Cameras
The CMUcam5 “Pixy” (also funded on Kickstarter) shipped in 2014 had a clever demo mode. Out of the box, it had the ability to recognize colors that were held in front of it, and even had headers for hobby servos to be used in a pan/tilt configuration. This allowed the camera to be trained to track a color, then move to follow that color, all without involving a computer. It was also possible to view raw or processed video, but that required connecting to a laptop or desktop by USB and running a program unique to the Pixy in order to access video and configuration. The ability to “see what the camera sees” was a great feature that made setup and troubleshooting much easier, even if it was limited.
The OpenMV project is an open source python-enabled machine vision module that was a semifinalist for the Hackaday Prize in 2014, and has grown since then. OpenMV have just started shipping their newest M7 cameras, which can even emulate a Pixy in UART mode for compatibility. We’d love to see a good demo to show off its capabilities.
Are there any other new offerings in the realm of embedded machine vision cameras, or particularly good demos? Let us know in the comments. And speaking of the Hackaday Prize, don’t forget that 2017’s iteration is currently underway.