OpenMV Promises “Flyby” Imaging Of Components For Pick And Place Project

[iforce2d] has an interesting video exploring whether the OpenMV H7 board is viable as a flyby camera for pick and place, able to quickly snap a shot of a moving part instead of requiring the part to be held still in front of the camera. The answer seems to be yes!

The OpenMV camera module does capture, blob detection, LCD output, and more.

The H7 is OpenMV‘s most recent device, and it supports a variety of useful add-ons such as a global shutter camera sensor, which [iforce2d] is using here. OpenMV has some absolutely fantastic hardware, and is able to snap the image, do blob detection (and other image processing), display on a small LCD, and send all the relevant data over the UART as well as accept commands on what to look for, all in one neat package.

It used to be that global shutter cameras were pretty specialized pieces of equipment, but they’re much more common now. There’s even a Raspberry Pi global shutter camera module, and it’s just so much nicer for machine vision applications.

Watch the test setup as [iforce2d] demonstrates and explains an early proof of concept. The metal fixture on the motor swings over the camera’s lens with a ring light for even illumination, and despite the moving object, the H7 gets an awfully nice image. Check it out in the video, embedded below.

Continue reading “OpenMV Promises “Flyby” Imaging Of Components For Pick And Place Project”

Shopping Cart Does The Tedious Work For You

Thanks to modern microcontrollers, basic home automation tasks such as turning lights on and off, opening blinds, and various other simple tasks have become common DIY projects. But with the advent of artificial intelligence and machine learning the amount of tasks that can be offloaded to computers has skyrocketed. This shopping cart that automates away the checkout lines at grocery stores certainly fits into this category.

The project was inspired by the cashierless Amazon stores where customers simply walk into a store, grab what they want, and leave. This is made possible by the fact that computers monitor their purchases and charge them automatically, but creator [kutluhan_aktar] wanted to explore a way of doing this without a fleet of sensors and cameras all over a store. By mounting the hardware to a shopping cart instead, the sensors travel with the shopper and monitor what’s placed in the cart instead of what’s taken from a shelf. It’s built around the OpenMV Cam H7, a microcontroller paired with a camera specifically designed for these types of tasks, and the custom circuitry inside the case also includes WiFi connectivity to make sure the shopping cart can report its findings properly.

[kutluhan_aktar] also built the entire software stack from the ground up and trained the model on a set of common products as a proof-of-concept. The idea was to allow smaller stores to operate more efficiently without needing a full suite of Amazon hardware and software backing it up, and this prototype seems to work pretty well to that end. If you want to develop a machine vision project on your own with more common hardware, take a look at this project which uses the Raspberry Pi instead.

The OpenMV board inside a security camera shell on the left, an AprilTag on smartphone's screen on the right

Use AprilTags To Let Guests Open Your Front Gate

[Herb Peyerl] is part of a robotics team, and in his robotics endeavours, learned about AprilTags; small QR-code-like printable patterns that are easily recognizable by even primitive machine vision. Later on, when thinking about good ways to let his guests through his property’s front gate, the AprilTags turned out to be a wonderful solution. Now all he needs to do is send his guest a picture of the appropriate AprilTag, which they can present to the camera at his front gate using their smartphone.

He used an OpenMV board for this – thanks to its wide variety of available libraries, the AprilTag recognition is already baked in, and the entire script is merely a hundred lines of MicroPython. An old surveillance camera gave up its dome-shaped housing, and now the OpenMV board is doing guest access duty on a post in front of his property’s front gate. He’s shared the code with us, and says he’s personally running a slightly modified version for security reasons — not that a random burglar is likely to stumble upon this post anyway. Besides it looks like the gate would be easy for a burglar to jump over without any need for security bypass, and the convenience benefits of this hack are undeniable.

In the unlikely chance a burglar is reading this, however, don’t be sad. We do happen to have a bunch of hacks for you, too. There’s far less secure systems out there, from building RFID keyfobs to gated community access control systems, sometimes all you need is a 12 V battery. If you’re not into burglary, that’s okay too — we’ve covered other guest access hacks before, for instance, this ESP8266-powered one.

The Tiniest Computer Vision Platform Just Got Better

The future, if you believe the ad copy, is a world filled with cameras backed by intelligence, neural nets, and computer vision. Despite the hype, this may actually turn out to be true: drones are getting intelligent cameras, self-driving cars are loaded with them, and in any event it makes a great toy.

That’s what makes this Kickstarter so exciting. It’s a camera module, yes, but there are also some smarts behind it. The OpenMV is a MicroPython-powered machine vision camera that gives your project the power of computer vision without the need to haul a laptop or GPU along for the ride.

The OpenMV actually got its start as a Hackaday Prize entry focused on one simple idea. There are cheap camera modules everywhere, so why not attach a processor to that camera that allows for on-board image processing? The first version of the OpenMV could do face detection at 25 fps, color detection at more than 30 fps, and became the basis for hundreds of different robots loaded up with computer vision.

This crowdfunding campaign is financing the latest version of the OpenMV camera, and there are a lot of changes. The camera module is now removable, meaning the OpenMV now supports global shutter and thermal vision in addition to the usual color/rolling shutter sensor. Since this camera has a faster microcontroller, this latest version can support multi-blob color tracking at 80 fps. With the addition of a FLIR Lepton sensor, this camera does thermal sensing, and thanks to a new library, the OpenMV also does number detection with the help of neural networks.

We’ve seen a lot of builds using the OpenMV camera, and it’s getting ot the point where you can’t compete in an autonomous car race without this hardware. This new version has all the bells and whistles, making it one of the best ways we’ve seen to add computer vision to any hardware project.

The Story Of Kickstarting The OpenMV

Robots are the ‘it’ thing right now, computer vision is a hot topic, and microcontrollers have never been faster. These facts lead inexorably to the OpenMV, an embedded computer vision module that bills itself as the ‘Arduino of Machine Vision.’

The original OpenMV was an entry for the first Hackaday Prize, and since then the project has had a lot of success. There are tons of followers, plenty of users, and the project even had a successful Kickstarter. That last bit of info is fairly contentious — while the Kickstarter did meet the minimum funding level, there were a lot of problems bringing this very cool product to market. Issues with suppliers and community management were the biggest problems, but the team behind OpenMV eventually pulled it off.

At the 2016 Hackaday SuperConference, Kwabena Agyeman, one of the project leads for the OpenMV, told the story about bringing the OpenMV to market:

Continue reading “The Story Of Kickstarting The OpenMV”

Hackaday Links: December 25th, 2016

You should be watching the Doctor Who Christmas special right now. Does anyone know when the Resturant at the End of the Universe spinoff is airing?

We have a contest going on right now. It’s the 1 kB Challenge, a contest that challenges you to do the most with a kilobyte of machine code. The deadline is January 5th, so get cracking.

A few years ago, [Kwabena] created the OpenMV, a Python-powered machine vision module that doesn’t require a separate computer. It’s awesome, and we’re going to have his talk from the Hackaday SuperConference up shortly. Now the OpenMV is getting an upgrade. The upgrades include an ARM Cortex M7, more RAM, more heap for less money. Here’s a link to preorder.

There ain’t no demoscene party like an Amtrak demoscene party because an Amtrak demoscene party lasts ten hours.

E-paper displays are fancy, cool, and low-power. Putting them in a project, however, is difficult. You need to acquire these display modules, and this has usually been a pain. Now Eink has a web shop where you can peruse and purchase epaper display modules and drivers.

[Kris] built a pair of STM32L4 dev boards that are easily programmed in the Arduino IDE. Now he’s putting these boards up on Kickstarter. The prices are reasonable – $15 for the smaller of the pair, and $25 for the bigger one. Remember, kids: ARM is the future, at least until RISC-V takes over.

This is how you do holiday greeting cards.

Didn’t get what you want for Christmas?  Don’t worry, Amazon still has A Million Random Digits with 100,000 Normal Deviates in stock. It’s also available on audible dot com. Sometimes we don’t have time to sit down and read a million random digits but with audible dot com, you can listen to a million random digits in audio book format. That’s audible dot com please give us money.

northkoreaThis is the last Hackaday Links post of the year, which means it’s time for one of our most cherished traditions: reviewing our readership in North Korea.

It’s been a banner year for Hackaday in the Democratic People’s Republic of North Korea. The readership has exploded in 2016, with a gain of nearly 300%. To put that in perspective, in 2015 we had thirty-six views from North Korea across every page on Hackaday. In 2016, that number increased to one hundred and forty.

That’s a phenomenal increase and a yearly growth that is unheard of in the publishing industry. We’d like to tip our hat to all our North Korean reader, and we’re looking forward to serving you in 2017.

Hacklet 114 – Python Powered Projects

Python is one of today’s most popular programming languages.  It quite literally put the “Pi” in Raspberry Pi. Python’s history stretches back to the late 1980’s, when it was first written by  Guido van Rossum. [Rossum] created Python as a hobby project over the 1989 Christmas holiday. He wanted a language that would appeal to Unix/C hackers. I’d say he was pretty successful in that endeavor. Hackers embraced Python, making it a top choice in their projects. This week’s Hacklet focuses on some of the best Python-powered projects on Hackaday.io.

pytoolWe start with [Jithin] and Python Powered Scientific Instrumentation tool, his entry in the 2015 Hackaday Prize. [Jithin] has created an “electronics lab in a box” style tool that can compete with commercial products with price tags in the thousands. Python Powered Scientific Instrumentation tool uses simple microcontroller powered hardware to create programmable gain amplifiers, waveform generators, LCR meters, CC sources and more. The microcontroller handles all the real-time operations. Data processing happens on a connected PC running Python scripts. Popular Python libraries like Scipy make signal processing and waveform displays easy.

 

pymusicNext up is [Bill Peterson] with jamPi. [Bill] loves his music keyboard, but hates having to lug around a laptop, audio interface, and all the associated cables. He needed a device which would be as flexible as a PC-based synthesizer, but as simple and compact as a MIDI sound module. JamPi does all this and more. [Bill] is using fluidsynth to generate sound. The control and interface software is handled in Python with the help of the fluidsynth.py module. All this functionality is wrapped up in a simple box with a 2 line character LCD. Now [Bill] is ready to jam anytime, anywhere.

openmv-featureNext is [i.abdalkader] with OpenMV, his entry in the 2014 Hackaday Prize. [i.abdalkader’s] goal was to create “the Arduino of machine vision”. He’s well on his way to accomplishing that. In 2015, OpenMV had a successful Kickstarter campaign. After a few manufacturing glitches, customers are now receiving their devices. OpenMV is a low-cost Python-powered machine vision device. An ARM microcontroller coupled to a simple image sensor makes up the core of the device. The camera is programmed in MicroPython, with the help of many image processing libraries created by the OpenMV team. [i.abdalkader] even created his own IDE using Glade and PyGTK.

pyfaceFinally we have [osannolik] with Calibration and Measurement Tool. Have you ever want to display a few debug parameters from your embedded project, but didn’t have the display real estate (or any display at all)? What about changing a parameter without pulling out your JTAG setup and firing up your debugger? [Osannolik] has created a simple Python powered PC-based front end which can be used as a Swiss army knife for developing embedded systems. Variables can be displayed in real-time, parameters changed. Even graphs are available thanks to pyqtgraph.

If you want more Python-powered goodness, check out our new Python-powered project list! Did I miss your project? Don’t be shy, just drop me a message on Hackaday.io. That’s it for this week’s Hacklet. As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!