Adulterated food detection

Hackaday Prize Entry: Detecting Adulterated Food Using AI

Adulterated food is food that has a substance added to it to save on manufacturing costs. It can have a negative effect, it can reduce the food’s potency or it can have no effect at all. In many cases it’s done illegally. It’s also a widespread problem, one which [G. Vignesh] has decided to take on as his entry for the 2017 Hackaday Prize, an AI Based Adulteration Detector.

On his hackaday.io Project Details page he outlines some existing methods for testing food, some which you can do at home: adulterated sugar may have chalk added to it, so put it in water and the sugar will dissolve while the chalk will not. His approach is to instead take high-definition photos of the food and, on a Raspberry Pi, apply filters to them to reveal various properties such as density, size, color, texture and so on. He also mentions doing image analysis using a deep learning neural network. This project touches us all and we’ll be watching it with interest.

If all this talk of adulterated food makes you nervous about your food supply then consider growing our own, hacker style. One such project we’ve seen here on Hackaday is Farmbot, an open-source CNC farming robot. Another such is MIT’s OpenAg Food Computer, a robotic control and monitoring growing chamber.

The ‘All-Seeing Pi’ Aids Low-Vision Adventurer

Adventure travel can be pretty grueling, what with the exotic locations and potential for disaster that the typical tourist destinations don’t offer. One might find oneself dangling over a cliff for that near-death-experience selfie or ziplining through a rainforest canopy. All this is significantly complicated by being blind, of course, so a tool like this Raspberry Pi low-vision system would be a welcome addition to the nearly-blind adventurer’s well-worn rucksack.

[Dan] has had vision problems since childhood, but one look at his YouTube channel shows that he doesn’t let that slow him down. When [Dan] met [Ben] in Scotland, [Ben] noticed that he was using his smartphone as a vision aid, looking at the display up close and zooming in to get as much detail as possible from his remaining vision. [Ben] thought he could help, so he whipped up a heads-up display from a Raspberry Pi and a Pi Camera. Mounted to a 3D-printed frame holding a 5″ HDMI display and worn from a GoPro head mount, the camera provides enough detail to help [Dan] navigate, as seen in the video below.

The rig is a bit unwieldy right now, but as proof of concept (and proof of friendship), it’s a solid start. We think a slimmer profile design might help, in which case [Ben] might want to look into this Google Glass-like display for a multimeter for inspiration on version 2.0.

Continue reading “The ‘All-Seeing Pi’ Aids Low-Vision Adventurer”

This 3D Printed Microscope Bends For 50nm Precision

Exploiting the flexibility of plastic, a group of researchers has created a 3D printable microscope with sub-micron accuracy. By bending the supports of the microscope stage, they can manipulate a sample with surprising precision. Coupled with commonly available M3 bolts and stepper motors with gear reduction, they have reported a precision of up to 50nm in translational movement. We’ve seen functionality derived from flexibility before but not at this scale. And while it’s not a scanning electron microscope, 50nm is the size of a small virus (no, not that kind of virus).

OpenFlexure has a viewing area of 8x8x4mm, which is impressive when the supports only flex 6°. But, if 256 mm3 isn’t enough for you, fret not: the designs are all Open Source and are modeled in OpenSCAD just begging for modification. With only one file for printing, no support material, a wonderful assembly guide and a focus on PLA and ABS, OpenFlexure is clearly designed for ease of manufacturing. Optics are equally interesting. Using a Raspberry Pi Camera Module with the lens reversed, they achieve a resolution where one pixel corresponds to 120nm.

The group hopes that their microscopes will reach low-resource parts of the world, and it seem that the design has already started to spread. If you’d like to make one for yourself, you can find all the necessary files up on GitHub.

Continue reading “This 3D Printed Microscope Bends For 50nm Precision”

Objectifier: Director Of Domestic Technology

book-example[Bjørn Karmann]’s Objectifier is a device that lets you control domestic objects by allowing them to respond to unique actions or behaviour, using machine learning and computer vision. The Objectifier can turn on a table lamp when you open a book, and turn it off when you close the book. Switch on the coffee maker when you place the mug next to the pot, and switch it off when the mug is removed. Turn on the belt sander when you put on the safety glasses, and stop it when you remove the glasses. Charge the phone when you put a banana in front of it, and stop charging it when you place an apple in front of it. You get the drift — the possibilities are endless. Hopefully, sometime in the (near) future, we will be able to interact with inanimate objects in this fashion. We can get them to learn from our actions rather than us learning how to program them.

The device uses computer vision and a neural network to learn complex behaviours associated with your trigger commands. A training mode, using a phone app, allows you to train it for the On and Off actions. Some actions require more human effort in training it — such as detecting an open and closed book — but eventually, the neural network does a fairly good job.

The current version is the sixth prototype in the series and [Bjørn] has put in quite a lot of work refining the project at each stage. In its latest avatar, the device hardware consists of a Pi Zero, a Raspberry-Pi camera module, an SMPS power brick, a relay block to switch the output, a 230 V plug for input power and a 230 V socket outlet for the final output. All the parts are put together rather neatly using acrylic laser cut support pieces, and then further enclosed in a nice wooden enclosure.

On the software side, all of the machine learning part is taken care of using “Wekinator” — a free, open source software that allows building musical instruments, gestural game controllers, computer vision or computer listening systems using machine learning. The computer vision is handled via Processing. All the code is wrapped using openframeworks, with ml4A providing apps for working with machine learning.

All of the above is what we could deduce looking at the pictures and information on his blog post. There isn’t much detail about the hardware, but the pictures are enough to tell us all. The software isn’t made available, but maybe this could spur some of you hackers into action to build another version of the Objectifier. Check out the video after the break, showing humans teaching the Objectifier its tricks.

Continue reading “Objectifier: Director Of Domestic Technology”

Squirrel Café To Predict The Weather From Customer Data

Physicist and squirrel gastronomer [Carsten Dannat] is trying to correlate two critical social economical factors: how many summer days do we have left, and when will we run out of nuts. His research project, the Squirrel Café, invites squirrels to grab some free nuts and collects interesting bits of customer data in return.

Continue reading “Squirrel Café To Predict The Weather From Customer Data”

Hackaday Prize Entry: An Internet Of Things Microscope

For their entry into the Citizen Scientist portion of the Hackaday Prize, the folks at Arch Reactor, the St. Louis hackerspace, are building a microscope. Not just any microscope – this one is low-cost, digital, and has a surprisingly high magnification and pretty good optics. It’s the Internet of Things Microscope, and like all good apparatus for Citizen Scientist, it’s a remarkable tool for classrooms and developing countries.

When you think of ‘classroom microscope’, you’re probably thinking about a pile of old optics sitting in the back of a storage closet. These microscopes are purely optical, without the ability to take digital pictures. The glass is good, but you’re not going to get a scanning stage when you’re dealing with 30-year-old gear made for a classroom full of sticky-handed eighth graders.

The Internet of Things Microscope includes a scanning stage that moves across the specimen on the X and Y axes, stitching digital images together to create a very large image. That’s a killer feature for a cheap digital microscope, and the folks at Arch Reactor are doing this with a few cheap stepper motors and stepper motor drivers.

The rest of the electronics are built around a Raspberry Pi, Raspberry Pi camera (which recently got a nice resolution upgrade), and a some microscope eyepieces and objectives. Everything else is 3D printed, making this a very cheap and very accessible microscope that has some killer features.

Raspberry Pi Zero Now With Camera Support, Still Only $5

The latest version (1.3) of everyone’s favorite $5 computer now sports a frequently requested feature: a camera connector. The Pi Zero will now use the same economical camera modules available for the full-sized Raspberry Pi units.

The price of the Pi Zero is unchanged at $5, but there is a small catch. While the Raspberry Pi camera modules themselves will work just fine on the Pi Zero, the usual camera cable they come with will not. The Pi Zero’s camera cable connector is a little smaller than the ones on the full-grown Pi, so it needs a special cable to interface the camera modules to the slightly smaller connector found on the Pi Zero.

This should be good news. The new connector has appeared because another production run is ramping up. Logic points to greater availability of the $5 wonder board, but we’re still not holding our breath.

Adafruit Pi Zero camera cable
Pi Zero with camera module connector cable. [Image source: Adafruit]
With the Pi Zero now able to use camera modules, perhaps camera-based Pi projects like these digital binoculars or time-lapse camera rigs can now get even smaller.

[via Engadget]