There are many people who find being around insects uncomfortable. This is understandable, and only likely to get worse as technology gives these multi-legged critters augmented bodies to roam around with. [tech_support], for one, welcomes our new arthropod overlords, and has even built them a sweet new ride to get around in.
The build follows the usual hallmarks of a self-balancing bot, with a couple of interesting twists. There’s twin brushed motors for drive, an an Arduino Uno running the show. Instead of the more usual pedestrian IMUs however, this rig employs the Bosch BNO055 Absolute Orientation Sensor. This combines a magnetometer, gyroscope, and accelerometer all on a single die, and handles all the complicated sensor fusion maths onboard. This allows it to output simpler and more readily usable orientation data.
The real party piece is even more interesting, however. Rather than radio control or a line following algorithm, this self-balancer instead gets its very own insect pilot. The insect is placed in a small chamber with ultrasonic sensors used to determine its position. The insect may then control the movement of the bot by moving around this chamber itself. The team have even developed a variety of codes to dial in the sensor system for different types of insect.
It’s not the first time we’ve seen insects augmented with robotic hardware, and we doubt it will be the last. If you’re working on a mad science project of your own, drop us a line. Video after the break.
Continue reading “Augmented Arthropod Gets A Self-Balancing Ride”
Some plants react quickly enough for our senses to notice, such as a Venus flytrap or mimosa pudica. Most of the time, we need time-lapse photography at a minimum to notice while more exotic sensors can measure things like microscopic pores opening and closing. As with any sensor reading, those measurements can be turned into action through a little trick we call automation. [Harpreet Sareen] and [Pattie Maes] at MIT brought these two ideas together in a way which we haven’t seen before where a plant has taken the driver’s seat in a project called Elowan. Details are sparse but the concept is easy enough to grasp.
We are not sure if this qualifies as a full-fledged cyborg or if this is a case of a robot using biological sensors. Maybe it all depends on which angle you present this mixture of plant and machine. Perhaps it is truly is the symbiotic relationship that the project claims it to be. The robot would not receive any instructions without the plant and the plant would receive sub-optimal light without the robot. What other ways could plants be integrated into robotics to make it a bona fide cyborg?
Continue reading “Cyborg, Or Leafy Sensor Array?”
You step out of the audience onto a stage, and a hypnotist hands you a potato chip. The chip is salty and crunchy and you are convinced the chip is genuine. Now, replace the ordinary potato chip with a low-sodium version and replace the hypnotist with an Arduino. [Nimesha Ranasinghe] at the University of Maine’s Multisensory Interactive Media Lab wants to trick people into eating food with less salt by telling our tongues that we taste more salt than the recipe calls for with the help of electrical pulses controlled by everyone’s (least) favorite microcontroller.
Eating Cheetos with chopsticks is a famous lifehack but eating unsalted popcorn could join the list if these chopsticks take hold and people want to reduce their blood pressure. Salt is a flavor enhancer, so in a way, this approach can supplement any savory dish.
Smelling is another popular machine hack in the kitchen, and naturally, touch is popular beyond phone screens. You have probably heard some good audio hacks here, and we are always seeing fascination stuff with video.