Turning That Old Hoverboard Into A Learning Platform

[Isabelle Simova] is building Hoverbot, a flexible robotics platform using Ikea plastic trays, JavaScript running on a Raspberry Pi and parts scavenged from commonly available hoverboards.

Self-balancing scooters a.k.a. Hoverboards are a great source of parts for such a project. Their high torque, direct drive brushless motors can drive loads of 100 kg or more. In addition, you also get a matching motor controller board, a rechargeable battery and its charging circuit. Most hoverboard controllers use the STM32F103, so flashing them with your own firmware becomes easy using a ST-link V2 programmer.

The next set of parts you need to build your robot is sensors. Some are cheap and easily available, such as microphones, contact switches or LDRs, while others such as ultrasonic distance sensors or LiDAR’s may cost a lot more. One source of cheap sensors are car parking assist transducers. An aftermarket parking sensor kit usually consists of four transducers, a control box, cables and display. Using a logic analyzer, [Isabelle] shows how you can poke around the output port of the control box to reverse engineer the data stream and decipher the sensor data. Once the data structure is decoded, you can then use some SPI bit-banging and voltage translation to interface it with the Raspberry Pi. Using the Pi makes it easy to add a cheap web camera, microphone and speakers to the Hoverbot.

Ikea is a hackers favourite, and offers a wide variety of hacker friendly devices and supplies. Their catalog offers a wide selection of fine, Swedish engineered products which can be used as enclosures for building robots. [Isabelle] zeroed in on a deep, circular plastic tray from a storage table set, stiffened with some plywood reinforcement. The tray offers ample space to mount the two motors, two castor wheels, battery and the rest of the electronics. Most of the original hardware from the hoverboard comes handy while putting it all together.

The software glue that holds all this together is JavaScript. The event-driven architecture of Node.js makes it a very suitable framework to use for Hoverbot. [Isabelle] has built a basic application allowing remote control of the robot. It includes a dashboard which shows live video and audio streams from the robot, buttons for movement control, an input box for converting text to speech, ultrasonic sensor visualization, LED lighting control, message log and status display for the motors. This makes the dashboard a useful debugging tool and a starting point for building more interesting applications. Check the build log for all the juicy details. Which other products from the Ikea catalog can be used to build the Hoverbot? How about a robotic Chair?

Continue reading “Turning That Old Hoverboard Into A Learning Platform”

Hackaday Prize Entry: Automated Wildlife Recognition

Trail and wildlife cameras are commonly available nowadays, but the Wild Eye project aims to go beyond simply taking digital snapshots of critters. [Brenda Armour] uses a Raspberry Pi to not only take photos of wildlife who wander into the camera’s field of view, but to also automatically identify and categorize the animals seen using a visual recognition API from IBM via the Node-RED infrastructure. The result is a system that captures an image when motion is detected, sends the image to the visual recognition API, and attempts to identify any wildlife based on the returned data.

The visual recognition isn’t flawless, but a recent proof of concept shows promising results with crows, a cat, and a dog having been successfully identified. Perhaps when the project is ready to move deeper into the woods, elements from these solar-powered networked birdhouses (which also use the Raspberry Pi) could help cut some cords.

Sticking With The Script For Cheap Plane Tickets

When [Zeke Gabrielse] needed to book a flight, the Internet hive-mind recommended that he look into traveling with Southwest airlines due to a drop in fares late Thursday nights. Not one to stay up all night refreshing the web page indefinitely, he opted to write a script to take care of the tedium for him.

Settling on Node.js as his web scraper of choice, numerous avenues of getting the flight pricing failed before he finally had to cobble together a script that would fill out and submit the search form for him. With the numbers coming in, [Grabrielse] set up a Twilio account to text him  once fares dropped below a certain price point — because, again, why not automate?

Continue reading “Sticking With The Script For Cheap Plane Tickets”

SDR and Node.js Remote-Controlled Monster Drift

Most old-school remote controlled cars broadcast their controls on 27 MHz. Some software-defined radio (SDR) units will go that low. The rest, as we hardware folks like to say, is a simple matter of coding.

So kudos to [watson] for actually doing the coding. His monster drift project starts with the basics — sine and cosine waves of the right frequency — and combines them in just the right durations to spit out to an SDR, in this case a HackRF. Watch the smile on his face as he hits the enter key and the car pulls off an epic office-table 180 (video embedded below).

Continue reading “SDR and Node.js Remote-Controlled Monster Drift”

Build Your Own P-Brain

I don’t think we’ll call virtual assistants done until we can say, “Make me a sandwich” (without adding “sudo”) and have a sandwich made and delivered to us while sitting in front of our televisions. However, they are not completely without use as they are currently – they can let you know the time, weather and traffic, schedule or remind you of meetings and they can also be used to order things from Amazon. [Pat AI] was interested in building an open source, extensible, virtual assistant, so he built P-Brain.

Think of P-Brain as the base for a more complex virtual assistant. It is designed from the beginning to have more skills added on in order to grow its complexity, the number of things it can do. P-Brain is written in Node.js and using a Node package called Natural, P-Brain parses your request and matches it to a ‘skill.’ At the moment, P-Brain can get the time, date and weather, it can get facts from the internet, find and play music and can flip a virtual coin for you. Currently, P-Brain only runs in Chrome, but [Pat AI] has plans to remove that as a dependency. After the break, [Pat AI] goes into some detail about P-Brain and shows off its capabilities. In an upcoming video, [Pat AI]’s going to go over more details about how to add new skills. Continue reading “Build Your Own P-Brain”

Objectifier: Director of Domestic Technology

book-example[Bjørn Karmann]’s Objectifier is a device that lets you control domestic objects by allowing them to respond to unique actions or behaviour, using machine learning and computer vision. The Objectifier can turn on a table lamp when you open a book, and turn it off when you close the book. Switch on the coffee maker when you place the mug next to the pot, and switch it off when the mug is removed. Turn on the belt sander when you put on the safety glasses, and stop it when you remove the glasses. Charge the phone when you put a banana in front of it, and stop charging it when you place an apple in front of it. You get the drift — the possibilities are endless. Hopefully, sometime in the (near) future, we will be able to interact with inanimate objects in this fashion. We can get them to learn from our actions rather than us learning how to program them.

The device uses computer vision and a neural network to learn complex behaviours associated with your trigger commands. A training mode, using a phone app, allows you to train it for the On and Off actions. Some actions require more human effort in training it — such as detecting an open and closed book — but eventually, the neural network does a fairly good job.

The current version is the sixth prototype in the series and [Bjørn] has put in quite a lot of work refining the project at each stage. In its latest avatar, the device hardware consists of a Pi Zero, a Raspberry-Pi camera module, an SMPS power brick, a relay block to switch the output, a 230 V plug for input power and a 230 V socket outlet for the final output. All the parts are put together rather neatly using acrylic laser cut support pieces, and then further enclosed in a nice wooden enclosure.

On the software side, all of the machine learning part is taken care of using “Wekinator” — a free, open source software that allows building musical instruments, gestural game controllers, computer vision or computer listening systems using machine learning. The computer vision is handled via Processing. All the code is wrapped using openframeworks, with ml4A providing apps for working with machine learning.

All of the above is what we could deduce looking at the pictures and information on his blog post. There isn’t much detail about the hardware, but the pictures are enough to tell us all. The software isn’t made available, but maybe this could spur some of you hackers into action to build another version of the Objectifier. Check out the video after the break, showing humans teaching the Objectifier its tricks.

Continue reading “Objectifier: Director of Domestic Technology”

Solving IoT Problems with Node.js for Hardware

Tod Kurt knows a thing or two about IoT devices. As the creator of blink(1), he’s shipped over 30,000 units that are now out in the wild and in use for custom signaling on everything from compile status to those emotionally important social media indicators. His talk at the 2016 Hackaday SuperConference covers the last mile that bridges your Internet of Things devices with its intended use. This is where IoT actually happens, and of course where it usually goes astray.

Continue reading “Solving IoT Problems with Node.js for Hardware”