The current wave of excitement around machine learning kicked off when graphics processors were repurposed to make training deep neural networks practical. Nvidia found themselves the engine of a new revolution and seized their opportunity to help push frontiers of research. Their research lab in Seattle will focus on one such field: making robots smart enough to work alongside humans in an IKEA kitchen.
Today’s robots are mostly industrial machines that require workspaces designed for robots. They run day and night, performing repetitive tasks, usually inside cages to keep squishy humans out of harm’s way. Robots will need to be a lot smarter about their surroundings before we could safely dismantle those cages. While there are some industrial robots making a start in this arena, they have a hard time justifying their price premium. (Example: financial difficulty of Rethink Robotics, who made the Baxter and Sawyer robots.)
So there’s a lot of room for improvement in this field, and this evolution will need a training environment offering tasks of varying difficulty levels for robots. Anywhere from the rigorous structured environment where robots work well today, to a dynamic unstructured environment where robots are hopelessly lost. Lab lead Dr. Dieter Fox explained how a kitchen is ideal. A meticulously cleaned and organized kitchen is very similar to an industrial setting. From there, we can gradually make a kitchen more challenging for a robot. For example: today’s robots can easily pick up a can with its rigid regular shape, but what about a half-full bag of flour? And from there, learn to pick up a piece of fresh fruit without bruising it. These tasks share challenges with many other tasks outside of a kitchen.
This isn’t about building a must-have home cooking robot, it’s about working through the range of challenges shared with common kitchen tasks. The lab has a lot of neat hardware, but its success will be measured by the software, and like all research, published results should be reproducible by other labs. You don’t have a high-end robotics lab in your house, but you do have a kitchen. That’s why it’s not just any kitchen, but an IKEA kitchen, to take advantage of the fact they are standardized, affordable, and available around the world for other robot researchers to benchmark against.
Most of us can experiment in a kitchen, IKEA or not. We have access to all the other tools we need: affordable AI hardware from Google, from Beaglebone, and from Nvidia. And we certainly have no shortage of robot arms and manipulators on these pages, ranging from a small laser-cut MeArm to our 2018 Hackaday Prize winner Dexter.
Its taken awhile, but thanks to devices like the Amazon Kindle, the cost of e-ink displays are finally at the point where mere mortals such as us can actually start using them in our projects. Now we’ve just got to figure out how to utilize them properly. Sure you can just hook up an e-ink display to a Raspberry Pi to get started, but to truly realize the potential of the technology, you need hardware designed with it in mind.
To that end, [Mahesh Venkitachalam] has created Papyr, an open hardware wireless display built with the energy efficiency of e-ink in mind. This means not only offering support for low-energy communication protocols like BLE and Zigbee, but keeping the firmware as concise as possible. According to the documentation, the end result is that Papyr only draws 22 uA in its idle state.
So what do you do with this energy-sipping Bluetooth e-ink gadget? Well, that part is up to you. The obvious application is signage, but unless you’re operating a particularly well organized hackerspace, you probably don’t need wireless dynamic labels on your part bins (though please let us know if you actually do). More likely, you’d use Papyr as a general purpose display, showing sensor data or the status of your 3D printer.
The 1.54 inch 200×200 resolution e-ink panel is capable of showing red in addition to the standard grayscale, and the whole thing is powered by a Nordic nRF52840 SoC. Everything’s provided for you to build your own, but if you’d rather jump right in and get experimenting, you can buy the assembled version for $39 USD on Tindie.
The future of the musical instrument industry is in tiny, cheap, handheld synthesizers. They’re sold as ‘musical toys’. They bleep and bloop, and that’s about it. Korg may have just released the minimum viable product for this category, and thus the most popular product for this category. On the surface, the Korg Nu:Tekt doesn’t look like much, just a box with three knobs, a speaker, a (crappy) keyboard, and a few buttons. I/O includes MIDI in, Sync in and out, audio in, and headphones out. What’s inside is what counts. There’s a high-powered ARM core (STM32F446, a Cortex-M4 running at 180 MHz) and a ton of RAM. What’s the play here? It’s compatible with the Korg Prologue/Minilogue SDK, so you can put the same sounds from the flagship synthesizer on a tiny box that fits in your pocket. Things are starting to get weird, man. This is a toy, with the same sounds as the ‘pro’ level synth. Let it be known that the synth market is the most interesting segment of consumer electronics right now.
Bird, that ride share scooter startup, is now selling their scooters. It costs thirteen hundred dollars. Alternatively, you can pick some up for cheap at your city’s impound lot. Or for the low, low, price of free.
Razer, the company that makes garish computer peripherals aimed at ‘gamers’ and other people who are sucked deep into the existential turmoil of disempowerment, depression, and playing video games all day, are building a toaster. Gamers aren’t known for eating food that isn’t prepared by their mom, but the Razer consumer community has been clamoring for a professional gaming toaster since it was first teased on April Fool’s Day three years ago. You only eat so many cold Pop Tarts straight out of the box, I guess.
Everyone loves cupcake cars, and this year we’re in for a treat! We’re ringing the bell this weekend with the 6th annual Hackaday x Tindie meetup for the Bay Area Maker Faire. We got a few things going on here. Next Thursday we’ll be greeted with talks by The Only Makers That You Want To Meet. That’s HDDG, the monthly San Francisco meetup happening at the Supplyframe office, and it’s going to be packed to the gills this month. Don’t miss it. Next Saturday, we’re renting a bar close to the Faire. The 6th Annual Hackaday x Tindie MFBA Meetup w/ Kickstarter is usually at an Irish pub in San Mateo, but we’re getting a bigger venue this year. You’ll be able to move around in this venue.
Purdue’s Bio-Robotics lab has been working on a robotic hummingbird and, as you can see in the videos below, have had a lot of success. What’s more, is they’ve shared that success on GitHub. If you want to make a flapping-winged robot, this is definitely where you start.
If you’ve ever watched a hummingbird, you know their flight capability is nothing short of spectacular. The Purdue robot flies in a similar fashion (although on a tether to get both power and control information) and relies on each wing having its own motor. The motors not only propel the wings but also act as sensors. For example, they can detect if a wing is damaged, has made contact with something, or has changed performance due to atmospheric conditions.
In addition to the tethered control system, the hummingbird requires a motion capture sensor external to itself and some machine learning. Researchers note that there is sufficient payload capacity to put batteries onboard and they would also need additional sensors to accomplish totally free flight. It is amazing when you realize that a real hummingbird manages all this with a little bitty brain.
The published code is in Python and is part of three presentations later this month at a technical conference (the IEEE International Conference on Robotics and Automation). If you don’t want to wait on the paper, there’s a post on IEEE Spectrum about the robotic beast, available now and that article contains preprint versions of the papers. The Python code does require a bit to run, so expect a significant flight computer.
The last hummingbird bot we saw was a spy. We’ve also seen robots that were like bees — sort of.
Continue reading “Robot Hummingbird Imitates Nature”
Eyes are windows into the soul, the old saying goes. They are also pathways into the mind, as much of our brain is involved in processing visual input. This dedication to vision is partly why much of AI research is likewise focused on machine vision. But do artificial neural networks (ANN) actually work like the gray matter that inspired them? A recently published research paper (DOI: 10.1126/science.aav9436) builds a convincing argument for “yes”.
Neural nets were named because their organization was inspired by biological neurons in the brain. But as we learned more and more about how biological neurons worked, we also discovered artificial neurons aren’t very faithful digital copies of the original. This cast doubt whether machine vision neural nets actually function like their natural inspiration, or if they worked in an entirely different way.
This experiment took a trained machine vision network and analyzed its internals. Armed with this knowledge, images were created and tailored for the purpose of triggering high activity in specific neurons. These responses were far stronger than what occurs when processing normal visual input. These tailored images were then shown to three macaque monkeys fitted with electrodes monitoring their neuron activity, which picked up similarly strong neural responses atypical of normal vision.
Manipulating neural activity beyond their normal operating range via tailored imagery is the Hollywood portrayal of mind control, but we’re not at risk of input injection attacks on our brains. This data point gives machine learning researchers confidence their work still has relevance to biological source material, and neuroscientists are excited about the possibility of exploring brain functions without invasive surgical implants. Artificial neural networks could end up help us better understand what happens inside our brain, bringing the process full circle.
[via Science News]
For anyone who’s seen a 1970’s era microcomputer like the Altair 8800 doing its thing, you’ll know the centerpiece of these behemoths is the array of LEDs and toggle switches used as input and output. Sure, computers today are exponentially more capable, but there’s something undeniably satisfying about developing software with pen, paper, and the patience to key it all in.
If you’d like to get a taste of old school visceral programming, but aren’t quite ready to invest in a 40 year old computer, then [GClown25] might have the answer for you. He’s developed a pocket sized “computer” he’s calling the BIT4 that can be programmed with just three tactile switches. In reality it’s an ATMega4809 running C code, but it does give you an idea of how the machines of yesteryear were programmed.
In the video after the break, [GClown25] demonstrates the BIT4 by entering in a simple binary counter program. With a hand-written copy of the program to use as a reference, he steps through the memory addresses and enters in the command and then the value he wishes to operate on. After a few seconds of frantic button pushing, he puts the BIT4 into run mode and you can see the output on the array of LEDs along the top edge of the PCB.
All of the hardware and software is open source for anyone who’s interested in building their own copy, or perhaps just wants to take a peak at how [GClown25] re-imagined the classic microcomputer experience with modern technology. Conceptually, this project reminds us of the Digirule2, but we’ve got to admit the fact this version isn’t a foot long is pretty compelling.
Continue reading “Bare Metal Programming with Only Three Buttons”
CNC builds come in all shapes and sizes. There’s delta manipulators, experimental polar rigs, and all manner of cartesian builds, large and small. After completing their first CNC build, [jtaggard] took what they learned and applied it in the development of a new machine.
It’s a desk-sized cartesian design, with a frame built from V-slot extrusion cut to size by circular saw. This is a great way to get quality extrusion for a custom build, and is readily available and easy to work with. The gantry rides on wheels, with the X and Y axes being belt driven, plus a screw drive for Z. A couple of NEMA 17s and a NEMA 23 provide motive power, and an Arduino Uno with stepper drivers is the brains of the operation. 1/4″ thick PLA plates are used to assemble everything, and while [jtaggard] intended to replace these with aluminium down the track, so far the plastic has proved plenty rigid enough for early tests of both machining and engraving wood.
It’s a great entry-level CNC build, which has proved usable with both a 500W spindle and a 2.5W laser for engraving. Being modular in nature, it would be easy to add other tools, such as a pen plotter or vinyl cutting blade for further versatility.
DIY CNC builds are always popular, as you end up with a useful tool as a reward for your hard work. Video after the break.
Continue reading “Modular CNC Build Gets You Both a Mill and a Laser Cutter”