We’ve been watching the development of the ESP32 chip for the last year, but honestly we’ve been a little bit cautious to throw all of our friendly ESP8266s away just yet. Earlier this month, Espressif released version 2.0 of their IoT Development Framework (ESP-IDF), and if you haven’t been following along, you’ve missed a lot.
We last took a serious look at the IDF when the chips were brand-new, and the framework was still taking its first baby steps. There was no support for such niceties as I2C and such at the time, but you could get both cores up and running and the thing connected to the network. We wanted to test out the power-save modes, but that wasn’t implemented yet either. In short, we were watching the construction of a firmware skyscraper from day one, and only the foundation had been poured.
But what a difference eight months make! Look through the GitHub changes log for the release, and it’s a totally new ballgame. Not only are their drivers for I2C, I2S, SPI, the DAC and ADCs, etc, but there are working examples and documentation for all of the above. Naturally, there are a ton of bugfixes as well, especially in the complex WiFi and Bluetooth Low Energy stacks. There’s still work left to do, naturally, but Espressif seems to think that the framework is now mature enough that they’ve opened up their security bug bounty program on the chip. Time to get hacking!
Continue reading “ESP32’s Dev Framework Reaches 2.0”
‘Member StarCraft? Ooooh, I ‘member StarCraft. The original game and the Brood War expansion are now free. A new patch fixes most of the problems of getting a 20-year-old game working and vastly improves playing over LAN (‘member when you could play video games over a LAN?) And you thought you were going to have free time this week.
About a year ago, [Mark Chepurny] built a dust boot for his Shapeoko CNC router. The SuckIt (not the best possible name, by the way) is an easy, simple way to add dust collection to an X-Carve or Shapeoko 2. The folks at Inventables reached out to [Mark] and made a few improvements. Now, the renamed X-Carve Dust Control System. It’s a proper vacuum attachment for the X-Carve with grounding and a neat brush shoe.
I don’t know if this is a joke or not. It’s certainly possible, but I seriously doubt anyone would have the patience to turn PowerPoint into a Turing Machine. That’s what [Tom Wildenhain] did for a lightning talk at SIGBOVIK 2017 at CMU. There’s a paper (PDF), and the actual PowerPoint / Turing Machine file is available.
System76 builds computers. Their focus is on computers that run Linux well, and they’ve garnered a following in the Open Source world. System76 is moving manufacturing in-house. Previously, they’ve outsourced their design and hardware work to outside companies. They’re going to work on desktops first (laptops are much harder and will come later), but with any luck, we’ll see a good, serviceable, Open laptop in a few year’s time.
Remember last week when a company tried to trademark the word ‘makerspace’? That company quickly came to their senses after some feedback from the community. That’s not all, because they also had a trademark application for the word ‘FabLab’. No worries, because this was also sorted out in short order.
Snap, Inc., the company behind Snapchat, is branding itself as a hardware company. What hardware does Snap make? Spectacles, or a camera attached to a pair of sunglasses. Snap, Inc. has a market value of around $30 Billion USD.
For his Hackaday Prize entry, [William Glover] is building a device that’s easily worth $100 Billion. It’s called SnappCat, and it’s a machine learning, AI, augmented reality, buzzword-laden camera that adds memes to pictures of cats. Better get in on the Series A now because this is 🔥🔥🔥.
Here’s the use case for SnappCat. Place a small device containing a camera and some sort of WiFi chip. During the day, this device will take pictures. If the device recognizes your cat in a picture, it adds a meme (we assume this means text, probably using the Impact typeface), and sends it to your mobile device. Just imagine sitting in a meeting at work. Your phone buzzes, you look at the message, and you laugh uproariously. Yes, you can has fud Mr. Pibbles, you can has so much fud.
This is the height of technology. That’s not to say landing on the moon or building a civilization on Mars isn’t a superior technological achievement. SnappCat is simply the best technology humanity will every produce because it’s all downhill from here.
That said, this is a pretty interesting problem. A small, cheap device that does image recognition is hard, and adding memes is just the cherry on top. We can’t wait to see where this project goes, and it’s a great entry to the Best Product portion of the Hackaday Prize.
A mew months ago, [wermy] built the mintyPi, a Raspberry Pi-based gaming console that fits inside an Altoids tin. It’s amazing — there’s a composite LCD, an audio DAC, and a chopped up Nintendo controller all connected to a Raspberry Pi for vintage gaming goodness on the road. Now, there’s a new mintyPi. The mintyPi 2.0 vastly improves over the earlier generation of this groundbreaking mint-based gaming console with a better screen, better buttons, customized 3D printed bezels, and better audio. Truly, we live in a Golden Age.
Version two of mintyPi uses 3D printed parts and includes a real hinge to keep the display propped up when the Altoids tin is open. Instead of a DAC-based audio solution, [wermy] is using a USB sound card for clearer, crisper sound. This version also uses the new, wireless version of the Raspberry Pi Zero. The Raspberry Pi Zero W allows this Altoids tin to connect to the Internet or, alternatively, gives the user the ability to dump ROMs on this thing without having to connect it to a computer.
For the software, this retro Altoids video game machine is running RetroPie, a very popular way to get retro video games running under low-power Linux machines. Everything is in there, from the NES to Amstrad to the Sega Master system.
Right now, there aren’t a whole lot of details on how [wermy] created the mintyPi 2.0, but he promises a guide soon. Until then, we’ll just have to drool over the video embedded below.
Continue reading “This Hacker Fit An Entire RetroPie In An Altoids Tin”
[Jason] has a Sonos home sound system, with a bunch of speakers connected via WiFi. [Jason] also has a universal remote designed and manufactured in a universe where WiFi doesn’t exist. The Sonos can not be controlled via infrared. There’s an obvious problem here, but luckily tiny Linux computers with WiFi cost $10, and IR receivers cost $2. The result is an IR to WiFi bridge to control all those ‘smart’ home audio solutions.
The only thing [Jason] needed to control his Sonos from a universal remote is an IR receiver and a Raspberry Pi Zero W. The circuit is simple – just connect the power and ground of the IR receiver to the Pi, and plug the third pin of the receiver into a GPIO pin. The new, fancy official Raspberry Pi Zero enclosure is perfect for this build, allowing a little IR-transparent piece of epoxy poking out of a hole designed for the Pi camera.
For the software, [Jason] turned to Node JS, and LIRC, a piece of software that decodes IR signals. With the GPIO pin defined, [Jason] set up the driver and used the Sonos HTTP API to send commands to his audio unit. There’s a lot of futzing about with text files for this build, but the results speak for themselves: [Jason] can now use a universal remote with everything in his home stereo now.
While sick with the flu a few months ago, [CroMagnon] had a vision. A face with eyes that would follow you – no matter where you walked in the room. He brought this vision to life in the form of Gawkerbot. This is no static piece of art. Gawkerbot’s eyes slowly follow you as you walk through its field of vision. Once the robot has fixed its gaze upon you, the eyes glow blue. It makes one wonder if this is an art piece, of if the rest of the robot is about to pop through the wall and attack.
Gawkerbot’s sensing system is rather simple. A PIR sensor detects motion in the room. If any motion is detected, two ultrasonic sensors which make up the robots pupils start taking data. Code running on an ATmega328 determines if a person is detected on the left or right, and moves the eyes appropriately.
[CroMagnon] used an old CD-ROM drive optics sled to move Gawkerbot’s eyes. While the motor is small, the worm drive has plenty of power to move the 3D-printed eyes and linkages. Gawkerbot’s main face is a 3D-printed version of a firefighters smoke helmet.
The ultrasonic sensors work, but it took quite a bit of software to tame the jitters noisy data stream. [CroMagnon] is thinking of using PIR sensors on Gawkerbot 2.0. Ultrasonic transducers aren’t just for sensing. Given enough power, you can solder with them. Ultrasonics even work for wireless communications.
Check out the video after the break to see Gawkerbot in action.
Continue reading “Gawkerbot is Watching You”
This one is both wild enough to be confused as a conspiracy theory and common sense enough to be the big solution staring us in the face which nobody realized. Until now. Oak Ridge National Laboratory and General Electric (GE), working on a grant from the US Department of Energy (DOE), have been playing around with new clothes dryer technology since 2014 and have come with something new and exciting. Clothes dryers that use ultrasonic traducers to remove moisture from garments instead of using heat.
If you’ve ever seen a cool mist humidifier you’ll know how this works. A piezo element generates ultrasonic waves that atomize water and humidify the air. This is exactly the same except the water is stored in clothing, rather than a reservoir. Once it’s atomized it can be removed with traditional air movement.
This is a totally obvious application of the simple and inexpensive technology — when the garment is laying flat on a bed of transducers. This can be implemented in a press drying system where a garment is laid flat on a bed or transducers and another bed hinges down from above. Poof, your shirt is dry in a few seconds.
But individual households don’t have these kinds of dryers. They have what are called drum dryers that spin the clothes. Reading closely, this piece of the puzzle is still to come:
They play [sic] to scale-up the technoloogy to press drying and eventually a clothes dryer drum in the next five months.
We look at this as having a similar technological hurdle as wireless electricity. There must be an inverse-square law on the effect of the ultrasonic waves to atomize water as the water moves further away from the transducers. It that’s the case, tranducers on the circumference of a drum would be inefficient at drying the clothing toward the center. This slide deck hints that that problem is being addressed. It talks about only running the transducers when the fabric is physically coupled with the elements. It’s an interesting application and we hope that it could work in conjunction with traditional drying methods to boost energy savings, even if this doesn’t pan out as a total replacement.
With a vast population, cost adds up fast. There are roughly 125 M households in the United States and the overwhelming majority of them use clothes dryers (while many other parts of the world have a higher percentage who hang-dry their clothing). The DOE estimates $9 billion a year is spent on drying clothes in the US. Reducing that number by even 1/10th of 1% will pay off more than tenfold the $880,000 research budget that went into this. Of course, you have to outfit those households with new equipment which will take at least 8-12 years through natural attrition, even if ultrasonics hit the market as soon as possible.
Continue reading “A Cool Mist that Dries Your Clothes”