[jwcrawley] is busy planning for the Makevention coming up in Bloomington, Indiana in late August. One problem when working any con is manning the door; it’s a good idea to know how many people are there, and you can’t double count people. Previously, the volunteers used dead trees to estimate how many people have turned up. This year they might go with a more technological solution: face recognition and tracking.
The project is called uWho, and it uses the faceRecognizer class in OpenCV. The purpose of the entire project is to identify who someone is from previous frames. If your face is unknown to the program, your likeness – rather, a few points of data – are added to the database of faces. It’s simple, and according to [jwcrawley], it works.
While this is technically the best way to count how many unique people show up to Makevention, there will be some discussions to see if this solution is appropriate. The program only saves unique data from a face locally, and does nothing online. It’s less evil than whatever Facebook does, but there are obvious privacy implications here.
On Thursday night Hackaday hosted an event in San Francisco to commemorate the launch of the 2014 Hackaday Omnibus. Our first print edition, compiled to commemorate some of the finest original content which we published last year should begin shipping as early as today. To celebrate the occasion, we were graced by a full house of amazing guests. Is it lame to say some of the people I respect most in the world were there?
Whenever you get a lot of people together, a good rule of thumb is to seize the opportunity to have them speak about what they’re doing. It’s not a big “ask” either; 8-minutes on what you’re passionate about is pretty simple.
[Jonathan Foote] gave a talk on generating RGBY colors from Hue. The project is ongoing but explores the concept of mixing colors of light with one additional source added to traditional red, green, and blue. [Priya Kuber] recently moved to San Francisco. She recently concluded more than a year of standing up the Arduino office in India (relevant but unrelated video). Her talk covered the emerging maker/hacker hardware scene in India which is showing amazing growth. [Chris McCoy] demonstrated his Raver Rings which began a Kickstarter on the same day. [Elecia White] of embedded.fm spoke about the educational opportunities that podcasts and other delivery medium provide and the responsibility we all have to guide our continued learning. [Emile Petrone] talked about an upcoming feature for his site Tindie which will add manufacturer information and ratings to the mix. And rounding things out [Dave Grossman] gave a talk on his Virtual Carl project which used video footage of his grandfather, combined with a Raspberry Pi and peripherals to create a remembrance of the man in virtual form.
He also showed off a lens that can be focused electronically. This is not mechanical, there are zero moving parts. Instead a droplet of oil floating in water is the lens. A 75V, AC power supply pulls on the droplet, altering the meniscus to focus the lens. He didn’t fabricate the device from scratch, but the concept is completely new to us and quite interesting.
Othermill is located in the SF area. They produce a desktop milling machine which is spectacular at routing PCBs. The little wonder isn’t limited to that though. Above you can see [Brian] holding up a milled wooden plaque which has milled mother-of-pearl inlays. The table is also strewn with other examples in wax, metal, wood, and more.
The rest of the evening was devoted to conversation on all topics. Get enough hardware geeks in one room and they’ll solve the world’s problems, right? That’s a conversation for another post.
Couldn’t make it to this one but still in the San Francisco area at least occasionally? We held this at the Supplyframe office. They host a ton of great events like the Hardware Developers Didactic Galactic.
This past weekend, I had the chance to visit this year’s Tangible, Embedded, and Embodied Interaction Conference (TEI) and catch up with a number of designers in the human-computer-interaction space. The conference brings together a unique collection of artists, computer scientists, industrial designers, and grad students to discuss computer interactivity in today’s world. Over the span of five days (two for workshops, and three for paper presentations), not only did I witness a number of today’s current models for computer interactivity (haptics, physical computing with sensors), I also witnessed a number of excellent projects: some developed just to prove a concept, others, to present a well-refined system or workflow. It’s hard to believe, but our computer mouse has sat beneath our fingertips since 1963; this conference is the first place I would start looking to find new ways of “mousing” with tomorrow’s technology.
Over the next few days, I’ll be shedding more light on a few projects from TEI. (Some have already seen the light of day.) For this first post, though, I decided to highlight two projects tied directly to the conference culture itself.
Before each lunch break, the audience was invited to take part in an audience-driven interactive game of “Collective” Pong. With some image processing running in the background, players held up pink cards to increase the height of their respective paddle–albeit by a miniscule amount. The audience member’s corresponding paddle weight was mapped to their respective marker location on the screen (left or right). It turns out that this trick is a respectful nod back to its original performance by [Loren Carpenter] at Siggraph in 1991. With each audience member performing their own visual servoing to bring the paddle to the right height, we were able to give the ball a good whack for 15 minutes while lunch was being prepared.
Next off, the conference’s interactivity spread far beyond the main conference room. During our lunch breaks we had the pleasure of discarding our scraps in a remotely operated trash bin. Happily accepting our refuse, this bin did a quick jiggle when users placed items inside. Upon closer inspection, a Roomba and Logitech camera gave it’s master a way of navigating the environment from inside some remote secret lair.
Overall, the conference was an excellent opportunity to explore the design space of tinkerers constantly re-imagining the idea of how we interact with today’s computers and data. Stay tuned for more upcoming projects on their way. If you’re curious for more details on the papers presented or layout of the conference, have a look at this year’s website.
LVBots, a club for robot building enthusiasts in Las Vegas, held an open house the week of CES. This was the only trip [Sophi] and I took away from the conference halls of The Strip and it was a blast! The group holds meetings twice a month in a space provided by Pololu — a well-known robotics and electronics manufacturer headquartered just south of McCarran International Airport.
[Claire] demonstrating robotic closet demo and app
The mechanized racks of the automated closet
Line-following robot demonstration
Sumo robots about to rumble
Before the formal part of the gathering started there were several builds being shown off. [Claire] and [Brian] recently participated in an AT&T sponsored hackathon. Their creation is a robotic closet. The system involves moving racks of clothing which are tracked by a smartphone app. Interesting features discussed for the software include monitoring when each garment was last worn, last washed, and if it is appropriate for current weather conditions. Dig into the code in their repo.
In other parts of the room a pair of line-following robots did their thing, and a couple of sumo-bots competed to push each other out of the ring. A large group was gathered around the projector watching videos of robots of all types, brainstorming about the difficult parts, how they were overcome, and how these methods may be applied to their own build. I can attest that hanging with a group of people who are trying to cue up the most amazing robot demonstrations makes for amazing viewing!
As the organized part of the meeting began I was delighted to hear about a standing challenge from the LVbots group. The Tabletop challenge has multiple phases that serve to encourage builders to start modestly and then iterate to achieve new goals:
Phase 0: bring a robot to LVBots
Phase 1: travel back and forth without falling off
Phase 2: find an object and push it off
Phase 3: push object into a goal
[Nathan Bryant] and robot
[Joe Carson] and robot
[Nathan Bryant] was one of the two robot builders trying out the challenge on this night. He built this hexapod from balsa wood and three servo motors and was testing Phase 1. The bot includes a sensor dangling out in front of the robot to detect then the table surface is no long below. At that point it backs up a few steps, turns in place, and proceeds in the opposite direction. [Nathan] mentions that he worked out all the movements in a spreadsheet and that future firmware upgrades will dramatically increase the speed at which the bot moves. We love the audible cadence of the bot which is easily observed in the video above. At one point a leg dangles over the edge and it looks like [Nathan] pushed the bot back but I don’t remember him actually touching it so I’m calling this a trick of camera angle.
One phase further in the Tabletop Challenge is [Joe Carson]. He exhibited a wheeled robot he’s been working on that includes a gripper arm on the front. The robot looks around the table for a predefined color, in this case provided by a highlighting marker. When found the bot approaches, grips, and then proceeds to move the marker over the void where it is dropped out of existence; at least from the robot’s point of view.
A couple of years ago he worked on a standalone chemical sensor and had a few extra boards sitting around after the project was done. As any resourceful hacker will do, he reached for them as the closest and easiest solution when needing to log data as a quick test. It wasn’t for quite some time that he went back to try out commercially available loggers and found a problem in doing so.
The performance of off-the-shelf data loggers wasn’t doing it for [Paul’s] team. They kept having issues with the noise level found in the samples. Since he had been patching into the chemical sensor PCBs and getting better results, the impetus for a new product appeared.
The flagship 24-bit 8-channel Sigzig samples 0-5v with less than 1uV of noise. A less expensive 4-channel differential unit offers 18-bit with 10-12 uV of noise. They are targeting $199 and $399 price points for the two units. We asked about the sample rate in the video below. The smaller version shown here captures up to 240 samples per second. The big guy has the hardware potential to sample 30,000 times per second but since the data is continuously streaming over USB that rate is currently limited to much less.
Update: It has been pointed out in the comments that USB may not be the choke point for sample rate.
[Gaurav Taneja] was showing off his projection clock add-on for iPhone called Clockety at this year’s Consumer Electronics Show. The concept is pretty neat, a clip-on clock which uses the iPhone flash LED as the light source. It may sound a little gimmicky until you see the functionality of the accompanying app which is shown off in the video after the break. Once clipped onto the phone, you lay it face down on your night stand and a gentle tap on the furniture will turn the projection on or off. This is a killer feature when you’re staying some place without an illuminated bedside clock.
Flip-dot displays are grand, especially this one which boasts 74,088 pixels! I once heard the hardware compared to e-ink. That’s actually a pretty good description since both use a pixel that is white on one side and black on the other, depend on a coil to change state, and only use electricity when flipping those bits.
What’s remarkable about this is the size of the installation. It occupied a huge curving wall on the ooVoo booth at 2015 CES. We wanted to hear more about the hardware so we reached out to them they didn’t disappoint. The ooVoo crew made time for a conference call which included [Pat Murray] who coordinated the build effort. That’s right, they built this thing — we had assumed it was a rental. [Matt Farrell] recounts that during conception, the team had asked themselves how an HD video chat for mobile company can show off display technology when juxtaposed with cutting edge 4k and 8k displays? We think the flip-dot was a perfect tack — I know I spent more time looking at this than at televisions.
Join us after the break for the skinny on how it was built, including pictures of the back side of the installation and video clips that you have to hear to believe.