Open Hybrid Gives You The Knobs And Buttons To Your Digital Kingdom

With a sweeping wave of complexity that comes with using your new appliance tech, it’s easy to start grumbling over having to pull your phone out every time you want to turn the kitchen lights on. [Valentin] realized that our new interfaces aren’t making our lives much simpler, and both he and the folks at MIT Media Labs have developed a solution.

open-hybrid-light-color-pickerOpen Hybrid takes the interface out of the phone app and superimposes it directly onto the items we want to operate in real life. The Open Hybrid Interface is viewed through the lense of a tablet or smart mobile device. With a real time video stream, an interactive set of knobs and buttons superimpose themselves on the objects they control. In one example, holding a tablet up to a light brings up a color palette for color control. In another, sliders superimposed on a Mindstorms tank-drive toy become the control panel for driving the vehicle around the floor. Object behaviors can even be tied together so that applying an action to one object, such as turning off one light, will apply to other objects, in this case, putting all other lights out.

Beneath the surface, Open Hybrid is developed on OpenFrameworks with a hardware interface handled by the Arduino Yún running custom firmware. Creating a new application, though, has been simplified to be achievable with web-friendly languages (HTML, Javascript, and CSS). The net result is that their toolchain cuts out a heavy need for extensive graphics knowledge to develop a new control panel.

If you can spare a few minutes, check out [Valentin’s] SolidCon talk on the drive to design new digital interfaces that echo those we’ve already been using for hundreds of years.

Last but not least, Open Hybrid may have been born in the Labs, but its evolution is up to the community as the entire project is both platform independent and open source.

Sure, it’s not mustaches, but it’s definitely more user-friendly.

Continue reading “Open Hybrid Gives You The Knobs And Buttons To Your Digital Kingdom”

Meetup In Boston This Thursday

Hackaday is headed to Boston this week. Meet up with us on Thursday at 6pm to show off your projects and meet other hackers in the area. Admission is free, just tell us you’re coming.

We’re hosting a Hackaday meetup at Artisan’s Asylum hackerspace. That name should sound familiar. This is the group that decided to throw down the robot gauntlet with Japan. We can’t wait to see what that’s all about first hand!

While in town we’ll also be stopping by the MIT Media Lab, a legendary den of cutting edge research that springs forth wave after wave of awesome inspiration. If you know of any projects going on there that we just shouldn’t miss please let us know below. We’re also looking for suggestions of other places we should check out while in town.

See you Thursday!

State-Aware Foldable Electronics Enters The Third Dimension

Still working with PCBs in 2D? Not [Yoav]. With some clever twists on the way we fab PCBs, he’s managed to create a state-aware foldable circuit board that responds to different configurations.

From his paper [PDF warning], [Yoav] discusses two techniques for developing foldable circuits that may be used repeatedly. The first method involves printing the circuit onto a flexible circuit board material and then bound front-and-back between two sheets of acrylic. Valid folded edges are distinguished by the edges of individual acrylic pieces. The second method involves laying out circuits manually via conductive copper tape and then exposing pads to determine an open or closed state.

Reconfigurable foldable objects may open the door for many creative avenues; in the video (after the break), [Yoav] demonstrates the project’s state-awareness with a simple onscreen rendering that echoes its physical counterpart.

While these circuits are fabbed from a custom solution, not FR1 or FR4, don’t let that note hold your imagination back. In fact, If you’re interested with using PCB FR4 as a structural element, check out [Voja’s] comprehensive guide on the subject.

Continue reading “State-Aware Foldable Electronics Enters The Third Dimension”

What You See Is What You (Laser) Cut

WYSIWYG editors revolutionized content management systems, will WYSIWYC interfaces do the same for laser cutters? Unlikely, but we still appreciate the concepts shown here. Chalkaat uses computer vision to trace lines drawn in ink with the cutting power of a laser.

At its core, you simply draw on your work piece with a colored marker and the camera system will ensure the laser traces this line exactly. There is even a proof of concept here for different behavior based on different line color, and the technique is not limited to white paper but can also identify and cut printed materials.

This is a spin on [Anirudh’s] first version which used computer vision with a projector to create a virtual interface for a laser cutter. This time around we can think of a few different uses for this. The obvious is the ability for anyone to use a laser cutter by drawing their designs by hand. Imagine introducing grade-school children to this type of technology by having them draw paper puppets and scenery in advance and have it cut in shop class for use in art projects.

A red arrow indicates cut line, but a pink arrow is used for indicating positioning on a work piece. The example shows a design from a cellphone etched next to a positioning marker. But we could see this used to position expensive things (like a Macbook) for etching. We also think the red marker could be used to make slight adjustments to cut pieces by scribing a work piece with the marker and having the laser cut it away.

This concept is a product of [Nitesh Kadyan] and [Anirudh Sharma] at the Fluid Interfaces group at the MIT Media Lab and is something we could see being built into future laser cutter models. What do you think?

Continue reading “What You See Is What You (Laser) Cut”

Open Hardware For Open Science – Interview With Charles Fracchia

Open Science has been a long-standing ideal for many researchers and practitioners around the world. It advocates the open sharing of scientific research, data, processes, and tools and encourages open collaboration. While not without challenges, this mode of scientific research has the potential to change the entire course of science, allowing for more rigorous peer-review and large-scale scientific projects, accelerating progress, and enabling otherwise unimaginable discoveries.

As with any great idea, there are a number of obstacles to such a thing going mainstream. The biggest one is certainly the existing incentive system that lies at the foundation of the academic world. A limited number of opportunities, relentless competition, and pressure to “publish or perish” usually end up incentivizing exactly the opposite – keeping results closed and doing everything to gain a competitive edge. Still, against all odds, a number of successful Open Science projects are out there in the wild, making profound impacts on their respective fields. HapMap Project, OpenWorm, Sloan Digital Sky Survey and Polymath Project are just a few to name. And the whole movement is just getting started.

While some of these challenges are universal, when it comes to Biology and Biomedical Engineering, the road to Open Science is paved with problems that will go beyond crafting proper incentives for researchers and academic institutions.

It will require building hardware.

Continue reading “Open Hardware For Open Science – Interview With Charles Fracchia”

Mediated Matter At The MIT Media Lab

Few things have managed to capture the imagination of hackers and engineers around the world the way Synthetic Biology did over the last couple of years. The promise of “applying engineering principles to designing new biological devices and systems” just seemed way too sci-fi to missed out on, and everyone jumped on the bandwagon. All of a sudden, the field which used to be restricted to traditional research organizations and startups found itself crowded with all sorts of enthusiasts, biohackers, and weirdos alike. Competitions such as the International Genetically Engineered Machine (iGEM) paved the way, and the emergence of community spaces like GenSpace and BioCurious finally made DNA experimentation accessible to anyone who dares to try. As it often happens, the Sci-Fi itself did not go untouched, and a whole new genre called “Biopunk” emerged, further fueling people’s imagination and extrapolating worlds to come.

Continue reading “Mediated Matter At The MIT Media Lab”

Smile Meter Reacts To Your Expressions With Pharrell’s Happy

MIT's Smile Meter

Here’s a clever use of a webcam and some facial recognition software — They call it Happy ++ and it will DJ [Pharrell’s] Happy according to how much you’re smiling (or not at all!).

It’s another project to come out of MIT’s Media Lab for a spring event this year by [Rob, Dan & Javier]. The facial tracking software was re-used from an older project, the MIT Mood Meter, which was a clever installation that had several zones on campus tracking the apparent “happiness” of the students walking by.

To create the program they’ve split up the song Happy into its various components. Drums, vocals, band, and the full mix. As the webcam recognizes a smile, it records the intensity, which in turn turns up the vocals and band. If no smiling is present there is only a drum beat.  Continue reading “Smile Meter Reacts To Your Expressions With Pharrell’s Happy”