RPi Show and Tell Saturday and NYC Meetup on Monday

Join Hackaday for the vanguard of cool emerging technologies next week at our meetup in New York.

Like all our meetups, we’ve gathered some of the neatest technologists to spill the beans on what they’re doing and how they’re doing it. Madison Maxey, founder of Loomia and designer of soft, blinky circuits will be there. Dr. Ellen Jorgensen, co-founder and executive director of Genspace, the citizen science biotech ‘hackerspace’ in the heart of New York will be there too. Kari Love & Matthew Borgatti of Super-Releaser, most famous for their super cute pneumatic soft robots will also be there. It’s still up in the air if we’ll be racing these robots. Of course there will also be opportunities for you to present a lightning talk at the meetup.

enlightenpiThe meetup will be at Pivotal Labs, 625 Ave of the Americas, on Monday, October 24 starting at 6:30 PM. An RSVP is required, so if you’re coming head on over to the Meetup page.

Live Video Show and Tell on Saturday

This Saturday join us online for a special show and tell all about Raspberry Pi projects from 7-8p EDT (UTC-4). Hosted by Limor Fried of Adafruit and Sophi Kravitz from Hackaday. This live show is hosted on our YouTube channel and will feature projects from our giant collection of Raspberry Pi projects on Hackaday.io and entries in the Enlightened Raspberry Pi contest.

A lot of people have already signed up for the Show and Tell but we do still have some time left for your project. Email sophi@hackaday.com to get on the list.

Retrotechtacular: Piano Rolls, Made By Apple ][

Piano rolls are the world’s longest-lasting recording medium, and its first digital one. They were mass-produced from 1896 to 2008, and you can still get some made today, although they’re a specialty item. The technology behind them, both on the player and the recorder side, is simply wonderful.

[lwalkera] sent us in this marvelous video (embedded below) that provides a late-80s peek inside the works of QRS Records, and the presenter seems to be loving every minute of it.

Player pianos are cool enough, with their “draw bar” pulling air through the holes in the paper roll as it goes by, and pneumatically activating the keys. But did you ever think of how the rolls are made?

Continue reading “Retrotechtacular: Piano Rolls, Made By Apple ][“

Interactive Dynamic Video

If a picture is worth a thousand words, a video must be worth millions. However, computers still aren’t very good at analyzing video. Machine vision software like OpenCV can do certain tasks like facial recognition quite well. But current software isn’t good at determining the physical nature of the objects being filmed. [Abe Davis, Justin G. Chen, and Fredo Durand] are members of the MIT Computer Science and Artificial Intelligence Laboratory. They’re working toward a method of determining the structure of an object based upon the object’s motion in a video.

The technique relies on vibrations which can be captured by a typical 30 or 60 Frames Per Second (fps) camera. Here’s how it works: A locked down camera is used to image an object. The object is moved due to wind, or someone banging on it, or  any other mechanical means. This movement is captured on video. The team’s software then analyzes the video to see exactly where the object moved, and how much it moved. Complex objects can have many vibration modes. The wire frame figure used in the video is a great example. The hands of the figure will vibrate more than the figure’s feet. The software uses this information to construct a rudimentary model of the object being filmed. It then allows the user to interact with the object by clicking and dragging with a mouse. Dragging the hands will produce more movement than dragging the feet.

The results aren’t perfect – they remind us of computer animated objects from just a few years ago. However, this is very promising. These aren’t textured wire frames created in 3D modeling software. The models and skeletons were created automatically using software analysis. The team’s research paper (PDF link) contains all the details of their research. Check it out, and check out the video after the break.

Continue reading “Interactive Dynamic Video”

Retrotechtacular: How Solidarity Hacked Polish TV

In the 1980s, Poland was under the grip of martial law as the Communist government of General Wojciech Jaruzelski attempted to repress the independent Solidarity trade union. In Western Europe our TV screens featured as much coverage of the events as could be gleaned through the Iron Curtain, but Polish state TV remained oblivious and restricted itself to wholesome Communist fare.

In September 1985, TV viewers in the city of Toruń sat down to watch an action adventure film and were treated to an unexpected bonus: the screen had a brief overlay with the messages “Solidarity Toruń: Boycotting the election is our duty,” and “Solidarity Toruń: Enough price hikes, lies, repression”. Sadly for the perpetrators, they were caught by the authorities after their second transmission a few days later when they repeated the performance over the evening news bulletin, and they were jailed for four months.

The transmission had been made by a group of dissident radio astronomers and scientists who had successfully developed a video transmitter that could synchronise itself with the official broadcast to produce an overlay that would be visible on every set within its limited transmission radius. This was a significant achievement using 1980s technology in a state in which electronic components were hard to come by. Our description comes via [Maciej Cegłowski], who was able to track down one of the people involved in building the transmitter and received an in-depth description of it.

Transmission equipment seized by the Polish police.
Transmission equipment seized by the Polish police.

The synchronisation came courtesy of the international effort at the time on Very Long Baseline Interferometry, in which multiple radio telescopes across the world are combined to achieve the effect of a single much larger instrument. Before GPS made available a constant timing signal the different groups participating in the experiment had used the sync pulses of TV transmitters to stay in time, establishing a network that spanned the political divide of the Iron Curtain. This expertise allowed them to create their transmitter capable of overlaying the official broadcasts. The police file on the event shows some of their equipment, including a Sinclair ZX Spectrum home computer from the West that was presumably used to generate the graphics.

There is no surviving recording of the overlay transmission, however a reconstruction has been put on YouTube that you can see below the break, complete with very period Communist TV footage.

Continue reading “Retrotechtacular: How Solidarity Hacked Polish TV”

HDMI Extender Reverse Engineered

[danman] has been playing around with various HDMI video streaming options, and he’s hit on a great low-cost solution. A $40 “HDMI extender” turns out to actually be an HDMI-to-RTP converter under the hood.

He’d done work previously on a similar extender that turned out to use a quirky method to send the video, which he naturally reversed and made to do his bidding. But non-standard formats are a pain. So when he was given a newer version of the same device, and started peeking into the packets with Wireshark, he was pleasantly surprised to find that the output was just MPEG-encoded video over RTP. No hacking necessary.

Until now, streaming video over an IP network from an arbitrary HDMI output has been tricky, [danman] has been more than a little obsessed with getting it working on the cheap. In addition to the previous version of this extender, he also managed to get a stream out of a rooted Android set-top box. That costs a bit more, but can also record at the same time, should you need to.

None of this solves the HDMI HDCP encryption problem, though. You’re on your own for that one.

(Those of you Wireshark wizards out there will note that we just swiped the headline image from the previous version of the project. There were no good images for this one. Sorry about that.)

1575 Bottles of Beer on the (LED) Wall

Say hello to my little friend, lovingly named Flaschen Taschen by the members of Noisebridge in San Francisco. It is a testament to their determination to drink Corona beer get more members involved in building big displays each year for the Bay Area Maker Faire. I pulled aside a couple of the builders for an interview despite their very busy booth. When you have a huge full-color display standing nine feet tall and ten feet wide it’s no surprise the booth was packed with people.

Check out the video and then join me after the break for more specifics on how they pulled this off.

Continue reading “1575 Bottles of Beer on the (LED) Wall”

Using An FPGA To Generate Ambient Color From Video

We should all be familiar with TV ambient lighting systems such as Philips’ Ambilight, a ring of LED lights around the periphery of a TV that extend the colors at the edge of the screen to the surrounding lighting. [Shiva Rajagopal] was inspired by his tutor to look at the mechanics of generating a more accurate color representation from video frames, and produced a project using an FPGA to perform the task in real-time. It’s not an Ambilight clone, instead it is intended to produce as accurate a color representation as possible to give the impression of a TV being on for security purposes in an otherwise empty house.

The concern was that simply averaging the pixel color values would deliver a color, but would not necessarily deliver the same color that a human eye would perceive. He goes into detail about the difference between RGB and HSL color spaces, and arrives at an equation that gives an importance rating to each pixel taking into account its saturation and thus how much the human eye perceives it. As a result, he can derive his final overall color by looking at these important pixels rather than the too-dark or too-saturated pixels whose color the user’s eye will not register.

The whole project was produced on an Altera DE2-115 FPGA development and education board, and makes use of its NTSC and VGA decoding example code. All his code is available for your perusal in his appendices, and he’s produced a demo video shown here below the break.

Continue reading “Using An FPGA To Generate Ambient Color From Video”