Hackaday Prize Entry: Augmented Reality Historical Reenactments

Go to a pier, boardwalk, the tip of Manhattan, or a battlefield, and you’ll see beautifully crafted coin operated binoculars. Drop a coin in, and you’ll see the Statue of Liberty, a container ship rolling coal, or a beautiful pasture that was once the site of terrific horrors. For just a quarter, these binoculars allow you to take in the sights, but simply by virtue of the location of where these machines are placed, you’re standing in the midsts of history. There’s so much more there. If only there was a way to experience that.

This is why [Ben Sax] is building the Perceptoscope. It’s a pair of augmented reality binoculars. Drop in a quarter, and you’ll be able to view the entirety of history for an area. Drop this in Battery Park, and you’ll be able to see the growth of Manhattan from New Amsterdam to the present day. Drop this in Gettysburg, and you’ll see a tiny town surrounded by farms become a horrorscape and turn back into a tiny town surrounded by a National Park.

This is a long term project, with any installations hopefully lasting for decades. That means these Perceptoscopes need to be tough, both in hardware and software. For the software, [Ben] is using WebVR, virtual reality rendering inside a browser. This means the electronics can just be a tablet that can be swapped in and out.

The hardware, though, isn’t as simple. This is going to be a device running in the rain, snow, and freezing weather for decades. Everything must be overbuilt, and already [Ben] has spent far too much time working on the bearing blocks.

Although this is an entry for The Hackaday Prize, it was ‘pulled out’, so to speak, to be a part of the Supplyframe DesignLab inaugural class. The DesignLab is a shop filled with the best tools you can imagine, and exists for only one goal: we’re getting the best designers in there to build cool stuff. The Perceptoscope has been the subject of a few videos coming out of the DesignLab, you can check those out below.

Continue reading “Hackaday Prize Entry: Augmented Reality Historical Reenactments”

Porting NES to the ESP32

There’s an elephant in the room when it comes to the Raspberry Pi Zero. The Pi Zero is an immensely popular single board computer, but out of stock issues for the first year may be due to one simple fact: you can run a Nintendo emulator on it. Instead of cool projects like clusters, CNC controllers, and Linux-based throwies, all the potential for the Pi Zero was initially wasted on rescuing the princess.

Espressif has a new chip coming out, the ESP32, and it’s a miraculous Internet of Things thing. It’s cheap, exceptionally powerful, and although we expect the stock issues to be fixed faster than the Pi Zero, there’s still a danger: if the ESP32 can emulate an NES, it may be too popular. This was the hypothetical supply issue I posited in this week’s Hackaday Links post just twenty-four hours ago.

Hackaday fellow, Hackaday Supercon speaker, Espressif employee, and generally awesome dude [Sprite_tm] just ported an NES emulator to the ESP32. It seems Espressif really knows how to sell chips: just give one of your engineers a YouTube channel.

This build began when [Sprite] walked into his office yesterday and found a new board waiting for him to test. This board features the ESP-WROOM-32 module and breaks out a few of the pins to a microSD card, an FT2232 USB/UART module, JTAG support, a bunch of GPIOs, and a 320×240 LCD on the back. [Sprite]’s job for the day was to test this board, but he reads Hackaday with a cup of coffee every morning (like any civilized hacker) and took the links post as a challenge. The result is porting an NES emulator to the ESP32.

The ESP-32-NESEMU is built on the Nofrendo emulator, and when it comes to emulation, the ESP32 is more than capable of keeping the frame rate up. According to [Sprite], the display is the bottleneck; the SPI-powered display doesn’t quite update fast enough. [Sprite] didn’t have enough time to work on the sound, either, but the source for the project is available, even if this dev board isn’t.

Right now, you can order an ESP32; mine are stuck on a container ship a few miles from the port of Long Beach. Supply is still an issue, and now [Sprite] has ensured the ESP32 will be the most popular embedded development platform in recent memory. All of this happened in the space of 24 hours. This is awesome.

Continue reading “Porting NES to the ESP32”

[NE555]’s SMD Prototyping is a Work of Art

ne555_20yearsold
One of [NE555]’s boards from the 90s.
Over on twitter [NE555] has been posting beautiful SMD prototypes.

Back in the 90s when surface mount components gained widespread adoption, the quick and cheap PCB prototyping services of today were unavailable. This led many to develop their own approaches. In Japan a particularly novel and beautiful approach was, and still is, somewhat popular. [NE555]’s work is a excellent example of this technique using a fine enameled wire (you can find this on eBay as “magnet wire”), wirewrap board, and careful hand soldering. [NE555] has made a great video on the process (which you can watch below).

Continue reading “[NE555]’s SMD Prototyping is a Work of Art”

The People, Talks, and Swag of Open Hardware Summit

Friday was the 2016 Open Hardware Summit, a yearly gathering of people who believe in the power of open design. The use of the term “summit” rather than “conference” is telling. This gathering brings together a critical mass of people running hardware companies that adhere to the ideal of “open”, but this isn’t at the exclusion of anyone — all are welcome to attend. Hackaday has built the world’s largest repository of Open Hardware projects. We didn’t just want to be there — We sponsored, sent a team of people, and thoroughly enjoyed ourselves in the process.

Join me after the break for a look at the talks, a walk through the swag bags, and a feel for what this wonderful day held.

Continue reading “The People, Talks, and Swag of Open Hardware Summit”

Beware Common Sense Engineering

I am always torn about the title of “engineer.” When I talk to school kids about engineering, I tell that an engineer is a person who uses science and math to solve or analyze practical problems. However, these days you hear a lot of engineering titles thrown around to anyone who does any sort of technical (and sometimes non-technical) work. “Software engineers” don’t have to be licensed to practice, while civil engineers do. What’s in a name and does any of this matter?

Continue reading “Beware Common Sense Engineering”

Self-Driving R/C Car Uses An Intel NUC

Self-driving cars are something we are continually told will be the Next Big Thing. It’s nothing new, we’ve seen several decades of periodic demonstrations of the technology as it has evolved. Now we have real prototype cars on real roads rather than test tracks, and though they are billion-dollar research vehicles from organisations with deep pockets and a long view it is starting to seem that this is a technology we have a real chance of seeing at a consumer level.

A self-driving car may seem as though it is beyond the abilities of a Hackaday reader, but while it might be difficult to produce safe collision avoidance of a full-sized car on public roads it’s certainly not impossible to produce something with a little more modest capabilities. [Jaimyn Mayer] and [Kendrick Tan] have done just that, creating a self-driving R/C car that can follow a complex road pattern without human intervention.

The NUC's-eye view. The green line is a human's steering, the blue line the computed steering.
The NUC’s-eye view. The green line is a human’s steering, the blue line the computed steering.

Unexpectedly they have eschewed the many ARM-based boards as the brains of the unit, instead going for an Intel NUC mini-PC powered by a Core i5 as the brains of the unit. It’s powered by a laptop battery bank, and takes input from a webcam. Direction and throttle can be computed by the NUC and sent to an Arduino which handles the car control. There is also a radio control channel allowing the car to be switched from autonomous to human controlled to emergency stop modes.

They go into detail on the polarizing and neutral density filters they used with their webcam, something that may make interesting reading for anyone interested in machine vision. All their code is open source, and can be found linked from their write-up. Meanwhile the video below the break shows their machine on their test circuit, completing it with varying levels of success.

Continue reading “Self-Driving R/C Car Uses An Intel NUC”