Supercon: Alex Hornstein’s Adventures In Hacking The Lightfield

We are all familiar with the idea of a hologram, either from the monochromatic laser holographic images you’ll find on your bank card or from fictional depictions such as Princes Leia’s distress message from Star Wars. And we’ve probably read about how the laser holograms work with a split beam of coherent light recombined to fall upon a photographic plate. They require no special glasses or headsets and  possess both stereoscopic and spatial 3D rendering, in that you can view both the 3D Princess Leia and your bank’s logo or whatever is on your card as 3D objects from multiple angles. So we’re all familar with that holographic end product, but what we probably aren’t so familiar with is what they represent: the capture of a light field.

In his Hackaday Superconference talk, co-founder and CTO of holographic display startup Looking Glass Factory Alex Hornstein introduced us to the idea of the light field, and how its capture is key to  the understanding of the mechanics of a hologram.

Capturing the light field with a row of GoPro cameras.
Capturing the light field with a row of GoPro cameras.

His first point is an important one, he expands the definition of a hologram from its conventional form as one of those monochromatic laser-interference photographic images into any technology that captures a light field. This is, he concedes, a contentious barrier to overcome. To do that he first has to explain what a light field is.

When we take a 2D photograph, we capture all the rays of light that are incident upon something that is a good approximation to a single point, the lens of the camera involved. The scene before us has of course countless other rays that are incident upon other points or that are reflected from surfaces invisible from the single point position of the 2D camera. It is this complex array of light rays which makes up the light field of the image, and capturing it in its entirety is key to manipulating the result. This is true no matter the technology used to bring it to the viewer. A light field capture can be used to generate variable focus 2D images after the fact as is the case with the Lytro cameras, or it can be used to generate a hologram in the way that he describes.

One possible future use of the technology, a virtual holographic aquarium.
One possible future use of the technology, a virtual holographic aquarium.

The point of his talk is that complex sorcery isn’t required to capture a light field, something he demonstrates in front of the audience with a volunteer and a standard webcam on a sliding rail. Multiple 2D images are taken at different points, which can be combined to form a light field. The fact that not every component of the light field has been captured doesn’t matter as much as that there is enough to create the holographic image from the point of view of the display. And since he happens to be head honcho at a holographic display company he can show us the result. Looking Glass Factory’s display panel uses a lenticular lens to combine the multiple images into a hologram, and is probably one of the most inexpensive ways to practically display this type of image.

Since the arrival of the Lytro cameras a year or two ago the concept of a light field is one that has been in the air, but has more often been surrounded by an air of proprietary marketing woo. This talk breaks through that to deliver a clear explanation of the subject, and is a fascinating watch. Alex leaves us with news of some of the first light field derived video content being put online and with some decidedly science-fiction possible futures for the technology. Even if you aren’t planning to work in this field, you will almost certainly encounter it over the next few years.

Continue reading “Supercon: Alex Hornstein’s Adventures In Hacking The Lightfield”

HOPE XII: Make Your Own Holograms

Prior to this weekend I had assumed making holograms to be beyond the average hacker’s reach, either in skill or treasure. I was proven wrong by a Club-Mate box full of electronics, and an acrylic jig perched atop an automotive inner tube. At the Hope Conference, Tommy Johnson was sharing his hacker holography in a workshop that let a few lucky attendees make their own holograms on site!

The technique used here depends on interference patterns rather than beam splitting. A diffused laser beam is projected through holographic film onto the subject of the hologram — say a bouquet of flowers like in the video below. Photons from that beam reflect from the bouquet and pass back through the film a second time. Since light is a form of electromagnetic radiation that travels as a wave, anywhere that two peaks (one from the beam the other from the reflected light) align on the film, exposure occurs. With just a 1/2 second exposure the film is ready to be developed, and if everything went right you have created a hologram.

Simple, right? In theory, at least. In practice Tommy’s been doing this for nearly 30 years and has picked up numerous tips along the way. Let’s take a look at the hardware he brought for the workshop.

Continue reading “HOPE XII: Make Your Own Holograms”

A Telepresence System That’s Starting To Feel Like A Holodeck

[Dr. Roel Vertegaal] has led a team of collaborators from [Queen’s University] to build TeleHuman 2 — a telepresence setup that aims to project your actual-size likeness in 3D.

Developed primarily for business videoconferencing, the setup requires a bit of space on both ends of the call. A ring of stereoscopic z-cameras capture the subject from all angles which the corresponding projector on the other end displays. Those projectors are arranged in similar halo above a human-sized, retro-reflective cylindrical screen which can be walked around — viewing the image from any angle without a VR headset or glasses — in real-time!

Continue reading “A Telepresence System That’s Starting To Feel Like A Holodeck”

Inexpensive Display Jumps to Life

If you’ve ever been to a local fair or amusement park, chances are you’ve seen an illusion known as Pepper’s Ghost. To perform the illusion, essentially all that’s needed is a thin sheet of plastic or one-way mirror and a light source. Get it right, and you’ll have apparitions popping up in all kinds of interesting places. With just the right software, though, one of those places could be in your own 3D display.

Using just a tablet and a sheet of plastic rolled into a cone, a three-person team was able to create a 3D display using the Pepper’s Ghost illusion. Using special software that the team developed, an image is altered so that when it reflects off of the plastic cone the image appears as a 3D rendering of the original picture. The rendering is perspective-correct and offers a novel way to interact with a 3D model without needing expensive equipment or special glasses.

If you do have some fancy equipment sitting around, like a computer monitor and some plexiglass, similar 3D displays have been made which utilize similar effects. Right now the team that developed this one haven’t made their code open yet, but have promised to release it soon so that others can build their own displays.

Thanks to [bmsleight] for the tip!

Review: New 3G and Cat-M1 Cellular Hardware from Hologram

In July we reported on the launch of the Hologram developer program that offered a free SIM card and a small amount of monthly cellular data for those who wanted to build connectivity into their prototypes. Today, Hologram has launched some new hardware to go along with that program.

Nova is a cellular modem in a USB thumb drive form factor. It ships in a little box with a PCB that hosts the u-blox cellular module, two different antennas, a plastic enclosure, and a SIM card. The product is aimed at those building connected devices around single-board computers, making it easy to plug Nova in and get connected quickly.

This device that Hologram sent me is a 3G modem. They have something like 1,000 of them available to ship starting today, but what I find really exciting is that there is another flavor of Nova that looks the same but hosts a Cat-M1 version of the u-blox module. This is a Low Power Wide Area Network technology built on the LTE network. We’ve seen 2G and 3G modems available for some time now, but if go that route you’re building a product around a network which has an end-of-life concern.

Cat-M1 will be around for much longer and it is designed to be low power and utilizes a narrower bandwidth for less radio-on time. I asked Hologram for some power comparison estimates between the two technologies:

AVERAGE current consumption comparisons:

Cat-M1: as low as 100 mA while transmitting and never more than 190 mA
Equivalent 3G: as high as 680 mA while transmitting

PEAK current consumption comparisons (these are typically filtered through capacitors so the power supply doesn’t ever witness these values, and they are only momentary):

Cat-M1: Less than 490 mA
Equivalent 3G: As high as 1550 mA

This is an exciting development because we haven’t yet seen LTE radios available for devices — of course there are hotspots but those are certainly not optimized for low power or inclusion in a product. But if you know your ESP8266 WiFi specs you know that those figures above put Cat-M1 on a similar power budget and in the realm of battery-operated devices.

The Cat-M1 Nova can be ordered beginning today, should ship in limited quantities within weeks, with wider availability by the end of the year. If you can’t get one in the first wave, the 3G Nova is a direct stand-in from the software side of things.

I suspect we’ll see a lot of interest in Cat-M1 technology moving forward simply because of the the technology promises lower power and longer support. (I’m trying to avoid using the term IoT… oops, there it is.) For today, let’s take a look at the 3G version of the new hardware and the service that supports it.

Continue reading “Review: New 3G and Cat-M1 Cellular Hardware from Hologram”

Shapes Made From Light, Smoke, and A Lot of Mirrors

Part lightshow, part art piece, part exploratory technology, Light Barrier (third edition) by South Korean duo [Kimchi and Chips] crafts a visual and aural experience of ephemeral light structures using projectors, mirrors, and a light fog.

Presently installed at the ACT Center of Asia Culture Complex in Gwangju, South Korea, Light Barrier co-ordinates eight projectors, directing their light onto a concave cluster of 630 mirrors. As a result, an astounding 16 million ‘pixel beams’ of refocused light simulate shapes above the array.  The array itself was designed in simulation using an algorithm which — with subtle adjustments to each mirror — “grew” the display so as to line up the reflecting vectors. Upon setup, final calibration of the display used Rulr to treat each ‘pixel beam’ as a ray in 3D space to ensure image accuracy once the show began. Check out a preview after the break! Continue reading “Shapes Made From Light, Smoke, and A Lot of Mirrors”

Helix Display Brings Snake Into Three Dimensions

Any time anyone finds a cool way to display in 3D — is there an uncool way? — we’re on board. Instructables user [Gelstronic]’s method involves an array of spinning props to play the game Snake in 3D.

The helix display consists of twelve props, precisely spaced and angled using 3D-printed parts, each with twelve individually addressable LEDs. Four control groups of 36 LEDs are controlled by the P8XBlade2 propeller microcontroller, and the resultant 17280 voxels per rotation are plenty to produce an identifiable image.

In order to power the LEDs, [Gelstronic] used wireless charging coils normally used for cell phones, transferring 10 W of power to the helix array.  A brushless motor keeps things spinning, while an Arduino controls speed and position via an encoder. All the links to the code used are found on the project page, but we have the video of the display in action is after the break.

Continue reading “Helix Display Brings Snake Into Three Dimensions”