Improve Your Vision With Computer Generated Glasses

[Vitor Pamplona] sent in a project presented at this years SIGGRAPH. It’s a piece of hardware that corrects vision without the need for lenses. Yep. software-defined eyeglasses now exist, even if the project is a bit bulky for daily wear.

[Vitor] et al came up with two versions of hardware for this project. The first is a dual stack of high-resolution LCD displays, while the second revision is an LCD with a lenticular overlay. With this hardware, the team can change the focal plane of an entire image, or just subsets of an image allowing for customized vision correction for anyone with nearsightedness, farsightedness, astigmatism, presbyopia, and even cataracts.

With plenty of head-mounted augmented reality platforms coming down the pipe such as Google’s Project Glass and a few retina displays, we could see this type of software-defined vision correction being very useful for the 75% of adults who use some form of vision correction. It may just be a small step towards the creation of a real-life VISOR, but we glasses-wearing folk will take what we can get.

You can check out the .PDF of the paper here, or watch the video after the break.

Continue reading “Improve Your Vision With Computer Generated Glasses”

Tracking Small Changes In Video To See Someone’s Pulse

[Gil] sent in an awesome paper from this year’s SIGGRAPH. It’s a way to detect subtle changes in a video feed from [Hao-Yu Wu, et al.] at the MIT CS and AI lab and Quanta Research. To get a feel for what this paper is about, check out the video and come back when you pick your jaw off the floor.

The project works by detecting and amplifying very small changes in color occurring in several frames of video. From the demo, the researchers were able to detect someone’s pulse by noting the very minute changes in the color of their skin whenever their face is pumped full of blood.

A neat side effect of detecting small changes in color is the ability to also detect motion. In the video, there’s an example of detecting someone’s pulse by exaggerating the expanding artery in someone’s wrist, and the change in a shadow produced by the sun over the course of 15 seconds. This is Batman-level tech here, and we can’t wait to see an OpenCV library for this.

Even though the researchers have shown an extremely limited use case – just pulses and breathing – we’re seeing a whole lot of potential applications. We’d love to see an open source version of this tech turned into a lie detector for the upcoming US presidential debates, and the motion exaggeration is  perfect for showing why every sports referee is blind as a bat.

If you want to read the actual paper, here’s the PDF. As always, video after the break.

Continue reading “Tracking Small Changes In Video To See Someone’s Pulse”

Now Pictures On The Internet Can Be Faked

We know it’s shopped, but we can’t tell because of the pixels. PhD student [Kevin Karsch] along with a few other friends will be presenting their methods to render objects into preexisting photos at SIGGRAPH Asia next month.

The paper (PDF…) covers how [Kevin] et al. go about putting impossible objects into photos. The user first defines the geometry of the picture; legs of tables are defined and the table top is extruded from these legs. The lights are then defined by drawing a bounding box and with a little bit of algorithmic trickery, a 3D object is inserted into the scene.

Comparing the results to the original picture is jaw-dropping. For us, photoshopping a bunch of billiard balls on a pool table would take hours, and it would never look quite right. [Kevin]’s work for SIGGRAPH can do the whole scene in minutes and produces results we couldn’t dream of.

There’s no downloadable software yet, but the algorithms are there. Check out the video demo of the techniques and results after the break.

Continue reading “Now Pictures On The Internet Can Be Faked”

DIY 3D Gets A Nod At SIGGRAPH

3dscan

Among the courses at this year’s SIGGRAPH (an annual technical conference and showcase of the latest in computer graphics research) was an introduction to 3D scanning that covers all the bases: mathematical foundations, two different build-your-own hardware approaches, and how to process and render the resulting datasets. The presenters have assembled all the course materials on a top-notch web site featuring slide shows, complete source code, and an extensive round-up with links to both commercial and homebrew 3D scanning gear. The simplest of these methods requires nothing more than a webcam, halogen light source, and a stick!

SIGGRAPH and 3D scanning have been highlighted many times on Hack a Day, but we’re swelling with pride now seeing an academic venue give a favorable nod to the DIY hacking community (on their links page). Okay, so Hack a Day isn’t called out by name, but just wait’ll next year!

[Thanks Fahrzin]

SIGGRAPH 2008: The Quest For More Pixels


Long before we started reporting on [Dan Kaminsky]’s DNS chicanery, he contributed a guest post about one of our favorite sources of new technology: SIGGRAPH. The stars have aligned again and we’re happy to bring you his analysis of this year’s convention. [photo: Phong Nguyen]

So, last week, I had the pleasure of being stabbed, scanned, physically simulated, and synthetically defocused. Clearly, I must have been at SIGGRAPH 2008, the world’s biggest computer graphics conference. While it usually conflicts with Black Hat, this year I actually got to stop by, though a bit of a cold kept me from enjoying as much of it as I’d have liked. Still, I did get to walk the exhibition floor, and the papers (and videos) are all online, so I do get to write this (blissfully DNS and security unrelated) report.

Continue reading “SIGGRAPH 2008: The Quest For More Pixels”

Hackit: The Bronco Table


While attending LA SIGGRAPH Maker Night, we got to talk to [Brett Doar] about his Bronco Table. The table is meant to make life more difficult by bucking off anything that’s set on top of it. Right now, it uses a tiny piezo mic to listen for the impact and then drives three leg motors in a random pattern. He envisions later generations either running away or following you intently when something is set on them.

The main problem with the current design is that you have to hit the table hard enough to make a noise the mic can pick up. The ideal solution would be able to detect anything, no matter what the material or how forcefully it was set down. How would you detect objects being placed on the surface (table doesn’t have to be wood)?

LA SIGGRAPH Maker Night


We coaxed our friends at Mahalo Daily into coming along with us to LA SIGGRAPH’s Maker Night. There were a handful of interesting projects there. [Univac] was showing a circuit bent Teletubby and his CellularRecombomat. [Brett Doar] brought his Bronco Table. Tired of engineers building items that made life easier, he decided to make something that made life more difficult. The table uses a piezo to detect the sound of something being set on top. It then starts twitching and bucking to shake the item free. The motors look like they’re salvaged window motors. Finally, we talked to [Mark Frauenfelder] from BoingBoing/Make about how he got into the DIY culture.