Camera And Code Team Up To Make Impossible Hovering Laser Effect

Right off the bat, we’ll say that this video showing a laser beam stopping in mid-air is nothing but a camera trick. But it’s the trick that’s the hack, and you’ve got to admit that it looks really cool.

It starts with the [Tom Scott] video, the first one after the break. [Tom] is great at presenting fascinating topics in a polished and engaging way, and he certainly does that here. In a darkened room, a begoggled [Tom] poses with what appears to be a slow-moving beam of light, similar to a million sci-fi movies where laser weapons always seem to disregard the laws of physics. He even manages to pull a [Kylo Ren] on the slo-mo photons with a “Force Stop” as well as a slightly awkward Matrix-style bullet-time shot.  It’s entertaining stuff, and the effect is all courtesy of the rolling shutter effect. The laser beam is rapidly modulated in sync with the camera’s shutter, and with the camera turned 90 degrees, the effect is to slow down or even stop the beam.

The tricky part of the hack is the laser stuff, which is the handiwork of [Seb Lee-Delisle]. The second video below goes into detail on his end of the effect. We’ve seen [Seb]’s work before, with a giant laser Asteroids game and a trick NES laser blaster that rivals this effect.

20 thoughts on “Camera And Code Team Up To Make Impossible Hovering Laser Effect

    1. Looks like it’s an aid for framing the shot. They’re cropping the video to get horizontal result even though the camera is rotated 90 degrees to get the shutter roll in right direction.

    2. Just showing the size of the frame that will be displayed I think. Looks like they are cutting down a portrait capture into landscape by editing out the parts behind the tape.

    1. Unfortunately, no. This is essentially an in-camera special effect (and only works with cameras that have a rolling shutter and not a global shutter.) A person looking at the scene with their eyes and not through the camera only sees a continuous beam.

  1. FWIW, I always abstracted the “slo-mo sci-fi lasers” bit as the blasts not actually being packets of photons but rather bundles of plasma somehow created by lasers and propelled toward the target at great, but finite, velocity. In my imagination, there would be a reaction chamber inside the weapon that would inject some sort of gas into a high-energy laser beam, and the rapid heating would propel the ionized gas out to do its business. Could explain the pew-pew, as well – the expanding gas would make noise, at least when not in the vacuum of space. The same kind of thinking could explain the light saber – the plasma is formed into a blade shape by some sort of containment field.

    1. ‘Laser’ guns and cannons etc are described in Star Wars canon as actually being plasma, they use the fictional Tibanna gas and convert it to a plasma and fire it as a particle beam.
      Lando made his money mining the gas from Bespin.

  2. Oooh. That’s a pretty neat trick. Wish I’d thought of that — I’ve got a rather mundane Thorlabs DCC1545 camera (MT9M001 sensor) that’s at least pretty versatile: I’ve run it at a synthetic 50 kfps by exploiting this rolling shutter trick: very much like an oscilloscope’s “equivalent time sampling” trick. I never thought to try it with a modulated beam like this. Now I already know where next weekend’s free time is going.

      1. It’s the same method used for oscilloscope equivalent time sampling. It requires the camera to be synchronized with a repetitive action (or the action to be synced to the camera frame rate).

        The DCC1545 (and many others) have both a trigger in (to sync the camera to a repetitive action) and strobe output (to sync an external device or light source to the camera). The low-end DCC1545 doesn’t provide a connector for that, like its more expensive brothers, but both signals are available on a connector footprint on the board and it’s easy to add your own connector with a bit of dremel work on the case.

        The camera can run up to 50 000 lines per second (and more, but field of view decreases), with an integration time of one line period — one line happens at time 0, the next at 20 us, then 40 us. Though the whole frame might take 20 milliseconds, each line is a 20 microsecond time slice. Do this a thousand times, delaying the onset by 20 microseconds each time, and you have a complete image. It’s handy to have a pulse generator to do this, but I’m sure a competently-coded Arduino is up to the task.

        There are lots of ways to tweak this: you can even randomly sample (or free-run) and re-sort later, if you can reliably time stamp the event and camera frame trigger (‘strobe’) times.

        You also need a ton of light. 20 microsecond exposure times don’t catch a lot of photons. A kilowatt of tungsten halogen spotlights on a 50 cm field of view is just barely enough.

Leave a Reply to jpaCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.