Screenshot of the demonstration video that shows the desktop being unlocked with face recognition, with a camera feed and a terminal showing how the software works.

Open-Source FaceID With RealSense

RealSense cameras have been a fascinating piece of tech from Intel — we’ve seen a number of cool applications in the hacker world, from robots to smart appliances. Unfortunately Intel did discontinue parts of the RealSense lineup at one point, specifically the LiDAR and face tracking-tailored models. Apparently, these haven’t been popular, and we haven’t seen these in hacks either. Until now, that is. [Lina] brings us a real-world application for the RealSense face tracking cameras, a FaceID application for Linux.

The project is as simple as it sounds: if the camera’s built-in face recognition module recognizes you, your lockscreen is unlocked. With the target being Linux, it has to tie into the Pluggable Authentication Modules (PAM) subsystem for authentication, and of course, there’s a PAM module for RealSense to go with it, aptly named pam_sauron. This module is written in Zig, a modern C-like language, so it’s both a good example of how to create your own PAM integrations, and a path towards doing that in a different language for once. As usual, there’s TODOs, like improving the UX and taking advantage of some security features RealSense cameras have, but it’s nevertheless a fun and self-sufficient application for one of the F4XX-series RealSense cameras in case you happen to own one.

Ever since the introduction of RealSense we’ve seen these cameras used in robotics and 3D scanning, thanks at least in part due to their ability to be used in Linux. Thankfully, Intel only discontinued the less popular RealSense cameras, which didn’t affect the main RealSense lineup, and the hacker-beloved depth cameras are still available for all of our projects. Wondering about the tech behind it? Here’s a teardown of a RealSense camera module intended for laptop use.

Surfsonar shows the depth of water while surfing

Surf Sensor Adds Depth To Finding The Ultimate Wave

To say that the ocean is a dynamic environment would be a gross understatement, especially when coastlines are involved. Waves crash, tides go in and out, and countless variables make even the usual conditions a guessing game. When [foobarbecue] goes surfing, he tries to take into account all of these things. The best waves at his local beach are directly over an ever-moving sand bar, and their dynamics are affected by depth, another constant variable. [foobarbecue]’s brilliant solution to understanding current conditions? Build a depth finder directly into his surf board!

At the heart of the “surfsonar” is the Ping Sonar Echosounder, a sonar transducer designed for AUV’s and ROV’s. [foobarbecue] embedded the transducer directly into the board. Data is fed to a Raspberry Pi 4b, which displays depth and confidence (a percentage of how sure it is of the measurement) on a 2.13 inch e-Paper Display Hat.

Power is provided by a PiSugar. Charging is done wirelessly, which we’d say is pretty important considering that the whole device is sealed inside a modified surfboard.

While it’s not a low budget build, and there’s yet room for improvement, early reports are positive. Once away from the breaking waves, the device confidently shows the depth. More testing will show if the surfsonar will help [foobarbecue] find that ever-moving sandbar!

Surf hacks are always welcome, we’ve featured the LED Strip Lit Surfboard as well as the Surf Window, which tells its owner if the surf is up. Be sure to let us know about any cool hacks you find when you’re out surfing the ‘net via our Tips Line!

Kinect Gave Us A Preview Of The Future, Though Not The One It Intended

This holiday season, the video game industry hype machine is focused on building excitement for new PlayStation and Xbox consoles. Ten years ago, a similar chorus of hype reached a crescendo with the release of Xbox Kinect, promising to revolutionize how we play. That vision never panned out, but as [Daniel Cooper] of Engadget pointed out in a Kinect retrospective, it premiered consumer technologies that impacted fields far beyond gaming.

Kinect has since withdrawn from the gaming market, because as it turns out gamers are quite content with handheld controllers. This year’s new controllers for a PlayStation or Xbox would be immediately familiar to gamers from ten years ago. Even Nintendo, whose Wii is frequently credited as motivation for Microsoft to develop the Kinect, have arguably taken a step back with Joy-cons of their Switch.

But the Kinect’s success at bringing a depth camera to consumer price levels paved the way to explore many ideas that were previously impossible. The flurry of enthusiastic Kinect hacking proved there is a market for depth camera peripherals, leading to plug-and-play devices like Intel RealSense to make depth-sensing projects easier. The original PrimeSense technology has since been simplified and miniaturized into Face ID unlocking Apple phones. Kinect itself found another job with Microsoft’s HoloLens AR headset. And let’s not forget the upcoming wave of autonomous cars and drones, many of which will see their worlds via depth sensors of some kind. Some might even be equipped with the latest sensor to wear the Kinect name.

Inside the Kinect was also one of the earliest microphone arrays sold to consumers. Enabling the Kinect to figure out which direction a voice is coming from, and isolate it from other noises in the room. Such technology were previously the exclusive domain of expensive corporate conference room speakerphones, but now it forms the core of inexpensive home assistants like an Amazon Echo Dot. Raising the bar so much that hacks needed many more microphones just to stand out.

With the technology available more easily elsewhere, attrition of a discontinued device is reflected in the dwindling number of recent Kinect hacks on these pages. We still see a cool project every now and then, though. As the classic sensor bar itself recedes into history, others will take its place to give us depth sensing and smart audio. But for many of us, Kinect was the ambitious videogame peripheral that gave us our first experience.

Handheld 3D Scanning, Using Raspberry Pi 4 And Intel RealSense Camera

Raspberry Pi 4 (with USB 3.0) and Intel RealSense D415 depth sensing camera.

When the Raspberry Pi 4 came out, [Frank Zhao] saw the potential to make a realtime 3D scanner that was completely handheld and self-contained. The device has an Intel RealSense D415 depth-sensing camera as the main sensor, which uses two IR cameras and an RGB camera along with the Raspberry Pi 4. The Pi uses a piece of software called RTAB-Map — intended for robotic applications — to take care of using the data from the camera to map the environment in 3D and localize itself within that 3D space. Everything gets recorded in realtime.

This handheld device can act as a 3D scanner because the data gathered by RTAB-Map consists of a point cloud of an area as well as depth information. When combined with the origin of the sensing unit (i.e. the location of the camera within that area) it can export a point cloud into a mesh and even apply a texture derived from the camera footage. An example is shown below the break.
Continue reading “Handheld 3D Scanning, Using Raspberry Pi 4 And Intel RealSense Camera”