2024 Home Sweet Home Automation: [HEX]POD – Climate Tracker And Digital Nose

[eBender] was travelling India with friends, when one got sick. Unable to find a thermometer anywhere during COVID, they finally ended up in a hospital. After being evacuated back home, [eBender] hatched an idea to create a portable gadget featuring a few travel essentials: the ability to measure body temperature and heart rate, a power bank and an illumination source. The scope evolved quite a lot, with the concept being to create a learning platform for environmental multi-sensor fusion. The current cut-down development kit hosts just the air quality measurement components, but expansion from this base shouldn’t be too hard.

ML for Hackers: Fiddle with that Tensor Flow

This project’s execution is excellent, with a hexagon-shaped enclosure and PCBs stacked within. As everyone knows, hexagons are the bestagons. The platform currently hosts SCD41 and SGP41 sensors for air quality, a BME688 for gas detection, LTR-308 for ambient light and motion, and many temperature sensors.

On top sits a 1.69-inch IPS LCD, with an OLED display on the side for always-on visualization. The user interface is completed with a joystick and a couple of buttons. An internal blower fan is ducted around the sensor array to pull not-so-fresh air from outside for evaluation. Control is courtesy of an ESP32 module, with the gory details buried deep in the extensive project logs, which show sensors and other parts being swapped in and out.

On the software side, some preliminary work is being done on training TensorFlow to learn the sensor fusion inputs. This is no simple task. Finally, we would have a complete package if [eBender] could source a hexagonal LCD to showcase that hexagon-orientated GUI. However, we doubt such a thing exists, which is a shame.

There are many air quality sensors on the market now, so we see a few hacks based on them, like this simple AQ sensor hub. Let’s not forget the importance of environmental CO2 detection; here’s something to get you started.

D-POINT: A Digital Pen With Optical-Inertial Tracking

[Jcparkyn] clearly had an interesting topic for their thesis project, and was conscientious enough to write up a chunk of it and release it to the wild. The project in question is a digital pen that uses some neat sensor fusion to combine the inputs from a pen-mounted gyro/accelerometer with data from an optical tracking system provided by an off-the-shelf webcam.

A six degrees of freedom (6DOF) tracking system is achieved as a result, with the pen-mounted hardware tracking orientation and the webcam tracking the 3D position. The pen itself is quite neat, with an ALPS/Alpine HSFPAR003A load sensor measuring the contact pressure transmitted to it from the stylus tip. A Seeed Xaio nRF52840 sense is on duty for Bluetooth and hosting the needed IMU. This handy little module deals with all the details needed for such a high-integration project and even manages the charging of a single 10440 lithium cell via a USB-C connector.

Positional tracking uses Visual Pose Estimation (VPE) assisted with ArUco markers mounted on the end of the stylus. A consumer-grade (i.e. uncalibrated) webcam is all that is required on the hardware side. The software utilizes the familiar OpenCV stack to unroll the effects of the webcam rolling shutter, followed by Perspective-n-Point (PnP) to estimate the pose from the corrected image stream. Finally, a coordinate space conversion is performed to determine the stylus tip position relative to the drawing surface.

The sensor fusion is taken care of with a Kalman filter, smoothed with the typical Rauch-Tung-Striebel (RTS) algorithm before being passed onto the final application. This process is running in Python using the NumPy module, as you would expect, but accelerated using the Numba JIT compiler.

Motion tracking is not news to us, we’ve seen many an implementation over the years, such as this one. But digital input pens? Why aren’t they more of a thing?

Thanks to [Oliver] for the tip!

Alfred Jones Talks About The Challenges Of Designing Fully Self-Driving Vehicles

The leap to self-driving cars could be as game-changing as the one from horse power to engine power. If cars prove able to drive themselves better than humans do, the safety gains could be enormous: auto accidents were the #8 cause of death worldwide in 2016. And who doesn’t want to turn travel time into something either truly restful or alternatively productive?

But getting there is a big challenge, as Alfred Jones knows all too well. The Head of Mechanical Engineering at Lyft’s level-5 self-driving division, his team is building the roof racks and other gear that gives the vehicles their sensors and computational hardware. In his keynote talk at Hackaday Remoticon, Alfred Jones walks us through what each level of self-driving means, how the problem is being approached, and where the sticking points are found between what’s being tested now and a truly steering-wheel-free future.

Check out the video below, and take a deeper dive into the details of his talk.

Continue reading “Alfred Jones Talks About The Challenges Of Designing Fully Self-Driving Vehicles”

Christal Gordon: Sensors, Fusion, And Neurobiology

Some things don’t sound like they should go together, but they do. Peanut butter and chocolate. Twinkies and deep frying. Bacon and maple syrup. Sometimes mixing things up can produce great results. [Dr. Christal Gordon’s] expertise falls into that category. She’s an electrical engineer, but she also studies neuroscience. This can lead to some interesting intellectual Reese’s peanut butter cups.

At the 2017 Hackaday Superconference, [Christal] spoke about sensor fusion. If you’ve done systems that have multiple sensors, you’ve probably run into that before even if you didn’t call it that. However, [Christal] brings the perspective of how biological systems fuse sensor data contrasted to how electronic systems perform similar tasks. You can see a video replay of her talk in the video below.

Continue reading “Christal Gordon: Sensors, Fusion, And Neurobiology”