DME With A Twist Of LimeSDR

Navigating aircraft today isn’t like the old days. No more arrows painted on a barn roof or rotating airway beacons. Now, there are a host of radio navigation aids. GPS, of course, is available. But planes often use VOR to determine a bearing to a known point and DME — distance measuring equipment — to measure the distance to that point. DME operates around 1000 MHz and is little more than a repeater. An airplane sends a pair of pulses, and times how long it takes for the DME to repeat them. [Daniel Estévez] has been monitoring these transmissions with a LimeSDR.

Like most repeaters, the DME transponders listen on one frequency and transmit on another. Those frequencies are 63 MHz apart. This poses a challenge for some types of SDRs which have limits on bandwidth.

Continue reading “DME With A Twist Of LimeSDR”

Playing Rock, Paper Scissors With A Time Of Flight Sensor

You can do all kinds of wonderful things with cameras and image recognition. However, sometimes spatial data is useful, too. As [madmcu] demonstrates, you can use depth data from a time-of-flight sensor for gesture recognition, as seen in this rock-paper-scissors demo.

If you’re unfamiliar with time-of-flight sensors, they’re easy enough to understand. They measure distance by determining the time it takes photons to travel from one place to another. For example, by shooting out light from the sensor and measuring how long it takes to bounce back, the sensor can determine how far away an object is. Take an array of time-of-flight measurements, and you can get simple spatial data for further analysis.

The build uses an Arduino Uno R4 Minima, paired with a demo board for the VL53L5CX time-of-flight sensor. The software is developed using NanoEdge AI Studio. In a basic sense, the system uses a machine learning model to classify data captured by the time-of-flight sensor into gestures matching rock, paper, or scissors—or nothing, if no hand is present. If you don’t find [madmcu]’s tutorial enough, you can take a look at the original version from STMicroelectronics, too.

It takes some training, and it only works in the right lighting conditions, but this is a functional system that can determine real hand sign and play the game. We’ve seen similar techniques help more advanced robots cheat at this game before, too! What a time to be alive.

CUDA, But Make It AMD

Compute Unified Device Architecture, or CUDA, is a software platform for doing big parallel calculation tasks on NVIDIA GPUs. It’s been a big part of the push to use GPUs for general purpose computing, and in some ways, competitor AMD has thusly been left out in the cold. However, with more demand for GPU computation than ever, there’s been a breakthrough. SCALE from [Spectral Compute] will let you compile CUDA applications for AMD GPUs.

SCALE allows CUDA programs to run as-is on AMD GPUs, without modification. The SCALE compiler is also intended as a drop-in swap for nvcc, right down to the command line options. For maximum ease of use, it acts like you’ve installed the NVIDIA Cuda Toolkit, so you can build with cmake just like you would for a normal NVIDIA setup. Currently, Navi 21 and Navi 31 (RDNA 2.0 and RDNA 3.0) targets are supported, while a number of other GPUs are undergoing testing and development.

The basic aim is to allow developers to use AMD hardware without having to maintain an entirely separate codebase. It’s still a work in progress, but it’s a promising tool that could help break NVIDIA’s stranglehold on parts of the GPGPU market.

 

Show Us Your Minimalist Games, And Win

Sometimes the tightest constraints inspire the highest creativity. The 2024 Tiny Games Challenge invites you to have the most fun with the most minimal setup. Whether that’s tiny size, tiny parts count, or tiny code, we want you to show us that big fun can come in small packages.

The Tiny Games Challenge starts now and runs through September 10th, with the top three entries receiving a $150 gift certificate courtesy of DigiKey.

Continue reading “Show Us Your Minimalist Games, And Win”

Giving People An Owl-like Visual Field Via VR Feels Surprisingly Natural

We love hearing about a good experiment, and here’s a pretty neat one: researchers used a VR headset, an off-the-shelf VR360 camera, and some custom software to glue them together. The result? Owl-Vision squashes a full 360° of un-distorted horizontal visual perception into 90° of neck travel to either side. One can see all around oneself, without needing to physically turn one’s head any further than is natural.

It’s still a work in progress, and accessing the paper currently doesn’t have a free option, but the demonstration video at that link (also embedded below) gives a solid overview of what’s going on.

Continue reading “Giving People An Owl-like Visual Field Via VR Feels Surprisingly Natural”

Smart Ball Technology Has Reached Football, But The Euros Show Us It’s Not Necessarily For The Better

Adidas brought smart balls to Euro 2024, for better or worse. Credit: Adidas

The good old fashioned game of football used to be a simple affair. Two teams of eleven, plus a few subs, who were all wrangled by a referee and a couple of helpful linesmen. Long ago, these disparate groups lived together in harmony. Then, everything changed when VAR attacked.

Suddenly, technology was being used to adjudicate all kinds of decisions, and fans were cheering or in uproar depending on how the hammer fell. That’s only become more prevalent in recent times, with smart balls the latest controversial addition to the world game. With their starring role in the Euro 2024 championship more than evident, let’s take a look at what’s going on with this new generation of intelligent footballs.

Continue reading “Smart Ball Technology Has Reached Football, But The Euros Show Us It’s Not Necessarily For The Better”

Time’s Up For Mbed

In a forum post has come the announcement that mBed, ARM’s accessible microcontroller development platform, is to reach end-of-life in July 2026. This means that the online platform and OS will no longer be supported by ARM, though the latter will remain an open source project. The website will be shuttered, and no new projects can be created after that date using ARM infrastructure.

mBed was originally launched back in 2009, as a competitor to the Arduino IDE for ARM’s chips. Its easy development made it attractive and there were soon an array of boards from different manufacturers supporting it, but perhaps due to its support for only the one architecture, it failed to find success. It’s certainly not the first time a single-architecture microcontroller development platform has been discontinued, we need only look to the Intel Edison for that, but given the success of ARM platforms in general it’s still something of a surprise. Perhaps it’s time to take the press release explanation that other platforms such as Arduino have simply been much more popular.

Will a community form around an open source mBed? Given that it’s been a definite minority among Hackaday projects over the years, while we hope it does, we’re not so sure.

mBed board image: Viswesr, CC BY-SA 3.0.