Playing Rock, Paper Scissors With A Time Of Flight Sensor

You can do all kinds of wonderful things with cameras and image recognition. However, sometimes spatial data is useful, too. As [madmcu] demonstrates, you can use depth data from a time-of-flight sensor for gesture recognition, as seen in this rock-paper-scissors demo.

If you’re unfamiliar with time-of-flight sensors, they’re easy enough to understand. They measure distance by determining the time it takes photons to travel from one place to another. For example, by shooting out light from the sensor and measuring how long it takes to bounce back, the sensor can determine how far away an object is. Take an array of time-of-flight measurements, and you can get simple spatial data for further analysis.

The build uses an Arduino Uno R4 Minima, paired with a demo board for the VL53L5CX time-of-flight sensor. The software is developed using NanoEdge AI Studio. In a basic sense, the system uses a machine learning model to classify data captured by the time-of-flight sensor into gestures matching rock, paper, or scissors—or nothing, if no hand is present. If you don’t find [madmcu]’s tutorial enough, you can take a look at the original version from STMicroelectronics, too.

It takes some training, and it only works in the right lighting conditions, but this is a functional system that can determine real hand sign and play the game. We’ve seen similar techniques help more advanced robots cheat at this game before, too! What a time to be alive.

CUDA, But Make It AMD

Compute Unified Device Architecture, or CUDA, is a software platform for doing big parallel calculation tasks on NVIDIA GPUs. It’s been a big part of the push to use GPUs for general purpose computing, and in some ways, competitor AMD has thusly been left out in the cold. However, with more demand for GPU computation than ever, there’s been a breakthrough. SCALE from [Spectral Compute] will let you compile CUDA applications for AMD GPUs.

SCALE allows CUDA programs to run as-is on AMD GPUs, without modification. The SCALE compiler is also intended as a drop-in swap for nvcc, right down to the command line options. For maximum ease of use, it acts like you’ve installed the NVIDIA Cuda Toolkit, so you can build with cmake just like you would for a normal NVIDIA setup. Currently, Navi 21 and Navi 31 (RDNA 2.0 and RDNA 3.0) targets are supported, while a number of other GPUs are undergoing testing and development.

The basic aim is to allow developers to use AMD hardware without having to maintain an entirely separate codebase. It’s still a work in progress, but it’s a promising tool that could help break NVIDIA’s stranglehold on parts of the GPGPU market.

 

Show Us Your Minimalist Games, And Win

Sometimes the tightest constraints inspire the highest creativity. The 2024 Tiny Games Challenge invites you to have the most fun with the most minimal setup. Whether that’s tiny size, tiny parts count, or tiny code, we want you to show us that big fun can come in small packages.

The Tiny Games Challenge starts now and runs through September 10th, with the top three entries receiving a $150 gift certificate courtesy of DigiKey.

Continue reading “Show Us Your Minimalist Games, And Win”

Giving People An Owl-like Visual Field Via VR Feels Surprisingly Natural

We love hearing about a good experiment, and here’s a pretty neat one: researchers used a VR headset, an off-the-shelf VR360 camera, and some custom software to glue them together. The result? Owl-Vision squashes a full 360° of un-distorted horizontal visual perception into 90° of neck travel to either side. One can see all around oneself, without needing to physically turn one’s head any further than is natural.

It’s still a work in progress, and accessing the paper currently doesn’t have a free option, but the demonstration video at that link (also embedded below) gives a solid overview of what’s going on.

Continue reading “Giving People An Owl-like Visual Field Via VR Feels Surprisingly Natural”

Smart Ball Technology Has Reached Football, But The Euros Show Us It’s Not Necessarily For The Better

Adidas brought smart balls to Euro 2024, for better or worse. Credit: Adidas

The good old fashioned game of football used to be a simple affair. Two teams of eleven, plus a few subs, who were all wrangled by a referee and a couple of helpful linesmen. Long ago, these disparate groups lived together in harmony. Then, everything changed when VAR attacked.

Suddenly, technology was being used to adjudicate all kinds of decisions, and fans were cheering or in uproar depending on how the hammer fell. That’s only become more prevalent in recent times, with smart balls the latest controversial addition to the world game. With their starring role in the Euro 2024 championship more than evident, let’s take a look at what’s going on with this new generation of intelligent footballs.

Continue reading “Smart Ball Technology Has Reached Football, But The Euros Show Us It’s Not Necessarily For The Better”

Time’s Up For Mbed

In a forum post has come the announcement that mBed, ARM’s accessible microcontroller development platform, is to reach end-of-life in July 2026. This means that the online platform and OS will no longer be supported by ARM, though the latter will remain an open source project. The website will be shuttered, and no new projects can be created after that date using ARM infrastructure.

mBed was originally launched back in 2009, as a competitor to the Arduino IDE for ARM’s chips. Its easy development made it attractive and there were soon an array of boards from different manufacturers supporting it, but perhaps due to its support for only the one architecture, it failed to find success. It’s certainly not the first time a single-architecture microcontroller development platform has been discontinued, we need only look to the Intel Edison for that, but given the success of ARM platforms in general it’s still something of a surprise. Perhaps it’s time to take the press release explanation that other platforms such as Arduino have simply been much more popular.

Will a community form around an open source mBed? Given that it’s been a definite minority among Hackaday projects over the years, while we hope it does, we’re not so sure.

mBed board image: Viswesr, CC BY-SA 3.0.

Flexures Make Robotic Fingers Simpler To Print

Designing an anthropomorphic robotic hand seems to make a lot of sense — right up until the point that you realize just how complex the human hand is. What works well in bone and sinew often doesn’t translate well to servos and sensors, and even building a single mechanical finger can require dozens of parts.

Or, if you’re as clever about things as [Adrian Perez] is, only one part. His print-in-place robotic finger, adorably dubbed “Fingie,” is a huge step toward simplifying anthropomorphic manipulators. Fingie is printed in PLA and uses flexures for the three main joints of the finger, each of which consists of two separate and opposed coil springs. The flexures allow the phalanges to bend relative to each other in response to the motion of three separate tendons that extend through a channel on the palmar aspect of the finger, very much like the real thing.

The flexures eliminate the need for bearings at each joint and greatly decrease the complexity of the finger, but the model isn’t perfect. As [Adrian] points out, the off-center attachment for the tendons makes the finger tend to curl when the joints are in flexion, which isn’t how real fingers work. That should be a pretty easy fix, though. And while we appreciate the “one and done” nature of this print, we’d almost like to see the strap-like print-in-place tendons replaced with pieces of PLA filament added as a post-processing step, to make the finger more compact and perhaps easier to control.

Despite the shortcomings, and keeping in mind that this is clearly a proof of concept, we really like where [Adrian] is going with this, and we’re looking forward to seeing a hand with five Fingies, or four Fingies and a Thumbie. It stands to be vastly simpler than something like [Will Cogley]’s biomimetic hand, which while an absolute masterpiece of design, is pretty daunting for most of us to reproduce.

Continue reading “Flexures Make Robotic Fingers Simpler To Print”