Hackaday Links Column Banner

Hackaday Links: February 18, 2024

So it turns out that walking around with $4,000 worth of hardware on your head isn’t quite the peak technology experience that some people thought it would be. We’re talking about the recently released Apple Vision Pro headset, which early adopters are lining up in droves to return. Complaints run the gamut from totally foreseeable episodes of motion sickness to neck pain from supporting the heavy headset. Any eyeglass wearer can certainly attest to even lightweight frames and lenses becoming a burden by the end of the day. We can’t imagine what it would be like to wear a headset like that all day. Ergonomic woes aside, some people are feeling buyer’s remorse thanks to a lack of apps that do anything to justify the hefty price tag. The evidence for a wave of returns is mostly gleaned from social media posts, so it has to be taken with a grain of salt. We wouldn’t expect Apple to be too forthcoming with official return figures, though, so the ultimate proof of uptake will probably be how often you spot one in the wild. Apart from a few cities and only for the next few weeks, we suspect sightings will be few and far between.

Continue reading “Hackaday Links: February 18, 2024”

Hackaday Links Column Banner

Hackaday Links: February 11, 2024

Apple’s Vision Pro augmented reality goggles made a big splash in the news this week, and try as we might to resist the urge to dunk on them, early adopters spotted in the wild are making it way too easy. Granted, we’re not sure how many of these people are actually early adopters as opposed to paid influencers, but there was still quite a bit of silliness to be had, most of it on X/Twitter. We’d love to say that peak idiocy was achieved by those who showed themselves behind the wheels of their Teslas while wearing their goggles, with one aiming for an early adopter perfecta, but alas, most of these stories appear to be at least partially contrived. Some people were spotted doing their best to get themselves killed, others were content to just look foolish, especially since we’ve heard that the virtual keyboard is currently too slow for anything but hunt-and-peck typing, which Casey Niestat seemed to confirm with his field testing. After seeing all this, we’re still unsure why someone would strap $4,000 worth of peripheral-vision-restricting and easily fenced hardware to their heads, but hey — different strokes. And for those of you wondering why these things are so expensive, we’ve got you covered.

Continue reading “Hackaday Links: February 11, 2024”

Beautifully Rebuilding A VR Headset To Add AR Features

[PyottDesign] recently wrapped up a personal project to create himself a custom AR/VR headset that could function as an AR (augmented reality) platform, and make it easier to develop new applications in a headset that could do everything he needed. He succeeded wonderfully, and published a video showcase of the finished project.

Getting a headset with the features he wanted wasn’t possible by buying off the shelf, so he accomplished his goals with a skillful custom repackaging of a Quest 2 VR headset, integrating a Stereolabs Zed Mini stereo camera (aimed at mixed reality applications) and an Ultraleap IR 170 hand tracking module. These hardware modules have tons of software support and are not very big, but when sticking something onto a human face, every millimeter and gram counts.

Continue reading “Beautifully Rebuilding A VR Headset To Add AR Features”

Simple Cubes Show Off AI-Driven Runtime Changes In VR

AR and VR developer [Skarredghost] got pretty excited about a virtual blue cube, and for a very good reason. It marked a successful prototype of an augmented reality experience in which the logic underlying the cube as a virtual object was changed by AI in response to verbal direction by the user. Saying “make it blue” did indeed turn the cube blue! (After a little thinking time, of course.)

It didn’t stop there, of course, and the blue cube proof-of-concept led to a number of simple demos. The first shows off a row of cubes changing color from red to green in response to musical volume, then a bundle of cubes change size in response to microphone volume, and cubes even start moving around in space.

The program accepts spoken input from the user, converts it to text, sends it to a natural language AI model, which then creates the necessary modifications and loads it into the environment to make runtime changes in Unity. The workflow is a bit cumbersome and highlights many of the challenges involved, but it works and that’s pretty nifty.

The GitHub repository is here and a good demonstration video is embedded just under the page break. There’s also a video with a much more in-depth discussion of what’s going on and a frank exploration of the technical challenges.

If you’re interested in this direction, it seems [Skarredghost] has rounded up the relevant details. And should you have a prototype idea that isn’t necessarily AR or VR but would benefit from AI-assisted speech recognition that can run locally? This project has what you need.

Continue reading “Simple Cubes Show Off AI-Driven Runtime Changes In VR”

Inspect The RF Realm With Augmented Reality

Intellectually, we all know that we exist in a complex soup of RF energy. Cellular, WiFi, TV, public service radio, radar, ISM-band transmissions from everything from thermometers to garage door openers — it’s all around us. It would be great to see these transmissions, but alas, most of us don’t come from the factory with the correct equipment.

Luckily, aftermarket accessories like RadioFieldAR by [Manahiyo] make it possible to visualize RF signals. As the name suggests, this is an augmented reality system that lets you inspect the RF world around you. The core of the system is a tinySA, a pocket-sized spectrum analyzer that acts as a broadband receiver. A special antenna is connected to the tinySA; unfortunately, there are no specifics on the antenna other than it needs to have a label with an image of the Earth attached to it, for antenna tracking purposes. The tinySA is connected to an Android phone — one that supports Google’s ARCore — by a USB OTG cable, and a special app on the phone runs the show.

By slowly moving the antenna around in the field of view of the phone’s camera, a heat map of signal strength at a particular frequency is slowly built up. The video below shows it in action, and the results are pretty cool. If you don’t have a tinySA, fear not — [Manahiyo] has a version of the app that supports a plain old RTL-SDR dongle too. That should make it easy for just about anyone to try this out.

And if you’re feeling deja vu about this, you’re probably remembering the [Manahiyo]’s VR spectrum analyzer, upon which this project is based.

Continue reading “Inspect The RF Realm With Augmented Reality”

Hackaday Prize 2022: Hedge Watcher Aims To Save Precious Bird Life

Hedges aren’t just a pretty garden decoration. They’re also a major habitat for many species of insects, birds, and other wildlife. In some areas, a lot of hedge trimming goes during the time that local birds are raising their fledglings, which causes harm at a crucial time. Thus, [Johann Elias Stoetzer] and fellow students were inspired to create Hedge Watcher.

Birds can easily blend in with their surroundings, but thermal cameras are a great way to spot them.

The concept is simple – using thermal vision to spot birds inside a hedge when they may not otherwise be easily visible. Many species blend in with their surroundings in a visual manner, so thermal imaging is a great way to get around this. It can help to avoid destroying nests or otherwise harming birds when trimming back hedges. The idea was sourced from large-scale agricultural operations, which regularly use thermal cameras mounted on drones to look for wildlife before harvesting a field.

However, staring at a thermal camera readout every few seconds while trimming hedges isn’t exactly practical. Instead, the students created an augmented reality (AR) monocular to allow the user to trim hedges at the same time as keeping an eye on the thermal camera feed. Further work involved testing a binocular AR headset, as well as a VR headset. The AR setups proved most useful as they allowed for better situational awareness while working.

It’s a creative solution to protecting the local birdlife, and is to be applauded. There’s plenty of hubris around potential uses for augmented reality, but this is a great example of a real and practical one. And, if you’re keen to experiment with AR yourself, note that it doesn’t have to break the bank either!


Hackaday Links Column Banner

Hackaday Links: May 15, 2022

It may be blurry and blotchy, but it’s ours. The first images of the supermassive black hole at the center of the Milky Way galaxy were revealed this week, and they caused quite a stir. You may recall the first images of the supermassive black hole at the center of the M87 galaxy from a couple of years ago: spectacular images that captured exactly what all the theories said a black hole should look like, or more precisely, what the accretion disk and event horizon should look like, since black holes themselves aren’t much to look at. That black hole, dubbed M87*, is over 55 million light-years away, but is so huge and so active that it was relatively easy to image. The black hole at the center of our own galaxy, Sagittarius A*, is comparatively tiny — its event horizon would fit inside the orbit of Mercury — a much closer at only 26,000 light-years or so. But, our black hole is much less active and obscured by dust, so imaging it was far more difficult. It’s a stunning technical achievement, and the images are certainly worth checking out.

Another one from the “Why didn’t I think of that?” files — contactless haptic feedback using the mouth is now a thing. This comes from the Future Interfaces Group at Carnegie-Mellon and is intended to provide an alternative to what ends up being about the only practical haptic device for VR and AR applications — vibrations from off-balance motors. Instead, this uses an array of ultrasonic transducers positioned on a VR visor and directed at the user’s mouth. By properly driving the array, pressure waves can be directed at the lips, teeth, and tongue of the wearer, providing feedback for in-world events. The mock game demonstrated in the video below is a little creepy — not sure how many people enjoyed the feeling of cobwebs brushing against the face or the splatter of spider guts in the mouth. Still, it’s a pretty cool idea, and we’d like to see how far it can go.

Continue reading “Hackaday Links: May 15, 2022”