Adding Space Music To The Astronomy Toolbox

Astronomy fans were recently treated to the Great Conjunction, where Jupiter and Saturn appear close together from the perspective of our planet Earth. Astronomy has given us this and many other magnificent sights, but we can get other senses involved. Science News tells of explorations into adapting our sense of hearing into tools of astronomical data analysis.

Data visualization has long been a part of astronomy, but they’re not restricted to charts and graphs that require a trained background to interpret. Every “image” generated using data from radio telescopes (like the recently-lost Arecibo facility) are a visualization of data from outside the visible spectrum. Visualizations also include crowd pleasing false-color images such as The Pillars of Creation published by NASA where interstellar emissions captured by science instruments are remapped to colors in the visible spectrum. The results are equal parts art and science, and can be appreciated from either perspective.

Data sonification is a whole other toolset with different strengths. Our visual system evolved ability to pick out edges and patterns in spatial plots, which we exploit for data visualization. In contrast our aural system evolved ability to process data in the frequency domain, and the challenge is to figure out how to use those abilities to gain scientifically relevant data insight. For now this field of work is more art than science, but it does open another venue for the visually impaired. Some of whom are already active contributors in astronomy and interested in applying their well-developed sense of hearing to their work.

Of course there’s no reason this has to be restricted to astronomy. A few months ago we covered a project for sonification of DNA data. It doesn’t take much to get started, as shown in this student sonification project. We certainly have no shortage of projects that make interesting sounds on this site, perhaps one of them will be the key.

Tracking Drone Flight Path Via Video, Using Cameras We Can Get

Calculating three-dimensional position from two-dimensional projections are literal textbook examples in geometry, but those examples are the “assume a spherical cow” type of simplifications. Applicable only in an ideal world where the projections are made with mathematically perfect cameras at precisely known locations with infinite resolution. Making things work in the real world is a lot harder. But not only have [Jingtong Li, Jesse Murray et al.] worked through the math of tracking a drone’s 3D flight from 2D video, they’ve released their MultiViewUnsynch software on GitHub so we can all play with it.

Instead of laboratory grade optical instruments, the cameras used in these experiments are available at our local consumer electronics store. A table in their paper Reconstruction of 3D Flight Trajectories from Ad-Hoc Camera Networks (arXiv:2003.04784) listed several Huawei cell phone cameras, a few Sony digital cameras, and a GoPro 3. Video cameras don’t need to be placed in any particular arrangement, because positions are calculated from their video footage. Correlating overlapping footage from dissimilar cameras is a challenge all in itself, since these cameras record at varying framerates ranging from 25 to 59.94 frames per second. Furthermore, these cameras all have rolling shutters, which adds an extra variable as scanlines in a frame are taken at slightly different times. This is not an easy problem.

There is a lot of interest in tracking drone flights, especially those flying where they are not welcome. And not everyone have the budget for high-end equipment or the permission to emit electromagnetic signals. MultiViewUnsynch is not quite there yet, as it tracks a single target and video files were processed afterwards. The eventual goal is to evolve this capability to track multiple targets on live video, and hopefully help reduce frustrating public embarrassments.

[IROS 2020 Presentation video (duration 14:45) requires free registration, available until at least Nov. 25th 2020.]

Illuminating The Inner Workings Of A Venus Flytrap

As a carnivorous plant, Venus flytraps have always been a fascinating subject of study. One of their many mysteries is how they differentiate an insect visit from less nutritious stimulants such as a windblown pebble. Now scientists are one step closer to deciphering the underlying mechanism, assisted by a new ability to visualize calcium changes in real time.

Calcium has long been suspected to play an important part in a Venus flytrap’s close/no-close decision process, but scientists couldn’t verify their hypothesis before. Standard chemical tests for calcium would require cutting the plant apart, which would only result in a static snapshot. The software analogy would be killing the process for a memory dump but unable to debug the process at runtime. There were tantalizing hints of a biological calcium-based analog computer at work, but mother nature had no reason to evolve JTAG test points on it.

Lacking in-circuit debug headers, scientists turned to the next best thing: add diagnostic indicator lights. But instead of blinking LEDs, genes were added to produce a protein that glows in the presence of calcium. Once successful, they could work with the engineered plants and get visual feedback. Immediately see calcium levels change and propagate in response to various stimuli over different time periods. Confirming that the trap snaps shut only in response to patterns of stimuli that push calcium levels beyond a threshold.

With these glowing proteins in place, researchers found that calcium explained some of the behavior but was not the whole picture. There’s something else, suspected to be a fast electrical network, that senses prey movement and trigger calcium release. That’ll be something to dig into, but at least we have more experience working with electrical impulses and not just for plants, either.

Continue reading “Illuminating The Inner Workings Of A Venus Flytrap”

Kinect Gave Us A Preview Of The Future, Though Not The One It Intended

This holiday season, the video game industry hype machine is focused on building excitement for new PlayStation and Xbox consoles. Ten years ago, a similar chorus of hype reached a crescendo with the release of Xbox Kinect, promising to revolutionize how we play. That vision never panned out, but as [Daniel Cooper] of Engadget pointed out in a Kinect retrospective, it premiered consumer technologies that impacted fields far beyond gaming.

Kinect has since withdrawn from the gaming market, because as it turns out gamers are quite content with handheld controllers. This year’s new controllers for a PlayStation or Xbox would be immediately familiar to gamers from ten years ago. Even Nintendo, whose Wii is frequently credited as motivation for Microsoft to develop the Kinect, have arguably taken a step back with Joy-cons of their Switch.

But the Kinect’s success at bringing a depth camera to consumer price levels paved the way to explore many ideas that were previously impossible. The flurry of enthusiastic Kinect hacking proved there is a market for depth camera peripherals, leading to plug-and-play devices like Intel RealSense to make depth-sensing projects easier. The original PrimeSense technology has since been simplified and miniaturized into Face ID unlocking Apple phones. Kinect itself found another job with Microsoft’s HoloLens AR headset. And let’s not forget the upcoming wave of autonomous cars and drones, many of which will see their worlds via depth sensors of some kind. Some might even be equipped with the latest sensor to wear the Kinect name.

Inside the Kinect was also one of the earliest microphone arrays sold to consumers. Enabling the Kinect to figure out which direction a voice is coming from, and isolate it from other noises in the room. Such technology were previously the exclusive domain of expensive corporate conference room speakerphones, but now it forms the core of inexpensive home assistants like an Amazon Echo Dot. Raising the bar so much that hacks needed many more microphones just to stand out.

With the technology available more easily elsewhere, attrition of a discontinued device is reflected in the dwindling number of recent Kinect hacks on these pages. We still see a cool project every now and then, though. As the classic sensor bar itself recedes into history, others will take its place to give us depth sensing and smart audio. But for many of us, Kinect was the ambitious videogame peripheral that gave us our first experience.

Teardown Experts Sing Praise Of Stretch-Release Adhesives

Anyone who enjoys opening up consumer electronics knows iFixit to be a valuable resource, full of reference pictures and repair procedures to help revive devices and keep them out of electronic waste. Champions of reparability, they’ve been watching in dismay as the quest for thinner and lighter devices also made them harder to fix. But they wanted to cheer a bright spot in this bleak landscape: increasing use of stretch-release adhesives.

Nokia BL-50J Battery
An elegant battery, for a more civilized age.

Once upon a time batteries were designed to be user-replaceable. But that required access mechanisms, electrical connectors, and protective shells around fragile battery cells. Eliminating such overhead allowed slimmer devices, but didn’t change the fact that the battery is still likely to need replacement. We thus entered into a dark age where battery pouches were glued into devices and replacement meant fighting clingy blobs and cleaning sticky residue. Something the teardown experts at iFixit are all too familiar with.

This is why they are happy to see pull tabs whenever they peer inside something, for those tabs signify the device was blessed with stretch-release adhesives. All we have to do is apply a firm and steady pull on those tabs to release their hold leaving no residue behind. We get an overview of how this magic works, with the caveat that implementation details are well into the land of patents and trade secrets.

But we do get tips on how to best remove them, and how to reapply new strips, which are important to iFixit’s mission. There’s also a detour into their impact on interior design of the device: the tabs have to be accessible, and they need room to stretch. This isn’t just a concern for design engineers, they also apply to stretch release adhesives sold to consumers. Advertising push by 3M Command and competitors have already begun, reminding people that stretch-release adhesive strips are ideal for temporary holiday decorations. They would also work well to hold batteries in our own projects, even if we aren’t their advertised targets.

Our end-of-year gift-giving traditions will mean a new wave of gadgets. And while not all of them will be easily repairable, we’re happy that this tiny bit of reparability exists. Every bit helps to stem the flow of electronics waste.

Fail Of The Week: Roboracer Meets Wall

There comes a moment when our project sees the light of day, publicly presented to people who are curious to see the results of all our hard work, only for it to fail in a spectacularly embarrassing way. This is the dreaded “Demo Curse” and it recently befell the SIT Acronis Autonomous team. Their Roborace car gained social media infamy as it was seen launching off the starting line and immediately into a wall. A team member explained what happened.

A few explanations had started circulating, but only in the vague terms of a “steering lock” without much technical detail until this emerged. Steering lock? You mean like The Club? Well, sort of. While there was no steering wheel immobilization steel bar on the car, a software equivalent did take hold within the car’s systems.  During initialization, while a human driver was at the controls, one of the modules sent out NaN (Not a Number) instead of a valid numeric value. This was never seen in testing, and it wreaked havoc at the worst possible time.

A module whose job was to ensure numbers stay within expected bounds said “not a number, not my problem!” That NaN value propagated through to the vehicle’s CAN data bus, which didn’t define the handling of NaN so it was arbitrarily translated into a very large number causing further problems. This cascade of events resulted in a steering control system locked to full right before the algorithm was given permission to start driving. It desperately tried to steer the car back on course, without effect, for the few short seconds until it met the wall.

While embarrassing and not the kind of publicity the Schaffhausen Institute of Technology or their sponsor Acronis was hoping for, the team dug through logs to understand what happened and taught their car to handle NaN properly. Driving a backup car, round two went very well and the team took second place. So they had a happy ending after all. Congratulations! We’re very happy this problem was found and fixed on a closed track and not on public roads.

[via Engadget]

One Wheel Is All We Need To Roll Into Better Multirotor Efficiency

Multirotor aircraft enjoy many intrinsic advantages, but as machines that fight gravity with brute force, energy efficiency is not considered among them. In the interest of stretching range, several air-ground hybrid designs have been explored. Flying cars, basically, to run on the ground when it isn’t strictly necessary to be airborne. But they all share the same challenge: components that make a car work well on the ground are range-sapping dead weight while in the air. [Youming Qin et al.] explored cutting that dead weight as much as possible and came up with Hybrid Aerial-Ground Locomotion with a Single Passive Wheel.

As the paper’s title made clear, they went full minimalist with this design. Gone are the driveshaft, brakes, steering, even other wheels. All that remained is a single unpowered wheel bolted to the bottom of their dual-rotor flying machine. Minimizing the impact on flight characteristics is great, but how would that work on the ground? As a tradeoff, these rotors have to keep spinning even while in “ground mode”. They are responsible for keeping the machine upright, and they also have to handle tasks like steering. These and other control algorithm problems had to be sorted out before evaluating whether such a compromised ground vehicle is worth the trouble.

Happily, the result is a resounding “yes”. Even though the rotors have to continue running to do different jobs while on the ground, that was still far less effort than hovering in the air. Power consumption measurements indicate savings of up to 77%, and there are a lot of potential venues for tuning still awaiting future exploration. Among them is to better understand interaction with ground effect, which is something we’ve seen enable novel designs. This isn’t exactly the flying car we were promised, but its development will still be interesting to watch among all the other neat ideas under development to keep multirotors in the air longer.

[IROS 2020 Presentation video (duration 10:49) requires no-cost registration, available until at least Nov. 25th 2020. Forty-two second summary embedded below]

Continue reading “One Wheel Is All We Need To Roll Into Better Multirotor Efficiency”