Quantum Inspired Algorithm Going Back To The Source

Recently, [Jabrils] set out to accomplish a difficult task: porting a quantum-inspired algorithm to run on a (simulated) quantum computer. Algorithms are often inspired by all sorts of natural phenomena. For example, a solution to the traveling salesman problem models ants and their pheromone trails. Another famous example is neural nets, which are inspired by the neurons in your brain. However, attempting to run a machine learning algorithm on your neurons, even with the assistance of pen and paper would be a nearly impossible exercise.

The quantum-inspired algorithm in question is known as the wavefunction collapse function. In a nutshell, you have a cube of voxels, a graph of nodes, or simply a grid of tiles as well as a list of detailed rules to determine the state of a node or tile. At the start of the algorithm, each node or point is considered in a state of superposition, which means it is considered to be in every possible state. Looking at the list of rules, the algorithm then begins to collapse the states. Unlike a quantum computer, states of superposition is not an intrinsic part of a classic computer, so this solving must be done iteratively. In order to reduce possible conflicts and contradictions later down the line, the nodes with the least entropy (the smallest number of possible states) are solved first. At first, random states are assigned, with the changes propagating through the system. This process is continued until the waveform is ultimately collapsed to a stable state or a contradiction is reached.

What’s interesting is that the ruleset doesn’t need to be coded, it can be inferred from an example. A classic use case of this algorithm is 2D pixel-art level design. By providing a small sample level, the algorithm churns and produces similar but wholly unique output. This makes it easy to provide thousands of unique and beautiful levels from an easy source image, however it comes at a price. Even a small level can take hours to fully collapse. In theory, a quantum computer should be able to do this much faster, since after all, it was the inspiration for this algorithm in the first place.

[Jabrils] spent weeks trying to get things running but ultimately didn’t succeed. However, his efforts give us a peek into the world of quantum computing and this amazing algorithm. We look forward to hearing more about this project from [Jabrils] who is continuing to work on it in his spare time. Maybe give it a shot yourself by learning the basics of quantum computing for yourself.

Continue reading “Quantum Inspired Algorithm Going Back To The Source”

Community Testing Suggests Bias In Twitter’s Cropping Algorithm

With social media and online services are now huge parts of daily life to the point that our entire world is being shaped by algorithms. Arcane in their workings, they are responsible for the content we see and the adverts we’re shown. Just as importantly, they decide what is hidden from view as well.

Important: Much of this post discusses the performance of a live website algorithm. Some of the links in this post may not perform as reported if viewed at a later date. 

The initial Zoom problem that brought Twitter’s issues to light.

Recently, [Colin Madland] posted some screenshots of a Zoom meeting to Twitter, pointing out how Zoom’s background detection algorithm had improperly erased the head of a colleague with darker skin. In doing so, [Colin] noticed a strange effect — although the screenshot he submitted shows both of their faces, Twitter would always crop the image to show just his light-skinned face, no matter the image orientation. The Twitter community raced to explore the problem, and the fallout was swift.

Continue reading “Community Testing Suggests Bias In Twitter’s Cropping Algorithm”

Sort The Rainbow With An Algorithm Machine

When you’re trying to learn how an algorithm works, it’s not always easy to visualize what’s going on. Well, except for maybe binary sort, thanks to the phone book. Professor [thatguyer] is a computer science teacher who wanted a way to help his students visualize the process of algorithms and at the same time, get a grasp on their resource cost.

The Algorithm Machine can demonstrate 8 different search and sort algorithms using two 100-count strips of RGB LEDs — one to represent an array of integers, and one to create indicators pointing to the integers under scrutiny.

This functional beauty is totally interactive, too. Once the user chooses the values and the algorithm and starts the process, they can speed it up or slow it down with the rotary encoder, or pause to discuss and start again with that slick triangular play button. We particularly like the control button wiring harness [thatguyer] created to keep everything neat and hot-swappable.

This iteration uses 3D printed face plates to give the LEDs shape, but in an early version, [thatguyer] cut and sanded a ton of circles out of brass tubing, and folded as many triangles cut from disposable baking pans. The world could use more teachers as committed as [thatguyer]. This really seems like a handy teaching aid for these concepts, and we wish we’d had one in class to play around with. Here’s your algorithm for watching the demo: click break, press play, enjoy.

If you’re still confused, there are other ways to understand algorithms through visualization. Failing all that, just watch these Hungarian folk dancers work out various algo-rhythms.

Continue reading “Sort The Rainbow With An Algorithm Machine”

Sara Adkins Is Jamming Out With Machines

Asking machines to make music by themselves is kind of a strange notion. They’re machines, after all. They don’t feel happy or hurt, and as far as we know, they don’t long for the affections of other machines. Humans like to think of music as being a strictly human thing, a passionate undertaking so nuanced and emotion-based that a machine could never begin to understand the feeling that goes into the process of making music, or even the simple enjoyment of it.

The idea of humans and machines having a jam session together is even stranger. But oddly enough, the principles of the jam session may be exactly what machines need to begin to understand musical expression. As Sara Adkins explains in her enlightening 2019 Hackaday Superconference talk, Creating with the Machine, humans and machines have a lot to learn from each other.

To a human musician, a machine’s speed and accuracy are enviable. So is its ability to make instant transitions between notes and chords. Humans are slow to learn these transitions and have to practice going back and forth repeatedly to build muscle memory. If the machine were capable, it would likely envy the human in terms of passionate performance and musical expression.

Continue reading “Sara Adkins Is Jamming Out With Machines”

Safety Systems For Stopping An Uncontrolled Drone Crash

We spend a lot of time here at Hackaday talking about drone incidents and today we’re looking into the hazard of operating in areas where people are present. Accidents happen, and a whether it’s a catastrophic failure or just a dead battery pack, the chance of a multi-rotor aircraft crashing down onto people below is a real and persistent hazard. For amateur fliers, operating over crowds of people is simply banned, but there are cases where professionally-piloted dones are flying near crowds of people and other safety measures need to be considered.

We saw a skier narrowly missed by a falling camera drone in 2015, and a couple weeks back there was news of a postal drone trial in Switzerland being halted after a parachute system failed. When a multirotor somehow fails while in flight it represents a multi-kilogram flying weapon widow-maker equipped with spinning blades, how does it make it to the ground in as safe a manner as possible? Does it fall in uncontrolled flight, or does it activate a failsafe technology and retain some form of control as it descends?

Continue reading “Safety Systems For Stopping An Uncontrolled Drone Crash”

Elegant Drum Machine From Teensy

Playing the drums is pretty hard, especially for the uncoordinated. Doing four things at the same time, all while keeping an even tempo, isn’t reasonable for most of us. Rather than hiring a drummer for your band who is well versed in this art, though, you might opt instead to outsource this job to a machine instead. It’s cheaper and also less likely to result in spontaneous combustion.

This drum machine is actually a MIDI Euclidean sequencer. Euclidean rhythms are interesting in their own regard, but the basics are that a common denominator between two beats is found in order to automatically generate complicated beats. This particular unit is running on a Teensy 3.5 and consists of four RGB rotary encoders, an SSD1306 LCD, four momentary buttons, and four 16 LED Neopixel rings. Setting each of the dials increases the number of beats for that particular channel, and it can be configured for an almost limitless combination of beats and patterns.

To really get a feel of what’s going on here, it’s worth it to check out the video after the break. MIDI is also a fascinating standard, beyond the fact that it’s one of the few remaining standards created in the 80s that still enjoys active use, it can also be used to build all kinds of interesting instruments like one that whacks wine glasses with mallets or custom synthesizers.

Thanks to [baldpower] for the tip!

Continue reading “Elegant Drum Machine From Teensy”

Simple Quadcopter Testbed Clears The Air For Easy Algorithm Development

We don’t have to tell you that drones are all the rage. But while new commercial models are being released all the time, and new parts get released for the makers, the basic technology used in the hardware hasn’t changed in the last few years. Sure, we’ve added more sensors, increased computing power, and improved the efficiency, but the key developments come in the software: you only have to look at the latest models on the market, or the frequency of Git commits to Betaflight, Butterflight, Cleanflight, etc.

With this in mind, for a Hackaday prize entry [int-smart] is working on a quadcopter testbed for developing algorithms, specifically localization and mapping. The aim of the project is to eventually make it as easy as possible to get off the ground and start writing code, as well as to integrate mapping algorithms with Ardupilot through ROS.

The initial idea was to use a Beaglebone Blue and some cheap hobby hardware which is fairly standard for a drone of this size: 1250 kv motors and SimonK ESCs, mounted on an f450 flame wheel style frame. However, it looks like an off-the-shelf solution might be even simpler if it can be made to work with ROS. A Scanse Sweep LIDAR sensor provides point cloud data, which is then munched with some Iterative Closest Point (ICP) processing. If you like math then it’s definitely worth reading the project logs, as some of the algorithms are explained there.

It might be fun to add FPV to this system to see how the mapping algorithms are performing from the perspective of the drone. And just because it’s awesome. FPV is also a fertile area for hacking: we particularly love this FPV tracker which rotates itself to get the best signal, and this 3D FPV setup using two cameras.