Quantum Inspired Algorithm Going Back To The Source

Recently, [Jabrils] set out to accomplish a difficult task: porting a quantum-inspired algorithm to run on a (simulated) quantum computer. Algorithms are often inspired by all sorts of natural phenomena. For example, a solution to the traveling salesman problem models ants and their pheromone trails. Another famous example is neural nets, which are inspired by the neurons in your brain. However, attempting to run a machine learning algorithm on your neurons, even with the assistance of pen and paper would be a nearly impossible exercise.

The quantum-inspired algorithm in question is known as the wavefunction collapse function. In a nutshell, you have a cube of voxels, a graph of nodes, or simply a grid of tiles as well as a list of detailed rules to determine the state of a node or tile. At the start of the algorithm, each node or point is considered in a state of superposition, which means it is considered to be in every possible state. Looking at the list of rules, the algorithm then begins to collapse the states. Unlike a quantum computer, states of superposition is not an intrinsic part of a classic computer, so this solving must be done iteratively. In order to reduce possible conflicts and contradictions later down the line, the nodes with the least entropy (the smallest number of possible states) are solved first. At first, random states are assigned, with the changes propagating through the system. This process is continued until the waveform is ultimately collapsed to a stable state or a contradiction is reached.

What’s interesting is that the ruleset doesn’t need to be coded, it can be inferred from an example. A classic use case of this algorithm is 2D pixel-art level design. By providing a small sample level, the algorithm churns and produces similar but wholly unique output. This makes it easy to provide thousands of unique and beautiful levels from an easy source image, however it comes at a price. Even a small level can take hours to fully collapse. In theory, a quantum computer should be able to do this much faster, since after all, it was the inspiration for this algorithm in the first place.

[Jabrils] spent weeks trying to get things running but ultimately didn’t succeed. However, his efforts give us a peek into the world of quantum computing and this amazing algorithm. We look forward to hearing more about this project from [Jabrils] who is continuing to work on it in his spare time. Maybe give it a shot yourself by learning the basics of quantum computing for yourself.

Continue reading “Quantum Inspired Algorithm Going Back To The Source”

Jetson Emulator Gives Students A Free AI Lesson

With the Jetson Nano, NVIDIA has done a fantastic job of bringing GPU-accelerated machine learning to the masses. For less than the cost of a used graphics card, you get a turn-key Linux computer that’s ready and able to handle whatever AI code you throw at it. But if you’re trying to set up a lab for 30 students, the cost of even relatively affordable development boards can really add up.

Spoiler: These things don’t exist.

Which is why [Tea Vui Huang] has developed jetson-emulator. This Python library provides a work-alike environment to NVIDIA’s own “Hello AI World” tutorials designed for the Jetson family of devices, with one big difference: you don’t need the actual hardware. In fact, it doesn’t matter what kind of computer you’ve got; with this library, anything that can run Python 3.7.9 or better can take you through NVIDIA’s getting started tutorial.

So what’s the trick? Well, if you haven’t guessed already, it’s all fake. Obviously it can’t actually run GPU-accelerated code without a GPU, so the library [Tea] has developed simply pretends. It provides virtual images and even “live” camera feeds to which randomly generated objects have been assigned.

The original NVIDIA functions have been rewritten to work with these feeds, so when you call something like net.Classify(img) against one of them you’ll get a report of what faux objects were detected. The output will look just like it would if you were running on a real Jetson, down to providing fictitious dimensions and positions for the bounding boxes.

If you’re a hacker looking to dive into machine learning and computer vision, you’d be better off getting a $59 Jetson Nano and a webcam. But if you’re putting together a workshop that shows a dozen people the basics of NVIDIA’s AI workflow, jetson-emulator will allow everyone in attendance to run code and get results back regardless of what they’ve got under the hood.

Should You Build For Windows, Mac, IOS, Android, Or Linux? Yes!

The holy grail of computer languages is to write code once and have it deploy effortlessly everywhere. Java likes to take credit for the idea, but UCSD P-Code was way before that and you could argue that mainframes had I/O abstraction like Fortran unit numbers even earlier. More modern efforts include Qt, GTK, and other things. Naturally, all of these fall short in some way. Now Google enters the fray with Flutter.

Flutter isn’t new, but in the past, it only handled Android and iOS. Now it can target desktop platforms and can even produce JavaScript. We haven’t played with the system enough to say how successful it is, but you can try it in your browser if you want some first-hand experience.

Continue reading “Should You Build For Windows, Mac, IOS, Android, Or Linux? Yes!”

Even More Firmware In Your Firmware

There are many ways to update an embedded system in the field. Images can fly through the air one a time, travel by sneaker or hitch a ride on other passing data. OK, maybe that’s a stretch, but there are certainly a plethora of ways to get those sweet update bytes into a target system. How are those bytes assembled, and what are the tools that do the assembly? This is the problem I needed to solve.

Recall, my system wasn’t a particularly novel one (see the block diagram below). Just a few computers asking each other for an update over some serial busses. I had chosen to bundle the payload firmware images into the binary for the intermediate microcontroller which was to carry out the update process. The additional constraint was that the blending of the three firmware images (one carrier and two payload) needed to happen long after compile time, on a different system with a separate toolchain. There were ultimately two options that fit the bill.

The system thirsty for an update

Continue reading “Even More Firmware In Your Firmware”

Hyper Links And Hyperfunctional Text CAD

Strong opinions exist on both sides about OpenSCAD. The lightweight program takes megabytes of space, not gigabytes, so many people have a copy, even if they’ve never written a shape. Some people adore the text-only modeling language, and some people abhor the minimal function list. [Johnathon ‘Zalo’ Selstad] appreciates the idea but wants to see something more robust, and he wants to see it in your browser. His project CascadeStudio has a GitHub repo and a live link so you can start tinkering in a new window straight away.

Continue reading “Hyper Links And Hyperfunctional Text CAD”

Boot Sector Pong As A Crash Course In Assembly

Have you ever wanted to develop a playable game small enough to fit into a disk’s 512 byte boot sector? How about watching somebody develop a program in assembly for nearly two hours? If you answered yes to either of those questions, or ideally both of them, you’re going to love this project from [Queso Fuego].

Whether you just want to check out the public domain source code or watch along as he literally starts from a blank file and codes every line for your viewing pleasure, chances are good that you’ll pick up a trick or two from this project. For example, he explains how all of the “graphics” in the game are done in 80 x 25 text mode simply by setting the background color of character cells without printing any text to them.

We really like the presentation in the video after the break, which was recorded over the course of multiple days, judging by the changing light levels in the background. As he types out each line of code, he explains what its function is and gives any background information necessary to explain how it will fit into the larger program. If you’ve ever wondered if you had what it takes to program in ASM, watching this video is a great way to decide.

[Queso Fuego] mentions that this project, and his research into this sort of low-level programming, came about due to the social distancing boredom that many of us are feeling. While we’re certainly not advocating for him to kept locked in his home permanently, with projects like this, you’ve got to admit it seems like a win for the rest of us.

Continue reading “Boot Sector Pong As A Crash Course In Assembly”

Parsing Math In Python

Programming computers used to be harder. Don’t get us wrong — today, people tend to solve harder problems with computers, but the fundamental act of programming is easier. We have high-level languages, toolkits, and even help from our operating systems. Most people never have to figure out how to directly read from a disk drive, deblock the data into records, and perform multiplication using nothing but shifts and adds. While that’s a good thing, sometimes it is good to study the basics. That was [gnebehay’s] thought when his university studies were too high level, so he decided to write an arithmetic expression parser in Python. It came out in about 100 lines of code.

Interpreting math expressions is one of those things that seems simple until you get into it. The first problem is correctly lexing the input — a term that means splitting into tokens. For a human, it seems simple that 5-3 is three tokens, {5, -, and 3} and that’s easy to figure out. But what about 5+-3? That’s also three tokens: {5,+,-3}. Tricky.

Continue reading “Parsing Math In Python”