Genetic Algorithm Runs On Atari 800 XL

For the last few years or so, the story in the artificial intelligence that was accepted without question was that all of the big names in the field needed more compute, more resources, more energy, and more money to build better models. But simply throwing money and GPUs at these companies without question led to them getting complacent, and ripe to be upset by an underdog with fractions of the computing resources and funding. Perhaps that should have been more obvious from the start, since people have been building various machine learning algorithms on extremely limited computing platforms like this one built on the Atari 800 XL.

Unlike other models that use memory-intensive applications like gradient descent to train their neural networks, [Jean Michel Sellier] is using a genetic algorithm to work within the confines of the platform. Genetic algorithms evaluate potential solutions by evolving them over many generations and keeping the ones which work best each time. The changes made to the surviving generations before they are put through the next evolution can be made in many ways, but for a limited system like this a quick approach is to make small random changes. [Jean]’s program, written in BASIC, performs 32 generations of evolution to predict the points that will lie on a simple mathematical function.

While it is true that the BASIC program relies on stochastic methods to train, it does work and proves that it’s effective to create certain machine learning models using limited hardware, in this case an 8-bit Atari running BASIC. In previous projects he’s also been able to show how similar computers can be used for other complex mathematical tasks as well. Of course it’s true that an 8-bit machine like this won’t challenge OpenAI or Anthropic anytime soon, but looking for more efficient ways of running complex computation operations is always a more challenging and rewarding problem to solve than buying more computing resources.

Continue reading “Genetic Algorithm Runs On Atari 800 XL”

Mobile Coffee Table Uses Legs To Get Around

For getting around on most surfaces, it’s hard to beat the utility of the wheel. Versatile, inexpensive, and able to be made from a wide array of materials has led to this being a cornerstone technology for the past ten thousand years or so. But with that much history it can seem a little bit played out. To change up the locomotion game, you might want to consider using robotic legs instead. That’s what [Giliam] designed into this mobile coffee table which uses custom linkages to move its legs and get itself from place to place around the living room.

Continue reading “Mobile Coffee Table Uses Legs To Get Around”

A 3D printed copper aerospike engine cutaway showing the intricate, organic-looking channels inside. It is vaguely reminiscent of a human torso and lungs.

3D Printed Aerospike Was Designed By AI

We’re still in the early days of generatively-designed objects, but when combined with the capabilities of 3D printing, we’re already seeing some interesting results. One example is this new copper aerospike engine. [via Fabbaloo]

A collaboration between startups Hyperganic (generative AI CAD) and AMCM (additive manufacturing), this 800 mm long aerospike engine may be the most complicated 3D print yet. It continues the exciting work being done with 3D printing for aerospace applications. The complicated geometries of rocket nozzles of any type let additive manufacturing really shine, so the combination of generative algorithms and 3D printed nozzles could result in some big leaps in coming years.

Aerospikes are interesting as their geometry isn’t pressure dependent like more typical bell-shaped rocket nozzles meaning you only need one engine for your entire flight profile instead of the traditional switching mid-flight. A linear aerospike engine was one of the main selling points for the cancelled VentureStar Space Shuttle replacement.

This isn’t the only generative design headed to space, and we’ve covered a few projects if you’re interested in building your own 3D printed rocket nozzles or aerospike engines. Just make sure you get clearance from your local aviation regulator before your project goes to space!

The underside of the rotational base of the Gen5X 3D printer. A belt connects a pulley on the bottom of the stage to a stepper motor on the right side. The carriage for the stage looks organic in nature and is printed in bright orange PLA. The stage can rotate within the carriage which is mounted on two stainless steel rods connected to teal mounting points on either side of the printer (ends of the X-axis).

5-Axis Printer Wants To Design Itself

RepRap 3D printers were designed with the ultimate goal of self-replicating machines. The generatively-designed Gen5X printer by [Ric Real] brings the design step of that process closer to reality.

While 5-axis printing is old hat in CNC land, it remains relatively rare in the world of additive manufacturing. Starting with “a set of primitives… and geometric relationships,” [Real] ran the system through multiple generations to arrive at its current design. Since this is a generative design, future variants could look different depending on which parameters you have the computer optimize.

The Gen5X uses the 5 Axis Slicer from DotX for slicing files and runs a RepRap Duet board with Duex expansion. Since the generative algorithm uses parametric inputs, it should be possible to to have a Gen5X generated based on the vitamins you may have already. With how fast AI is evolving, perhaps soon this printer will be able to completely design itself? For now, you’ll have to download the files and try it yourself.

If you want to see some more printers with more than 3-axes, check out the RotBot or Open5X.

Continue reading “5-Axis Printer Wants To Design Itself”

AI Learns To Drive Trackmania

Machine learning has long been a topic of interest for humanity, but only in recent years have we had broad access to great computing power to enable to the average person to dive in. [Yosh] recently decided to put an AI to work learning how to race in Trackmania.

After early experiments with supervised learning, [Yosh] decided to implement a genetic algorithm to produce an AI to drive in the game. The AI takes distance from the track walls as an input, and has steering and accelerator values as an output. Starting with 100 AIs in generation 1, [Yosh] iterated by choosing the AIs that covered the longest distance in 13 seconds. Once the AIs started to get the hang of the first few corners, he changed the training to instead prioritize the lowest time taken to traverse each of the checkpoints along the track.

The AI improved over time, and over 100 generations, got down to a 23.48s time on the test track, versus 19.63s for [Trabadia], a talented human. We’d love to see how much better the AI could do with more training. [Yosh] is trying more experiments, like providing extra feedback in the AI fitness function to keep it from hitting the walls. It’s not the first time we’ve seen a genetic algorithm used to train a racing AI, either. Video after the break.

Continue reading “AI Learns To Drive Trackmania”

Neural Networks And MarI/O

Minecraft wizard, and record holder for the Super Mario World speedrun [SethBling] is experimenting with machine learning. He built a program that will get Mario through an entire level of Super Mario World – Donut Plains 1 – using neural networks and genetic algorithms.

A neural network simply takes an input, in this case a small graphic representing the sprites in the game it’s playing, sends that input through a series of artificial neurons, and turns that into commands for the controller. It’s an exceedingly simple neural network – the network that can get Mario through an entire level is less than a dozen neurons – but with enough training, even simple networks can accomplish very complex tasks.

To train the network, or weighting the connections between inputs, neurons, and outputs, [SethBling] is using an evolutionary algorithm. This algorithm first generates a few random neural networks, watches Mario’s progress across Donut Plains 1, and assigns a fitness value to each net. The best networks of each generation are combined, and the process continues for the next generation. It took 34 generations before MarI/O could finish the level without dying.

A few members of the Internet’s peanut gallery have pointed to a paper/YouTube video by [Tom Murphy] that generalized a completely different technique to play a whole bunch of different NES games. While both [SethBling]’s and [Tom Murphy]’s algorithms use certain variables to determine its own success, [Tom Murphy]’s technique works nearly automatically; it will play about as well as the training data it is given. [SethBling]’s algorithm requires no training data – that’s the entire point of using a genetic algorithm.

Continue reading “Neural Networks And MarI/O”

Genetic Algorithm Programmer Gets Functions

[Kory] has been writing genetic algorithms for a few months now. This in itself isn’t anything unique or exceptional, except for what he’s getting these genetic algorithms to do. [Kory] has been using genetic algorithms to write programs in Brainfuck. Yes, it’s a computer programming a computer. Be thankful Skynet is 18 years late.

When we first saw [Kory]’s work, he had programmed a computer to write and run its own programs in Brainfuck. Although the name of the language [Kory] chose could use some work, it’s actually the ideal language for computer-generated programs. With only eight commands, each consisting of a single character, it greatly reduces the overhead of what any genetic algorithm must produce and what a fitness function must evaluate.

There was one shortcoming to [Kory]’s initial efforts: functions. It’s relatively easy to get a program to say Hello World, but to do something complex, you’re going to need something like a macro or a function. Brainfuck, it its most simple form, doesn’t support functions. This throws a wrench in [Kory]’s plan to have his computer programming computer grow smarter and get over local minima in its genetic algorithms.

The solution to this problem was the creation of a new dialect of Brainfuck [Kory] calls BrainPlus. This takes the best parts of Extended Brainfuck and adds a command that basically serves as a break statement.

With this, [Kory]’s self programming computer can develop more complex programs. Already it has created a program to generate the first few numbers of the Fibonacci sequence. It only goes up to 233 because 255 is the maximum value for a byte, and the program itself took seven hours to generate. It does, however, work. Other programs generated with the new Brainplus functions include reciting 99 bottles on the wall and a program that multiples two values.

Even though [Kory]’s computer is spending a long time to generate these programs, given enough time, there’s really not much this program can’t do. Brainfuck, and [Kory]’s Brainplus, are Turing complete, so that given infinite memory and time it can compute anything. With the new addition of functions, it can compute anything faster.

All the code for [Kyle]’s GA is available on Github.