A 3D printed copper aerospike engine cutaway showing the intricate, organic-looking channels inside. It is vaguely reminiscent of a human torso and lungs.

3D Printed Aerospike Was Designed By AI

We’re still in the early days of generatively-designed objects, but when combined with the capabilities of 3D printing, we’re already seeing some interesting results. One example is this new copper aerospike engine. [via Fabbaloo]

A collaboration between startups Hyperganic (generative AI CAD) and AMCM (additive manufacturing), this 800 mm long aerospike engine may be the most complicated 3D print yet. It continues the exciting work being done with 3D printing for aerospace applications. The complicated geometries of rocket nozzles of any type let additive manufacturing really shine, so the combination of generative algorithms and 3D printed nozzles could result in some big leaps in coming years.

Aerospikes are interesting as their geometry isn’t pressure dependent like more typical bell-shaped rocket nozzles meaning you only need one engine for your entire flight profile instead of the traditional switching mid-flight. A linear aerospike engine was one of the main selling points for the cancelled VentureStar Space Shuttle replacement.

This isn’t the only generative design headed to space, and we’ve covered a few projects if you’re interested in building your own 3D printed rocket nozzles or aerospike engines. Just make sure you get clearance from your local aviation regulator before your project goes to space!

The underside of the rotational base of the Gen5X 3D printer. A belt connects a pulley on the bottom of the stage to a stepper motor on the right side. The carriage for the stage looks organic in nature and is printed in bright orange PLA. The stage can rotate within the carriage which is mounted on two stainless steel rods connected to teal mounting points on either side of the printer (ends of the X-axis).

5-Axis Printer Wants To Design Itself

RepRap 3D printers were designed with the ultimate goal of self-replicating machines. The generatively-designed Gen5X printer by [Ric Real] brings the design step of that process closer to reality.

While 5-axis printing is old hat in CNC land, it remains relatively rare in the world of additive manufacturing. Starting with “a set of primitives… and geometric relationships,” [Real] ran the system through multiple generations to arrive at its current design. Since this is a generative design, future variants could look different depending on which parameters you have the computer optimize.

The Gen5X uses the 5 Axis Slicer from DotX for slicing files and runs a RepRap Duet board with Duex expansion. Since the generative algorithm uses parametric inputs, it should be possible to to have a Gen5X generated based on the vitamins you may have already. With how fast AI is evolving, perhaps soon this printer will be able to completely design itself? For now, you’ll have to download the files and try it yourself.

If you want to see some more printers with more than 3-axes, check out the RotBot or Open5X.

Continue reading “5-Axis Printer Wants To Design Itself”

AI Learns To Drive Trackmania

Machine learning has long been a topic of interest for humanity, but only in recent years have we had broad access to great computing power to enable to the average person to dive in. [Yosh] recently decided to put an AI to work learning how to race in Trackmania.

After early experiments with supervised learning, [Yosh] decided to implement a genetic algorithm to produce an AI to drive in the game. The AI takes distance from the track walls as an input, and has steering and accelerator values as an output. Starting with 100 AIs in generation 1, [Yosh] iterated by choosing the AIs that covered the longest distance in 13 seconds. Once the AIs started to get the hang of the first few corners, he changed the training to instead prioritize the lowest time taken to traverse each of the checkpoints along the track.

The AI improved over time, and over 100 generations, got down to a 23.48s time on the test track, versus 19.63s for [Trabadia], a talented human. We’d love to see how much better the AI could do with more training. [Yosh] is trying more experiments, like providing extra feedback in the AI fitness function to keep it from hitting the walls. It’s not the first time we’ve seen a genetic algorithm used to train a racing AI, either. Video after the break.

Continue reading “AI Learns To Drive Trackmania”

Neural Networks And MarI/O

Minecraft wizard, and record holder for the Super Mario World speedrun [SethBling] is experimenting with machine learning. He built a program that will get Mario through an entire level of Super Mario World – Donut Plains 1 – using neural networks and genetic algorithms.

A neural network simply takes an input, in this case a small graphic representing the sprites in the game it’s playing, sends that input through a series of artificial neurons, and turns that into commands for the controller. It’s an exceedingly simple neural network – the network that can get Mario through an entire level is less than a dozen neurons – but with enough training, even simple networks can accomplish very complex tasks.

To train the network, or weighting the connections between inputs, neurons, and outputs, [SethBling] is using an evolutionary algorithm. This algorithm first generates a few random neural networks, watches Mario’s progress across Donut Plains 1, and assigns a fitness value to each net. The best networks of each generation are combined, and the process continues for the next generation. It took 34 generations before MarI/O could finish the level without dying.

A few members of the Internet’s peanut gallery have pointed to a paper/YouTube video by [Tom Murphy] that generalized a completely different technique to play a whole bunch of different NES games. While both [SethBling]’s and [Tom Murphy]’s algorithms use certain variables to determine its own success, [Tom Murphy]’s technique works nearly automatically; it will play about as well as the training data it is given. [SethBling]’s algorithm requires no training data – that’s the entire point of using a genetic algorithm.

Continue reading “Neural Networks And MarI/O”

Genetic Algorithm Programmer Gets Functions

[Kory] has been writing genetic algorithms for a few months now. This in itself isn’t anything unique or exceptional, except for what he’s getting these genetic algorithms to do. [Kory] has been using genetic algorithms to write programs in Brainfuck. Yes, it’s a computer programming a computer. Be thankful Skynet is 18 years late.

When we first saw [Kory]’s work, he had programmed a computer to write and run its own programs in Brainfuck. Although the name of the language [Kory] chose could use some work, it’s actually the ideal language for computer-generated programs. With only eight commands, each consisting of a single character, it greatly reduces the overhead of what any genetic algorithm must produce and what a fitness function must evaluate.

There was one shortcoming to [Kory]’s initial efforts: functions. It’s relatively easy to get a program to say Hello World, but to do something complex, you’re going to need something like a macro or a function. Brainfuck, it its most simple form, doesn’t support functions. This throws a wrench in [Kory]’s plan to have his computer programming computer grow smarter and get over local minima in its genetic algorithms.

The solution to this problem was the creation of a new dialect of Brainfuck [Kory] calls BrainPlus. This takes the best parts of Extended Brainfuck and adds a command that basically serves as a break statement.

With this, [Kory]’s self programming computer can develop more complex programs. Already it has created a program to generate the first few numbers of the Fibonacci sequence. It only goes up to 233 because 255 is the maximum value for a byte, and the program itself took seven hours to generate. It does, however, work. Other programs generated with the new Brainplus functions include reciting 99 bottles on the wall and a program that multiples two values.

Even though [Kory]’s computer is spending a long time to generate these programs, given enough time, there’s really not much this program can’t do. Brainfuck, and [Kory]’s Brainplus, are Turing complete, so that given infinite memory and time it can compute anything. With the new addition of functions, it can compute anything faster.

All the code for [Kyle]’s GA is available on Github.

 

Genetic Algorithms Become Programmers Themselves

AI

[Kory] has been experimenting with genetic algorithms. Normally we’d expect his experiments to deal with tuning the variables in a control system or something, but he’s doing something much cooler. [Kory] is using genetic algorithms to write computer programs, and in the process bringing us one step closer to the Singularity.

The first experiments with genetic algorithms generating applications did so in BASIC, C, and other human-readable languages. While these programs nearly worked, there were far too many limitations on what could be produced with a GA. A simpler language was needed, and after turning to assembly for a hot second, [Kory] ended up using brainfuck, an extremely minimal but still Turing-complete language.

The use of brainfuck for creating programs from a genetic algorithm may seem a bit strange, but there’s a method to [Kory]’s madness. It’s relatively simple to write an interpreter the GA’s fitness function can look into and come up with a score of which programs should breed and which should die. Also, the simplicity of brainfuck means a computer doesn’t have to learn much syntax and grammar at all.

Right now, [Kory]’s computer that can program itself only does so by creating simple ‘hello world’ programs. It should be possible, though, for this AI to create programs that take user input and generate an output, whatever that may be. Once [Kory] is able to have the computer generate its own fitness functions, though, the sky is the limit and the Singularity will be fast approaching.

On Not Designing Circuits With Evolutionary Algorithms

[Henrik] has been working on a program to design electronic circuits using evolutionary algorithms. It’s still very much a work in progress, but he’s gotten to the point of generating a decent BJT inverter after 78 generations (9 minutes of compute time), as shown in the .gif above.

To evolve these circuits, [Henrik] told a SPICE simulation to generate an inverter with a 5V power supply, 2N3904 and 2N3906 transistors, and whatever resistors were needed. The first dozen or so generations didn’t actually do anything, but after 2000 generations the algorithm produced a circuit nearly identical to the description of a CMOS inverter you’d find in a circuit textbook.

Using evolution to guide electronic design is nothing new; an evolutionary algorithm and a a few bits of Verilog can turn an FPGA into a chip that can tell the difference between a 1kHz and 10kHz tone with extremely minimal hardware requirements. There’s also some very, very strange stuff that happened in this experiment; the evolutionary algorithm utilized things that are impossible for a human to program and relies on magnetic flux and quantum weirdness inside the FPGA.

[Henrik] says his algorithm didn’t test for how much current goes through the transistors, so implementing this circuit outside of a simulation will destroy the transistors and emit a puff of blue smoke. If you’d like design your own circuits using evolution, [Henrik] put all the code in a git for your perusal. It’s damn cool as it stands now, and once [Henrik] includes checking current and voltage in each component his project may actually be useful.