Here’s Code For That AI-Generated Minecraft Clone

A little while ago Oasis was showcased on social media, billing itself as the world’s first playable “AI video game” that responds to complex user input in real-time. Code is available on GitHub for a down-scaled local version if you’d like to take a look. There’s a bit more detail and background in the accompanying project write-up, which talks about both the potential as well as the numerous limitations.

We suspect the focus on supporting complex user input (such as mouse look and an item inventory) is what the creators feel distinguishes it meaningfully from AI-generated DOOM. The latter was a concept that demonstrated AI image generators could (kinda) function as real-time game engines.

Image generators are, in a sense, prediction machines. The idea is that by providing a trained model with a short history of what just happened plus the user’s input as context, it can generate a pretty usable prediction of what should happen next, and do it quickly enough to be interactive. Run that in a loop, and you get some pretty impressive clips to put on social media.

It is a neat idea, and we certainly applaud the creativity of bending an image generator to this kind of application, but we can’t help but really notice the limitations. Sit and stare at something, or walk through dark or repetitive areas, and the system loses its grip and things rapidly go in a downward spiral we can only describe as “dreamily broken”.

It may be more a demonstration of a concept than a properly functioning game, but it’s still a very clever way to leverage image generation technology. Although, if you’d prefer AI to keep the game itself untouched take a look at neural networks trained to use the DOOM level creator tools.

VR Technology Helps Bring A Galaxy Far, Far Away To Our TV

Virtual reality is usually an isolated individual experience very different from the shared group experience of a movie screen or even a living room TV. But those worlds of entertainment are more closely intertwined than most audiences are aware. Video game engines have been taking a growing role in film and television production behind the scenes, and now they’re stepping out in front of the camera in a big way for making The Mandalorian TV series.

Big in this case is a three-quarters cylindrical LED array 75 ft (23 m) in diameter and 20 ft (6 m) high. But the LEDs covering its walls and ceiling aren’t pointing outwards like some installation for Times Square. This setup, called the Volume, points inward to display background images for camera and crew working within. It’s an immersive LED backdrop and stage environment.

Incorporating projected imagery on stage is a technique going at least as far back as 1933’s King Kong, but it is very limited. Lighting and camera motion has to be very constrained in order to avoid breaking the fragile illusion. More recently, productions have favored green screens replaced with computer imagery in post production. It removed most camera motion and lighting constraints, but costs a lot of money and time. It is also more difficult for actors to perform their roles convincingly against big blank slabs of green. The Volume solves all of those problems by putting computer-generated imagery on set, rendered in real time via video game engine Unreal.

Continue reading “VR Technology Helps Bring A Galaxy Far, Far Away To Our TV”

The Arduboy, Ported To Desktop And Back Again

A neat little hacker project that’s flying off the workbenches recently is the Arduboy. This tiny game console looks like a miniaturized version of the O.G. Game Boy, but it is explicitly designed to be hacked. It’s basically an Arduino board with a display and a few buttons, anyway.

[rv6502] got their hands on an Arduboy and realized that while there were some 3D games, there was nothing that had filled polygons, or really anything resembling a modern 3D engine. This had to be rectified, and the result is pretty close to Star Fox on a microcontroller.

This project began with a simple test on the Arduboy to see if it would be even possible to render 3D objects at any reasonable speed. This test was just a rotating cube, and everything looked good. Then began a long process of figuring out how fast the engine could go, what kind of display would suit the OLED best, and how to interact in a 3D world with limited controls.

Considering this is a fairly significant engineering project, the fastest way to produce code isn’t to debug code on a microcontroller. This project demanded a native PC port, so all the testing could happen on the PC without having to program the Flash every time. That allowed [rv] to throw out the Arduino IDE and USB library; if you’re writing everything on a PC and only uploading a hex file to a microcontroller at the end, you simply don’t need it.

One of the significant advances of the graphics capability of the Arduboy comes from exploring the addressing modes of the OLED. By default, the display is in a ‘horizontal mode’ which works for 2D blitting, but not for rasterizing polygons. The ‘vertical addressing mode’, on the other hand, allows for a block of memory, 8 x 128 bytes, that maps directly to the display. Shove those bytes over, and there’s no math necessary to display an image.

This is, simply, one of the best software development builds we’ve seen. It’s full of clever tricks (like simply not doing math if you’ll never need the result) and stuffing animations into far fewer bytes than you would expect. You can check out the demo video below.

Continue reading “The Arduboy, Ported To Desktop And Back Again”

FiSSION 3D Game Engine For Wii Homebrew

[youtube=http://www.youtube.com/watch?v=Xd1DrN2PDt4]

[PunMaster] wrote in to tell us that he has just released the first public demo of FiSSION Project. It’s a homebrew 3D game engine for the Wii. He’s hoping it will make development easier for other people that want to get into the Wii hacking scene. The project was originally spun out of similar work he was doing targeted at XNA for the 360. This is just a demo to generate interest in the project and hopefully get some feedback as to what’s needed to make a full release possible.