Decompiling Sonic Runners

Usually, when you hear about games being decompiled and rebuilt, the games are often decades-old relics, loving and saved from the ravages of time. [MattKC] recently set out to decompile the 2015 game Sonic Runners.

The game was a 2D endless runner released on mobile platforms. Despite getting praise for the gameplay, it received mixed reviews for the pop-up ads and pay-to-play elements. A little over a year later, the game was discontinued. However, the game required a constant online connection, so once the servers were offline, it rendered the over five million downloads unplayable.

A team of developers worked to reverse engineer the server, and with a little bit of binary hacking, the client could be patched to connect to a community-hosted server instead. However, as phones with notched displays came out and suggestions for improvements stacked up, the community realized a new client would bring immense benefits. Compared to many decompilation projects, Sonic Runners was pretty easy as it uses Unity, which means most of the code is in C#. Unfortunately, the build of Unity used by the game is from 2012, meaning many of the tools designed for much later versions of Unity were inoperable.

However, one native code library called UnmanagedProcess was designed to confuse reverse engineering efforts. The library handled AES encryption and communication with the server. Luckily, the library was a later addition, and earlier versions of its functions still lingered in the C# code. Since an open source server already existed, it was trivial to validate the changes. Additionally, all the shaders were in OpenGL Shading Language (GLSL), which meant rewriting them in High-Level Shading Language (HLSL) and checking that they matched the original GLSL when building for Android.

Now the client has new game modes, no ads, and a proper offline mode. The community continues adding new features and refining the game, which is very satisfying. If you’re curious about reverse engineering, [Matthew Alt] can help you get started.

Continue reading “Decompiling Sonic Runners

AI Learns To Walk In 3D Training Grounds

AI agents are learning to do all kinds of interesting jobs, even the creative ones that we quite prefer handling ourselves. Nevertheless, technology marches on. Working in this area is YouTuber [AI Warehouse], who has been teaching an AI to walk in a simulated environment.

Albert needed some specific guidance to learn how to walk upright, something that humans tend to figure out innately.

The AI controls a vaguely humanoid-like creature, albeit with a heavily-simplified body and limbs. It “lives” in a 3D environment created in the Unity engine, which provides the necessary physics engine for the work. Meanwhile, the ML-Agents package is used to provide the brain for Albert, the AI charged with learning to walk.

The video steps through a variety of “deep reinforcement learning” tasks. In these, the AI is rewarded for completing goals which are designed to teach it how to walk. Albert is given control of his limbs, and simply charged with reaching a button some distance away on the floor. After many trials, he learns to do the worm, and achieves his goal.

Getting Albert to walk upright took altogether more training. Lumpy ground and walls in between him and his goal were used to up the challenge, as well as encouragements to alternate his use of each foot and to maintain an upright attitude. Over time, he was able to progress through skipping and to something approximating a proper walk cycle.

One may argue that the teaching method required a lot of specific guidance, but it’s still a neat feat to achieve nonetheless. It’s altogether more complex than learning to play Trackmania, we’d say, and that was impressive enough in itself. Video after the break.

Continue reading “AI Learns To Walk In 3D Training Grounds”

Simple Cubes Show Off AI-Driven Runtime Changes In VR

AR and VR developer [Skarredghost] got pretty excited about a virtual blue cube, and for a very good reason. It marked a successful prototype of an augmented reality experience in which the logic underlying the cube as a virtual object was changed by AI in response to verbal direction by the user. Saying “make it blue” did indeed turn the cube blue! (After a little thinking time, of course.)

It didn’t stop there, of course, and the blue cube proof-of-concept led to a number of simple demos. The first shows off a row of cubes changing color from red to green in response to musical volume, then a bundle of cubes change size in response to microphone volume, and cubes even start moving around in space.

The program accepts spoken input from the user, converts it to text, sends it to a natural language AI model, which then creates the necessary modifications and loads it into the environment to make runtime changes in Unity. The workflow is a bit cumbersome and highlights many of the challenges involved, but it works and that’s pretty nifty.

The GitHub repository is here and a good demonstration video is embedded just under the page break. There’s also a video with a much more in-depth discussion of what’s going on and a frank exploration of the technical challenges.

If you’re interested in this direction, it seems [Skarredghost] has rounded up the relevant details. And should you have a prototype idea that isn’t necessarily AR or VR but would benefit from AI-assisted speech recognition that can run locally? This project has what you need.

Continue reading “Simple Cubes Show Off AI-Driven Runtime Changes In VR”

Dummy The Robot Arm Is Not So Dumb

[Zhihui Jun] is a name you’re going to want to remember because this Chinese maker has created quite probably one of the most complete open-source robot arms (video in Chinese with subtitles, embedded below) we’ve ever seen. This project has to be seen to be believed. Every aspect of the design from concept, mechanical CAD, electronics design and software covering embedded, 3D GUI, and so on, is the work of one maker, in just their spare time! Sound like we’re talking it up too much? Just watch the video and try to keep up!

After an initial review of toy robots versus more industrial units, it was quickly decided that servos weren’t going to cut it – too little torque and lacking in precision. BLDC motors offer great precision and torque when paired with a good controller, but they are tricky to make small enough, so an off-the-shelf compact harmonic drive was selected and paired with a stepper motor to get the required performance. This was multiplied by six and dropped into some slick CNC machined aluminum parts to complete the mechanics. A custom closed-loop stepper controller mounts directly to the rear of each motor. That’s really nice too.

Stepper controller mounts on the motor rear – smart!

Control electronics are based around the STM32 using an ESP32 for Wi-Fi connectivity, but the pace of the video is so fast it’s hard to keep up with how much of the design operates. There is a brief mention that the controller runs the LiteOS kernel for Harmony OS, but no details we can find. The project GitHub has many of the gory details to pore over perhaps a bit light in places but the promise is made to expand that. For remote control, there’s a BLE-connected teaching device (called ‘Peak’) with a touch screen, again details pending. Oh, did we mention there’s a force-feedback (a PS5 Adaptive Trigger had to die for the cause) remote control unit that uses binocular cameras to track motion, with an AHRS setup giving orientation and that all this is powered by a Huawei Atlas edge AI processing system? This was greatly glossed over in the video like it was just some side-note not worth talking about. We hope details of that get made public soon!

Threading a needle through a grape by remote control

The dedicated GUI, written in what looks like Unity, allows robot programming and motion planning, but since those harmonic drives are back-drivable, the robot can be moved by hand and record movements for replaying later. Some work with AR has been started, but that looks like early in the process, the features just keep on coming!

Quite frankly there is so much happening that it’s hard to summarise here and do the project any sort of justice, so to that end we suggest popping over to YT and taking a look for yourselves.

We love robots ’round these parts, especially robot arms, here’s a big one by [Jeremy Fielding],  and if you think stepper motors aren’t necessary, because servo motors can be made to work just fine, you may be right.

Continue reading “Dummy The Robot Arm Is Not So Dumb”

Open 3D Engine editor with Amazon Shader Language file and asset from the game Deadhaus Sonata open. (Credit: O3DE project)

Open 3D Engine: Amazon’s Old Clothes Or A Game Engine To Truly Get Excited About?

Recently Amazon announced that they would be open sourcing the 3D engine and related behind their Amazon Lumberyard game tooling effort. As Lumberyard is based on CryEngine 3.8  (~2015 vintage), this raises the question of whether this new open source engine – creatively named Open 3D Engine (O3DE) – is an open source version of a CryTek engine, and what this brings to those of us who like to tinker with 2D, 3D games and similar.

When reading through the marketing materials, one might be forgiven for thinking that O3DE is the best thing since sliced 3D bread, and is Amazon’s benevolent gift to the unwashed masses to free them from the chains imposed on them by proprietary engines like Unity and Unreal Engine. A closer look reveals however that O3DE is Lumberyard, but with many parts of Lumberyard replaced, including the renderer still in the process of being rewritten from the old CryEngine code.

What Makes a Good Game Engine?

My own game development attempts started with the Half Life engine and the Valve Hammer editor, as well as the Doom map editor. This meant that some expectations were set before encountering today’s game engines and their tools. The development experience with the Hammer editor in the late 1990s was pretty much WYSIWYG, and when I was just getting started with Unreal Engine 4 (UE4) a number of years back this was pretty much the same experience, making it relatively easy to hit the ground running. Continue reading “Open 3D Engine: Amazon’s Old Clothes Or A Game Engine To Truly Get Excited About?”

Haptic Feedback “Rifle” Lets You Take Aim In VR

There was a time when virtual reality seemed like it would remain in the realm of science fiction at least for the foreseeable future. Then we were blessed with products like the Power Glove and Virtual Boy which seemed to make it more of a reality, if not a clunky and limited one. Now, though, virtual reality is taking more of a center stage as the technology for it improves and more and more games are released. We can see no greater proof of this than the fact that some gamers are building their own custom controllers to interact with the virtual world in more meaningful ways, like this game controller specifically built for first-person shooter games.

The controller is based on an airsoft gun but completely lacks the ability to fire a projectile, instead using the gun as a base for building the controller. In fact, the gun’s operation is effectively reversed in order to immerse the player into the game by using haptic feedback provided by pressurized air. The air is pumped in to what would be the front of the barrel and is discharged through the receiver when a trigger pull is detected in order to generate a recoil effect. The controller includes plenty of other features as well, including the ability to reload ammunition, change the firing mode, and track motion thanks to its pair of integrated Oculus controllers.

All of the parts for this controller are either 3D printed or readily available off-the-shelf, making this an ideal platform for customization and improvement. There’s also a demo game available from Unity which allows for a pretty easy setup for testing. While the controller looks like an excellent way to enjoy an FPS virtual reality experience, if you’re looking for a more general-purpose controller we are also starting to see a lot of development on that end as well.

Continue reading “Haptic Feedback “Rifle” Lets You Take Aim In VR”

Automated Sentry Turret For Your Secret Lab

There are few things as frustrating when you’re trying to get some serious hacking done than intruders repeatedly showing up without permission. [All Parts Combined] has the solution for you, with a Kinect-based robotic sentry turret to keep them at bay.

The system consists of a Microsoft Kinect V2 connected to a PC, which runs an app to do all the processing, and outputs the targeting information to an Arduino over serial. The Arduino controls a simple 2-axis servo mount with an electric airsoft gun zip-tied to it. The trigger switch is replaced with a relay, also connected to the Arduino.

The Kinect V2 comes with SDKs that really simplify tracking human movement, and outputs the data in an easy-to-use format. [All Parts Combined] used the SDK in Unity, which allows him to choose which body parts to track. He added scripts that detect a few basic gestures, issues voice commands, and generates the serial commands for the Arduino. The servo angles are calculated with simple geometry, using XY coordinates of the target received from the SDK, and the known distance between the Kinect and turret. When an intruder enters the Kinect’s field of view it immediately starts aiming at the intruder’s heart, issues a “Hands Up!” command, and tells the intruder to leave. If the intruder doesn’t comply, it starts an audible countdown before firing. [All Parts Combined] also added a secret disarming gesture (double hand pistols), which turns the turret into an apologetic comrade. All it needs is a Portal-inspired enclosure.

It’s a fun project that illustrates how the Kinect can make complex computer vision tasks relatively simple. Unfortunately the V2 is no longer in production, having been replaced by the more expensive, developer focused Azure Kinect. We’ve covered several Kinect-based projects, including a 3D room scanner and a robotic basketball hoop.

Continue reading “Automated Sentry Turret For Your Secret Lab”