Adding AI To NPCs Is Easy, Doing It Well Is Hard

Adding natural language interfaces to software is easier than ever, and that led [creikey] to prototype a game that hinges on communicating with NPCs. The prototype went through multiple iterations during which he mainly discovered things that did not work well. Ultimately, it led to [creikey] settling on a western-themed game called Dante’s Cowboy which he hopes to release as an experiment. He begins talking about the game around the 4:43 mark in the video, which directly precedes a recording of a presentation he gives at as an indie developer.

Games typically revolve around the player manipulating entities in an environment in order to make things happen. This interaction drives engagement and interesting decisions. But while adding natural language AI to NPCs makes them easy to talk with, talking by itself is a shallow interaction. Convincing NPCs to do things? That’s complex and far more difficult to implement. [creikey] realized the limitations large language models (LLMs) had and worked to overcome them to make a unique game experience.

The challenges boil down to figuring out how to drive meaningful interaction, aligning AI behavior with the gameplay context, and managing API costs. In his words, “it’s been a learning experience to figure out where [natural language AI] even belongs in a game, if it belongs at all.”

We’ve previously seen ChatGPT used to grant NPCs the ability to communicate naturally which is a fascinating tech demo, but gameplay-wise can boil down to being a complicated alternative to pressing a button. As [creikey] discovered, adding this technology into games in a way that feels meaningful takes a new kind of work.

Continue reading “Adding AI To NPCs Is Easy, Doing It Well Is Hard”

Weird Lens Allows Light Field Passthrough For VR Headset

Light Fields are a subtle but critical element to making 3D video look “real”, and it has little to do with either resolution or field of view. Meta (formerly Facebook) provides a look at a prototype VR headset that provides light field passthrough video to the user for a more realistic view of their surroundings, and it uses a nifty lens and aperture combination to make it happen.

As humans move our eyes (or our heads, for that matter) to take in a scene, we see things from slightly different perspectives in the process. These differences are important cues for our brains to interpret our world. But when cameras capture a scene, they capture it as a flat plane, which is different in a number of important ways from the manner in which our eyes work. A big reason stereoscopic 3D video doesn’t actually look particularly real is because the information it presents lacks these subtleties.

Continue reading “Weird Lens Allows Light Field Passthrough For VR Headset”

Simple Cubes Show Off AI-Driven Runtime Changes In VR

AR and VR developer [Skarredghost] got pretty excited about a virtual blue cube, and for a very good reason. It marked a successful prototype of an augmented reality experience in which the logic underlying the cube as a virtual object was changed by AI in response to verbal direction by the user. Saying “make it blue” did indeed turn the cube blue! (After a little thinking time, of course.)

It didn’t stop there, of course, and the blue cube proof-of-concept led to a number of simple demos. The first shows off a row of cubes changing color from red to green in response to musical volume, then a bundle of cubes change size in response to microphone volume, and cubes even start moving around in space.

The program accepts spoken input from the user, converts it to text, sends it to a natural language AI model, which then creates the necessary modifications and loads it into the environment to make runtime changes in Unity. The workflow is a bit cumbersome and highlights many of the challenges involved, but it works and that’s pretty nifty.

The GitHub repository is here and a good demonstration video is embedded just under the page break. There’s also a video with a much more in-depth discussion of what’s going on and a frank exploration of the technical challenges.

If you’re interested in this direction, it seems [Skarredghost] has rounded up the relevant details. And should you have a prototype idea that isn’t necessarily AR or VR but would benefit from AI-assisted speech recognition that can run locally? This project has what you need.

Continue reading “Simple Cubes Show Off AI-Driven Runtime Changes In VR”

Prototyping The Prototype

For basic prototyping, the go-to tool to piece together a functioning circuit is the breadboard. It’s a great way to prove a concept works before spending money and time on a PCB. For more complex tasks we can make use of simulation software such as SPICE. But there hasn’t really been a tool to blend these two concepts together. That’s what CRUMB is hoping to solve as a tool that allows simulating breadboard circuits.

Currently, most basic circuit functions are working for version 1.0. This includes passive components like resistors, capacitors, switches, some LEDs, and potentiometers, as well as some active components like transistors and diodes. There are some logic chips available such as 74XX series chips and 555 timers, which opens up a vast array of circuit building. There’s even an oscilloscope feature, plus audio output to incorporate buzzers into the circuit simulation. Currently in development is an LCD display module and improvements to the oscilloscope.

Besides prototyping, this could be useful for anyone, students included, who is learning about circuits without the need to purchase any hardware. The major downside to this project is that it there doesn’t seem to have a free or trial version, the source is not available, and it’s only for sale on Steam, Apple Store, and Google Play. That being said, there is a forum available for users to discuss problems and needs for future versions, so it’s possible that a community could build up around it. We’ve seen previously non-free versions of circuit simulation software become more open after some time, so it’s not out of the realm of possibility.

Thanks to [Thomas] for the tip!

Several shelf boxes of various widths are held together by brightly-colored plus-sign-shaped connectors.

3D Printed Shelf Connector

Sometimes, you really need a custom shelf. Whether you have a weird-shaped space, weird-shaped stuff, or just want something different, making your own shelving can make your place more like home. The Plus Shelf by [shurly] aims to make building your own shelves a little easier with a 3D printed bracket.

These connectors aren’t just sitting flush against the wood of the shelf. Each end of the + sign actually sits in a 3/8″ drilled recess, giving a more secure fit. The pieces were printed on an Objet and then dyed in various bright shades to really make the shelving pop. The cubbies were assembled with biscuits after cutting down a sheet of plywood to the appropriate sizes. The 45˚ angles around the edges of the cubbies make the whole shelf system that much nicer.

The final shelf has a little wobble, but that’s probably because dying the shelf connectors made them “bendy.” Because of the instability with the friction fit, the shelf connectors were super glued into the shelf boxes. [shurly] hopes that a metal version of the connectors might be able to eliminate these problems in the future.

This shelving system not your cup of tea? Maybe you’d prefer this Vintage Adjustable Shelving Method or this MP3 Player Shelf.

KachiChan_Sisyphus_RobotArms-On-A-Platform

Robot Repeatedly Rearranges Remnants In The Round

Sisyphus is an art installation by [Kachi Chan] featuring two scales of robots engaged in endless cyclic interaction. Smaller robots build brick arches while a giant robot pushes them down. As [Kachi Chan] says “this robotic system propels a narrative of construction and deconstruction.” The project was awarded honorary mention at the Ars Electronica’s Prix Ars 2022 in the Digital Communities category. Watch the video after the break to see the final concept.

KachiChan_Sisyphus_RobotArms-On-A-Platform_detail-view

[Kachi Chan] developed the installation in pre-visualizations and through a series of prototypes shown in a moody process film, the second video after the break. While the film is quite short on details, you’ll see iterations of the robot arm and computer vision system. According to this article on the project [Kachi Chan] used Cinema 4D to simulate the motion, ROS for control, PincherX150 robotic arms modified with Dynamixel XM 430 & XL430 servo motors, and custom 3D prints.

We’ve covered another type of Sisyphus project, sand tables like this and the Sisyphish. Continue reading “Robot Repeatedly Rearranges Remnants In The Round”

Injection Molds: Aluminum Or Resin?

[JohnSL] and his friend both have injection molding machines. They decided to compare the aluminum molds they usually use with some 3D printed molds created with a resin printer. They used two different resins, one on each side of the mold. You can see a video of the results below.

One half of the mold used ordinary resin while the other side used a resin that is made to hold up to higher temperatures. As you might expect, the lower-temperature resin didn’t stand up well to molten plastic. However, the higher temperature resin did somewhat better. It makes sense, though, that an aluminum mold draws more heat out of the plastic which is helpful in the molding process.

The higher temperature — and more expensive — resin did seem to hold up rather well, though. Of course, this was just to test. In real life, you’d want to use the better resin throughout.

No surprise, the resin molds didn’t last nearly as long as a proper mold. After 70 shots, the mold was worn beyond what you’d want to use. So not necessarily something you’d want to use for a real production run, but it should be enough for a quick prototype before you go to the expense of creating a proper mold.

We wonder if there are some other tricks to get better results. A comment from [TheCrafsMan] suggests that clear resin UV cures better, and that might produce better results. In fact, there are a lot of interesting comments on the video from people who have varied experiences trying to do the same thing.

If nothing else, watching the mill cut through the aluminum around the 15-minute mark is always interesting to watch.  If you don’t already have an injection molding setup, you can always build one. We’ve seen more than one design.

Continue reading “Injection Molds: Aluminum Or Resin?”