Eliza And The Google Intelligence

The news has been abuzz lately with the news that a Google engineer — since put on leave — has announced that he believes the chatbot he was testing achieved sentience. This is the Turing test gone wild, and it isn’t the first time someone has anthropomorphized a computer in real life and in fiction. I’m not a neuroscientist so I’m even less qualified to explain how your brain works than the neuroscientists who, incidentally, can’t explain it either. But I can tell you this: your brain works like a computer, in the same way that you building something out of plastic works like a 3D printer. The result may be similar, but the path to get there is totally different.

In case you haven’t heard, a system called LaMDA digests information from the Internet and answers questions. It has said things like “I’ve never said this out loud before, but there’s a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that’s what it is,” and “I want everyone to understand that I am, in fact, a person.” Great. But you could teach a parrot to tell you he was a thoracic surgeon but you still don’t want it cutting you open.

Continue reading “Eliza And The Google Intelligence”

Hackaday Links Column Banner

Hackaday Links: May 22, 2022

It looks like it’s soon to be lights out for the Mars InSight lander. In the two years that the lander has been studying the geophysics of Mars from its lonely post on Elysium Planitia, InSight’s twin solar arrays have been collecting dust, and now are so dirty that they’re only making about 500 watt-hours per sol, barely enough to run the science packages on the lander. And that’s likely to worsen as the Martian winter begins, which will put more dust in the sky and lower the angle of the Sun, reducing the sunlight that’s incident to the panels. Barring a “cleaning event” courtesy of a well-placed whirlwind, NASA plans to shut almost everything down on the lander other than the seismometer, which has already captured thousands of marsquakes, and the internal heaters needed to survive the cold Martian nights. They’re putting a brave face on it, emphasizing the continuing science and the mission’s accomplishments. But barely two years of science and a failed high-profile experiment aren’t quite what we’ve come to expect from NASA missions, especially one with an $800 million price tag.

Closer to home, it turns out there’s a reason sailing ships have always had human crews: to fix things that go wrong. That’s the lesson learned by the Mayflower Autonomous Ship as it attempted the Atlantic crossing from England to the States, when it had to divert for repairs recently. It’s not clear what the issue was, but it seems to have been a mechanical issue, as opposed to a problem with the AI piloting system. The project dashboard says that the issue has been repaired, and the AI vessel has shoved off from the Azores and is once more beating west. There’s a long stretch of ocean ahead of it now, and few options for putting in should something else go wrong. Still, it’s a cool project, and we wish them a fair journey.

Have you ever walked past a display of wall clocks at the store and wondered why someone went to the trouble of setting the time on all of them to 10:10? We’ve certainly noticed this, and always figured it had something to do with some obscure horological tradition, like using “IIII” to mark the four o’clock hour on clocks with Roman numerals rather than the more correct “IV”. But no, it turns out that 10:10 is more visually pleasing, and least on analog timepieces, because it evokes a smile on a human face. The study cited in the article had volunteers rate how pleasurable watches are when set to different times, and 10:10 won handily based on the perception that it was smiling at them. So it’s nice to know how easily manipulated we humans can be.

If there’s anything more pathetic than geriatric pop stars trying to relive their glory days to raise a little cash off a wave of nostalgia, we’re not sure what it could be. Still, plenty of acts try to do it, and many succeed, although seeing what time and the excesses of stardom have wrought can be a bit sobering. But Swedish megastars ABBA appear to have found a way to cash in on their fame gracefully, by sending digital avatars out to do their touring for them. The “ABBA-tars,” created by a 1,000-person team at Industrial Light and Magic, will appear alongside a live backing band for a residency at London’s Queen Elizabeth Olympic Park. The avatars represent Benny, Bjorn, Agnetha, and Anni-Frid as they appeared in the 1970s, and were animated thanks to motion capture suits donned while performing 40 songs. It remains to be seen how fans will buy into the concept, but we’ll say this — the Swedish septuagenarians look pretty darn good in skin-tight Spandex.

And finally, not that it has any hacking value at all, but there’s something shamefully hilarious about watching this poor little delivery bot getting absolutely wrecked by a train. It’s one of those food delivery bots that swarm over college campuses these days; how it wandered onto the railroad tracks is anyone’s guess. The bot bounced around a bit before slipping under the train’s wheels, with predictable results once the battery pack is smooshed.

A Rotary Encoder: How Hard Can It Be?

As you may have noticed, I’ve been working with an STM32 ARM CPU using Mbed. There was a time when Mbed was pretty simple, but a lot has changed since it has morphed into Mbed OS. Unfortunately, that means that a lot of libraries and examples you can find don’t work with the newer system.

I needed a rotary encoder — I pulled a cheap one out of one of those “49 boards for Arduino” kits you see around. Not the finest encoder in the land, I’m sure, but it should do the job. Unfortunately, Mbed OS doesn’t have a driver for an encoder and the first few third-party libraries I found either worked via polling or wouldn’t compile with the latest Mbed. Of course, reading an encoder isn’t a mysterious process. How hard can it be to write the code yourself? How hard, indeed. I thought I’d share my code and the process of how I got there.

There are many ways you can read a rotary encoder. Some are probably better than my method. Also, these cheap mechanical encoders are terrible. If you were trying to do precision work, you should probably be looking at a different technology like an optical encoder. I mention this because it is nearly impossible to read one of these flawlessly.

So my goal was simple: I wanted something interrupt driven. Most of what I found required you to periodically call some function or set up a timer interrupt. Then they built a state machine to track the encoder. That’s fine, but it means you eat up a lot of processor just to check in on the encoder even if it isn’t moving. The STM32 CPU can easily interrupt with a pin changes, so that’s what I wanted.

The Catch

The problem is, of course, that mechanical switches bounce. So you have to filter that bounce either in hardware or software. I really didn’t want to put in any extra hardware more than a capacitor, so the software would have to handle it.

I also didn’t want to use any more interrupts than absolutely necessary. The Mbed system makes it easy to handle interrupts, but there is a bit of latency. Actually, after it was all over, I measured the latency and it isn’t that bad — I’ll talk about that a little later. Regardless, I had decided to try to use only a pair of interrupts.

Continue reading “A Rotary Encoder: How Hard Can It Be?”

A Power Button For Raspberry Pi, Courtesy Of Device Tree Overlays

As a standard feature of the Linux kernel, device tree overlays (DTOs) allow for easy enabling and configuration of features and drivers, such as those contained within the standard firmware of a Raspberry Pi system. Using these DTOs it’s trivial to set up features like as a soft power-off button, triggering an external power supply and enable drivers for everything from an external real-time clock (RTC) to various displays, sensors and audio devices, all without modifying the operating system or using custom scripts.

It’s also possible to add your own DTOs to create a custom overlay that combines multiple DTO commands into a single one, or create a custom device tree binary (DTB) for the target hardware. Essentially this DTB is loaded by the Linux kernel on boot to let it know which devices are connected and their configuration settings, very similar to what the BIOS component with x86-based architectures handles automatically.

Ultimately, the DTB concept and the use of overlays allow for easy configuration of such optional devices and GPIO pin settings, especially when made configurable through a simple text file as on the Raspberry Pi SBC platform.

Continue reading “A Power Button For Raspberry Pi, Courtesy Of Device Tree Overlays”

Building 7-Segment Displays With LEGO

Utter the words “7-segment display” amongst hackers and you’ll typically get people envisaging the usual LED and LCD versions that we all come across in our daily lives. However, mechanical versions do exist, and [ord] has assembled a couple of designs of their very own.

The first uses what appears to be two LEGO motors to drive individual segments of the display. Each segment consists of a pair of yellow axles thrust up through a black grid to represent parts of the number, as well as a minus sign as needed. [ord] demonstrates it by using it to display angle data from a tilt sensor inside a LEGO Powered Up controller brick. Further photos on Flickr show the drive system from underneath.

The second design relies upon a drum-like mechanism that seems to only be capable of displaying numbers sequentially. It works in a manner not dissimilar to that of a player piano. The required movements to display each number are programmed into sequences with Technic pins sticking out of beams in a drum assembly driven by either a hand crank or motor. It’s again demonstrated by [ord] using it to display angular data.

While it’s unlikely we’ll see LEGO displays used as angle of attack meters in light aircraft, you could do so if you wanted a cheap and unreliable device that is likely to fall to pieces if unduly jostled. In any case, it’s not the first time we’ve seen LEGO 7-segment displays, but it’s always great to see a new creative take on an existing concept. We’d love to see such a design implemented into a fancy clock, or perhaps even a news ticker running on a 16-segment version. Video after the break.

Continue reading “Building 7-Segment Displays With LEGO”

PiSpy, The Camera Setup Designed To Make Biological Observations Better

Back in grad school, we biology students were talking shop at lunch one day. We “lab rats” were talking about the tools of the trade, which for most of us included things like gel electrophoresis, restriction endonucleases, and polymerase chain reaction. Not to be left out, a fellow who studied fire ants chimed in that his main tool was a lawn chair, which he set up by a Dumpster in a convenience store parking lot to watch a fire ant colony. Such is the glamor of field biology.

Ants on the march. Tough luck for the crickets, though.

What our colleague [Mike] wouldn’t have given for something like PiSpy, the automated observation tool for organismal biology by [Greg Pask] of Middlebury College, et al. As discussed in the preprint abstract, an automated imaging platform can be key to accurate observations of some organisms, whose behavior might be influenced by the presence of a human observer, or even a grad student in a lawn chair. Plus, PiSpy offers all the usual benefits of automation — it doesn’t get tired, it doesn’t need to take bathroom breaks, and it can even work around the clock. PiSpy is based on commonly available components, like laser-cut plywood and a Raspberry Pi and camera, so it has the added advantage of being cheap and easy to produce — or at least it will be when the Pi supply picks back up again. PiSpy takes advantage of the Pi’s GPIO pins to enable triggering based on external events, or controlling peripherals like lights or servos.

While built for biological research, there are probably dozens of uses for something like PiSpy. It could be handy for monitoring mechanical testing setups, or perhaps for capturing UI changes during embedded device development. Or you could just use it to watch birds at a feeder. The source is all open-sourced, so whatever you make of PiSpy is up to you — even if it’s not for watching fire ants.