Assemble Your (Virtual) Robotic Underground Exploration Team

It’s amazing how many things have managed to move online in recent weeks, many with a beneficial side effect of eliminating travel making them more accessible to everyone around the world. Though some events had a virtual track before it was cool, among them the DARPA Subterranean Challenge (SubT) robotics competition. Recent additions to their “Hello World” tutorials (with promise of more to come) have continued to lower the barrier of entry for aspiring roboticists.

We all love watching physical robots explore the real world, which is why SubT’s “Systems Track” gets most of the attention. But such participation is necessarily restricted to people who have the resources to build and transport bulky hardware to the competition site, which is just a tiny subset of all the brilliant minds who can contribute. Hence the “Virtual Track” which is accessible to anyone with a computer that meets requirements. (64-bit Ubuntu 18 with NVIDIA GPU) The tutorials help get us up and running on SubT’s virtual testbed which continues to evolve. With every round, the organizers work to bring the virtual and physical worlds closer together. During the recent Urban Circuit, they made high resolution scans of both the competition course as well as participating robots.

There’s a lot of other traffic on various SubT code repositories. Motivated by Bitbucket sunsetting their Mercurial support, SubT is moving from Bitbucket to GitHub and picking up some housecleaning along the way. Together with the newly added tutorials, this is a great time to dive in and see if you want to assemble a team (both of human collaborators and virtual robots) to join in the next round of virtual SubT. But if you prefer to stay an observer of the physical world, enjoy this writeup with many fun details on systems track robots.

A More Open Raspberry Pi Camera Stack With Libcamera

As open as the Raspberry Pi Foundation has been about their beloved products, they would be the first to admit there’s always more work to be done: Getting a Pi up and running still requires many closed proprietary components. But the foundation works to chip away at it bit by bit, and one of the latest steps is the release of a camera stack built on libcamera.

Most Linux applications interact with the camera via V4L2 or a similar API. These established interfaces were designed back when camera control was limited and consisted of a few simple hardware settings. Today we have far more sophisticated computational techniques for digital photography and video. Algorithms have outgrown dedicated hardware, transforming into software modules that take advantage of CPU and/or GPU processing. In practice, this trend meant bigger and bigger opaque monolithic pieces of proprietary code. Every one a mix of “secret sauce” algorithms commingling with common overhead code wastefully duplicated for each new blob.

We expect camera makers will continue to devise proprietary specialties as they seek a competitive advantage. Fortunately, some of them see benefit in an open-source framework to help break up those monoliths into more manageable pieces, letting them focus on just their own specialized parts. Leveraging something like libcamera for the remainder can reduce their software development workload, leading to faster time to market, lower support cost, and associated benefits to the bottom line that motivates adoption by corporations.

But like every new interface design borne of a grandiose vision, there’s a chicken-and-egg problem. Application developers won’t consume it if there’s no hardware, and hardware manufacturers won’t implement it if no applications use it. For the consumer side, libcamera has modules to interop with V4L2 and other popular interfaces. For the hardware side, it would be useful to have a company with wide reach who believes it is useful to open what they can and isolate the pieces they can’t. This is where the Raspberry Pi foundation found a fit.

The initial release doesn’t support their new High-Quality Camera Module though that is promised soon. In the short term, there is still a lot of work to be done, but we are excited about the long term possibilities. If libcamera can indeed lower the barrier to entry, it would encourage innovation and expanding the set of cameras beyond the officially supported list. We certainly have no shortage of offbeat camera sensor ideas around here, from a 1-kilopixel camera sensor to a decapped DRAM chip.

[via Hackster.io]

Under The Hood Of Second Reality, PC Demoscene Landmark

In 1993, IBM PCs & clones were a significant but not dominant fraction of the home computer market. They were saddled with the stigma of boring business machines. Lacking Apple Macintosh’s polish, unable to match Apple II’s software library, and missing Commodore’s audio/visual capabilities. The Amiga was the default platform of choice for impressive demos, but some demoscene hackers saw the PC’s potential to blow some minds. [Future Crew] was such a team, and their Second Reality accomplished exactly that. People who remember and interested in a trip back in time should take [Fabien Sanglard]’s tour of Second Reality source code.

We recently covered another impressive PC demo executed in just 256 bytes, for which several commenters were thankful the author shared how it was done. Source for demos aren’t necessarily released: the primary objective being to put on a show, and some authors want to keep a few tricks secret. [Future Crew] didn’t release source for Second Reality until 20th anniversary of its premiere, by which time it was difficult to run on a modern PC. Technically it is supported by DOSBox but rife with glitches, as Second Reality uses so many nonstandard tricks. The easiest way to revisit nostalgia is via video captures posted to YouTube (one embedded below the break.)

A PC from 1993 is primitive by modern standards. It was well before the age of GPUs. In fact before any floating point hardware was commonplace: Intel’s 80387 math co-processor was a separate add-on to the 80386 CPU. With the kind of hardware at our disposal today it can be hard to understand what a technical achievement Second Reality was. But PC users of the time understood, sharing it and dropping jaws well beyond the demoscene community. Its spread was as close to “going viral” as possible when “high speed data” was anything faster than 2400 baud.

Many members of [Future Crew] went on to make impact elsewhere in the industry, and their influence spread far and wide. But PC graphics wasn’t done blowing minds in 1993 just yet… December 10th of that year would see the public shareware release of a little thing called Doom.

Continue reading “Under The Hood Of Second Reality, PC Demoscene Landmark”

Pouring Creativity Into Musical Upcycling Of Plastic Bottles

Convenient and inexpensive, plastic beverage bottles are ubiquitous in modern society. Many of us have a collection of empties at home. We are encouraged to reduce, reuse, and recycle such plastic products and [Kaboom Percussion] playing Disney melodies on their Bottlephone 2.0 (video embedded below) showcases an outstanding melodic creation for the “reuse” column.

Details of this project are outlined in a separate “How we made it” video (also embedded below). Caps of empty bottles are fitted with commodity TR414 air valves. The pitch of each bottle is tuned by adjusting pressure. Different beverage brands were evaluated for pleasing tone of their bottles, with the winners listed. Pressure levels going up to 70 psi means changes in temperature and inevitable air leakage makes keeping this instrument in tune a never-ending task. But that is a relatively simple mechanical procedure. What’s even more impressive on display is the musical performance talent of this team, assisted by some creative video editing. Sadly for us, such skill does not come in a bottle. Alcohol only makes us believe we are skilled without improving actual skill.

But that’s OK, this is Hackaday where we thrive on building machines to perform for us. We hope it won’t be long before a MIDI-controlled variant is built by someone, perhaps incorporating an air compressor for self-tuning capabilities. We’ve featured bottles as musical instruments before, but usually as wind instruments like this bottle organ or the fipple. This is a percussion instrument more along the lines of the wine glass organ. It’s great to see different combinations explored, and we are certain there are more yet to come.

Continue reading “Pouring Creativity Into Musical Upcycling Of Plastic Bottles”

Behind The Scenes Of Folding@Home: How Do You Fight A Virus With Distributed Computing?

A great big Thank You to everyone who answered the call to participate in Folding@Home, helping to understand proteins interactions of SARS-CoV-2 virus that causes COVID-19. Some members of the FAH research team hosted an AMA (Ask Me Anything) session on Reddit to provide us with behind-the-scenes details. Unsurprisingly, the top two topics are “Why isn’t my computer doing anything?” and “What does this actually accomplish?”

The first is easier to answer. Thanks to people spreading the word — like the amazing growth of Team Hackaday — there has been a huge infusion of new participants. We could see this happening on the leader boards, but in this AMA we have numbers direct from the source. Before this month there were roughly thirty thousand regular contributors. Since then, several hundred thousands more started pitching in. This has overwhelmed their server infrastructure and resulted in what’s been termed a friendly-fire DDoS attack.

The most succinct information was posted by a folding support forum moderator.

Here’s a summary of current Folding@Home situation :
* We know about the work unit shortage
* It’s happening because of an approximately 20x increase in demand
* We are working on it and hope to have a solution very soon.
* Keep your machines running, they will eventually fold on their own.
* Every time we double our server resources, the number of Donors trying to help goes up by a factor of 4, outstripping whatever we do.

Why don’t they just buy more servers?

The answer can be found on Folding@Home donation FAQ. Most of their research grants have restrictions on how that funding is spent. These restrictions typically exclude capital equipment and infrastructure spending, meaning researchers can’t “just” buy more servers. Fortunately they are optimistic this recent fame has also attracted attention from enough donors with the right resources to help. As of this writing, their backend infrastructure has grown though not yet caught up to the flood. They’re still working on it, hang tight!

Computing hardware aside, there are human limitations on both input and output sides of this distributed supercomputer. Folding@Home need field experts to put together work units to be sent out to our computers, and such expertise is also required to review and interpret our submitted results. The good news is that our contribution has sped up their iteration cycle tremendously. Results that used to take weeks or months now return in days, informing where the next set of work units should investigate.

Continue reading “Behind The Scenes Of Folding@Home: How Do You Fight A Virus With Distributed Computing?”

Ingenious Hacks That Brought The Original Prince Of Persia To Life

For many 8-bit computing veterans, the original Prince of Persia game was our first exposure to fluid life-like animation on screen. This groundbreaking technical achievement earned the game’s place in nostalgia and history. Ars Technica invited its original creator [Jordan Mechner] to sit in front of a camera and talk through many technical and game design challenges he had to solve. (Video embedded below. Bonus: correct pronunciation of Karateka directly from the creator’s mouth.)

Enjoy the journey back in time as [Jordan] broke down the convoluted process behind Prince of Persia‘s rotoscope animation. Starting with VCR footage, to film negatives, to tracing out with black markers and white correction fluid to generate a high contrast reference suitable for the (then) state-of-the-art digitizer. But generating those frames was just the beginning! They consumed majority of an Apple II’s memory, thus fighting memory constraints was a persistent headache. Fortunately for us, that limitation also motivated memorable elements such as our “Shadow Man” alter ego.

This Prince of Persia feature is the latest episode of Ars Technica’s “War Stories” series, inviting people behind notable games to talk about their work behind the scenes. The creators of Myst put a lot of effort into minimizing the impact of CD-ROM seek times, an entirely theoretical endeavour as they had no CD burner for verification. The creators of Crash Bandicoot paged in game content from CD in 64kb chunks as a player progressed, allowing creation of levels too large to fit in a PlayStation’s memory all at once. Read over these and other short synopsis of episodes so far¬†or go straight to their YouTube playlist.

If this talk of wrangling bits with 6502 assembly code has whet your appetite for more, the source code for Prince of Persia is available for digging into. Don’t worry if you have long since lost track of your Apple II (or never had one) as the code can run in an emulator.

Continue reading “Ingenious Hacks That Brought The Original Prince Of Persia To Life”

DARPA Challenge Autonomous Robot Teams To Navigate Unfinished Nuclear Power Plant

Robots might be finding their footing above ground, but today’s autonomous robots have a difficult time operating underground. DARPA wanted to give the state of the art a push forward, so they are running a Subterranean (SubT) Challenge which just wrapped up its latest round. A great review of this Urban Circuit competition (and some of the teams participating in it) has been published by IEEE Spectrum. This is the second of three underground problem subdomains presented to the participants, six months apart, preparing them for the final event which will combine all three types.

If you missed the livestream or prefer edited highlight videos, they’re all part of DARPAtv’s Subterranean Challenge playlist. Today it starts with a compilation of Urban Circuit highlights and continues to other videos. Including team profiles, video walkthrough of competition courses, actual competition footage, edited recap videos, and the awards ceremony. Half of the playlist are video from the Tunnels Circuit six months ago, so we can compare to see how teams performed and what they’ve learned along the way. Many more lessons were learned in the just-completed Urban Circuit and teams will spend the next six months improving their robots. By then we’ll have the Caves Circuit competition with teams ready to learn new lessons about operating robots underground.

Continue reading “DARPA Challenge Autonomous Robot Teams To Navigate Unfinished Nuclear Power Plant”