What Do You Want In A Programming Assistant?

The Propellerheads released a song in 1998 entitled “History Repeating.” If you don’t know it, the lyrics include: “They say the next big thing is here. That the revolution’s near. But to me, it seems quite clear. That it’s all just a little bit of history repeating.” The next big thing today seems to be the AI chatbots. We’ve heard every opinion from the “revolutionize everything” to “destroy everything” camp. But, really, isn’t it a bit of history repeating itself? We get new tech. Some oversell it. Some fear it. Then, in the end, it becomes part of the ordinary landscape and seems unremarkable in the light of the new next big thing. Dynamite, the steam engine, cars, TV, and the Internet were all predicted to “ruin everything” at some point in the past.

History really does repeat itself. After all, when X-rays were discovered, they were claimed to cure pneumonia and other infections, along with other miracle cures. Those didn’t pan out, but we still use them for things they are good at. Calculators were going to ruin math classes. There are plenty of other examples.

This came to mind because a recent post from ACM has the contrary view that chatbots aren’t able to help real programmers. We’ve also seen that — maybe — it can, in limited ways. We suspect it is like getting a new larger monitor. At first, it seems huge. But in a week, it is just the normal monitor, and your old one — which had been perfectly adequate — seems tiny.

But we think there’s a larger point here. Maybe the chatbots will help programmers. Maybe they won’t. But clearly, programmers want some kind of help. We just aren’t sure what kind of help it is. Do we really want CoPilot to write our code for us? Do we want to ask Bard or ChatGPT/Bing what is the best way to balance a B-tree? Asking AI to do static code analysis seems to work pretty well.

So maybe your path to fame and maybe even riches is to figure out — AI-based or not — what people actually want in an automated programming assistant and build that. The home computer idea languished until someone figured out what people wanted to do with them. Video cassette didn’t make it into the home until companies figured out what people wanted most to watch on them.

How much and what kind of help do you want when you program? Or design a circuit or PCB? Or even a 3D model? Maybe AI isn’t going to take your job; it will just make it easier. We doubt, though, that it can much improve on Dame Shirley Bassey’s history lesson.

An Almost Invisible Desktop

When you’re putting together a computer workstation, what would you say is the cleanest setup? Wireless mouse and keyboard? Super-discrete cable management? How about no visible keeb, no visible mouse, and no obvious display?

That’s what [Basically Homeless] was going for. Utilizing a Flexispot E7 electronically raisable standing desk, an ASUS laptop, and some other off-the-shelf parts, this project is taking the idea of decluttering to the extreme, with no visible peripherals and no visible wires.

There was clearly a lot of learning and much painful experimentation involved, and the guy kind of glazed over how a keyboard was embedded in the desk surface. By forming a thin layer of resin in-plane with the desk surface, and mounting the keyboard just below, followed by lots of careful fettling of the openings meant the keys could be depressed. By not standing proud of the surface, the keys were practically invisible when painted. After all, you need that tactile feedback, and a projection keeb just isn’t right.

ChatGPT-inspired machine learning mouse emulator

Moving on, never mind an ultralight gaming mouse, how about a zero-gram mouse? Well, this is a bit of a cheat, as they mounted a depth-sensing camera inside a light fitting above the desk, and built a ChatGPT-designed machine-learning model to act as a hand-tracking HID device. Nice idea, but we don’t see the code.

The laptop chassis had its display removed and was embedded into the bottom of the desk, along with the supporting power supplies, a couple of fans, and a projector. To create a ‘floating’ display, a piece of transparent plastic was treated to a coating of Lux labs “ClearBright” transparent display film, which allows the image from the projector to be scattered and observed with sufficient clarity to be usable as a PC display. We have to admit, it looks a bit gimmicky, but playing Minecraft on this setup looks a whole lotta fun.

Many of the floating displays we’ve covered tend to be for clocks (after all timepieces are important) like this sweet HUD hack.

Continue reading “An Almost Invisible Desktop”

Hackaday Prize 2023: LASK4 Watches Those Finger Wiggles

What do you get when you combine an ESP32-S2, a machine-learning model, some Hall effect sensors, and a grip exercise toy? [Turfptax] did just that and created LASK4. The four springs push down pistons with tiny magnets on them. Hall effect sensors determine the piston’s position, and since the springs are linear, the ESP32 can also estimate the force being applied on a given finger. This data is then streamed to a nearby computer over TCP. A small OLED screen shows the status, and a tidy 3D printed case creates a comfortable package.

So other than an excellent musical instrument, what is this good for? First, it creates well-labeled training data when combined with what is collected by the muscle sensor band we discussed previously. The muscle band measures various pressure sensors radially around the forearm. With just a few minutes of training data, the system can accurately predict finger movement using the random forest regression model.

What would you use it for? It’s considered a somatosensory device, so it can be used for physical therapy when undergoing hand rehabilitation, as it provides feedback during sessions. Or it could be used to train a controller efficiently.

It’s an exciting project on GitHub under an OpenCERN hardware license. The code is in MicroPython, and the PCB and STL files are included. We’re looking forward to seeing what else comes from the project. After the break, there’s a progress update video.

Continue reading “Hackaday Prize 2023: LASK4 Watches Those Finger Wiggles”

AI Creates Killer Drug

Researchers in Canada and the United States have used deep learning to derive an antibiotic that can attack a resistant microbe, acinetobacter baumannii, which can infect wounds and cause pneumonia. According to the BBC, a paper in Nature Chemical Biology describes how the researchers used training data that measured known drugs’ action on the tough bacteria. The learning algorithm then projected the effect of 6,680 compounds with no data on their effectiveness against the germ.

In an hour and a half, the program reduced the list to 240 promising candidates. Testing in the lab found that nine of these were effective and that one, now called abaucin, was extremely potent. While doing lab tests on 240 compounds sounds like a lot of work, it is better than testing nearly 6,700.

Interestingly, the new antibiotic seems only to be effective against the target microbe, which is a plus. It isn’t available for people yet and may not be for some time — drug testing being what it is. However, this is still a great example of how machine learning can augment human brainpower, letting scientists and others focus on what’s really important.

WHO identified acinetobacter baumannii as one of the major superbugs threatening the world, so a weapon against it would be very welcome. You can hope that this technique will drastically cut the time involved in developing new drugs. It also makes you wonder if there are other fields where AI techniques could cull out alternatives quickly, allowing humans to focus on the more promising candidates.

Want to catch up on machine learning algorithms? Google can help. Or dive into an even longer course.

Hackaday Prize 2023: Hearing Sirens When Drivers Can’t

[Jan Říha]’s PionEar device is a wonderful entry to the Assistive Tech portion of the 2023 Hackaday Prize. It’s a small unit intended to perch within view of the driver in a vehicle, and it has one job: flash a light whenever a siren is detected. It is intended to provide drivers with a better awareness of emergency vehicles, because they are so often heard well before they are seen, and their presence disrupts the usual flow of the road. [Jan] learned that there was a positive response in the Deaf and hard of hearing communities to a device like this; roads get safer when one has early warning.

Deaf and hard of hearing folks are perfectly capable of driving. After all, not being able to hear is not a barrier to obeying the rules of the road. Even so, for some drivers it can improve awareness of their surroundings, which translates to greater safety. For the hearing impaired, higher frequencies tend to experience the most attenuation, and this can include high-pitched sirens.

The PionEar leverages embedded machine learning to identify sirens, which is a fantastic application of the technology. Machine learning, after all, is a way to solve the kinds of problems that humans are not good at figuring out how to write a program to solve. Singling out the presence of a siren in live environmental audio definitely qualifies.

We also like the clever way that [Jan] embedded an LED light guide into the 3D-printed enclosure: by making a channel and pouring in a small amount of white resin intended for 3D printers. Cure the resin with a UV light, and one is left with an awfully good light guide that doubles as a diffuser. You can see it all in action in a short video, just under the page break.

Continue reading “Hackaday Prize 2023: Hearing Sirens When Drivers Can’t”

Self-Driving Library For Python

Fully autonomous vehicles seem to perennially be just a few years away, sort of like the automotive equivalent of fusion power. But just because robotic vehicles haven’t made much progress on our roadways doesn’t mean we can’t play with the technology at the hobbyist level. You can embark on your own experimentation right now with this open source self-driving Python library.

Granted, this is a library built for much smaller vehicles, but it’s still quite full-featured. Known as Donkey Car, it’s mostly intended for what would otherwise be remote-controlled cars or robotics platforms. The library is built to be as minimalist as possible with modularity as a design principle, and includes the ability to self-drive with computer vision using machine-learning algorithms. It is capable of logging sensor data and interfacing with various controllers as well, either physical devices or through something like a browser.

To build a complete platform costs around $250 in parts, but most things needed for a Donkey Car compatible build are easily sourced and it won’t be too long before your own RC vehicle has more “full self-driving” capabilities than a Tesla, and potentially less risk of having a major security vulnerability as well.

Hackaday Prize 2023: Finger Tracking Via Muscle Sensors

Whether you want to build a computer interface device, or control a prosthetic hand, having some idea of a user’s finger movements can be useful. The OpenMuscle finger tracking sensor can offer the data you need, and it’s a device you can readily build in your own workshop.

The device consists of a wrist cuff that mounts twelve pressure sensors, arranged radially about the forearm. The pressure sensors are a custom design, using magnets, hall effect senors, and springs to detect the motion of the muscles in the vicinity of the wrist.

We first looked at this project last year, and since then, it’s advanced in leaps and bounds. The basic data from the pressure sensors now feeds into a trained machine learning model, which then predicts the user’s actual finger movements. The long-term goal is to create a device that can control prosthetic hands based on muscle contractions in the forearm. Ideally, this would be super-intuitive to use, requiring a minimum of practice and training for the end user.

It’s great to see machine learning combined with innovative mechanical design to serve a real need. We can’t wait to see where the OpenMuscle project goes next.

Continue reading “Hackaday Prize 2023: Finger Tracking Via Muscle Sensors”