Machine Learning Baby Monitor, Part 2: Learning Sleep Patterns

The first lesson a new parent learns is that the second you think you’ve finally figured out your kid’s patterns — sleeping, eating, pooping, crying endlessly in the middle of the night for no apparent reason, whatever — the kid will change it. It’s the Uncertainty Principle of kids — the mere act of observing the pattern changes it, and you’re back at square one.

As immutable as this rule seems, [Caleb Olson] is convinced he can work around it with this over-engineered sleep pattern tracker. You may recall [Caleb]’s earlier attempts to automate certain aspects of parenthood, like this machine learning system to predict when baby is hungry; and yes, he’s also strangely obsessed with automating his dog’s bathroom habits. All that preliminary work put [Caleb] in a good position to analyze his son’s sleep patterns, which he did with the feed from their baby monitor camera and Google’s MediaPipe library.

This lets him look for how much the baby’s eyes are open, calculate with a wakefulness probability, and record the time he wakes up. This worked great right up until the wave function collapsed the baby suddenly started sleeping on his side, requiring the addition of a general motion detection function to compensate for the missing eyeball data. Check out the video below for more details, although the less said about the screaming, demon-possessed owl, the better.

The data [Caleb] has collected has helped him and his wife understand the little fellow’s sleep needs and fine-tune his cycles. There’s a web app, of course, and a really nice graphical representation of total time asleep and awake. No word on naps not taken in view of the camera, though — naps in the car are an absolute godsend for many parents. We suppose that could be curated manually, but wouldn’t doubt it if [Caleb] had a plan to cover that too.

Continue reading “Machine Learning Baby Monitor, Part 2: Learning Sleep Patterns”

Smart Bike Suspension Tunes Your Ride On The Fly

Riding a bike is a pretty simple affair, but like with many things, technology marches on and adds complications. Where once all you had to worry about was pumping the cranks and shifting the gears, now a lot of bikes have front suspensions that need to be adjusted for different riding conditions. Great for efficiency and ride comfort, but a little tough to accomplish while you’re underway.

Luckily, there’s a solution to that, in the form of this active suspension system by [Jallson S]. The active bit is a servo, which is attached to the adjustment valve on the top of the front fork of the bike. The servo moves the valve between fully locked, for smooth surfaces, and wide open, for rough terrain. There’s also a stop in between, which partially softens the suspension for moderate terrain. The 9-gram hobby servo rotates the valve with the help of a 3D printed gear train.

But that’s not all. Rather than just letting the rider control the ride stiffness from a handlebar-mounted switch, [Jallson S] added a little intelligence into the mix. Ride data from the accelerometer on an Arduino Nano 33 BLE Sense was captured on a smartphone via Arduino Science Journal. The data was processed through Edge Impulse Studio to create models for five different ride surfaces and rider styles. This allows the stiffness to be optimized for current ride conditions — check it out in action in the video below.

[Jallson S] is quick to point out that this is a prototype, and that niceties like weatherproofing still have to be addressed. But it seems like a solid start — now let’s see it teamed up with an Arduino shifter.

Continue reading “Smart Bike Suspension Tunes Your Ride On The Fly”

Hackaday Links Column Banner

Hackaday Links: January 22, 2023

The media got their collective knickers in a twist this week with the news that Wyoming is banning the sale of electric vehicles in the state. Headlines like that certainly raise eyebrows, which is the intention, of course, but even a quick glance at the proposed legislation might have revealed that the “ban” was nothing more than a non-binding resolution, making this little more than a political stunt. The bill, which would only “encourage” the phase-out of EV sales in the state by 2035, is essentially meaningless, especially since it died in committee before ever coming close to a vote. But it does present a somewhat lengthy list of the authors’ beefs with EVs, which mainly focus on the importance of the fossil fuel industry in Wyoming. It’s all pretty boneheaded, but then again, outright bans on ICE vehicle sales by some arbitrary and unrealistically soon deadline don’t seem too smart either. Couldn’t people just decide what car works best for them?

Speaking of which, a man in neighboring Colorado might have some buyer’s regret when he learned that it would take five days to fully charge his brand-new electric Hummer at home. Granted, he bought the biggest battery pack possible — 250 kWh — and is using a standard 120-volt wall outlet and the stock Hummer charging dongle, which adds one mile (1.6 km) to the vehicle’s range every hour. The owner doesn’t actually seem all that surprised by the results, nor does he seem particularly upset by it; he appears to know enough about the realities of EVs to recognize the need for a Level 2 charger. That entails extra expense, of course, both to procure the charger and to run the 240-volt circuit needed to power it, not to mention paying for the electricity. It’s a problem that will only get worse as more chargers are added to our creaky grid; we’re not sure what the solution is, but we’re pretty sure it’ll be found closer to the engineering end of the spectrum than the political end.

Continue reading “Hackaday Links: January 22, 2023”

Giving An Old Typewriter A Mind Of Its Own With GPT-3

There was an all-too-brief period in history where typewriters went from clunky, purely mechanical beasts to streamlined, portable electromechanical devices. But when the 80s came around and the PC revolution started, the typewriting was on the wall for these machines, and by the 90s everyone had a PC, a printer, and Microsoft Word. And thus the little daisy-wheel typewriters began to populate thrift shops all over the world.

That’s fine with us, because it gave [Arvind Sanjeev] a chance to build “Ghostwriter”, an AI-powered automatic typewriter. The donor machine was a clapped-out Brother electronic typewriter, which needed a bit of TLC to bring it back to working condition. From there, [Arvind] worked out the keyboard matrix and programmed an Arduino to drive the typewriter, both read and write. A Raspberry Pi running the OpenAI Python API for GPT-3 talks to the Arduino over serial, which basically means you can enter a GPT writing prompt with the keyboard and have the machine spit out a dead-tree version of the results.

To mix things up a bit, [Arvind] added a pair of pots to control the creativity and length of the response, plus an OLED screen which seems only to provide some cute animations, which we don’t hate. We also don’t hate the new paint job the typewriter got, but the jury is still out on the “poetry” that it typed up. Eye of the beholder, we suppose.

Whatever you think of GPT’s capabilities, this is still a neat build and a nice reuse of otherwise dead-end electronics. Need a bit more help building natural language AI into your next project? Our own [Donald Papp] will get you up to speed on that.

Continue reading “Giving An Old Typewriter A Mind Of Its Own With GPT-3”

Machine Learning Makes Sure Your LOLs Are Genuine

There was a time not too long ago when “LOL” actually meant something online. If someone went through the trouble of putting LOL into an email or text, you could be sure they were actually LOL-ing while they were typing — it was part of the social compact that made the Internet such a wholesome and inviting place. But no more — LOL has been reduced to a mere punctuation mark, with no guarantee that the sender was actually laughing, chuckling, chortling, or even snickering. What have we become?

To put an end to this madness, [Brian Moore] has come up with the LOL verifier. Like darn near every project we see these days, it uses a machine learning algorithm — EdgeImpulse in this case. It detects a laugh by comparing audio input against an exhaustive model of [Brian]’s jocular outbursts — he says it took nearly three full minutes to collect the training set. A Teensy 4.1 takes care of HID duties; if a typed “LOL” correlates to some variety of laugh, the initialism is verified with a time and date stamp. If your LOL was judged insincere – well, that’s on you. See what you think of the short video below — we genuinely LOL’d. And while we’re looking forward to a ROTFL verifier, we’re not sure we want to see his take on LMAO.

Hats off to [Brian] for his attempt to enforce some kind of standards online. You may recall his earlier attempt to makeĀ leaving Zoom calls a little less awkward, which we also appreciate.

Continue reading “Machine Learning Makes Sure Your LOLs Are Genuine”

Shopping Cart Does The Tedious Work For You

Thanks to modern microcontrollers, basic home automation tasks such as turning lights on and off, opening blinds, and various other simple tasks have become common DIY projects. But with the advent of artificial intelligence and machine learning the amount of tasks that can be offloaded to computers has skyrocketed. This shopping cart that automates away the checkout lines at grocery stores certainly fits into this category.

The project was inspired by the cashierless Amazon stores where customers simply walk into a store, grab what they want, and leave. This is made possible by the fact that computers monitor their purchases and charge them automatically, but creator [kutluhan_aktar] wanted to explore a way of doing this without a fleet of sensors and cameras all over a store. By mounting the hardware to a shopping cart instead, the sensors travel with the shopper and monitor what’s placed in the cart instead of what’s taken from a shelf. It’s built around the OpenMV Cam H7, a microcontroller paired with a camera specifically designed for these types of tasks, and the custom circuitry inside the case also includes WiFi connectivity to make sure the shopping cart can report its findings properly.

[kutluhan_aktar] also built the entire software stack from the ground up and trained the model on a set of common products as a proof-of-concept. The idea was to allow smaller stores to operate more efficiently without needing a full suite of Amazon hardware and software backing it up, and this prototype seems to work pretty well to that end. If you want to develop a machine vision project on your own with more common hardware, take a look at this project which uses the Raspberry Pi instead.

Wearable Sensor Trained To Count Coughs

There are plenty of problems that are easy for humans to solve, but are almost impossibly difficult for computers. Even though it seems that with modern computing power being what it is we should be able to solve a lot of these problems, things like identifying objects in images remains fairly difficult. Similarly, identifying specific sounds within audio samples remains problematic, and as [Eivind] found, is holding up a lot of medical research to boot. To solve one specific problem he created a system for counting coughs of medical patients.

This was built with the idea of helping people with chronic obstructive pulmonary disease (COPD). Most of the existing methods for studying the disease and treating patients with it involves manually counting the number of coughs on an audio recording. While there are some software solutions to this problem to save some time, this device seeks to identify coughs in real time as they happen. It does this by training a model using tinyML to identify coughs and reject cough-like sounds. Everything runs on an Arduino Nano with BLE for communication.

While the only data the model has been trained on are sounds from [Eivind], the existing prototypes do seem to show promise. With more sound data this could be a powerful tool for patients with this disease. And, even though this uses machine learning on a small platform, we have seen before that Arudinos are plenty capable of being effective machine learning solutions with the right tools on board.