This Week In Security: Court Orders, GlassWorm, TARmageddon, And It Was DNS

This week, a US federal court has ruled that NSO Group is no longer allowed to use Pegasus spyware against users of WhatsApp. And for their trouble, NSO was also fined $4 million. It’s unclear how much this ruling will actually change NSO’s behavior, as it intentionally stopped short of applying to foreign governments.

There may be an unexpected source of leverage the US courts can exert over NSO, with the news that American investors are acquiring the company. Among the requirements of the ruling is that NSO cannot reverse engineer WhatsApp code, cannot create new WhatsApp accounts, and must delete any existing WhatsApp code in their possession. Whether this actually happens remains to be seen.

Continue reading “This Week In Security: Court Orders, GlassWorm, TARmageddon, And It Was DNS”

Robot Phone Home…Or Else

We would have enjoyed [Harishankar’s] tear down of a robot vacuum cleaner, even if it didn’t have a savage twist at the end. Turns out, the company deliberately bricked his smart vacuum.

Like many of us, [Harishankar] is suspicious of devices beaming data back to their makers. He noted a new vacuum cleaner was pinging a few IP address, including one that was spitting out logging or telemetry data frequently. Of course, he had the ability to block the IP address which he did. End of story, right?

No. After a few days of working perfectly, the robot wouldn’t turn on. He returned it under warranty, but the company declared it worked fine. They returned it and, indeed, it was working. A few days later, it quit again. This started a cycle of returning the device where it would work, it would come home and work for a few days, then quit again.

You can probably guess where this is going, but to be fair, we gave you a big hint. The fact that it would work for days after blocking the IP address wouldn’t seem like a smoking gun in real time.

Continue reading “Robot Phone Home…Or Else”

Tommy Flowers: How An Engineer Won The War

Back in 2016, we took you to a collection of slightly dilapidated prefabricated huts in the English Home Counties, and showed you a computer. The place was the National Museum of Computing, next to the famous Bletchley Park codebreaking museum, and the machine was their reconstruction of Colossus, the world’s first fully electronic digital computer. Its designer was a telephone engineer named Tommy Flowers, and the Guardian has a piece detailing his efforts in its creation.

The front of the museum's Colossus MkII.
TNMOC’s Colossus MkII.

It’s a piece written for a non-technical audience so you’ll have to forgive it glossing over some of the more interesting details, but nevertheless it sets out to right a long-held myth that the machine was instead the work of the mathematician Alan Turing. Flowers led the research department at the British Post Office, who ran the country’s telephone system, and was instrumental both in proposing the use of electronic switches in computing, and in producing a working machine. The connection is obvious when you see Colossus, as its racks are the same as those used in British telephone exchanges of the era.

All in all, the article makes for an interesting read for anyone with an interest in technology. You can take a look at Colossus as we saw it in 2016 here, and if your interest extends to the only glimpse the British public had of the technology behind it in the 1950s, we’ve also taken a look at another Tommy Flowers creation, ERNIE, the UK Premium Bond computer.

Automatically Serving Up Canned Cat Food

If there’s any one benefit to having a cat as a pet instead of a dog, it’s that they’re a bit more independent and able to care for themselves for many days without human intervention. The only thing that’s really needed is a way to make sure they get food and water at regular intervals, but there are plenty of off-the-shelf options for these tasks. Assuming your cat can be fed dry food, that is. [Ben Heck]’s cat has a health problem that requires a special canned wet food, and since there aren’t automatic feeders for this he built his own cat-feeding robot.

Unlike dry food that can dispense a measured amount from a hopper full of food, the wet food needs to be opened and dispensed every day. To accomplish this, his robot has a mechanism that slowly slides a wedge under the pull tab on the can, punctures the can with it, and then pulls it back to remove the lid. From there the food is ejected from the feeder down a ramp to a waiting (and sometimes startled) cat. The cans are loaded into 3D-printed cartridges and then stacked into the machine on top of each other, so the machine can dispense food cans until it runs out. This design has space for six cans.

Although there are many benefits to having pets of any sort, one of the fun side quests of pet ownership is building fun things for them to enjoy or to make caring for them easier. We even had an entire Hackaday contest based on this premise. And, if biological life forms aren’t your cup of tea, there are always virtual pets to care for as well.

Thanks to [Michael C] for the tip!

Continue reading “Automatically Serving Up Canned Cat Food”

Making The Smallest And Dumbest LLM With Extreme Quantization

Turns out that training on Twitch quotes doesn't make an LLM a math genius. (Credit: Codeically, YouTube)
Turns out that training on Twitch quotes doesn’t make an LLM a math genius. (Credit: Codeically, YouTube)

The reason why large language models are called ‘large’ is not because of how smart they are, but as a factor of their sheer size in bytes. At billions of parameters at four bytes each, they pose a serious challenge when it comes to not just their size on disk, but also in RAM, specifically the RAM of your videocard (VRAM). Reducing this immense size, as is done routinely for the smaller pretrained models which one can download for local use, involves quantization. This process is explained and demonstrated by [Codeically], who takes it to its logical extreme: reducing what could be a GB-sized model down to a mere 63 MB by reducing the bits per parameter.

While you can offload a model, i.e. keep only part of it in VRAM and the rest in system RAM, this massively impacts performance. An alternative is to use fewer bits per weight in the model, called ‘compression’, which typically involves reducing 16-bit floating point to 8-bit, reducing memory usage by about 75%. Going lower than this is generally deemed unadvisable.

Using GPT-2 as the base, it was trained with a pile of internet quotes, creating parameters with a very anemic 4-bit integer size. After initially manually zeroing the weights made the output too garbled, the second attempt without the zeroing did somewhat produce usable output before flying off the rails. Yet it did this with a 63 MB model at 78 tokens a second on just the CPU, demonstrating that you can create a pocket-sized chatbot to spout nonsense even without splurging on expensive hardware.

Continue reading “Making The Smallest And Dumbest LLM With Extreme Quantization”

Keep An Eye On Your Air-Cooled Engine

There was a time, long ago, when passenger vehicles used to be much simpler than they are today. There were many downsides of this era, safety chief among them, but there were some perks as well. They were in general cheaper to own and maintain, and plenty could be worked on with simple tools. There’s perhaps no easier car to work on than an air-cooled Volkswagen, either, but for all its simplicity there are a number of modern features owners add to help them with these antiques. [Pegor] has created his own custom engine head temperature monitor for these vehicles.

As one could imagine with an air-cooled engine, keeping an eye on the engine temperature is critical to ensuring their longevity but the original designs omitted this feature. There are some off-the-shelf aftermarket solutions but this custom version has a few extra features that others don’t. It’s based on a ATMega32u4 microcontroller and will work with any K-type thermocouple, and thanks to its open nature can use a wide array of displays. [Pegor] chose one to blend in with the rest of the instrumentation on this classic VW. The largest issue that needed to be sorted out was around grounding, but a DC-DC converter created an isolated power supply for the microcontroller, allowing the thermocouple to be bonded to the grounded engine without disrupting operation of the microcontroller.

The finished product looks excellent and does indeed blend in to the dashboard more than the off-the-shelf temperature monitor that was in use before. The only thing that is planned for future versions is a way to automatically dim the display when the headlights are on, as [Pegor] finds it a little bright at night. We also enjoy seeing anything that helps these antiques stay on the road more reliably as their modern descendants don’t have any of the charm or engineering of these classics.

A Logical Clock That Pretends To Be Analog

[kcraske] had a simple plan for their clock build. They wanted a digital clock that was inspired by the appearance of an analog one, and they only wanted to use basic logic, with no microprocessors involved. Ultimately, they achieved just that.

Where today you might build a clock based around a microcontroller and a real-time clock module, or by querying a network time server, [kcraske] is doing all the timekeeping in simpler hardware. The clock is based around a bunch of 74-series logic chips, a CD4060 binary counter IC, and a 32.768 KHz crystal, which is easy to divide down to that critical 1 Hz. Time is displayed on the rings of LEDs around the perimeter of the clock—12 LEDs for hours, and 60 each for minutes and seconds. Inside the rings, the ICs that make up the clock are arranged in a pleasant radial configuration.

It’s a nice old-school build that reminds us not everything needs to run at 200 MHz or hook up to the internet to be worthwhile. We’ve featured some other fun old-school clocks of late, too. Meanwhile, if you’re cooking up your own arcane timepieces, we’d love to hear about it on the tipsline.