Smart Doorbell Focuses On Privacy

As handy as having a smart doorbell is, with its ability to remotely see who’s at the front door from anywhere with an Internet connection, the off-the-shelf units are not typically known for keeping user privacy as a top priority. Even if their cloud storage systems were perfectly secure (which is not a wise assumption to make) they have been known to give governmental agencies and police free reign to view the videos whenever they like. Unfortunately if you take privacy seriously, you might need to implement your own smart doorbell yourself.

The project uses an ESP32-CAM board as the doorbell’s core, paired with a momentary push button and all housed inside a 3D-printed enclosure. [Tristam] provides a step-by-step guide, including printing the enclosure, configuring the ESP32-CAM to work with the popular open-source home automation system ESPHome, handling doorbell notifications automatically, and wiring the components. There are plenty of other optional components that can be added to this system as well, including things like LED lighting for better nighttime imaging.

[Tristam] isn’t much of a fan of having his home automation connected to the Internet, so the device eschews wireless connections and batteries in favor of a ten-meter USB cable connected to it from a remote machine. As far as privacy goes, this is probably the best of all worlds as long as your home network isn’t doing anything crazy like exposing ports to the broader Internet. It also doesn’t need to be set up to continuously stream video either; this implementation only takes a snapshot when the doorbell button is actually pressed. Of course, with a few upgrades to the ESP circuitry it is certainly possible to use these chips to capture video if you prefer.

Thanks to [JohnU] for the tip!

The British Government Is Coming For Your Privacy

The list of bad legislation relating to the topic of encryption and privacy is long and inglorious. Usually, these legislative stinkers only affect those unfortunate enough to live in the country that passed them. Still, one upcoming law from the British government should have us all concerned. The Online Safety Bill started as the usual think-of-the-children stuff, but as the EFF notes, some of its proposed powers have the potential to undermine encryption worldwide.

At issue is the proposal that services with strong encryption incorporate government-sanctioned backdoors to give the spooks free rein to snoop on communications. We imagine that this will be of significant interest to some of the world’s less savoury regimes, a club we can’t honestly say the current UK government doesn’t seem hell-bent on joining. The Bill has had a tumultuous passage through the Lords, the UK upper house, but PM Rishi Sunak’s administration has proved unbending.

If there’s a silver lining to this legislative train wreck, it’s that many of the global tech companies are likely to pull their products from the UK market rather than comply. We understand that UK lawmakers are partial to encrypted online messaging platforms. Thus, there will be poetic justice in their voting once more for a disastrous bill with the unintended consequence of taking away something they rely on.

Header image: DaniKauf, CC BY-SA 3.0.

Disabling Intel’s Backdoors On Modern Laptops

Despite some companies making strides with ARM, for the most part, the desktop and laptop space is still dominated by x86 machines. For all their advantages, they have a glaring flaw for anyone concerned with privacy or security in the form of a hardware backdoor that can access virtually any part of the computer even with the power off. AMD calls their system the Platform Security Processor (PSP) and Intel’s is known as the Intel Management Engine (IME).

To fully disable these co-processors a computer from before 2008 is required, but if you need more modern hardware than that which still respects your privacy and security concerns you’ll need to either buy an ARM device, or disable the IME like NovaCustom has managed to do with their NS51 series laptop.

NovaCustom specializes in building custom laptops with customizations for various components and specifications to fit their needs, including options for the CPU, GPU, RAM, storage, keyboard layout, and other considerations. They favor Coreboot as a bootloader which already goes a long way to eliminating proprietary closed-source software at a fundamental level, but not all Coreboot machines have the IME completely disabled. There are two ways to do this, the HECI method which is better than nothing but not fully trusted, and the HAP bit, which completely disables the IME. NovaCustom is using the HAP bit approach to disable the IME, meaning that although it’s not completely eliminated from the computer, it is turned off in a way that’s at least good enough for computers that the NSA uses.

There are a lot of new computer manufacturers building conscientious hardware nowadays, but (with the notable exception of System76) the IME and PSP seem to be largely ignored by most computing companies we’d otherwise expect to care about an option like this. It’s certainly still an area of concern considering how much power the IME and PSP are given over their host computers, and we have seen even mainline manufacturers sometimes offer systems with the IME disabled. The only other options to solve this problem are based around specific motherboards for 8th and 9th generation Intel desktops, or you can go way back to hardware from 2008 and install libreboot to eliminate, rather than disable, the IME.

Thanks to [Maik] for the tip!

Hackaday Links Column Banner

Hackaday Links: August 28, 2022

The countdown for the first step on humanity’s return to the Moon has begun. The countdown for Artemis 1 started on Saturday morning, and if all goes well, the un-crewed Orion spacecraft atop the giant Space Launch Systems (SLS) booster will liftoff from the storied Pad 39B at Cape Canaveral on Monday, August 29, at 8:33 AM EDT (1233 GMT). The mission is slated to last for about 42 days, which seems longish considering the longest manned Apollo missions only lasted around 12 days. But, without the constraint of storing enough consumables for a crew, Artemis is free to take the scenic route to the Moon, as it were. No matter what your position is on manned space exploration, it’s hard to deny that launching a rocket as big as the SLS is something to get excited about. After all, it’s been 50 years since anything remotely as powerful as the SLS has headed to space, and it’s an event that’s expected to draw 100,000 people to watch it in person. We’ll have to stick to the NASA live stream ourselves; having seen a Space Shuttle launch in person in 1990, we can’t express how much we envy anyone who gets to experience this launch up close.
Continue reading “Hackaday Links: August 28, 2022”

You Break It, We Fix It

Apple’s AirTags have caused a stir, but for all the wrong reasons. First, they turn all iPhones into Bluetooth LE beacon repeaters, without the owner’s permission. The phones listen for the AirTags, encrypt their location, and send the data on to the iCloud, where the tag’s owner can decrypt the location and track it down. Bad people have figured out that this lets them track their targets without their knowledge, turning all iPhone users into potential accomplices to stalkings, or worse.

Naturally, Apple has tried to respond by implementing some privacy-protecting features. But they’re imperfect to the point of being almost useless. For instance, AirTags now beep once they’ve been out of range of their owner’s phone for a while, which would surely alert the target that they’re being tracked, right? Well, unless the evil-doer took the speaker out, or bought one with the speaker already removed — and there’s a surprising market for these online.

If you want to know that you’re being traced, Apple “innovated with the first-ever proactive system to alert you of unwanted tracking”, which almost helped patch up the problem they created, but it only runs on Apple phones. It’s not clear what they meant by “first-ever” because hackers and researchers from the SeeMoo group at the Technical University of Darmstadt beat them to it by at least four months with the open-source AirGuard project that runs on the other 75% of phones out there.

Along the way, the SeeMoo group also reverse engineered the AirTag system, allowing anything that can send BLE beacons to play along. This opened the door for [Fabian Bräunlein]’s ID-hopping “Find You” attack that breaks all of the tracker-detectors by using an ESP32 instead of an AirTag. His basic point is that most of the privacy guarantees that Apple is trying to make on the “Find My” system rely on criminals using unmodified AirTags, and that’s not very likely.

To be fair, Apple can’t win here. They want to build a tracking network where only the good people do the tracking. But the device can’t tell if you’re looking for your misplaced keys or stalking a swimsuit model. It can’t tell if you’re silencing it because you don’t want it beeping around your dog’s neck while you’re away at work, or because you’ve planted it on a luxury car that you’d like to lift when its owners are away. There’s no technological solution for that fundamental problem.

But hackers are patching up the holes they can, and making the other holes visible, so that we can at least have a reasonable discussion about the tech’s tradeoffs. Apple seems content to have naively opened up a Pandora’s box of privacy violation. Somehow it’s up to us to figure out a way to close it.

Pixelating Text Not A Good Idea

People have gotten much savvier about computer security in the last decade or so. Most people know that sending a document with sensitive information in it is a no-no, so many people try to redact documents with varying levels of success. A common strategy is to replace text with a black box, but you sometimes see sophisticated users pixelate part of an image or document they want to keep private. If you do this for text, be careful. It is possible to unredact pixelated images through software.

It appears that the algorithm is pretty straightforward. It simply guesses letters, pixelates them, and matches the result. You do have to estimate the size of the pixelation, but that’s usually not very hard to do. The code is built using TypeScript and while the process does require a little manual preparation, there’s nothing that seems very difficult or that couldn’t be automated if you were sufficiently motivated.

Continue reading “Pixelating Text Not A Good Idea”

No Privacy: Cloning The AirTag

You’ve probably heard of the infamous rule 34, but we’d like to propose a new rule — call it rule 35: Anything that can be used for nefarious purposes will be, even if you can’t think of how at the moment. Case in point: apparently there has been an uptick in people using AirTags to do bad things. People have used them to stalk people or to tag cars so they can be found later and stolen. According to [Fabian Bräunlein], Apple’s responses to this don’t consider cases where clones or modified AirTags are in play. To prove the point, he built a clone that bypasses the current protection features and used it to track a willing experimental subject for 5 days with no notifications.

According to the post, Apple says that AirTags have serial numbers and beep when they have not been around their host Apple device for a certain period. [Fabian] points out that clone tags don’t have serial numbers and may also not have speakers. There is apparently a thriving market, too, for genuine tags that have been modified to remove their speakers. [Fabian’s] clone uses an ESP32 with no speaker and no serial number.

The other protection, according to Apple, is that if they note an AirTag moving with you over some period of time without the owner, you get a notification. In other words, if your iPhone sees your own tag repeatedly, that’s fine. It also doesn’t mind seeing someone else’s tags if they are near you. But if your phone sees a tag many times and the owner isn’t around, you get a notification. That way, you can help identify random tags, but you’ll know if someone is trying to track you. [Fabian] gets around that by cycling between 2,000 pre-loaded public keys so that the tracked person’s device doesn’t realize that it is seeing the same tag over and over. Even Apple’s Android app that scans for trackers is vulnerable to this strategy.

Even for folks who aren’t particularly privacy minded, it’s pretty clear a worldwide network of mass-market devices that allow almost anyone to be tracked is a problem. But what’s the solution? Even the better strategies employed by AirGuard won’t catch everything, as [Fabian] explains.

This isn’t the first time we’ve had a look at privacy concerns around AirTags. Of course, it is always possible to build a tracker. But it is hard to get the worldwide network of Bluetooth listeners that Apple has.