The Dark Arts: Anonymity

Love him or hate him, Edward Snowden knew a thing or two about anonymity. In June of 2013, he blew the whistle on the NSA’s out-of-control programs that can target virtually anyone plugged into the digital age. The former CIA employee was working as a contractor for the NSA, where he had access to highly classified documents for many of these general populace surveillance programs. He eventually took off to Hong Kong and released the documents to a handful of reporters. One of these documents was a power point presentation of the NSA complaining about how the TAILS operating system was a major thorn in their side. Naturally, Snowden insisted that the reporters and himself only communicate via the TAILS O/S. He used PGP, which is an encryption method with the highly sophisticated title of “Pretty Good Privacy”, and asked not to be quoted at length for fear of identification via stylometry.

In this article, we’re going to go over the basics of anonymity, and introduce you to methods of staying anonymous while online.

Continue reading “The Dark Arts: Anonymity”

Apple Aftermath: Senate Entertains A New Encryption Bill

If you recall, there was a recent standoff between Apple and the U. S. Government regarding unlocking an iPhone. Senators Richard Burr and Dianne Feinstein have a “discussion draft” of a bill that appears to require companies to allow the government to court order decryption.

Here at Hackaday, we aren’t lawyers, so maybe we aren’t the best source of legislative commentary. However, on the face of it, this seems a bit overreaching. The first part of the proposed bill is simple enough: any “covered entity” that receives a court order for information must provide it in intelligible form or provide the technical assistance necessary to get the information in intelligible form. The problem, of course, is what if you can’t? A covered entity, by the way, is anyone from a manufacturer, to a software developer, a communications service, or a provider of remote computing or storage.

There are dozens of services (backup comes to mind) where only you have the decryption keys and there is nothing reasonable the provider can do to get your data if you lose your keys. That’s actually a selling point for their service. You might not be anxious to backup your hard drive if you knew the vendor could browse your data when they wanted to do so.

The proposed bill has some other issues, too. One section states that nothing in the document is meant to require or prohibit a specific design or operating system. However, another clause requires that covered entities provide products and services that are capable of complying with the rule.

A broad reading of this is troubling. If this were law, entire systems that don’t allow the provider or vendor to decrypt your data could be illegal in the U. S. Whole classes of cybersecurity techniques could become illegal, too. For example, many cryptography systems use the property of forward secrecy by generating unrecorded session keys. For example, consider an SSH session. If someone learns your SSH key, they can listen in or interfere with your SSH sessions. However, they can’t take recordings of your previous sessions and decode them. The mechanism is a little different between SSHv1 (which you shouldn’t be using) and SSHv2. If you are interested in the gory details for SSHv2, have a look at section 9.3.7 of RFC 4251.

In all fairness, this isn’t a bill yet. It is a draft and given some of the definitions in section 4, perhaps they plan to expand it so that it makes more sense, or – at least – is more practical. If not, then it seems to be an indication that we need legislators that understand our increasingly technical world and have some understanding of how the new economy works. After all, we’ve seen this before, right? Many countries are all too happy to enact and enforce tight banking privacy laws to encourage deposits from people who want to hide their money. What makes you think that if the U. S. weakens the ability of domestic companies to make data private, that the business of concealing data won’t just move offshore, too?

If you were living under a rock and missed the whole Apple and FBI controversy, [Elliot] can catch you up. Or, you can see what [Brian] thought about Apple’s response to the FBI’s demand.

Facebook To Slurp Oculus Rift Users’ Every Move

The web is abuzz with the news that the Facebook-owned Oculus Rift has buried in its terms of service a clause allowing the social media giant access to the “physical movements and dimensions” of its users. This is likely to be used for the purposes of directing advertising to those users and most importantly for the advertisers, measuring the degree of interaction between user and advert. It’s a dream come true for the advertising business, instead of relying on eye-tracking or other engagement studies on limited subsets of users they can take these metrics from their entire user base and hone their offering on an even more targeted basis for peak interaction to maximize their revenue.

Hardly a surprise you might say, given that Facebook is no stranger to criticism on privacy matters. It does however represent a hitherto unseen level of intrusion into a user’s personal space, even to guess the nature of their activities from their movements, and this opens up fresh potential for nefarious uses of the data.

Fortunately for us there is a choice even if our community doesn’t circumvent the data-slurping powers of their headsets; a rash of other virtual reality products are in the offing at the moment from Samsung, HTC, and Sony among others, and of course there is Google’s budget offering. Sadly though it is likely that privacy concerns will not touch the non-tech-savvy end-user, so competition alone will not stop the relentless desire from big business to get this close to you. Instead vigilance is the key, to spot such attempts when they make their way into the small print, and to shine a light on them even when the organisations in question would prefer that they remained incognito.

Oculus Rift development kit 2 image: By Ats Kurvet – Own work, CC BY-SA 4.0, via Wikimedia Commons.

Directional Booklight Invisible To Everyone But You

Consistent contributor [Ken] has cooked up another contraption with his directional booklight. Combining an LED strip and privacy screen filter inside a wooden enclosure, this handy tool is made for someone who wants to read in bed without disturbing anyone else. The booklight sits on top of the page, the LEDs light up just the given area, and because the privacy screen only allows light to come straight off the page, only the reader can see any light and any other viewing angle is obscured.

[Ken] thought of everything. Rather than have the light stay on while the booklight is lifted to turn the page and possibly flash an unsuspecting slumberer, a tactile switch on the underside turns the light on only when it is pressed against the page, allowing very little light to escape.

Future upgrades include another switch on top to detect when the book is closed, and an accelerometer to detect when the reader may have fallen asleep.

We’ve reported a few of [Ken]’s projects before, like his 3D popup cardsunique weather display, and semi-real-life Mario Kart

Continue reading “Directional Booklight Invisible To Everyone But You”

The Contrarian Response To Apple’s Need For Encryption

On December 2, 2015, [Syed Rizwan Farook] and [Tashfeen Malik] opened fire at a San Bernardino County Department of Public Health training event, killing 14 and injuring 22. This was the third deadliest mass shooting in the United States in recent memory, and began a large investigation by local, state, and federal agencies. One piece of evidence recovered by the FBI was an iPhone 5C belonging to one of the shooters. In the days and months after the shooting, the FBI turned to Apple to extract data from this phone.

A few days ago in an open letter to customers, [Tim Cook], CEO of Apple, stated they will not comply with FBI’s request to build a backdoor for the iPhone. While the issue at hand is extracting data from an iPhone recovered from the San Bernardino shooting, [Cook] says building a new version of iOS to extract this data would allow the FBI to unlock any iPhone. Needless to say, there are obvious security implications of this request.

Apple does not publish open letters to its customers often. Having one of the largest companies on the planet come out in support of privacy and encryption is nearly unprecedented. There is well-founded speculation this open letter to the public will be exhibit A in a supreme court case. Needless to say, the Internet has gone a little crazy after this letter was published, and rightly so: just imagine how better off we would be if AT&T said no to the NSA in 2002 – [Snowden] might just be another IT geek working for a government contractor.

CalvinThere is a peculiar aspect of public discourse that doesn’t make any sense. In the absence of being able to say anything interesting, some people have just decided to add a contrary viewpoint. Being right, having a valid argument, or even having evidence to support assertions doesn’t matter; being contrary is far more interesting. Look at any comment thread on the Internet, and you’ll find the longest comment chain is the one refuting the parent article. Look up the ratings for a cable news channel. You’ll find the highest rated show is the one with the most bickering. When is the last time you saw something from the New York Times, Washington Post, or LA Times on Facebook or your favorite news aggregator? Chances are, it wasn’t news. It was an op-ed, most likely one that was espousing a view contrary to either public opinion or public policy.

As with any headline event on the Internet, the contrarians have come out of the woodwork. These contrarians are technically correct and exceedingly myopic.

Continue reading “The Contrarian Response To Apple’s Need For Encryption”

Hijacking Quadcopters With A MAVLink Exploit

Not many people would like a quadcopter with an HD camera hovering above their property, and until now there’s no technical resource to tell drone pilots to buzz off. That would require actually talking to a person. Horrors. Why be reasonable when you can use a Raspberry Pi to hijack a drone? It’s the only reasonable thing to do, really.

The folks at shellIntel have been messing around with quads for a while, and have recently stumbled upon a vulnerability in the Pixhawk flight controller and every other quadcopter that uses the MAVLink protocol. This includes the Parrot AR.drone, ArduPilot, PX4FMU, pxIMU, SmartAP, MatrixPilot, Armazila 10dM3UOP88, Hexo+, TauLabs and AutoQuad. Right now, the only requirement to make a drone fall out of the sky is a simple radio module and a computer. A Raspberry Pi was used in shellIntel’s demo.

The exploit is a consequence of the MAVLink sending the channel or NetID used to send commands from the transmitter to the quadcopter in each radio frame. This NetID number is used so multiple transmitters don’t interfere with each other; if two transmitters use the same NetID, there will be a conflict and two very confused pilots. Unfortunately, this also means anyone with a MAVLink radio using the same NetID can disarm a quadcopter remotely, and anyone with a MAVLink radio can tell a quad to turn off, or even emulate the DJI Phantom’s ‘Return to China’ function.

The only required hardware for this exploit is a $100 radio and three lines of code. It is certainly possible to build a Raspberry Pi-based box that would shut down any Pixhawk-equipped quadcopter within radio range, although the folks at shellIntel didn’t go that far just yet. Now it’s just a proof of concept to demonstrate that there’s always a technical solution to your privacy concerns. Video below.

Continue reading “Hijacking Quadcopters With A MAVLink Exploit”

Panopticlick: You Are A Beautiful And Unique Snowflake

We all like to think we’re unique, but when it comes to remaining anonymous online that’s probably not such a good idea. By now, it’s common knowledge that advertising firms, three-letter agencies, and who-knows-who-else want to know what websites you’re visiting and how often. Persistent tracking cookies, third-party cookies, and “like” buttons keep tabs on you at all times.

For whatever reason, you might want to browse anonymously and try to plug some of the obvious sources of identity leakage. The EFF and their Panopticlick project have bad news for you.

The idea behind Panopticlick is simple: to try to figure out how identifiable you are even if you’re not accepting cookies, or if you’ve disabled Flash, or if you’re using “secure” browsers. To create a fingerprint of your browser, Panopticlick takes all the other little bits of identifying information that your browser gives up, and tries to piece them together.

For a full treatment of the project, see this paper (PDF). The takeaway from the project is that the information your browser gives up to servers can, without any cookies, specifically identify you.

fooFor instance, a server can query which plugins your browser supports, and if you’ve installed anything a tiny bit out of the ordinary, you’re fingerprinted. Your browser’s User Agent strings are often over-specific and tell which browser sub-sub-sub version you’re running on which OS platform. If you’re running Flash, it can report back which fonts you’ve got installed on your system. Any of these can be easily as rare as one-in-a-million. Combining them together (unless they’re all highly correlated) can fingerprint you uniquely.

You can’t necessarily win. If you disable Flash, the remote site doesn’t get your font list, but since only one in five browsers runs with Flash disabled, you’re still giving up two bits of information. If you run a “privacy-enhancing” niche browser, your chances of leaving a unique fingerprint go through the roof unless you’re also forging the User Agent strings.

I ran the Panopticlick experiment twice, once with a Firefox browser and once with an obscure browser that I actually use most of the time (dwb). Firefox runs a Flash blocker standard, so they didn’t get my font list. But still, the combination of browser plugins and a relatively new Firefox on Linux alone made me unique.

It was even worse for the obscure browser test. Only one in 1.4 million hits use dwb, so that alone was bad news. I also use a 4:3 aspect-ratio monitor, with 1280×1024 pixels at 24-bit color depth, which is apparently a one-in-twenty-four occurrence. Who knew?

fooFinally, I tried out the Tor browser, which not only routes your traffic through the Tor network, but also removes a lot of the specific data about your session. It fared much better, making me not uniquely identifiable: instead only one in a thousand. (Apparently a lot of people trying out the Panopticlick site ran Tor browser.)

If you’re interested in online anonymity, using something like Tor to obscure your IP address and disabling cookies is a good start. But Panopticlick points out that it may not be enough. You can never use too many layers of tinfoil when making your hat.

Try it out, and let us know in the comments how you fare.