Interplanetary Whack-A-Mole: NASA’s High-Stakes Rescue Plan For InSight Lander’s Science Mission

People rightly marvel at modern surgical techniques that let surgeons leverage the power of robotics to repair the smallest structures in the human body through wounds that can be closed with a couple of stitches. Such techniques can even be applied remotely, linking surgeon and robot through a telesurgery link. It can be risky, but it’s often a patient’s only option.

NASA has arrived at a similar inflection point, except that their patient is the Mars InSight lander, and the surgical suite is currently about 58 million kilometers away. The lander’s self-digging “mole” probe needs a little help getting started, so they’re planning a high-stakes rescue attempt that would make the most seasoned telesurgeon blanch: they want to use the lander’s robotic arm to press down on the mole to help it get back on track.

Continue reading “Interplanetary Whack-A-Mole: NASA’s High-Stakes Rescue Plan For InSight Lander’s Science Mission”

Smart Speakers “Accidentally” Listen Up To 19 Times A Day

In the spring of 2018, a couple in Portland, OR reported to a local news station that their Amazon Echo had recorded a conversation without their knowledge, and then sent that recording to someone in their contacts list. As it turned out, the commands Alexa followed came were issued by television dialogue. The whole thing took a sitcom-sized string of coincidences to happen, but it happened. Good thing the conversation was only about hardwood floors.

But of course these smart speakers are listening all the time, at least locally. How else are they going to know that someone uttered one of their wake words, or something close enough? It would sure help a lot if we could change the wake word to something like ‘rutabaga’ or ‘supercalifragilistic’, but they probably have ASICs that are made to listen for a few specific words. On the Echo for example, your only choices are “Alexa”, “Amazon”, “Echo”, or “Computer”.

So how often are smart speakers listening when they shouldn’t? A team of researchers at Boston’s Northeastern University are conducting an ongoing study to determine just how bad the problem really is. They’ve set up an experiment to generate unexpected activation triggers and study them inside and out.

Continue reading “Smart Speakers “Accidentally” Listen Up To 19 Times A Day”

EU Duty Changes, A Whole VAT Of Trouble For Hackers?

It could be said that there are a number of factors behind  the explosion of creativity in our community of hardware hackers over the last couple of decades, but one in particular that is beyond doubt is the ease with which it has been possible to import small orders from China. See something on AliExpress and it can be yours for a few quid, somewhere in a warehouse on the other side of the world it’s put into a grey shipping bag, and three weeks later it’s on your doorstep. This bounty has in no small part been aided by a favourable postage and taxation environment in which both low postage costs and a lack of customs duties on packages under a certain value conspire to render getting the product in front of you a fraction of the cost of buying the thing in the first place. Continue reading “EU Duty Changes, A Whole VAT Of Trouble For Hackers?”

Dexter Robot Arm Embraces New Manufacturing With First Micro-Factory

Haddington Dynamics, the company behind the Dexter robot arm that won the 2018 Hackaday Prize, has opened its first microfactory to build robot arms for Australia and Southeast Asia.

You may remember that the combination of Dexter’s makeup and capabilities are what let it stand out among robotics projects. The fully-articulated robot arm can be motion trained; it records how you move the arm and can play back with high precision rather than needing to be taught with code. The high-precision is thanks to a clever encoder makeup that leverages the power of FPGAs to amplify the granularity of its optical encodes. And it embraces advanced manufacturing to combine 3D printed and glue-up parts with mass produced gears, belts,  bearings, and motors.

It’s a versatile robot arm, for a fraction of the cost of what came before it, with immense potential for customization. And did I mention that it’s open source? Continue reading “Dexter Robot Arm Embraces New Manufacturing With First Micro-Factory”

John Deere And Nebraska’s Right To Repair, The Aftermath Of A Failed Piece Of Legislation

For the past few years now we’ve covered a long-running battle between American farmers and the manufacturers of their farm machinery, over their right to repair, with particular focus on the agricultural giant John Deere. The manufacturer of the familiar green and yellow machinery that lies in the heart and soul of American farming has attracted criticism for using restrictive DRM and closed-source embedded software to lock down the repair of its products into the hands of its dealer network.

This has been a hot-button issue in our community as it has with the farmers for years, but it’s failed to receive much traction in the wider world. It’s very encouraging then to see some mainstream coverage from Bloomberg Businessweek on the subject, in which they follow the latest in the saga of the Nebraska farmers’ quest for a right to repair bill. Particularly handy for readers wishing to digest it while doing something else, they’ve also recorded it as an easy-to-listen podcast.

We last visited the Nebraska farmers a couple of years ago when they were working towards the bill reaching their legislature. The Bloomberg piece brings the saga up to date, with the Nebraska Farm Bureau failing to advance it, and the consequent anger from the farmers themselves. It’s interesting in its laying bare the arguments of the manufacturer, also for its looking at the hidden aspect of the value of the data collected by these connected machines.

It’s likely that the wider hardware hacker community and the farming community have different outlooks on many fronts, but in our shared readiness to dive in and fix things and now in our concern over right to repair we have a common purpose. Watching these stories at a distance, from the agricultural heartland of the European country where this is being written, it’s striking how much the farmers featured are the quintessential salt-of-the-earth Americans representing what much of America still likes to believe that it is at heart. If a company such as John Deere has lost those guys, something really must have gone wrong in the world of green and yellow machinery.

Header image: Nheyob / CC BY-SA 4.0

The Legacy Of One Of Science’s Brightest Stars: Freeman Dyson

Of the many well-known names in science, few have been as reluctant to stick to one particular field as Freeman John Dyson. Born in the UK in 1923, he showed a great interest in mathematics and related fields even as a child. By the time he was 15 he had won a scholarship at Trinity College, in Cambridge, where he studied mathematics. Though the war forced him to work at the Air Force’s Operational Research Section (ORS), afterwards he would return to Trinity to get his BA in mathematics.

His subsequent career saw him teaching at universities in the UK and US, before eventually ending up at Cornell University, where he joined the Institute for Advanced Study at the invitation of its head, J. Robert Oppenheimer. Here he would meet up with such people as Richard Feynman with whom he would work on quantum electrodynamics.

Beyond mathematics and physics, Dyson would also express great interest in space exploration — with Dyson spheres being well-known — and genetics, both in the context of the first formation of life and in genetic manipulation to improve plants to deal with issues today. He also worked on the famous Project Orion, which used nuclear bombs for propulsion.

In this article we’ll take a look at these and other parts of Mr. Dyson’s legacy, as well as the influence of his works today.

Continue reading “The Legacy Of One Of Science’s Brightest Stars: Freeman Dyson”

This Week In Security: Let’s Encrypt Revocation, Ghostcat, And The RIDLer

Let’s Encrypt recently celebrated their one billionth certificate. That’s over 190 million websites currently secured, and thirteen full-time staff. The annual budget for Lets Encrypt is an eye-watering $3.3+ million, covered by sponsors like Mozilla, Google, Facebook, and the EFF.

A cynic might ask if we need to rewind the counter by the three million certificates Let’s Encrypt recently announced they are revoking as a result of a temporary security bug. That bug was in the handling of the Certificate Authority Authorization (CAA) security extension. CAA is a recent addition to the X.509 standard. A domain owner opts in by setting a CAA field in their DNS records, specifying a particular CA that is authorized to issue certificates for their domain. It’s absolutely required that when a CA issues a new certificate, it checks for a CAA record, and must refuse to issue the certificate if a different authority is listed in the CAA record.

The CAA specification specifies eight hours as the maximum time to cache the CAA check. Let’s Encrypt uses a similar automated process to determine domain ownership, and considers those results to be valid for 30 days. There is a corner case where the Let’s Encrypt validation is still valid, but the CAA check needs to be re-performed. For certificates that cover multiple domains, that check would need to be performed for each domain before the certificate can be issued. Rather validating each domain’s CAA record, the Let’s Encrypt validation system was checking one of those domain names multiple times. The problem was caught and fixed on the 28th.

The original announcement gave administrators 36 hours to manually renew their affected certificates. While just over half of the three million target certificates have been revoked, an additional grace period has been extended for the over a million certs that are still in use. Just to be clear, there aren’t over a million bad certificates in the wild, and in fact, only 445 certificates were minted that should have been prevented by a proper CAA check.

Ghostcat

Apache Tomcat, the open source Java-based HTTP server, has had a vulnerability for something like 13 years. AJP, the Apache JServ Protocol, is a binary protocol designed for server-to-server communication. An example use case would be an Apache HTTP server running on the same host as Tomcat. Apache would serve static files, and use AJP to proxy dynamic requests to the Tomcat server.

Ghostcat, CVE-2020-1938, is essentially a default configuration issue. AJP was never designed to be exposed to untrusted clients, but the default Tomcat configuration enables the AJP connector and binds it to all interfaces. An attacker can craft an AJP request that allows them to read the raw contents of webapp files. This means database credentials, configuration files, and more. If the application is configured to allow file uploads, and that upload location is in the folder accessible to the attacker, the result is a full remote code execution exploit chain for any attacker.

The official recommendation is to disable AJP if you’re not using it, or bind it to localhost if you must use it. At this point, it’s negligence to leave ports exposed to the internet that aren’t being used.

Have I Been P0wned

You may remember our coverage of [Troy Hunt] over at haveibeenpwned.com. He had made the decision to sell HIBP, as a result of the strain of running the project solo for years. In a recent blog post, [Troy] reveals the one thing more exhausting that running HIBP: trying to sell it. After a potential buyer was chosen, and the deal was nearly sealed, the potential buyer went through a restructuring. At the end of the day, the purchase no longer made sense for either party, and they both walked away, leaving HIBP independent. It sounds like the process was stressful enough that HIBP will remain a independent entity for the foreseeable future.

You Were Warned

Remember the Microsoft Exchange vulnerability from last week? Attack tools have been written, and the internet-wide scans have begun.

Ridl Me This, Chrome

We’ve seen an abundance of speculative execution vulnerabilities over the last couple of years. While these problems are technically interesting, there has been a bit of a shortage of real-world attacks that leverage those vulnerabilities. Well, thanks to a post over at Google’s Project Zero, that dearth has come to an end. This attack is a sandbox escape, meaning it requires a vulnerability in the Chrome JS engine to be able to pull it off.

To understand how Ridl plays into this picture, we have to talk about how the Chrome sandbox works. Each renderer thread runs with essentially zero system privileges, and sends requests through Mojo, an inter-process communication system. Mojo uses a 128 bit numbering system to both identify and secure those IPC endpoints.

Once an attacker has taken over the unprivileged sandbox process, the next step is to figure out the port name of an un-sandboxed Mojo port. The trick is to get that privileged process to access its Mojo port name repeatedly, and then capture an access using Ridl. Once the port is known, the attacker has essentially escaped the sandbox.

The whole read is interesting, and serves as a great example of the sorts of attacks enabled by speculative execution leaks.