This Week In Security: Leaking Partial Bits, Apple News, And Overzealous Contact Tracing

Researchers at the NCCGroup have been working on a 5-part explanation of a Windows kernel vulnerability, targeting the Kernel Transaction Manager (KTM). The vulnerability, CVE-2018-8611, is a local privilege escalation bug. There doesn’t seem to be a way to exploit this remotely, but it is an interesting bug, and NCCGroup’s work on it is outstanding.

They start with a bit of background on what the KTM is, and why one might want to use it. Next is a handy guide to reverse engineering Microsoft patches. From there, they describe the race condition and how to actually exploit it. They cover a wide swath in the series, so go check it out.

Left4Dead 2

Just a reminder that bugs show up where you least expect them, [Hunter Stanton] shares his story of finding a code execution bug in the popular Valve game, Left4Dead 2. Since the game’s code isn’t available to look at, he decided to go the route of fuzzing. The specific approach he took was to fuzz the navigation mesh data, part of the data contained in each game map. Letting the Basic Fuzzing Framework (BFF) run for three days turned up a few possible crashes, and the most promising turned out to have code execution potential. [Hunter] submitted the find through Valve’s HackerOne bug bounty program, and landed a cool $10k bounty for his trouble.

While it isn’t directly an RCE, [Hunter] does point out that malicious mesh data could be distributed with downloadable maps on the Steam workshop. Alternatively, it should be possible to set up a fake game server that distributes the trapped map.

Big Brother Apple?

There is a constant tension between security and privacy. We’re used to governments making arguments about giving up privacy for the sake of security, but the same trade-off can show up in computer security, too. In this case, Apple has implemented an online check for every executable run by a macOS Catalina system. If you’re running macOS 10.15, you might have noticed your system is a bit slower than it should be. It seems that when connected to the internet, a modern Mac will upload a hash of each binary to Apple, assumably to check it against a blacklist of known malware.

The Reddit thread discussing this issue had a few more interesting observations. First off, one user pointed out that he had observed this issue while flying and connected to the terrible in-flight wifi. A second poster observed that a Mac will take an inordinate amount of time to reboot when connected to a network without internet access.

While there is likely an upside, this approach is terrible for performance and user privacy, and a breach of trust between Apple and their users. If they wanted to monetize the data, Apple now has a record of which binaries are run by which users and when. This sort of behavior should be documented at the very least, and come with an off switch for those who don’t wish to participate.  The fact that it was discovered by internet sleuths is a black eye for Apple.

LadderLeak

An interesting attack on certain ECDSA schemes was published on the 25th (PDF). This attack was specifically developed against OpenSSL, and uses a Flush+Reload cache attack to leak information from the elliptic curve operation as it is calculated. At some point we’ll do an in-depth look at elliptic curve cryptography, but for now it’s sufficient to understand that a mathematical operation is performed repeatedly in order to do key exchanges.

For each iteration, the researching team were able to extract approximately one bit of information about the internal state of the key. (Technically less than one bit, since it is a statistical attack.) After the data collection was carried out, a rather intensive CPU process is required to calculate the key. It’s not an attack that is particularly practical at this point, but it’s still important for the affected projects to mitigate against.

The math required to fully appreciate their work is pretty intense, but if that’s your thing, it’s there to be appreciated. For the rest of us, it’s just good to know that our algorithms are under such scrutiny from the good guys. We all win as a result.

iOS Jailbreak

The iOS security landscape has been in a tizzy over the last few weeks. It wasn’t long ago that an iOS exploit was the holy grail of security research, but just recently Zerodium, a zero-day vendor, has stopped accepting iOS zero-days because they have too many.

There’s been a new development, a jailbreak for any device running iOS 11 or newer. This jailbreak, named unc0ver, requires an unlocked phone and a computer. It’s quite a boon to researchers and end users alike.

COVID-19 Contact Tracing — What Could Go Wrong?

The Australian government has developed an Android and iOS app to track the spread of COVID-19, and it seems that it went wrong in all the predictable ways. For starters, it seems that once a device has the app installed, that device can be tracked even after it’s been uninstalled. A few of the issues have been fixed, but as the app is closed source, it’s impossible to fully verify that it’s well behaved. Update: The source is available, but under a bizarre license. We suspect that there are other bugs. The link above is the working document maintained by a handful of researchers working to audit the app.

10 thoughts on “This Week In Security: Leaking Partial Bits, Apple News, And Overzealous Contact Tracing

  1. The macOS thing, it can’t be true, can it? I can’t access a couple of the referenced sites; I can’t believe someone ran this past their legal dept. and got an OK, and that nobody raised a privacy concern, if true. Is this a configurable setting? At least an opt-out configuration? I haven’t got an Apple laptop to check for myself. Just from the engineering aspect it seems like a huge oversight for failing to take into account non-network, non-internet, and slow internet connected devices.

    1. Chrome’s software_reporter_tool.exe service runs in the background and does exactly the same thing, but nobody apart from me seems to care. It’s installed in secret whenever Chrome is installed on Windows, and runs periodically even when Chrome is closed.

      I don’t use Chrome but on many occasions I’ve been called to fix computers due to abnormally high CPU usage, only to find this secretive background Google process gobbling up anywhere from 25-90% of the CPU’s resources.

      Any suggestion that installing undocumented closed-source hidden services that catalogue a user’s software for unknown purposes is unacceptable behaviour gets me banned from forums, because apparently everything Google does is for our own good, and it’s for our own protection.

      I can’t believe anybody can defend any company who behave like this, but it seems I’m in the minority. The majority also seem to love Windows 10 which is crippled by constant telemetry. Unfortunately it’s become the norm.

      1. Thanks for providing this nugget of info! I say this, as I have seen similar issues on a few computers and found it odd that I am seeing open connections to servers controlled by google and Microsoft. I clued into this a few weeks ago as I was doing some software work on an old Win XP machine. I know, I know… XP… I have to use this old machine to maintain some legacy software/tools until the new stuff is complete. Anyway… as O was checking some networking connections via Wireshark I was floored with the number of packets being sent to IP addresses that when reverse looked up were google/microsoft servers. When I did some checking on the Internet there was little info on this behavior… I have to say I am getting pretty pi**ed at all this crap, as it is no wonder that systems go sideways with all the background big brother nonsense going on…

  2. The Australian tracing app is open sourced, or at the very least the source has been released. Just with a really restrictive licence. All documented just a few pages below the point in the google doc linked.
    Enough to see what is going on, but not following best practices in many aspects.

    1. It’s interesting, they might of opened sourced the app, but it’s missing a heap of code. If you decompile the android APK it has a heap of code that doesn’t appear to be in the repository. Either way something doesn’t sit right with an app like this that is controlled by the goverment

  3. The Australian Covid safe app is an interesting beast.
    Firstly it’s put out by a government department that can’t even make their departmental intranet work – there is no way they can make this work reliably.

    And storing data on a 3rd parties server . Major Fail…

  4. Apple is rotting to the core.

    Fine, you can hash the executable ONCE, then save it with a digitally signed object so you woudn’t need connectivity (yes, it could be reverse sandbox attacked but there are other ways).
    Apple could also digitally sign executables without any “store” simply to insure against alteration. Use something like Let’s Encrypt.
    (How do they know the hash is for the intended executable, I can make FakeyWriter and its hash would be fine – which is why much else doesn’t matter).

    Today Apple is more about virtue signalling showing solidarity with those burning cities in the USA but censoring Hong Kong at the behest of the Chinese Communist Party because they have more (supply) chains than Marley’s ghost. They have H1-Bs making the code cheaper. And it is not as if they would go bankrupt if they adjusted things 10% on the input end.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.