Code obfuscation has been around for a long time. The obfuscated C contest first ran way back in 1984, but there are examples of natural language obfuscation from way earlier in history. Namely Cockney rhyming slang, like saying “Lady from Bristol” instead of “pistol” or “lump of lead” instead of “head”. It’s speculated that Cockney was originally used to allow the criminal class to have conversations without tipping off police.
Code obfuscation in malware serves a similar purpose — hiding from security devices and applications. There are known code snippets and blacklisted IP addresses that anti-malware software scans for. If that known bad code can be successfully obfuscated, it can avoid detection. This is a bit of a constant game of cat-and-mouse, as the deobfuscation code itself eventually makes the blacklist. This leads to new obfuscation techniques, sometimes quite off the wall. Well this week, I found a humdinger of an oddball approach. Morse Code.
Yep, dots and dashes. The whole attack goes like this. You receive an email, claiming to be an invoice. It’s a .xlsx.hTML
file. If you don’t notice the odd file extension, and actually let it open, you’re treated to a web page. The source of that page is a very minimal JS script that consists of a morse code decoder, and a payload encoded in Morse. In this case, the payload is simply a pair of external scripts that ask for an Office 365 login. The novel aspect of this is definitely the Morse Code. Yes, our own [Danie] covered this earlier this week, but it was too good not to mention here.
Literal Watering Hole Attack
You may have seen the breathless news articles, that a cyber bioterrorism attack almost poisoned a city. Let’s talk about what actually happened. The city in question has a population of under 15,000, and a dedicated water system. The primary control computer in the treatment plant legitimately had a remote access program installed on it. An unauthorized user used that remote access program from the outside early in the day, but only connected for a moment. This first access was assumed to be a legitimate use of the remote access system, but later in the day, it became obvious that something was afoot. When the system was accessed the second time, the remote user turned the sodium hydroxide level up, presumably to maximum, and logged back out.
Well it turns out that the Windows 7 machine in question was running TeamViewer, with a single password shared among all the city’s computers. Christopher Krebs suggests that a ticked-off employee is likely to blame, which is a decent guess.
Metaphorical Watering Hole Attack
These headlines write themselves, but this watering hole attack has quite a twist. We’ve covered problems like typosquatting and malicious project takeovers on platforms like pip
and npm
. [Alex Birsan] came up with a related attack he calls “Dependency Confusion“. The inspiration was finding PayPal code that wasn’t intended for public release on Github. What intrigued him about this code was the list of dependencies, some of which were open source packages on npm
, while some were clearly proprietary packages. This mix brings a question to mind. If a package was uploaded to npm
with a name collision with a local package, which version would be used?
As an enterprising researcher, [Alex] decided to find out. He does make the important caveat that he has authorization to do live penetration testing on all his targets, either through the terms of public bounty programs, or through private agreements. Without that authorization, trying to pull off an attack like this could land you in serious hot water.
So what happens when a build system requests a package that exists both locally and on a public repository? It depends, but in many cases, the package with the higher version number is used. [Alex] pulled information on proprietary package names from every source he could find, and carefully crafted proof-of-concept packages. To collect the list of successes, the spoofed packages exfiltrated a few bytes of data encoded over a DNS request. The roster of companies that were vulnerable to the attack is impressive, as is the amount of payout. Apple, Paypal, and Shopify all confirmed the efficacy of the technique, and paid out a cool $30,000 bounty each. Sometimes it’s very good to be a security researcher.
All Your Barcode Are Belong to Us
It’s no great surprise that barcode scanners are popular cell phone apps. You have a handheld computer with a camera and an internet connection. Pointing at a barcode and getting instant information about the product is potentially invaluable. One of the very popular barcode scanner apps on Android, with over ten million installs, has recently started a very aggressive adware campaign. The publisher of that app was LavaBird LTD.
Once the bad behavior was noticed, the app was pulled from the play store. This doesn’t remove it from devices, so you might check your devices for any LavaBird software. The Malwarebytes blog points out that sometimes malicious activity like this is a result of a third-party advertising library, and not the fault of the app developer. In this case, the bad code was directly in the app itself, signed by the developer’s key, and obfuscated to be hard to detect. All in all, the situation is reminiscent of the Great Suspender debacle we covered last week. It makes me suspect that something similar happened. Likely, the original app author sold to a shady third party, who filled the app with malware.
Noteworthy Notes
Firefox released 85.0.1 about a week ago, and there is an interesting note in the changelog: “Prevent access to NTFS special paths that could lead to filesystem corruption.” This rather cryptic note is a reference to bug 1689598, which is still restricted as of time of writing. We can get a look at the changelog that fixes the issue. Apparently there was a way to invoke a special file handler like $MFT
or $Volume
when opening a file. A malicious invocation can result in filesystem corruption, hence a potentially serious denial of service attack.
Remember the sudoedit
bug from a couple weeks ago? A few vendors, namely Apple, considered themselves immune to the bug, since they didn’t have the sudoedit
symlink. It didn’t take long for the internet to figure out that it is simple enough to create the symlink yourself, and then trigger the bug.
Google seems to be already making good on their “Know, Prevent, Fix” initiative, publishing the Open Source Vulnerabilities database. Currently, the OSS-Fuzz project is the sole data source, but more are planned. The purpose seems to be a automatable data source for tracking down vulnerabilities. Time will tell what further tools will result from the KPF push.
Thanks for the heads up about the Barcode Scanner app.
This is the first I’ve heard about that issue and I’ve purged it from my phone since reading this article.
It’s kind of superfluous since a lot of apps already have the capability built in.
A lot of apps call out to an external scanner through some “intent”. I got hit by this and was an early reporter on Play. I’d had this app installed since, I think, my first Android phone some 10 years ago. It was simple, free, no ads, and always worked. There is a new release of the original as a different package.
If you are in the habit of scanning random barcodes and clicking on url shorteners then you have bigger problems and your devices are not your friends.
nospam, where I live there is mandatory contact tracing that relies on QR codes. I have no idea why there couldn’t have just used a URL, but I guess the masses think that pointing their phone at an obfuscated URL is “better” than just knowing what the URL is, and typing that in. I don’t like it either. I have nonpronlem with contact tracing but as you said, obfuscated URLs are a bad idea.
Internet rule 174: If you think ‘that hack can not affect my OS’ then the hack does affect your OS.