Sufficiently Advanced Technology And Justice

Imagine that you’re serving on a jury, and you’re given an image taken from a surveillance camera. It looks pretty much like the suspect, but the image has been “enhanced” by an AI from the original. Do you convict? How does this weigh out on the scales of reasonable doubt? Should you demand to see the original?

AI-enhanced, upscaled, or otherwise modified images are tremendously realistic. But what they’re showing you isn’t reality. When we wrote about this last week, [Denis Shiryaev], one of the authors of one of the methods we highlighted, weighed in the comments to point out that these modifications aren’t “restorations” of the original. While they might add incredibly fine detail, for instance, they don’t recreate or restore reality. The neural net creates its own reality, out of millions and millions of faces that it’s learned.

And for the purposes of identification, that’s exactly the problem: the facial features of millions of other people have been used to increase the resolution. Can you identify the person in the pixelized image? Can you identify that same person in the resulting up-sampling? If the question put before the jury was “is the defendant a former president of the USA?” you’d answer the question differently depending on which image you were presented. And you’d have a misleading level of confidence in your ability to judge the AI-retouched photo. Clearly, informed skepticism on the part of the jury is required.

Unfortunately, we’ve all seen countless examples of “zoom, enhance” in movies and TV shows being successfully used to nab the perps and nail their convictions. We haven’t seen nearly as much detailed analysis of how adversarial neural networks create faces out of a scant handful of pixels. This, combined with the almost magical resolution of the end product, would certainly sway a jury of normal folks. On the other hand, the popularity of intentionally misleading “deep fakes” might help educate the public to the dangers of believing what they see when AI is involved.

This is just one example, but keeping the public interested in and educated on the deep workings and limitations of the technology that’s running our world is more important than ever before, but some of the material is truly hard. How do we separate the science from the magic?

Amazon Sidewalk: Should You Be Co-Opted Into A Private Neighbourhood LoRa Network?

WiFi just isn’t very good at going through buildings. It’s fine for the main living areas of an average home, but once we venture towards the periphery of our domains it starts to become less reliable.  For connected devices outside the core of a home, this presents a problem, and it’s one Amazon hope to solve with their Sidewalk product.

It’s a low-bandwidth networking system that uses capability already built into some Echo and Ring devices, plus a portion of the owner’s broadband connection to the Internet.  The idea is to provide basic connectivity over longer distances to compatible devices even when the WiFi network is not available, but of most interest and concern is that it will also expose itself to devices owned by other people. If your Internet connection goes down, then your Ring devices will still provide a basic version of their functionality via a local low-bandwidth wide-area wireless network provided by the Amazon devices owned by your neighbours. Continue reading “Amazon Sidewalk: Should You Be Co-Opted Into A Private Neighbourhood LoRa Network?”

What Is Worth Saving?

When it rain, it pours. One of the primary support cables holding up the Arecibo Observatory dish in Puerto Rico has just snapped, leaving its already uncertain fate. It had been badly damaged by Hurricane Maria in 2017, and after a few years of fundraising, the repairs were just about to begin on fixing up that damage, when the cable broke. Because the remaining cables are now holding increased weight, humans aren’t allowed to work on the dome until the risk of catastrophic failure has been ruled out — they’re doing inspection by drone.

Arecibo Observatory has had quite a run. It started out life as part of a Cold War era ICBM-tracking radar, which explains why it can transmit as well as receive. And it was the largest transmitting dish the world had. It was used in SETI, provided the first clues of gravitational waves, and found the first repeating fast radio bursts. Its radar capabilities mean that it could be used in asteroid detection. There are a number of reasons, not the least of which its historic import, to keep it running.

So when we ran this story, many commenters, fearing the worst, wrote in with their condolences. But some wrote in with outrage at the possibility that it might not be repaired. The usual suspects popped up: failure to spend enough on science, or on infrastructure. From the sidelines, however, and probably until further structural studies are done, we have no idea how much a repair of Arecibo will cost. After that, we have to decide if it’s worth it.

Per a 2018 grant, the NSF was splitting the $20 M repair and maintenance bill with a consortium led by the University of Central Florida that will administer the site. With further damage, that might be an underestimate, but we don’t know how much of one yet.

When do you decide to pull the plug on something like this? Although the biggest, Arecibo isn’t the only transmitter out there. The next largest transmitters are part of Deep Space Network, though, and are busy keeping touch with spacecraft all around our solar system. For pure receiving, China’s FAST is bigger and better. And certainly, we’ve learned a lot about radio telescopes since Arecibo was designed.

I’m not saying that we won’t shed a tear if Arecibo doesn’t get repaired, but it’s not the case that the NSF’s budget has been hit dramatically, or that they’re unaware of the comparative value of various big-ticket astronomy projects. Without being in their shoes, and having read through the thousands of competing grant proposals, it’s hard to say that the money spent to prop up a 70 year old telescope wouldn’t be better spent on something else.

Tired Of The Cat-and-Mouse

Facebook just announced their plans for the Oculus Quest 2 VR headset. You probably won’t be surprised, but they want more of your user data, and more control over how you use the hardware. To use the device at all, you’ll need a verified Facebook account. Worse, they’re restricting access to the wide world of community-developed applications by requiring a developer account to be able to “sideload” non-Facebook software onto the device. Guess who decides who gets to be a developer. Hint: it’s not the people developing software.

Our article suggests that this will be the beginning of a race to jailbreak the headset on the community’s part, and to get ahead of the hackers on Facebook’s. Like every new release of iOS gets a jailbreak within a week or two, and then Apple patches it up as fast as they can, are we going to see a continual game of hacker cat-and-mouse with Facebook?

I don’t care. And that’s not because I don’t care about open hardware or indie VR developers. Quite the opposite! But like that romance you used to have with the girl who was absolutely no good for you, the toxic relationship with a company that will not let you run other people’s games on their hardware is one that you’re better off without. Sure, you can try to fix it, or hack it. You can tell yourself that maybe Facebook will come around if you just give them one more chance. It’s going to hurt at first.

But in the end, there is going to be this eternal fight between the user and the company that wants to use them, and that’s just sad. I used to look forward to the odd game of cat and mouse, but nowadays the cats are just too well bankrolled to make it a fair fight. If you’re buying a Quest 2 today with the intent of hacking it, I’d suggest you spend your time with someone else. You’re signing up for a string of heartbreaks. Nip it in the bud. You deserve better. There are too many fish in the sea, right?

What are our options?

Walmart Gives Up On Stock-Checking Robots

We’ve seen the Jetsons, Star Wars, and Silent Running. In the future, all the menial jobs will be done by robots. But Walmart is reversing plans to have six-foot-tall robots scan store shelves to check stock levels. The robots, from a company called Bossa Nova Robotics, apparently worked well enough and Walmart had promoted the idea in many investor-related events, promising that robot workers would reduce labor costs while better stock levels would increase sales.

So why did the retail giant say no to these ‘droids? Apparently, they found better ways to check stock and, according to a quote in the Wall Street Journal’s article about the decision, shoppers reacted negatively to sharing the aisle with the roving machines.

The robots didn’t just check stock. They could also check prices and find misplaced items. You can see a promotional video about the device below. Continue reading “Walmart Gives Up On Stock-Checking Robots”

OpenOffice Or LibreOffice? A Star Is Torn

When it comes to open source office suites, most people choose OpenOffice or LibreOffice, and they both look suspiciously similar. That isn’t surprising since they both started with exactly the same code base. However, the LibreOffice team recently penned an open letter to the Apache project — the current keepers of OpenOffice — asking them to redirect new users to the LibreOffice project. Their logic is that OpenOffice has huge name recognition, but hasn’t had a new major release in several years. LibreOffice, on the other hand, is a very active project. We could argue that case either way, but we won’t. But it did get us thinking about how things got here.

It all started when German Marco Börries wrote StarWriter in 1985 for the Zilog Z80. By 1986, he created a company, Star Division, porting the word processor to platforms like CP/M and MSDOS. Eventually, the company added other office suite programs and with support for DOS, OS/2, and Windows, the suite became known as StarOffice.

The program was far less expensive than most competitors, costing about $70, yet in 1999 that price point prompted Sun Microsystems to buy StarOffice. We don’t mean they bought a copy or a license, they bought the entire thing for just under $74 million. The story was that it was still cheaper than buying a license for each Sun employee, particularly since most had both a Windows machine and a Unix machine which still required some capability.

Sun in Charge

Sun provided StarOffice 5.2 in 2000 as a free download for personal use, which gave the software a lot of attention. It eventually released much of the code under an open source license producing OpenOffice. Sun contributed to the project and would periodically snapshot the code to market future versions of StarOffice.

This was the state of affairs for a while. StarOffice 6.0 corresponded to OpenOffice 1.0. In 2003, release 1.1 turned into StarOffice 7. A couple of years later, StarOffice 8/OpenOffice 2.0 appeared and by 2008, we had StarOffice 9 with OpenOffice 3.0 just before Oracle entered the picture.

Continue reading “OpenOffice Or LibreOffice? A Star Is Torn”

DSL Is Barely Hanging On The Line As Telcos Stop Selling New Service

Are you reading this over AT&T DSL right now? If so, you might have to upgrade or go shopping for a new ISP soon. AT&T quietly stopped selling new traditional DSLs on October 1st, though they will continue to sell their upgraded fiber-to-the-node version. This leaves a gigantic digital divide, as only 28% of AT&T’s 21-state territory has been built out with full fiber to the home, and the company says they have done almost all of the fiber expansion that they intend to do. AT&T’s upgraded DSL offering is a fiber and copper hybrid, where fiber ends at the network node closest to the subscriber’s home, and the local loop is still over copper or coax.

At about the same time, a report came out written jointly by members of the Communications Workers of America union and a digital inclusion advocacy group. The report alleges that AT&T targets wealthy and non-rural areas for full fiber upgrades, leaving the rest of the country in the dark.

As the internet has been the glue holding these unprecedented times together, this news comes as a slap in the face to many rural customers who are trying to work, attend school, and see doctors over various videoconferencing services.

If you live in a big enough city, chances are you haven’t thought of DSL for about twenty years, if ever. It may surprise you to learn of the popularity of ADSL in the United Kindom. ADSL the main source of broadband in the UK until 2017, having been offset by the rise of fibre-to-the-cabinet (FTTC) connections. However, this Ofcom report shows that in 2018 ADSL still made up more than a third of all UK broadband connections.

Why do people still have it, and what are they supposed to do in the States when it dries up?

Continue reading “DSL Is Barely Hanging On The Line As Telcos Stop Selling New Service”