Facial Recognition For Covid-19 Tracking In Seoul

The city of Bucheon, population 830,000, is a satellite city southwest of Seoul and part of the greater metropolitan area and the site of a pilot program to apply AI facial recognition and tracking technologies to aid Covid-19 epidemiological investigators. South Korea has been generally praised for its rapid response to coronavirus patient tracking since the beginning of the outbreak. People entering public facilities enter their information on a roster or scan a QR code. Epidemiologists tracking outbreaks use a variety of data available to them, including these logs, electronic transaction data, mobile phone location logs, CCTV footage, and interviews. But the workload can be overwhelming, and there are only a fixed number of workers with the required training available, despite efforts to hire more.

As contract tracing has been done to-date, it takes one investigator up to an hour to trace the movements of one patient. When the system goes online in January, it should be able to trace one patient in less than a minute, handling up to ten traces simultaneously. Project officials say there is no plan for this system to expand to the rest of Seoul, nor nationwide. But with the growing virus caseloads and continued difficulties hiring and training investigators, it’s not unexpected that officials will be turning to these technologies more and more to keep up with the increasing workload.

Like the controversy surrounding the recent facial recognition project at Incheon International Airport, people are becoming concerned about the privacy implications and the specter of a Big Brother government that tracks each and every move of its citizens — a valid fear, given the state of technology today. The project planners note that the data is being legally collected and its usage subject to strict rules. Korean privacy law requires consent for the collecting and storage of biometric data. But there are exceptions for situations such as disease control and prevention.

Even if all the privacy concerns are solves, we wonder just how effective these AI systems will be for tracking people wearing masks. This is not an issue unique to South Korea or even Asia. Many countries around the world are turning to such technologies (see this article from the Columbia School of Law) and are having similar struggles striking the balance between privacy and public health requirements.

[Banner image: “facial-recognition-1” by Electronic_Frontier_Foundation. Thanks for all you do!]

Korean Facial Recognition Project Faces Opposition

It was discovered last month that a South Korean government project has been providing millions of facial images taken at Incheon International Airport to private industry without the consent of those photographed. Several civic groups called this a “shocking human rights disaster” in a 9 Nov press conference, and formally requested that the project be cancelled. In response, the government has only promised that “the project would be conducted at a minimum level to ensure personal information is not abused”. These groups are now planning a lawsuit to challenge the project.

Facial information and other biometric data aren’t easily altered and are unique to the individuals concerned. If this data were to be leaked, it would constitute a devastating infringement upon their privacy. It’s unheard of for state organizations — whose duty it is to manage and control facial recognition technology — to hand over biometric information collected for public purposes to a private-sector company for the development of technology.

The program itself wasn’t secret, and had been publicly announced back in 2019. But the project’s scope and implementation weren’t made clear until a lawmaker recently requested documents on the project from the responsible government agencies. The system, called the Artificial Intelligence and Tracking System Construction Project, was a pilot program set to run until 2022. Its goals were to simplify the security and immigration screening of passengers, improve airport security, and to promote the local AI industry in South Korea. If the project proves successful, the plan is to expand it to other airports and ports in the country.

Current systems at the airport do one-to-one facial recognition. For example, they try to determine whether the face of the person presenting a passport matches the photo in the passport. The goal of this new project was to develop one-to-many matching algorithms, which can match one face against the plethora of faces in an airport, track the movement of a face within the airport, and flag “suspicious” activities which could be a security concern.

The groups protesting the project note that the collection and sharing of these images without the travelers’ consent is prohibited by the Personal Information Protection Act, the South Korean law which governs such things. Under this act, a project like this would ordinarily require consent of the participants. But the government’s interpretation relies on an exception in the act, specifically, Article 15 Section 3, which states:

A personal information controller may use personal information without the consent of a data subject within the scope reasonably related to the initial purpose of the collection

Basically they are saying that since the images were collected at the security and immigration checkpoints, and that the project will be using them to improve the security and immigration checkpoints, no consent is required.

  • Foreigners: 120 million individuals, face image, nationality, gender, age
  • Korean citizens: 57.6 million individuals, face image, nationality, gender, age
  • Other: unknown number of individuals, images and videos of atypical behavior and travelers in motion

The breakdown of the numbers above reveals that 57 million Korean citizens are in the data set, a bit surprising to many since the collection of biometric data on Korean citizens at immigration is prohibited by law. The project circumvented this by only collecting data from citizens who participate in the automated Smart Entry service, a voluntary program which uses fingerprints and facial recognition. It’s interesting to note that the number of passengers using Incheon airport since May 2019 (the program was announced 30 Apr 2019) is only 62 million, so the average passenger appears approximately three times in the data set.

Are there any similar programs in your region? How do they handle the issue of consent, if at all? Let us know in the comments below.

[Banner image: “Customer uses facial recognition as identification at TSA security checkpoint” by DeltaNewsHub, CC BY 2.0  — Yes, it’s from another country with similar problems, but much less public outcry. Discuss in the comments!]

Iceland Is Doing Its COVID-19 Proximity Tracing The Open Source Way

As governments around the world grapple with the problem of tracing those who have had contact with a person known to have been infected with the COVID-19 virus, attention has turned to the idea of mobile apps that can divulge who a person has been near so that they can be alerted of potential infections. This has a huge potential for abuse by regimes with little care for personal privacy, and has been a significant concern for those working in that field. An interesting compromise has been struck by Iceland, who have produced an app for their populace that stores the information on the device and only uploads it with the user’s consent once they have received a diagnosis. We can all take a look, because to ensure transparency they have released it as open source.

On signing up for the scheme a central server stores the details of each user as well as their phone number. When the epidemiologists have a need to trace a person’s contacts they send a notification, and the person can consent to their upload. This is a fine effort to retain user privacy, with depending on your viewpoint the flaw or the advantage being that the user can not have their data slurped without their knowledge. Iceland is a country with a relatively small population, so we can imagine that with enough consent there could be effective tracing.

We installed the Android version on the Hackaday phone to have a look, but unfortunately it seems to need to be in Iceland to be of use enough to explore. We would be interested to hear from our Icelandic readers, to hear their views. Meanwhile readers can juxtapose the Icelandic app with another proposal for a more anonymised version.

Decentralized Privacy-Preserving Proximity Tracing

As we continue through the pandemic, whether we are on lockdown or still at work, there is a chance for all of us that we could still pick up the virus from a stray contact. Mapping these infections and tracing those in proximity to patients can present a major problem to infection control authorities, and there have been a variety of proposals for smartphone apps designed to track users’ contacts via the Bluetooth identities their phones encounter. This is a particular concern to privacy advocates, because there is a chance that some governments could use this as an excuse to bring in intrusive personal surveillance by this means. A group of academics from institutions across Europe have come together with a proposal for a decentralised proximity tracing system that allows identification of infection risk without compromising the privacy of those using it.

Where a privacy-intrusive system might use a back-end database tracking all users and recording their locations and interactions, this one uses anonymised tokens stored at the local level rather than at the central server. When a user is infected this is entered at app level rather than at server level, and the centralised part of the system merely distributes the anonymised tokens to the clients. The computation of whether contact has been made with an infected person is thus made on the client, meaning that the operator has no opportunity to collect surveillance data. After the pandemic has passed the system will evaporate as people stop using it, rather than remaining in place harvesting details from installed apps. They are certainly not the first academics to wrestle with this thorny issue, but they seem to have ventured further into the mechanics of it all.

As with all new systems, it’s probably good to subject it to significant scrutiny before deploying it live. Have a read. What do you think?

We are all watching our authorities as they race to respond to the pandemic in an effective manner, and we hope that should they opt for an app that it does an effective job and they resist the temptation to make it too intrusive. Our best course of action meanwhile as the general public is to fully observe all advised public health measures such as self-isolation or the wearing of appropriate personal protective equipment.

Chatterbox Voice Assistant Knows To Keep Quiet For Privacy

Cruising through the children’s hands-on activity zone at Maker Faire Bay Area, we see kids building a cardboard enclosure for the Chatterbox smart speaker kit. It would be tempting to dismiss the little smiling box as “just for kids” but doing so would overlook something more interesting: an alternative to data-mining corporations who dominate the smart speaker market. People are rightly concerned about Amazon Echo and Google Home, always-listening devices for online retail sending data back to their corporate data centers. In order to be appropriate for children, Chatterbox is none of those things. It only listens when a button is pressed, and its online model is designed to support the mission of CCFC (Campaign for a Commercial-Free Childhood.)

Getting started with a Chatterbox is much like other products designed to encourage young makers. The hardware — Raspberry Pi, custom HAT, speaker and button inside a cardboard enclosure — is conceptually similar to a Google AIY Voice kit but paired with an entirely different software experience. Instead of signing in to a Google developer account, children create their own voice interaction behavior with a block-based programming environment resembling MIT Scratch. Moving online, Chatterbox interactions draw upon resources of similarly privacy-minded entities like DuckDuckGo web search. Voice interaction foundation is built upon a fork of Mycroft with changes focused on education and child-friendliness. If a Chatterbox is unsure whether a query was for “Moana” or “Marijuana”, it will decide in favor of the Disney movie.

Many of these privacy-conscious pieces are open source or freely available, but Chatterbox pulls them all together into a single package that’s an appealing alternative to the big brand options. Based on conversations during Hackaday’s Maker Faire meetup, there’s a market beyond parents of young children. From technically aware adults who lack web API coding skills, to senior citizens unaware of dark corners of the web. Chatterbox Kickstarter campaign has a few more weeks to run but has already reached funding goals. We look forward to having a privacy-minded option in voice assistants.

Wave Bubble Portable RF Jammer


Hack-A-Day friend [Limor] AKA [ladyada] has been promising a portable RF jammer for a while. guess what she sent me for Christmas? The Wave-bubble is a self tuning RF jammer – good for around 20 feet of RF enforced peace. (It outputs .1-.3 watts) With a pair of less efficient antennas, it even fits inside a pack of cigarettes. She’ll never sell these because the FCC would come-a-knockin, but if you’ve got some major skills, you might be able to build one. (I’m going to believe her take on this, I’ve seen her work in person and it’s some damn fine stuff)

Merry Christmas! Get your Design Challenge entries in today!