A&S International: Privacy and Cybersecurity Woes Stifling Facial Recognition Adoption
December 1, 2020
By Prasanth Aby Thomas, Consultant Editor
Article originally posted
via A&S International
One of the biggest problems that hurt the growth of biometrics has been privacy and cybersecurity concerns. Several governments worldwide have restricted the use of facial recognition citing privacy violations. The US city of San Francisco was among the first to come up with laws banning the use of facial recognition. Other states in the US followed suit, and according to the latest reports, the EU and countries in other regions are also debating the same.
For facial recognition companies, this raises several questions, especially as the technology's demand increases with COVID-19
. First, how do such privacy concerns affect demand? Second, are these concerns valid, and what can be done to ensure no technology misuse?
Limiting facial recognition to access control
Young Moon, CEO of Suprema, said that compared to the privacy concerns regarding facial recognition used for surveillance monitoring (i.e., to track down criminal suspects), facial recognition used for access control and time & attendance purpose (the area Suprema's involved in) is not as controversial because the organizations that deploy the biometric technology usually have users' agreement on the use of personal data.
"Biometric technology is not less safe, if not safer, to cybersecurity threats than traditional technologies that use personal information like social security number or date of birth," Moon continued. "This is because biometric information is stored in templates (binary data that is not reversible to original information such as profile or fingerprint images) and not as original personal information. In addition, the templates can be (are usually) encrypted to better combat cybersecurity threats."
Similar concerns were voiced by Steven Humphreys, CEO of Identiv, who pointed out that the main issue for privacy concerns is biometrics are being used in a shotgun approach
. For example, on large groups of people and for identifying people in cities and so on.
"If you're coming into your office, or you have an appointment at the IRS, and they're checking that it's you at the door and not someone else with an access control system," Humphreys continued. "Most people don't have a huge problem with that because earlier access cards used to hold a lot of information, as well. Now, they check the face that accesses that door."
In other words, there are credible concerns with facial recognition, false positives, false negatives, and misuse of data when used in extremely broad applications. These are mostly limited to group surveillance applications.
"I think when it's that it's making me safer, or making it more convenient to walk up to my door, because I got my phone in my pocket which can recognize a face and the door just opens, I'm not going to say, 'Oh, my gosh, you misused my face,'" Humphreys added. “So, I think it's very much down to the use case. And we do have to be careful because right now, there's a positive view towards biometrics in many areas because we all use touch access on our phones and many gadgets. We're getting comfortable with biometrics. If organizations misuse them, they can create discomfort and concerns about it. We, in the industry and especially in the government sector, have to be careful that we're using it only for safety and convenience and not for mass surveillance or something inappropriate."
The privacy and cybersecurity concerns that have hurt facial recognition industry may not extend to the access control market because of its limited scope. Facial recognition for access control is used only with people’s consent and hence there is less chance of fear. However, care needs to be taken to ensure that data does not get into the wrong hands, to avoid concerns entering this segment and stifling growth.