Issues in Machine Learning’s Facial Recognition Capabilities

Dru Macasieb
2 min readMay 31, 2021

Software is is not perfect, it can always be improved. Take for example the advances in artificial intelligence, specifically machine learning and facial recognition software. Research by the National Institute of Standards and Technology (NIST) found that facial recognition software, which uses machine learning algorithms, disproportionately misidentifies women and people of color (Crumpler, 2020).

Amazon Recognition is Amazon’s fully managed SaaS computer vision service that enables developers to analyze images and videos for a variety of use cases including identity verification, Find our more by viewing the video below.
The problem with facia recognition technology is within the data that the machine learning algorithms process. For example, even if an algorithm shows no difference in its accuracy between demographics, disparate impact occurs if certain groups are over-represented in databases. African American males, for example, are disproportionately represented in the mugshot databases many law enforcement facial recognition systems use for matching. If facial recognition becomes an acceptable law enforcement tool, this could mean that African American men will be more frequently identified and tracked since many are disproportionately enrolled in criminal databases.

The problem with facia recognition technology is within the data that the machine learning algorithms process. For example, even if an algorithm shows no difference in its accuracy between demographics, disparate impact occurs if certain groups are over-represented in databases. African American males, for example, are disproportionately represented in the mugshot databases many law enforcement facial recognition systems use for matching. If facial recognition becomes an acceptable law enforcement tool, this could mean that African American men will be more frequently identified and tracked since many are disproportionately enrolled in criminal databases.

How do we fix this issue? Software testers need to be familiar with the problem that the software seeks to solve and needs to ensure that the data processed produces results that are not misleading. This requires software testers to focus on acceptance activities, ensuring that the software works as intended, keeping in mind ethical implications as a result of data that has not be been cleansed.

What do you think about facial recognition technology?

References:

Desharnais, J., Abran, A., & Suryn, W. (2011). Identification and analysis of attributes and base measures within ISO 9126. Software Quality Journal, 19(2), 447–460. doi:http://dx.doi.org/10.1007/s11219-010-9124-5

Crumpler, W. (2020, May 1). The Problem of Bias in Facial Recognition. Retrieved from https://www.csis.org/blogs/technology-policy-blog/problem-bias-facial-recognition

--

--

Dru Macasieb

Educator turned Ethical Hacker: Leveraging my Expertise in Technology and Education for the Greater Good. | MBA | MAOL | CC | CEH | SEC+