Categories
Research Post

Research Post 3 – Ysabelle Galvez

The title to the video “Facial Weaponization” is a very appropriate and accurate way of talking about face recognition bias. As Kyle McDonald’s article pointed out, faces are a way we associate with identity. By having programmed inequalities in face recognition, we are invalidating those who suffer from these prejudices. The flaws of algorithms do not just happen miraculously, they are programmed in, or failed to be noticed in the first place due to lack of representation. Programmed inequalities do not just affect technology, but real life mentality as well. 

In the video “Facial Recognition: Last Week Tonight with John Oliver,” he talks about how facial recognition is mainly used for law enforcement. I thought that it was good at first, but then he asks, “but at what cost?” and continues by explaining that facial recognition in law enforcement has been utilized in places that don’t necessarily need law enforcement (for example, identifying people at protests, interfering with the American right to protest). Furthermore, governments have used facial recognition to track people – which, when explained, seems like a very advanced degree of invasion of privacy. This video has taught me that facial recognition has gone too far and doesn’t necessarily help crime prevention.