One thing that struck me was “FindFace”, a program where you can take a picture of someone and in result get their “profile” or a collection of pictures of them from the internet. This could be used by anyone and on anyone. This shows much privacy is being taken away with the increased usage and development of social media and technology.
Joy Buolamwini discusses algorithmic bias in her TED Talk. The way facial recognition software generally works is that they are given a “training set” which tells the machine what is a face and what is not. This gives the machine the ability to detect faces. However, the training sets are not always inclusive of all people. Despite this, I especially liked how she phrased this issue by emphasizing that more inclusive training sets must and can be made.
Links: