The initial idea for my midterm was to make an app that would provide the users information about their personality based on the hypothetical model of facial expression and personality prediction. In the proposed interface, users would be able to take a picture of their face, have it analysed by a personality prediction algorithm, and then answer questions about the accuracy of the given model in their own perspective.
I was interested in the topic due to the recent revelations about how facial structure can be used and is increasingly deployed by safety forces, in order to identify potential “suspects” or “troublemakers” in public spaces. Such attempts echo the 19th century attempts to map “good” and “bad” personality traits onto certain characteristics of bone structure of the human head. Despite its scientific appeal, modern personality prediction algorithms are largely based on flawed and biased data.
In my project, I wanted to create an algorithm based on a possibly accurate and scientifically validated dataset of perceived personality traits, and then compare it to the actual reactions of people to their own personality “prophecies”. It is about the difference between inner truth and outer perception in the time when complexity is increasingly being automated and outsourced to models.