Week 2: MLNI—Case Study-Interface with ML or AI (Wei Wang)

Partner: Cherry Cai (rc3470)

Project Topic: Refraction Emotion

Presentation Slides

Introduction

Ouchhh created Artificial Intelligence and the t-SNE visualization of the hundreds of books and articles [approx. 20 million lines of text] written by scientists who changed the destiny of the world -and wrote history- were fed to the Recurrent Neural Network during the training. Later on, by using the trained models, new texts are generated. AI was used to convert those text into visuals, while the 136 projectors created a real-time interaction for the audience during the exhibition.

This kind of poetic refraction of scientific consciousness generated innovated method for the audience to perceive the beauty of literature from a new perspective.

This is a cognitive performance created by Ouchhh, which takes a pianist’s brain waves at his concert and visualizes it. The visualization is wrapped with the data concerning emotion and neural mechanism with electroencephalogram (EEG). This project is based on superstring theory, which argues that the world consists of only vibrating thin strings. Ouchhh define the melodies as matter and symphonies of the melodies as the universe and take eleven dimensions is abstract directions which will change and turn into a reality with AI algorithm.

The use of musical refraction of the artist strengthens the emotion the performer conveys and helps the audience to get emerged in the performance.

Implementation

Help the audience to better understand the artists’ emotion in the context of his/her creation.

  • Concert/Show
  • Educational purpose
  • A larger group of audience (e.g. disabled people)

MLNI – Presentation Homework (Wei Wang)

Dare to Dream_VODAFONE

This is a facial mapping project by Ouchhh Creative New Media Agency.  The project basically tracks the facial movement of the model and projects images onto her face.  Images were projected onto the model’s face while she is shaking her head constantly and slowly to create certain atmosphere and convey the feeling. 

Why is it interesting

I find this project interesting because the projection was perfectly mapped onto every corner of the  model’s face and her face only. Even at some point when the only side face was shown, the projection remain the feeling of three dimensional effect.  The projection also expand the space from purely a face to a whole new world. But as there is no facial expression by the model, it leads me to think what would be supplemented to this kind of real-time facial tracking project if the feeling could be strengthened by  expression.

Technology Behind

The two main part of technology adopted by this project would be facial tracking and projecting.  I would assume that a facial tracking camera was used to capture the image of this model and some AI algorithms was used to detect the face and analyzed the position of each part on the face. Once the position was determined, the images were then projected onto the corresponding part.