Week #3 Assignment: ml5.js project —— Lishan Qin

MyCode:https://drive.google.com/open?id=1Vci8NRnUh9j7PCPs_xzOobpghGQjaXq-

Project Name: “How to Train Your Dragon with Machine Learning”

Intro

For this week’s assignment, I utilize the ml5.js model PosenetBasics to develop an entertaining interactive project called “How to Train Your Dragon with Machine Learning”. My idea is to create an interactive AR app that allows users to interact with the virtue creature in screen using their whole body in the physical world. With the help of the PosenetBasics example, I’m able to make the users appear to be wearing a virtual Viking hat in the screen and allow them to hang out with the virtual creature “Toothless” by making different gestures and actions in real physical world to interact with it. 

Overview & Demo

The PosenetBasics model is able to recognize people’s nose, eyes and wrists, and provides data of the positions of these organs. Thus, I’m able to program “Toothless” to make different reactions according to the users’ action by changing the image and the position of “Toothless”. This project allows users to become a dragon trainer and interact with “Toothless” through different hand gestures. Firstly, the program recognizes the users’ face, and puts a Viking helmet on the users’ head. Then, the users can make different action to interact with “Toothless”, such as petting its head, poking its face or raising the right wrist to ask “Toothless” to fly.

Inspiration

The work that inspires me to build this project is Pokémon Go developed by Niantic. Pokemon Go allows users to use GPS to locate, capture, battle, and train virtual creature Pokemons, which appear in the game as if they’re in the player’s real world. The recent update of Pokémon Go brings a new function that allows users to take photos with the virtual Pokemon in the real world. Nonethless, even though I love this game and this update so very much, I still find the interaction between trainers and pokemons this game provides to be limited. The players of Pokemon Go can only interact with these virtual creatures through phones with their taps on the screen to switch the pokemon’s posture rather than using the users’  physical movement in the real world as input. Therefore, I wish to create a deeper and more direct ways of interaction between game players and those virtual creatures in the real world.

Technical Issues & Future Improvement

My original plan was to make Toothless react differently to both the right hand and left hand of the user. However, I found the model’s collected data of the wright and left wrists to be highly unstable. It often mistook the right wrist for the left, and when there was only one wrist in the screen, it’s not able to tell whether it’s a left wrist or a right wrist. Therefore, in my final work “Toothless” can only react to the users’ right hand’s movement. Also, the size of the Viking helmet that appears in the user’s head in the screen is not able to match the size of the users’ head automatically. I believe there is an algorithm that can make it work but I can’t figure it out. In addition, due to the limitation of time to finish this project, there are also a lot of different output I want to try that I haven’t finished. For example, if given more time, I’d like to add more different sound effects of Toothless to create more diverse output.

Source: https://giphy.com/stickers/howtotrainyourdragon-httyd-toothless-httyd3-1yTgtHaRuuaFWUWk7Q

Leave a Reply