Calli-Glove

Project Title: Calli-glove

Partner: Ying Chen

Date of publishing: 12.13.2018

Brief Introduction:

Calli-Glove is a project that let you write calligraphy on the screen with your hand! It’s aiming to bring greater recognition of Chinese culture and Chinese calligraphy. Its purpose doesn’t just end here. You can also learn Chinese or simply draw anything you like on the screen.

Inspiration:

This project is inspired by an interactive project called Xingqi – Flow of Qi in 2007. In Flow of Qi people can control the movement speed and shape of the ink trail by their breath and heart rate. It is a project that helps people to focus and to meditate. It perfectly reflects my definition of interaction — the communication between people and machine, people and people, and people and the environment.

CONCEPTION AND DESIGN:

We would like to make a gesture-control ink canvas. So the first thing came to our mind is to make it wearable. A glove is the simplest and a easy-to-digest wearable device by the users, so we decided to build upon a glove. Then we think about how to recognize the users hand gestures. I decided to insert two flex sensors into the knuckle position, and Ying made instructions on the processing—a big yellow page which is easy to read and noticed by users. As for the recognization part, at first, we use color recognization because the glove is red itself, however, the precisibility will decline greatly and the user will not directly see which part is the webcam tracking. Hence we decided to add a LED in the center of the palm.

 

FABRICATION AND PRODUCTION:

Functions:

Calli-glove is an interactive calligraphy platform aims for both who wants to learn and practice calligraphy and also for people with some calligraphy experience to create their masterpieces. We use Processing and Arduino to create the software part and hardware part. As for the software part, we create a canvas with 2 instruction pages. The canvas has the boxes for calligraphy, is can sense the position of the glove(hardware) through processing video pixel color function and convert it into brushlines with ink dots around it.

Processing:

For the brush part, I use the line function and create the “speed” variable by calculating the distance between the position in this frame (x1, y1) with the previous position (x0, y0). The stroke weight and stroke transparency were controlled by the speed variable.

For the ink spots, I use the random function to generate an array of ellipses with different size and weight and pop them around the brush position.

Apart from receiving the (x, y) variables from the glove, the processing can also receive the values of two FSR 400 flex sensors located on the index finger and the pinky finger. There are 3 functions that are controlled by them:

Press the index finger only: initiate drawing a line.
Press both index finger and pinky finger: erase the canvas.
Press the pinky finger only: save the frame to project file.

So here’s our code for processing:
//
// states
import processing.video.*;
import gab.opencv.*;
import processing.serial.*;
Capture video;
OpenCV opencv;
// Variable for capture device
float ox, oy, nx, ny, spd = 0, spd0 = 0, spd00 = 0, angle = 0, oldAngle = 0, angleDiff;
String myString = null;
Serial myPort;
int NUM_OF_VALUES = 2; /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues; /** this array stores values from Arduino **/
PImage pic1, pic2, pic3;
// A variable for the color we are searching for.
color trackColor = -2561; 
int a = 1;

final int stateMenu = 0;
final int stateSeeChartFootball = 1;
final int stateSeeFootballStatistics = 2;
int state = stateMenu;
//
// font
PFont font;
//

// ----------------------------------------------------------------------
// main functions

void setup()
{
 pic1 = loadImage("1.jpg");
 pic2 = loadImage("2.jpg");
 pic3 = loadImage("3.jpeg");
 frameRate(15);
 //background(240,235,220);
 setupSerial();
 initSketch();
 size(displayWidth,displayHeight);
 String[] cameras = Capture.list();
 printArray(cameras);
 video = new Capture(this, displayWidth, displayHeight);
 video.start();
 // runs only once
 //
 //fullScreen();
 smooth();
 font = loadFont("STBaoliSC-Regular-48.vlw");
 textFont(font);
} // func
//
void captureEvent(Capture video) {
 // Read image from the camera
 video.read();
}
void draw()
{
 // the main routine. It handels the states.
 // runs again and again
 switch (state) {
 case stateMenu:
 showMenu();
 break;
 case stateSeeChartFootball:
 handleStateSeeChartFootball();
 break;
 case stateSeeFootballStatistics:
 handleStateSeeFootballStatistics();
 break;
 default:
 println ("Unknown state (in draw) "
 + state
 + " ++++++++++++++++++++++");
 exit();
 break;
 } // switch
 //image(video, 0, 0);
 video.loadPixels();
 //image(video, 0, 0);
 updateSerial();
 printArray(sensorValues);
 float worldRecord = 500; 
 int closestX = 0;
 int closestY = 0;

// Begin loop to walk through every pixel
 for (int x = 0; x < video.width; x++ ) {
 for (int y = 0; y < video.height; y++ ) {
 int loc = x + y * video.width;
 // What is current color
 color currentColor = video.pixels[loc];
 float r1 = red(currentColor);
 float g1 = green(currentColor);
 float b1 = blue(currentColor);
 float r2 = red(trackColor);
 float g2 = green(trackColor);
 float b2 = blue(trackColor);

// Using euclidean distance to compare colors
 float d = dist(r1, g1, b1, r2, g2, b2); // We are using the dist( ) function to compare the current color with the color we are tracking.

// If current color is more similar to tracked color than
 // closest color, save current location and current difference
 if (d < worldRecord) {
 worldRecord = d;
 closestX = x;
 closestY = y;
 }
 }
 }
 if(sensorValues[0] >= 400 && sensorValues[1] >= 300){
 initSketch();
 }
 if(sensorValues[0] >= 400 && sensorValues[1] <= 200){
 //saveFrame();
 }
 
 // We only consider the color found if its color distance is less than 10. 
 // This threshold of 10 is arbitrary and you can adjust this number depending on how accurate you require the tracking to be.
 if (worldRecord < 10) { 
 // Draw a circle at the tracked pixel



//fill(trackColor);
 //strokeWeight(4.0);
 //stroke(0);
 //ellipse(640 - closestX, closestY, 16, 16);
 float adj = 0;
 float num, size, distance;
 if (ox == -1) {
 ox = video.width - closestX;
 oy = closestY;
 }
 nx = video.width - closestX;
 ny = closestY;
 spd = dist(ox, oy, nx, ny) ;
 if (spd < spd0) { 
 adj += 1;
 }
 if (spd < spd00 && spd < spd0) { 
 adj += 2;
 }
 if (spd > spd0) { 
 adj -= 6;
 }
 if (spd > spd00 && spd > spd0) { 
 adj -= 13;
 }
 spd = ((spd + spd0 + spd00)/3) + adj;
 spd = max(spd, 10);
 //println(spd);
 stroke(0, 200);
 strokeWeight(min(spd, 20));

if(sensorValues[1]>=150 && sensorValues[0] <= 400){
 line(ox, oy, nx, ny);
 }



// spatter:
 angle = atan2(ny - oy, nx - ox);
 angleDiff = abs(angle - oldAngle); 
 //push();
 noStroke();
 fill(0, 100);
 num = int(angleDiff > 1);
 distance = spd * random(1);
 for (int i = 0; i < 4; i++) {
 size = random(0, distance / 2);
 ellipse(wobble(nx, distance), wobble(ny, distance), size, size);
 }
 oldAngle = angle;
 ox = nx;
 oy = ny;
 spd00 = spd0;
 spd0 = spd;
 }
 //
} // func
// ----------------------------------------------------------------
void mousePressed() {
 // Save color where the mouse is clicked in trackColor variable
 int loc = mouseX + mouseY*video.width;
 trackColor = video.pixels[loc];
 println(trackColor);
}
void initSketch() {
 //background(240,235,220);
 ox = -1;
 oy = -1;
 nx = -1;
 ny = -1;
 image(pic1, 0, -20, displayWidth+20, displayHeight-30);
}
void setupSerial() {
 printArray(Serial.list());
 myPort = new Serial(this, Serial.list()[ 6 ], 9600);
 // WARNING!
 // You will definitely get an error here.
 // Change the PORT_INDEX to 0 and try running it again.
 // And then, check the list of the ports,
 // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
 // and replace PORT_INDEX above with the index number of the port.

myPort.clear();
 // Throw out the first reading,
 // in case we started reading in the middle of a string from the sender.
 myString = myPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII
 myString = null;

sensorValues = new int[NUM_OF_VALUES];
}
// keyboard functions

void keyPressed() {
 // keyboard. Also different depending on the state.
 switch (state) {
 case stateMenu:
 keyPressedForStateMenu();
 break;
 case stateSeeChartFootball:
 keyPressedForStateSeeChartFootball();
 break;
 case stateSeeFootballStatistics:
 keyPressedForStateSeeFootballStatistics();
 break;
 default:
 println ("Unknown state (in keypressed) "
 + state
 + " ++++++++++++++++++++++");
 exit();
 break;
 } // switch
 if (keyCode == 69) {
 initSketch();
 }
}

float wobble(float num, float mag) {
 return random(0 - mag, mag) + num;
 //
} // func
void keyPressedForStateMenu() {
 //
 switch(key) {
 case '1':
 state = stateSeeChartFootball;
 break;
 case '2':
 state = stateSeeFootballStatistics;
 break;
 case 'x':
 case 'X':
 // quit
 exit();
 break;
 default:
 // do nothing
 break;
 }// switch
 //
} // func
void keyPressedForStateSeeChartFootball() {
 // any key is possible
 switch(key) {
 default:
 state = stateMenu;
 break;
 } // switch
 //
} // func
void keyPressedForStateSeeFootballStatistics() {
 // any key is possible
 switch(key) {
 default:
 state = stateMenu;
 break;
 } // switch
 //
} // func

// ----------------------------------------------------------------
// functions to show the menu and functions that are called from the menu.
// They depend on the states and are called by draw().

void showMenu() {
 image(pic2, 0, 0, displayWidth, displayHeight);
 fill(0);
 textFont(font);
 textSize(100);
 text(" CaliGlove ", 400, 100, 3);
 textSize(60);
 text("Press 1 Start Drawing ", 100, 200);
 text("Press 2 See Instructions ", 100, 260);
 //
 text("Press x to quit ", 100, 320);
 //
} // func

void handleStateSeeChartFootball() {

//
} // func
//

void handleStateSeeFootballStatistics() {
 image(pic2, 0, 0, displayWidth, displayHeight);
 fill(0);
 textSize(64);
 text(" INSTRUCTIONS ", 230, 100, 3);
 textSize(30);
 text("Place your hand near the center of the computer camera, at least 5 inches away. ", 100, 200);
 text("In order to draw, press against the end of your thumb and index finger and unhold to discountinue. ", 100, 230);
 text("Move your hand but within the visibility of the camera. ", 100, 260);
 text("To erase, just hold your hand into fist. ", 100, 290);
 text("Bend your pinky finger to screenshot your calligraphy ", 100, 320);
 image(pic3,500,350,522,231);
 //
} // func
// ----------------------------------------------------------------
//
void updateSerial() {
 while (myPort.available() > 0) {
 myString = myPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII
 if (myString != null) {
 String[] serialInArray = split(trim(myString), ",");
 if (serialInArray.length == NUM_OF_VALUES) {
 for (int i=0; i<serialInArray.length; i++) {
 sensorValues[i] = int(serialInArray[i]);
 }
 }
 }
 }
}

The Glove:

As for the glove part, I designed it as the following blueprint. As you can see on the sketch, There are two flex sensors attach inside the glove under the joint of the user’s index finger and pinky finger. There’s a LED located in the center of the glove in the palm. The Arduino and breadboard are contained in a little box that can be worn on the user’s wrist.

The user can activate the flex sensors by bending their fingers. And facing the screen to control the ink lines.

Here’s our code for Arduino:
#define LED 12
int val=0;
void setup() {
 Serial.begin(9600);
 pinMode(LED,OUTPUT);
}

void loop() {
 int sensor1 = analogRead(A0);
 int sensor2 = analogRead(A1);
 //int sensor3 = analogRead(A2);

// keep this format
 Serial.print(sensor1);
 Serial.print(","); // put comma between sensor values
 Serial.print(sensor2);
 //Serial.print(",");
 //Serial.print(sensor3);
 Serial.println(); // add linefeed after sending the last sensor value

// too fast communication might cause some latency in Processing
 // this delay resolves the issue.
 delay(100);
// val = digitalRead(LED);
// if(sensor1 >= 500 && val == LOW){
 digitalWrite(LED,HIGH);
// }else if(sensor1 >= 500 && val == HIGH){
// digitalWrite(LED,LOW);
// }
 
}

Development & User Testing:

We just made an interactive model with a glove during user testing. It is a prototype without a purpose. While the users are impressed by our interaction ideas, they also provided us with improvement measures and most importantly, a purpose. We would like to thanks them for giving us so many useful advice. Here are some positive adjustments we made according to the responses from our users:

Calligraphy:

We decide to make a calligraphy practice platform after the user test because a user tells us that we can make such an adaption and we both thought a calligraphy practice platform is more meaningful than a simple designing application.

Tracking Method:

We had to think of a way to track hand and these are the possible ways we thought of: movement by frame, color, brightness, and Kinect. At first, we experienced with Kinect but it was soon abolished because it is not appropriate for this project. Then, we tested out movement and brightness detection by using OpenCV but it is not nearly precise as we would expect it to be, so we settled on color with the help of processing video.

Precision:

We had a major problem with precision even till now. To solve this problem we tested out different color properties and LEDs. As for now, we set on blue LEDs and we covered it with fabric so it doesn’t spread the lightwave outward. We understand that perfection is very hard to master but right now the result is acceptable.

Physical Prototype:

In order to carry out the usage of our project, we need to think what would it be physically, We thought of different prototypes like pen and a glove for a single finger. But we finally settled on a glove for all five fingers to carry out more functions with the pressure sensors.

CONCLUSION:

Many of our friends are also shows great interests towards Calligraphy. But it is basically very hard to practice them in the dorm. Inks, papers, brushes… all of them are very hard to get even for Chinese students. Thus we decide to get rid of such problem by creating a calligraphy prototype to allow users to calligraphy everywhere by building a canvas where they can do calligraphy with their movement. It should be portable, flexible, visually attractive, and precise.

Our final product indeed aligns with our definitions of interaction because it shows perfectly how people communicate with “the Canvas”—a digital environment. The user gives them hand position and gestures while the canvas reacts differently accordingly.

If we have more time, we would like to make our project more precise with color recognization. We would also like to make it “one size fits all” because the glove can only be used by a certain range of hand sizes. By using Arduino Nano, we would like to simplify its design and make it lighter and easier to wear. If possible, I would like to move the LED to the tip of the user’s index finger to better simulate the writing process.

In this project, we learned how to use a new library that is not taught in class — OpenCV and explored more functions with already known libraries — Video and Serial. 

We learned and tested the different tracking methods such as color and movement.

We tried and learned how to use different sensors like Kinect, pressure sensors etc.

We learned how to incorporate physical movement and imaging through the communication of Arduino.

We learned how to design a “user-friendly project” which hold KISS principals(Keep It Simple and Stupid). We learned how to predict and made a successful project from the user’s perspective.

Most importantly, we learn how to effectively cooperate between team members. We finish a really big project through a good division of our labor.

Leave a Reply

Your email address will not be published. Required fields are marked *