Final Project Report
The Artistic Scanner – Justin Lau – Eric Parren
CONCEPTION AND DESIGN:
This project has been a challenge but fun one to make. The original concept for this project was to make a more portable version of a motion detecting generative artwork that wouldn’t be limited to just a single area, but rather be able to “scan” your choice of the surrounding area and be able to create a generative artwork from that, with a button that would control when the scanner is on or off. Based upon my midterm project, my primary goals I sought to achieve was to create a project in which people can easily figure out its function and to give it many depths for interaction. From my previous research and intuition, I came to the conclusion that something resembling a store scanner would work best as people would easily recognize and figure out how to use it. I also moved towards multiple sensors that would control certain aspects of the generative art being made to give the project more depth of interaction as I thought only just the motion sensor as it would have been quite shallow in terms of interaction and not give the project much depth, but I did keep the portability aspect in mind, hence the store scanner design. I also scratched the button altogether as it would limit the interaction from the user.
This all cumulated into a prototype which you see below, to which fulfilled pretty much all goals I was hoping to achieve when tested by people. However, the primary feedback I received for the prototype was if there was any way to clear the screen in case a user wanted to make something brand new, as well as the option to save the art once a user is satisfied with what they made. As such, I made the additions of sensors to my final design which would satisfy this feedback, to which they worked as intended when the final test was done. You can see the final result below.
FABRICATION AND PRODUCTION:
I will split this section into two different parts: one for the prototype phase and one for the final design phase. For the prototype phase, my main concern was to figure out the wiring, Arduino and Processing coding for all the sensors I would be using. I first explored my options of which sensors would work best for the purposes of my project, to which I have selected 5 of them (Color sensor, digital light sensor, temperature sensor, UV sensor, and accelerometer). I would then look up reference websites on each of the sensors, following the wiring process for each and tested out the codes provided for each sensor, to which each worked individually. The main challenge was to figure out how to combine the wiring of all of the sensors and codes for each together into one program, which took the majority of the time for the prototype phase. After some discussion with TAs and my own research, I discovered about i2c connections and followed the i2c connection video you will see below. After some tinkering, I managed to get all the wiring for all the sensors to work on one Arduino board. Next is the Arduino code, it also took quite a bit of time to figure out how to combine them all into one program and get them to work. This took some more discussion with TAs and experimenting, but I did manage to get all the code to work with their respective sensors all in one program, combing all the sensors codes and Arduino to Processing code from class. The final challenge in the prototyping phase was the Processing coding, namely what aspects I wanted each sensor to control when making the generative art. This took more discussion with TAs and experimenting, but I managed to figure out what aspects each sensor controlled, how the values from the sensors would correspond to those aspects, and for all of those aspects to work when I run them (you can see which sensor corresponds to what aspect below). All that was said and done, I made a cardboard prototype of the scanner (see below) that would hold all of the wiring and sensors together, serving as the basis for the final design, and would present it during User Testing.
For the final design, I was mainly focus on constructing a proper and durable build to hold all the sensors and wires together. But before that, I had received feedback from User Testing, which was if there was a way to restart the screen from scratch to make a new art piece, and a way to save the current art piece on the screen. I came up with the solution to use two touch sensors to act as a sort of erase and save button respectively. I repeated the process I had done with the other sensors with the two touch sensors (search for a reference, tinkering with the wiring, and adding the code for them into my program for Arduino and Processing), which resulted in the sensors working as intended. After that was done, I had to make the final design for the scanner. I had originally planned to make it from 3d print but unfortunately basically all of the 3d printers were either used by other people all the time or were broken. Hence, I resorted to using the laser cutter and Cuttle to make the design. This took several tries as I had to get the measurements just right for everything to fit naturally, but eventually I managed to create a boxed scanner that would be able to hold all the wiring and sensors inside while also being a durable design. I also wanted the design to be ergonomic, hence the placements of the touch sensors and the grip design and placement on the scanner. All of this cumulated into a solid, durable, but light weight design that was comfortable to hold and easy to wield (which you can see below), with the sensors working as intended inside the scanner.
CONCLUSION:
Once more, my goals for this project were to create a scanner type device that would be able to pick up various aspects from the environment to create generative art, which would be easy for people to figure out and use, as well as have depth to the interaction which would get people to come back to using to repeatedly. To that end I’d say I mostly succeed in achieving these goals. People were able to easily figure out the functions of my project via the design of the scanner and the screen from Processing and were very much impressed with result. I’d say that the result of my project matches current definition of interaction as not only were people able to identify, process and put into action their ideas in the context of my project, but there was a level of depth to it as well given the sheer number of sensors within scanner that can output many different results depending on what it is pointed at, as well as having the option to erase and save the work on the fly. With that said however, I do feel like I could have improved my project should time have allowed for it. The first was to refine the design of the scanner, as I had to use tape to hold it together. The second was some inconsistencies with some of the sensors, such as the position of the spawning not quite matching with position and movement of the Accelerometer, and especially the Color Sensor that needed to be calibrated after almost every use. Lastly, I could have expanded upon the options of what could be generated instead of solely simple shapes or a more complex generated background. However, with the time I was provided to make the project, I am still satisfied with the results. I have learned that projects like this take a lot of time and energy to complete but that even if the result is not what you’re looking for, it may still satisfy you. I am proud of all the time and energy I put into this project, and overall had a good time with Interaction Lab as a whole.
APPENDIX:
Arduino Code:
#include <Adafruit_MLX90614.h>
Processing Code:
import processing.serial.*; Serial serialPort; PGraphics pg; int NUM_OF_VALUES_FROM_ARDUINO = 11; /* CHANGE THIS ACCORDING TO YOUR PROJECT */ /* This array stores values from Arduino */ int arduino_values[] = new int[NUM_OF_VALUES_FROM_ARDUINO]; void setup() { printArray(Serial.list()); // put the name of the serial port your Arduino is connected // to in the line below - this should be the same as you're // using in the "Port" menu in the Arduino IDE serialPort = new Serial(this, "COM3", 9600); size(1920, 1080); pg = createGraphics(1920, 1080); } void draw() { // receive the values from Arduino getSerialData(); // use the values like this: float R = constrain(map(arduino_values[0], -800, -50, 0, 255), 0, 255); //-100 float G = constrain(map(arduino_values[1], -100, -50, 0, 255), 0, 255); //-50 float B = constrain(map(arduino_values[2], 250, 75, 0, 255), 0, 255); //300 float T = map(arduino_values[3], 0, 100, 0, 100); float L = map(arduino_values[4], 50, 400, 0, 255); float UV = map(arduino_values[5], 500, 10000, 1, 20); UV = constrain(UV, 1, 20); float X = map(arduino_values[6], 210, 370, 0, width); float Y = map(arduino_values[7], 210, 370, height, 0); float Z = map(arduino_values[8], 200, 300, 0, 100); float E = map(arduino_values[9], 0, 1, 0, 1); float S = map(arduino_values[10], 0, 1, 0, 1); pg.beginDraw(); if (T >= 60 && T <70) { pg.ellipse (X,Y-100,X-100,Y+100); } if (T >= 70 && T <80){ pg.square(X,Y,Z); } if (T >= 80 && T < 90) { pg.circle (X, Y, Z); } if (T > 90) { pg.triangle(X, Y-50, X-50, Y+50, X+50, Y+50); } pg. strokeWeight (UV); pg.fill (R,G,B, 50); pg.endDraw(); //background(random(255)); background(L); image(pg, 0, 0); // Saves each frame as line-000001.png, line-000002.png, etc. if (S == 1) { saveFrame("your_picture-##########.png"); } if (E == 1){ background (0); pg.clear(); } } void getSerialData() { while (serialPort.available() > 0) { String in = serialPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII if (in != null) { print("From Arduino: " + in); String[] serialInArray = split(trim(in), ","); if (serialInArray.length == NUM_OF_VALUES_FROM_ARDUINO) { for (int i=0; i<serialInArray.length; i++) { arduino_values[i] = int(serialInArray[i]); } } } } }
References Used To Build Circuit and Codes:
– Color Sensor: Arduino Color Sensor TCS230 TCS3200 | Random Nerd Tutorials
– Temperature Sensor: Grove – Temperature and Humidity Sensor – Wiki (archive.org)
– Digital Light Sensor: Grove – Digital Light Sensor | Seeed Studio Wiki
– UV Sensor: Grove – UV Sensor | Seeed Studio Wiki
– Accelerometer: interaction-lab/Three-axis-analog-accelerometer-documentation at main · ima-nyush/interaction-lab · GitHub
– Touch Sensor: Touch sensor with Arduino | arduino touch switch | Techatronic
– Arduino to Processing Code: https://docs.google.com/presentation/d/1LxzVE4ZRV9B4nx5j8HGs9PZyNYV1WHOxntHupVBjEG0/edit?usp=share_link
– Some assistance from TAs Original Sketch:
Build Process and Prototype:
Laser cut wood pieces for final build are entirely original, hand measured and made in Cuttle by me:
Final Build and Demonstration:
Temperature Sensor & UV Sensor (Controls Shapes being spawned and how thick their outlines are): https://drive.google.com/file/d/1fJ_XNdUMxZiO3WVyuZbKqcOPbjSyR9l1/view?usp=share_link
Digital Light Sensor (Control Background Color from Black To White): https://drive.google.com/file/d/1dp06mfiSi2f65CkLRSGTk4scrCfRFlqT/view?usp=share_link
Accelerometer (Controls where on the screen the shapes are being spawned): https://drive.google.com/file/d/1pMIwbp9aviHwUqkCze0J_cLzfknd0EHk/view?usp=share_link
Touch Sensors (One erases the screen to blank and another saves the current screen upon pressing): https://drive.google.com/file/d/1iO9MhVPIsJVmXqANe-M0tGtNpawZl_j_/view?usp=share_link (Erase)
&
https://drive.google.com/file/d/1O80TbYnKj-O4fyg7AlkAdlZa7wNKa-Uu/view?usp=share_link (Save)