Hey, at least I got somewhere?

Okay so after a very sad email to Aaron and a few anxiety attacks, I got probably the funniest response from Aaron and got to work. He told me to specifically- 

  1. Write a Script 
  2. Watch Stranger Things Season Three
  3. Break out into a crazy musical number 

So I did. 

Step one- write a script. Here are the highlights: 

The scene opens up with an actor, no gender preference, sitting at the base of a tree. They are reading a book.

They reach into their bag and pull out a tie. They put the tie on. They are now The Bird Detective. Their only mission is to investigate the birds in the palm trees. 

The actor poses, maybe a fun logo plays with music (like a superhero theme). They reach back into their backpack, ready to get started. We realize that this is not an academic backpack, but a bird backpack. 

They move the stethoscope around the tree, where it moves, different parts of the bird noises are amplified. A tweet, a chatter, A twiddle. Idk. Bird things. As they move the stethoscope up the trunk, towards the top, the sounds get louder.

They take a step back, take a deep and audible breath, and start to twirl the lasso.

They find a bird, reading a book. And that book is entitled, “How to make Students Less Stressed” 

Yes, there are silly elements, but doing this script really helped. I think it got my creativity flowing really nicely. I was able to work through what I wanted it to look like for the most part. I feel like maybe I should have done that, like, a week ago. 

Step 2 was watch Stranger Things and I actually did that while I was writing the script. Hell yeah multitasking. I didn’t draw any direct inspiration from it, but hey, I’ll keep watching and maybe it will help. 

Step 3…the musical number. I do not think this has a place in my performance but I did aggressively play guitar when I got frustrated. 

I think the ideas are shaping up pretty well. I’m concerned about the look and honestly how to do a lot of the things that I want to do with it? I think maybe using a camera to see the person and then having like…idk maybe different areas of the screen or different triggers. Like okay when the person hits this circle then the bird noises start or….when the person lassos the tree (how?? idk??? projection???) then the image just warps along a set curve so it looks like the tree is bending? Is that allowed? Is that possible? 

I still feel a bit unconfident with this, but I’m getting there. I did a very simple thing in open frameworks, where I just split the screen into four quarters and had each quarter trigger different things, whether it was a background change, a sound trigger, etc. Later on, I added a little circle that would make the triggers happen. And all of this would be triggered with the mouse. So maybe if I translate that to follow the person around? But do I need just a regular camera for that or do I need a depth camera? 

roughPrototype.jpg

After yet another busy week of working, I have returned again with a violently mediocre project. Ok maybe it is not that bad. I have figured out most of the logic and functionality I want from my code. There are just some things that I haven’t quite figured out how to do.

I have spent the last couple of days playing with the video player example and incorporating its code into mine, then making different buttons trigger different things such as the honk sound and the play our pause button images, as well as causing the video to play or pause. 

As far as my logic, these are the things I want my buttons to be able to do. 

Ok.. but I was planning on doing buttons with serial communication! I tried, but kept running into problems with my teensy, and was unable to get some simple functions to work. After a few hours of messing about, I moved on to making my openFrameworks code work. 

Prototype

Me and Mari have been playing around with using infrared lights and the PS3 Eye camera to track the brightest point. We started off by using just a breadboard with infrared LED bulbs plugged into it. This seemed to work from a close distance pretty well, and we later tried it further away (around 15 feet) and it still seemed to consistently track the bulbs. The PS3 Eye camera is programmed to find the brightest point in the video, and draw an ellipse on the brightest point.

Next, we found a styrofoam ball in the fabrication area of the IM lab, and soldered infrared LEDs along its circumference through a wire. We hung this up on an extension cord in the IM lab, similar to our intention of using a hanging ball during the performance. The tracking seemed to be consistent and we only had LEDs on about a fifth of the ball, so placing more around it will definitely improve the cameras ability to track the ball. We hope that using a clear acrylic sphere and placing the LEDs inside of it will accomplish a more consistent tracking of the ball. We are hoping to use colored LEDs or a neopixel along with IR LEDs to create an aesthetically pleasing ball along with one that can be consistently tracked using infrared. 

Finally, we used the capacitive touch example to send information from the Teensy to Openframeworks. We ran a wire from the Teensy to the styrofoam ball, and added some copper tape to make a portion of the sphere conductive for the capacitive touch. Using the serial inputs example, we were able to detect if someone was touching the part of the ball and create animations around the position of the sphere. In the video above, we simply made balls come out of the tracked position of the sphere when it was touched. We plan to make the ball touch sensitive so that whenever our performer Erica will touch it, there will be changes in the sketch to correspond with this. We are still not sure how we will make the entire acrylic sphere touch sensitive while making the wiring discreet. 

We are happy with the progress that we made over the weekend, and for this next week we plan to create most of the visuals for the performance as well as hopefully have more consistent hardware created.  

Ratatouille but if Remy was actually Gordon Ramsey

For our Project this week, Lateefa and I decided to play a little bit with the idea of making our teensy come alive. We figured that a good way to make the tilt, heading, and rolling of the prop shield feel alive in an understandable way connected the body, was to hide it in a piece of clothing. And what is a more iconic piece of clothing than the chef’s hat from the Disney Pixar animated film, Ratatouille? 

In the movie, Remy the rat controls a human chef by sitting under his hat and moving his hands with his hair. In a similar fashion, our project seems like gordon may be pulling the user in the direction he chooses. the only difference is that gordon shames the person instead of cooking delicious food for them. 

Here is the code

Chidori! (Steven & Mari)

For this week’s homework, Steven and I decided to make use of the Prop Shield’s motion sensors to enhance a childhood nostalgia/dream we both had: to properly do the hand gestures in the Naruto anime by playing the iconic sound effects that accompany the character’s hand gestures. Matching up different sounds to different hand gestures would also be an interesting challenge to properly calibrate and distinguish between the different hand movements. 

For those who are not familiar with the hand gestures used to summon certain jutsus (or a certain ninja technique according to the show), here’s a video showing the one we were trying to recreate: 

Steven and I decided to simplify the whole action in general, and instead chose 2 hand gestures, leading up to the electricity sound. We used this image as a guide as well:

Image result for chidori hand gestures

The aspect that took the longest time was first making sure that the sounds wouldn’t continuously play on top of each other. We solved this by creating booleans that would not allow the same sound to trigger again and again, once it was already triggered once. Another step we had to take was to determine the heading, roll, and pitch values that each specific hand gesture usually stayed on, so we could successfully trigger the proper sounds each time. 

In the end, this was the final outcome, which we’re quite happy about 🙂 

The code can also be found here

 

Backstreet (Sad) Bois

You are my firrreee my one desiiiiirreee (I’m singing about a good grade in this class). 

Sara and I used the pitch and roll to control the speed and volume of the song, “I Want It That Way” by the Backstreet Boys. We originally used heading too, but decided it made the movements more natural and more in line with what they affected if we used just pitch and roll. 

Here’s the code!! 

And here’s a video of it working.