All posts by Daniel Ryan Johnston

V&S Week 7: Technomancy

Class Critique:

Video Comments:
Above is the video we presented during our final class.  As noted in the comments from the class, there is an issue with the resolution when the virus (Bloody Mary) takes over.  Technically, these shots were filmed by myself using OBS so there might have been an issue during rendering  for YouTube that was not apparent when viewed in Premiere.  Eden and I will figure this out and have the fixed  video posted below.   One classmate commented that this resolution change actually added an element of spookiness to the video.  I could see adding some resolution variations intentionally to a future version. 

Also, it was noted that the video of searching showed how intimate this action is in reality.  Why does someone chose one link over another?  Why are certain words used over others? 

It was noted that our use of the closed captioning in one of the YouTube clips was an effective preface to Bloody Mary’s use of text after she takes over.  They noted that it was a good tool to tie those moments together.

Story Comments:
One comment was regarding the transition from when the user is searching for divination information to where Bloody Mark takes.  It was suggested that there be a moment, possibility a pause, before the take over happens as it is currently a quick transition.  I  would happen to agree with this comment.  There could possibly be a moment where the user considers clicking on a link that looks suspicious.  The user would then click on it triggering the take over.   Additionally, there are a few other ways this could be done but I do agree with the need for a moment. 

Sound Comments:
Our use of ambient noise (room noise) while the user typed were effective in establishing this piece in a realistic place.  These also added to the intimacy of the user in their private space. 

Our use of the robotic voice received mixed reviews.  One classmate thought that the voice broke them out of developing a sense of fear/horror that was being developed prior.  He suggested possibly using a more human sounding voice.  Others thought the robot voices were effective.  I agree with both and would actually try out having the voice transition from robotic into a human voice as if Bloody Mary was transitioning herself from virus to something more. 

Overall Notes:

Process:
I focused on learning and creating with Open Broadcasting Software (OBS) and Eden focused on Adobe Premiere. In addition to OBS and Premier, we utilized Mac screen recordings, OBS recordings, voice renderings through the Mac terminal, and audio editing within Audition.

Final Product:
I am happy with how everything turned out and look forward to using the tools I learned to create new media that I can use within my work.

Reflection:
Working with Eden was a joy which made this project run very smoothly.  We were able to meet at my apartment to work on editing as well as creation of a few aspects of the media components.  I feel fortunate that I was able to meet with my partner face to face during this isolating time of COVD.  I wouldn’t change how we worked together at all.

In regards to our final creation, there are aspects I would change and/or develop more if we choose to work on this piece more. I would add elements that make the viewer think the computer will crash at any moment due to Bloody Mary.   This could be accomplished through Premiere by way of creating flickers.  I would also add updates from the comments above. 

In conclusion, I feel that I am now able to look at video and sound media with an more critical eye.  I have a better understanding of how to use sound and video in telling a story and am glad I took this comm. class. 

ICM Week 6: Objects and Array…Jelly Fish Life

This week we were tasked with creating an object-oriented sketch, consisting of objects and arrays.   I was inspired by the following clip to create a jellyfish sketch where the jellies would interact with the environment and eventually an intruder. 

In order to track my progress, I made multiple drafts of the sketches so that I can present them here.

I envisioned a scene in the deep ocean where a school of jelly fish were floating around and reacting to the currents.  As the jelly fish floated downwards, their movement would be easy/gradual.  Once they hit the point where they wanted to swim up, their movement would increase.  Eventually, an intruder, possibly a shark, would enter their swarm entered causing the jellies to move away to avoid the intruders. 

 Draft 1:

Code

I began this journey by creating a jelly fish function that had all the capabilities I desired.  Once that was working, I created a class titled JellyFish which housed the the methods constructor(), show(), swim() and move(). 

I had originally wanted to have a more fluid shape for the jelly fish but, after experimenting with various combinations of functions, settled on using an ellipse and an arc.  I would like to learn how to crate different shapes outside of the ones we have available to us in the P5.js reference for 2D shapes.   Had I been able to create the shape I wanted to, I was worried as how to make those shapes interact with each other.  As noted in one of Dan Shiffman’s videos, it is possible to work with shapes other than rectangle and ellipse but it is just more complicated. 

Draft 2:

 Code

Next, I worked to create varying sizes of the jelly fish.  I do like this effect but am not totally satisfied with how the jelly fish are all oriented so that as you go closer to (0, 0) they get smaller.  I was not sure why that happens when I used push(), scale(), and pop(), to create the different sizes.  This is a nice effect and adds the dimension I would like but I am not sure why this effect happens this way. 

Draft 3:

Code

For the 3rd draft, I worked to make it appear as if the jelly fish were in a confined space.  I added the height and the width of the sketch as a boundary for where they could swim.  When the jelly gets to the edge, it reverses course and swims away.  As you can see, I also removed the scaling of their sizes. 

 
Shark:

 Code

Next, I looked at adding an intruder into the jelly fishes’ environment.   Above is the shark I created much in the same way I created the JellyFish class. 

Draft 4:

 Code

Since I wanted the scale of the jelly fish and sharks to be proportional, I shrunk the jellies.   Within this sketch the JellyFish and SharkAttack are in two different classes.  I played with getting the jelly fish to move out of the way of the shark unsuccessfully.  IN the next draft, I moved the SharkAttack into the JellyFish class, renaming it JellyFishLife. 

Draft 5 Final:

 Code

I merged the classes into one, but did create two different objects using for the jelly and the shark. 

In conclusion and questions:

In order to bring this project to where I would like it, I need to figure out the following:

  • How can I get the two classes in draft 4 to interact with each other?  I would like to also have the sharks moving along both the x and the y axis .  Maybe even chasing after the jelly fish. 
  • How can I make color change in a liner fashion in stroke.  Instead of the stroke being a strobe, I would like the color to move around the body like on actual jelly fish. 
  • Is there a millis() function, or something comparable, within JavaScript?  I would like to know how to work with objects that have various timing in their effects.  Currently, I am working with frameRate() to adjust the rate at which the effects happen.  Unfortunately, this makes all the effects happen at the same rate.
  • I need to look into scale() further in order to get the varying sizes I would like.  This will help to create depth within my sketch. 

V&S Week 6:Bloody Mary Project Process Update

Conceptualization of Bloody Mary Tale:
Eden and I decided to take the tale of Bloody Mary out of the bathroom mirror and into a laptop.   In our adaptation, Bloody Mary is essentially a virus within your computer. The story is told from the prospective of the computer’s user.

Bloody Mary is awaken through your investigation into her story. She wants you to say her name three times but needs you to do so while she is listening and watching.  After a few popups regarding her tale, she starts open up all the apps on your computer which would allow her to see you.  Unfortunately, your camera is covered. Next, Bloody Mary begins to connect to you through various text apps in order to tell you what she wants.  In order to hear you she opens up voice recorder and lets you know she is listening.

Throughout the creation of this piece, we considered how to incorporate our research regarding divination and catoptromancy – divination by use of mirrors.  Focusing on the act of divination through a mirror became less of a focus as this piece evolved.  Choosing to tell the story from the point of view of the user really informed what we were able to portray.  This perspective brings the experience personal for the viewer and our hope is that it elicits a feeling of fear and curiosity.  Curiosity into the security of their personal devices.  Is anyone watching when we think the camera is off?  Who is watching and listening to us when we use our technology?  Are we actually alone?  It is interesting that Apple’s latest OS update for the iPhone includes a feature that notifies you when an app is using the microphone and/or the camera. Clearly this is a concern for people.  We wanted to play with that feeling in order to create a little creepiness during Halloween.

Medium: Recorded video using screen capture and OBS
If we were technically capable, I would propose this project as an interactive installation where you, the user, would search for information about Bloody Mary on a laptop.  This would then trigger a takeover by the virus of the computer where you would be asked questions, recorded both in video and in audio and eventually be coerced into saying Bloody Mary’s name three times.  I imagine this being in a dark room with only the light from the screen.  With the guidelines for this project being to create a video piece, we needed to find a way to simulate that experience for the viewer.

Before working with the screen recordings and OBS, Eden and I synced our laptop setups so that we could both work on various parts of the project.  Our initial recordings were able to capture the mouse moving which showed that the user was still in control of their actions.  Once Bloody Mary took over, however, we needed the mouse to disappear.  To create this effect, we used OBS. 

Within OBS, we created a simulation of the desktop.  Each window was layered so that it mirrored how the apps would open on our desktop.  Below is a clip of running through the scenes within OBS. For each new popup, I created a new scene with

 

Challenges:
The main challenge we ran into was saving work within OBS.  As avid tech users, we are accustomed/trained to save our work many times just in case something happens.  As a streaming software, this is less of a concern.  You are able to record, which we utilized.  OBS does allow you to save the profile and the scene setups in the even, for example, that you need to use a different computer for broadcasting.  Related to this is the ability to close out apps when you are finished.  OBS saves the layout and the location of the linked item.   When restarting OBS, you have to open up the linked items in order for everything to appear as designed.  If you are opening up a saved scene or profile, reestablishing links can take some time, depending on the number of scenes and items. 

Outside of challenges inherent in learning new software, I didn’t run into any other difficulties with this project. 

ICM Week 5: Functions

This week we focused on creating our own functions within our  sketches. 

The first challenge was to modify the function within Exclamator so that it returns the text plus n exclamation marks.  I was able to do that by creating a new function called repeatStringNumTimes( string, n).  Check out the code here.   On line 10, you will want to edit the second argument within the function exclamate() in order to add as many exclamation points to the string as you desire. 

For our sketch this week, we were asked to refactor the code within a previous sketch by creating functions and objects, if possible.   I decided to refactor a sketch from Week 2.   The original code is here.   Within the new code, you are able to call the function moveTriangle() and fadeBackground() within the draw() function.  The parameters for moveTriangle(a, b, c) corresponds to the corners of the triangle.  by changing the arguments, you are able to speed up or slow down how quickly each particular corner moves through the space.  Within the function fadeBackground(), you are able to adjust the speed which the background fades out. 

 

PComp: Project 1 – Social Distancing Crown

Yona Ngo, Yonatan Rozin and I were paired together for this project and decided to create what came to be called the Social Distancing Crown.  For more info on our initial planning, please check out my last PComp post here.  You can also see Yona and Yony’s blog post here: Yona, Yony.  (ATTACH LINKED POSTS).

Below is a breakdown of the project production by day. 

Day 1:

Yona, Yony and I met on the ITP floor in Brooklyn to parse out each item.  Once we figured out the individual components, we will meet again to combine them into one piece.
Yona – speaker and ultrasonic ranger
Yony – servo moto
Daniel – LEDs

Yona found several LED NeoPixel strands on the free shelf so we decided to use those.  I started by researching the specific strands and was able to find one of the strands info as it had the manufacturer’s name printed on it – Alitove.  I was also able to find a guide which helped get me started.  I learned that each NeoPixel draws 50mA of power and needs 5V 2A of power. 

As a starting point, I used this drawing for the circuit as well as Adafruits NeoPixel guide:

Tried originally tried utilizing the code from randomnerdtutorials.com guide but that was not working for so I switch to Adafruit’s learning page from that point on.Nano Schematic Initial Nano Circuit

Day 2:

I started the day out trying to figure out the following questions:
Questions:

  • Would I need a 3.3V Regulator if I wanted to power the Nano 33 IoT from the 9V battery as well?
    • Answer: Yes, I would need a 3.3V regulator to power the Nano but I would also need to convert the 9V to 5V in order to power the NeoPixels. 
  • Do I need a logic level shifter per Adafruit (https://www.adafruit.com/product/2488) in order to use the Nano 33 IoT?
    • Answer: Yes, but did I want to have to get one?  No.
  • Library: Would we want to use the Adaruit TiCoServo Library since we will be using a servo in the same sketch? 
    • Answer: Yes and No.  (more to come on this later).

I attempted to configure the circuit using the Nano 33 IoT with little success.  In order to I would need a logic level shifter for the NeoPixels and the Nano to be able to speak to each other as the NeoPixels need a 5V current for their coding as well as for power.  I decided to switch to using an Adafruit Metro.  The Arduino Nano 33 IoT runs on 3.3V and offers ways to run components off of 5V but it was the least complicated to to switch microcontrollers. Adafruit’s guide does mention that you would be able to run a short length of NeoPixels using 3.3V but it would only get more complicated one we added all the components. 

Day 3:

I started the day by meeting with our professor, Dave Rios to discuss my progress with the NeoPixels.   Dave suggested the following:

  • Continue with the Metro or Uno since we were able to get them working during the meeting.
  • Our biggest challenge will be in regards to powering the entire circuit once all the components are added. 
  • Get the components added one at a time instead of all together.

Later that day, I tested out setting the NeoPixel strips to a single color using Adafruit’s guide.   I ran into one chaleneg with the colding: how do I use Color( )? 

While testing out the coding from Adafruit’s NeoPixel library, I was only able to get color() to show various bightnesses of blue but was unable to get a different color.  Adafruit NeoPixel Library Example CodeBlue NeoPixelsIt took me a bit but I was able to find the following code which showed me how to define then update the colors of the NeoPixels:

Change ColorCode MagentaMagenta NeoPixelSingle NeoPixel Schematic

Now that I was able to control the color of one NeoPixel strand, I wanted to test out how to have 2 NeoPixel strands on one board. I used the following code but was unsuccessful in having the black NeoPixel strip as Green and the white NeoPixel strip as Red.

Double NeoPixel Code NeoPixel Strands Double NeoPixel Schematic Breadboard View

Day 4:

I meet with ITP resident Arnab Chakravarty to go over my questions with the Adafruit library.

Adafruit uses the terms #define followed by LED_PIN 6 to establish that pin 6 will host the output.  Per Arnab, this is the same as assigning using PinMode.  Arnab was able able to walk me through utilizing the library and working with the two NeoPixel Strips.

Day 5:

Meet with Yona on the floor.

Added the ultrasonic ranger (w/o the speaker) to the NeoPixle circuit.
⁃ Added 9V battery with 5V regulator for Ultrasonic ranger
⁃ Updated code from earlier as I forgot to declare the NeoPixel strip object.

Test 1:
Taking into account changed items listed above.  I was able to get the NeoPixel strip to start as green then turn red as I moved my hand closer to the ultrasonic ranger.  I was not able to get the strip to turn back to green once out of range, however.

Test 2:
Changed (line 49)
else {
setNeoPixelToGreen(); //Sent NeoPixel to Green
}

to

else if (distance > 10 && previousValue > 10) {
setNeoPixelToGreen(); //Sent NeoPixel to Green
}

This was an attempt to get the NeoPixel to change back to green when distance is greater than 10 and the previousVaule greater than 10.Ranger UltraSonic Ranger and NeoPixel Circuit

Test 3: Running sonic ranger off battery.  The ranger was not reading properly whilte the lights working momentarily.

Test 4: Running NeoPixels off 9V battery.  This was unsuccessful as the NeoPixels did not light up.  (I realize on Day 6 that I was using a 3.3V regulator instead of the 5V regulator I needed for this to work). 

Yona and I were finally successful in getting the sensor, speaker and NeoPixels to work together!Circuit minus ServoCircuit Schematic minus Servo

Next step was to build the housing for the components.  Once built, I tested the wiring to make sure nothing came undone during this step.  It was all working and ready for the servo motor. 

 Housing Step 1 PatternInside of Crown HousingOutside view without the servoInside view 2

Day 6:
Yona, Yony and I met at the ITP floor to complete the circuit by adding the servo code to the overall code. 

We considered adding an Op Amp to the circuit so that the voice would be louder but, after reviewing the datasheet of the component, decided that this would best be left to another time.  We were not sure even where to begin with working it into the circuit as the datasheet was not clear enough.  Dave Rios suggest using fig. 22 on page 25 of the TL072P’s datasheet.  Unfortunately, we were not able to fully decipher the schematic in time to implement into the finished project.  If we pursue refining this project, we will invest in finding an amp to bring out the voice.  

Test 1: servo moved at start but did not move again while the other components acted as desired  Tested adding 9V battery for servo, unsuccessfully.

Test 2:
Realized that I had been using a 3.3v regulator in the circuit with the batter so I switched instead to a 5V regulator.   Unfortunately, nothing still.

Test 3:
Checked the power for all by unplugging speaker and NeoPixels to test if the servo moves more than one. It did not.Complete Circuit Ready for Final HousingFinal Schematic

Final Front View Final Side View Final Backview

 

Conclusion:
We were unable to figure out why the servo only worked once in combination with the speaker and NeoPixels.  We were however able to create a piece that works once! 

 

We recorded 2 videos: (1) just the servo running its original program by Yony and (2) the final piece with the servo only running that initial time.

Questions for the next build:

  • How do we implement the amp into the circuit?
  • How do we get the lights to change, the voice to speak and the finger to wag without a delay?
  • How do we properly power the entire circuit (this might have been the issue with the lag and the inability of the servo to function)?
  • Besides drawing in Illustrator or photoshop, how can I improve on my schematic diagrams? 

Additional Component Datasheets and Guides:
Sensor datasheet: https://cdn.sparkfun.com/datasheets/Sensors/Proximity/HCSR04.pdf
Servo datasheet: https://cdn-learn.adafruit.com/downloads/pdf/analog-feedback-servos.pdf
Nano 33 IoT:
https://itp.nyu.edu/physcomp/introduction-to-the-nano-33-iot/
NeoPixel Strands: https://www.alitove.net/product/ws2812b-300bk-np/