Zeyao Li | Stanbot: Intro to Stan Culture Through Voice Tech

Stanbot is an unconventional use of voice technology. It turns the AI assistant to a “stan” and talks to users in a toxic way that stans talk on Twitter or other social media. Through the interaction, the user will briefly understand the concept of stan culture and to actively participate in stan culture within a controlled environment.
 

uc?id=12ccuWLM-a2-Vwj_ur_1j4mjEfpR_EpO6&export=download
Educate the user about stan culture

Enlarge

uc?id=1_73LmP9FytgchWa8kveHkdvvRrf51XtH&export=download
Stan culture experience

 
 
 

 
What is “Stan”? If I ask this question on the street, 9 out of 10 people will not give me the correct answer. In fact, the term “Stan” recently has been added to Merriam-Webster dictionary as both a noun and a verb. Stan became a cultural phenomenon after the breakthrough of social media, such as Twitter. The project starts with two questions: what is “stan culture”, and why do people need to care about it? In short, stan culture is about the heavily obsessive fandom community that is active on Twitter. Their shady and aggressive attitude towards other fandom community does not only hurt celebrities themselves but also their fans. Learning stan culture can let people be prepared when they get involved in having conversations with stan. It will let people avoid unnecessary trolling comments and fights online.
Stanbot tackles the one specific media issue in the digital age with voice technology. The project turns the AI assistant to a “stan” and talks to users in a toxic way that stans talk on Twitter or other social media. The user comes to the phone and starts the conversation with “Talk to Stanbot”. The bot will pick up the user’s words and reply to the user with programmed input based on what the user says. Through the interaction, the user will briefly understand the concept of stan culture and to actively participate in stan culture within a controlled environment.
Stanbot is an unconventional use of voice technology. Usually, people will assume the chatbot will help the user play music or tell them the weather. However, Stanbot goes beyond the common use of this technology and explores the other side of it. Voice technology can replicate an intimate experience with talking to a real voice. Stan culture mostly exists on the digital platform whereas no one really uses the stan terminology in reality. Since people might never have a chance to talk to stan in person, therefore the use of voice technology can mimic the situation where people are put in a stan twitter conversation. With Google Assistant, the conversation will be set in a continuous and natural way. So, the rudeness and shade that the Stanbot has will be shown even more during the conversation.
The project is intended to let the digital citizen (i-Gen) be aware of this relatively new culture — stan culture, and start questioning the toxicity of the stan community. Is the tension between different idol’s stans necessary? Do stans have to troll another stan for supporting their favorite? Through this project, I bring the online space to reality and allow audiences to experience the cyber-bullying and cyber-arguing inside the stan culture. Voice Assistants like Alexa or Google Home are created to be intimate and warm to users, yet Stanbots will rebel the traditional settings. The machine-like voice will reinforce the uncanny valley feeling for the user when the bot starts speaking explicit words. With all has been said, the project itself fill the hole of using voice technology to analyze cultural issues related to social media.

 


Tags:#ConversationDesign#UserExperience#MediaStudies

 

Ye Chen | JING: an interactive finger destressing studio

JING is an interactive finger destressing studio where users can enjoy a ten minute interactive mudra session, with both music and visual effects incorporated, to relieve themselves from stress.

 

uc?id=1BWfZH5uS_kiTPwqD0HXnkgqzuaXGQMqA&export=download
#gesture_detection

Enlarge

uc?id=148SiFBHPoxu6b8XSgYGNnp3oGdiEiYQR&export=download
#geode_effect

 
 
 

 
In the fast paced, constantly changing world, anxiety seems to have become an indispensable part of a lot of people’s everyday life. We may think that anxiety and stress are normal feelings, however, it could actually interfere with our normal activities and relationships in a destructive way if not being paid enough attention at the first place. Therefore, it is necessary for people to release the tensions in time once they feel that the stress is becoming too overwhelming.
JING is an interactive finger destressing studio where users can enjoy a five-minute interactive mudra session to relieve themselves from stress. A mudra is a symbolic or ritual gesture that is commonly used in yoga practices. Most of the mudra gestures are performed with only the hands and fingers. The purpose of this project is to provide the users with a rather relaxing experience that involves both hearing, visual senses, finger gestures into the interaction process. Users will start off by seeing a short introduction of Jing and then place their hands above the Leap Motion as instructed to perform the mudra gestures shown on screen. Users are asked to fit their fingers into the circles on screen and once the Leap Motion detects that they are doing the gestures correctly, it will automatically start to play a short piece of music. According to mudra, each finger represents different natural elements, thus for each gesture, there will be a short sound track of different natural sounds like fire, rain, birds chirping, etc. Users will need to hold the gesture for 1 minute before jumping to the next one. By moving their hands back and forth, left and right around the Leap Motion, they can play with the sound track by changing its rate and amplitude. JING also owns a futuristic aesthetics with a space travelling background video playing throughout the whole process and a geode-like visual effect that displays hand gestures real-time on screen, which creates a sense of alienation from the reality and helps the users to better immerse themselves to the experience.
Different from the existing destressing devices like meditation mobile apps and interactive yoga studios, JING would focus more on how the interactions between the users and the project could help users to destress with minimum space and time requirements but a more immersive experience with music and visual effects. Comparing to meditation mobile apps, JING added more interactive elements into the destressing process. Researches have shown that when people are under extreme stress and pressure, it is difficult for them to distract their attentions from what is currently bothering them to fully focus on the meditation or yoga session. Thus, with the little “distractions” of the visual and music effects, users are able to switch their focus to the five-minute mudra journey. Also, comparing to interactive yoga studios, JING would stand out with its convenience for its relatively low space requirements, since the session would only involve the usage of hands and fingers.
In all, JING combines both elements from yoga and meditation into the interactive process of destressing. Through doing the mudra gestures as well as playing with the music and visual effects, users are able to fully relax themselves for five minutes.

 


Tags:#mudra_gestures#destressing#immersive

 

Xincheng (Peter) Huang | Immersive Strategies: A First-Person Perspective Chess in VR

This project builds a human-sized chess set in VR that allows a player to play a chess game as any of the chess pieces from a first-person perspective and explores the possibility of providing immersive interactions and strategic challenges in one single game.
 

uc?id=1niKiEA1Eg4HNCXeTEK2lgw8wKQtTlSu-&export=download
The Chess World

Enlarge

uc?id=1NwBB1YDv1EqRBx_kF4yBhCJ8sc2JOCgL&export=download
Battling

Enlarge

uc?id=15GOeVbwlur6B_syc9Wy7TUO36TBDxQCg&export=download
Up into dust

 
 
 

 
A common and easily neglected fact about games is how many of them are actually virtual representations of warfare. These games can be further roughly put into two categories, third-person strategic games such as StarCraft or Command and Conquer, and first-person combat games such as Counter-Strike. The former is usually more intellectually engaging but the players won’t feel as physically involved as first-person combat games. Both these two categories have their own merits. However, they are usually considered separated and seldom have we seen attempts of combining their advantages together. Therefore, one interesting question would be how a game will be like if we had the strategic components and the first-person combating components combined together.
 
The project “First-person Perspective VR chess” attempts to answer the above question by incorporating the first-person combat components into chess, a traditionally purely strategic game, with virtual reality. The reason why this project chooses to combine chess and virtual reality is that they present two extremes of gaming experiences. On the one hand, chess is commonly viewed as a strategic-oriented game that requires calm and intellect. On the other hand, virtual reality was created to create immersive and first-person experiences. By combining them together, the advantages of strategic games and first-person combating games can be best utilized.
 
It is true that incorporating VR in chess is not news. However, most of them only create a virtual gaming experience for regular chess playing, instead of playing from the first-person perspective with human-sized pieces. In this game, however, players will stand on a human-sized chess set and play the chess game from the perspective of any of the chess pieces. The targeted user group of this project are both chess lovers and video game lovers in general. For the former, this project would create an immersive gaming experience for chess-playing, and for the latter, this project will showcase how a game can be both strategically intriguing and physically engaging.
 
As the development of this project goes on, it turns out that virtual reality does introduce an immersive gaming experience for chess, but more questions arise. It turns out that the newly introduced components change the strategic view of chess. Firstly, players no longer have the entire picture of the chessboard. Secondly, the human-sized chess pieces will not only block the view of a part of the chessboard, but they also prevent a player from controlling whatever pieces are in that area. Therefore, the project is not only about how first-perspective brings immersive engagement to a strategic game, but also about how it will reshape the strategies of chess. By changing the game dynamics such as piece sizes and control mechanics, the game requires the players to think about the chess strategies differently. For example, a player can now block the sight of the opponent on an important piece to protect it or sacrifice a piece in order to get a better view of the chessboard. Therefore, perhaps the answer to the question that we raised at the beginning can be formulated in this way: to combine an immersed gaming experience with a strategic-oriented game, the strategic component will either be degraded or altered and potentially lead to a completely different gaming experience. Is this gaming experience better or worse? Maybe we have to leave it for the players to decide.
 
Tags:#VirtualReality#First-PersonGaming#MagicChess

 

Konrad Krawczyk | iLiveInPublic: Gen Z, Dataveillance & Digital Nativity

iLiveInPublic is a Web-embedded performance installation which captures the implicit forces of timed data surveillance and puts them on display, tangibly and publicly.
 

uc?id=1mvDv3iL2oTZYUyYY0M7Fh-u3T1MauUbT&export=download
The full setup (front)

Enlarge

uc?id=1EWL9CQqrXp8okWizoGEE3cpG9A1QLZRt&export=download
The setup (close-up)

Enlarge

uc?id=1c1zsHjXm1r8GISKDDXbRmniW0DWZcEOI&export=download
The setup (during the show)

Enlarge

uc?id=1pQ5boz9C9nTo7QsFBNHW_OzfD8Gc0J2_&export=download
The setup (close-up)

 
 
 

 
As part of the early Generation Z, I am a data commodity. It is not merely because I grew up alongside with the Internet. The very brief history of the Web includes the time when nobody would know if a dog had been sitting in front of the keyboard. Now, various corporate forms of the Internet compete with each other about who knows the person behind the keyboard more specifically and in hindsight. The Internet itself has changed, and it happened right as the first digital natives became adolescents. It was around the early-to-mid-2000s, and it was precisely when the digital form of surveillance capitalism has been invented.
As Shoshana Zuboff put it, the core questions that surveillance capitalism seeks answer to is how to modify human behaviour towards self-serving and profit making ends.The general answer is a wisdom of privacy education that we tend to happily ignore: any interaction with almost any website is a potential data entry that goes far beyond the responses we immediately see on screen. Through timed and massive data accumulation of individuals on the Web, “data-driven” companies gather raw data as commodities for powering sophisticated black-box models. These profiling and targeting models overfit to our relative personal traits over time, thereby increasingly enforcing our standing in the society, largely depriving us of the right to future tense.
Having known both the benefits and risks of this new, seemingly “free” capitalist logic, we still (quite literally) accepted the terms and conditions. For the youngest Internet users, the immense network effect and the subsequent expectation of presence have both made participation a de facto non-decision. What does it mean to live in relation to megastructure that knows you before you know yourself? What does it mean for the digital natives, for whom highly individualised experiences are manufactured before they even get to make up their minds and grow?
iLiveInPublic is a Web-embedded performance installation that captures this new model by putting it into a public and fully visible form. By sitting in a glass enclosure for an entire working day, printing out browser tracking data and publicly displaying the aggregate personality profile, the living human inside the installation turns themself into an observable object of data surveillance. By making the implicit interactions and data flows explicit, the subsequent performance is aimed at prompting the viewers to think of their own relationship to data tracking and profiling, and the extent to which this experience is common for all users of the siloed, corporate-centered Internet.
The subsequent performance has also proved to be a digital social experiment in several ways. Firstly, it documented patterns of browsing, looking and typing that would be otherwise difficult to capture. As users, we tend to find most pages through link referrals (which is how PageRank, the Google’s engine, calculates relevances), although these referrals often reveal a concerning attention logic. Because of it, the data printed out showed the unstructured mixture of knowledge and distraction, where news blend with infotainment and family updates with celebrity gossip.
Another question that came up during the show itself was the issue of consent in performance. Installations, especially with many moving or interactive parts, often invite participants who watch, take pictures, or even try to talk to the performer. How to establish boundaries between the audience, in order to make the performance most impactful? The context of the Final Show seemed extremely important in this case, with highly participatory student exhibitions creating an expectation of interactivity, participation and even dialogue. In order to establish better understanding of the project among various groups, it has been valuable to respect that context, even at the expense of altering the results of the experiment.
As a form of documentation, and perhaps a meta-commentary, the performance has been live-streamed and saved on a video gaming channel, twitch.tv. This has been a major form of engagement with the installation, also for the viewers who could directly observe the performance on site in Shanghai.
Tags:#dataveillance#youAreYourData#generationSeen