An Interview with NYU Tandon Mobile AR Lab’s Mark Skwarek
By Jayson Miller
Augmented Reality (AR) is set to become one of the largest new industries of the 21st century. The Mobile Augmented Reality Lab at the NYU Tandon School of Engineering, is an addition to the school’s Integrated Digital Media Program, and introduces students to working with augmented reality in a myriad of ways. The lab’s founder, Mark Skwarek, boasts an extensive background that straddles the worlds of art and technology. His work has been seen in numerous places including the Museum of Modern Art (MoMA), WIRED, the New York Times, and the Boston Globe. Mark currently has a show at the Queens Museum of Art. We recently chatted with Mark about the world of AR, the exciting things the Lab is doing, and some of the ways students and faculty can get involved.
JM: What is the difference between augmented reality (AR) and virtual reality (VR)?
MS: VR lets people leave the reality for a world of computer simulations. These simulations can create highly immersive experiences. When done correctly these experiences are amazing with the potential to be addicting. Developers are making efforts to mix the VR experience with real world objects for a more embodied experience. An example would be a user in a VR world sees a virtual couch. They walk over and sit down on it and there is a real world couch at the same location, allowing them to sit comfortably.
Augmented reality overlays the real world with digital content. The user’s digital experience is in their field of view instead of a laptop screen or smartphone display. The user sees the real world as they normally would, but there is digital content embedded in their surroundings. The locations of train stations and city buses could be seen through buildings. Real time access to this information about one’s surroundings can give the user a much stronger connection to reality.
Compared to AR, VR is currently the more refined technology and has a stronger user experience because the virtual world does not have to stay in alignment with the real world. Augmented reality has to update the virtual content’s position to stay in alignment with the real world as the user moves. Bandwidth and processing power are bottlenecks. This is a challenging problem but hardware like Microsoft’s Hololens shows a working proof of concept.
JM: What are some practical uses of AR?
MS: AR can potentially make humans smarter and strengthen our connection to reality when used correctly; from general information about peoples’ surroundings such as historical events and the daily news, to safety features that could create a line on the ground leading you out of the city during a disaster scenario. Someone with no training would be able to fix complex car engine problems simply by looking at the engine. Step-by-step instructions would overlay the engine showing how to unscrew each bolt.
I predict that this will help democratize knowledge and could bring about something similar to the industrial revolution. Highly-skilled jobs that take years of training could be learned in a fraction of the time. People in remote locations with little or no access to higher education will be able to perform complex tasks that normally would have taken years of specialized schooling to learn.
JM: What type of work happens at the Mobile AR Lab?
MS: The Mobile AR Lab at Tandon is working on next-generation, mobile AR experiences that have not yet been done and could make the world a better place. We focus on projects that can reach the public now instead of making speculative work that can only live in a multi-million dollar lab. We look for ways to make people’s lives better, easier, safer, and more exciting.
There is great potential for AR to be misused. I tell my students and lab members that it’s up to them to make a useful technology. Together we are shaping the way people in the future will understand it.
JM: What are some projects that students are working on?
MS: We have done a lot of work at the lab. We created an app that shrinks users so they can explore the human body from inside; “The Augmented Reality Human Body – Mobile app“, done in conjunction with the Discovery Channel, allows users to view cellular interactions taking place in different parts of the body.
We have also been doing a lot of work around navigation. Imagine being in a train station, hospital, or an airport and being able to see where you need to be, through the walls. It will show you how to get there with a line on the floor. AR can let us see where we are traveling to, and can enable users see a room inside a building from the street below. It can tell you real time information about what’s happening in the room while you are looking at it. The project makes users smarter by overlaying the real world with additional information that lets people navigate unfamiliar spaces with ease. After creating the 1st version of this project, the lab has partnered with the world’s leading AR navigation company and students currently have internships with them.
Another exciting project is our Open Telepresence, an open-source software that can create 3D videos of real-world and people. Users can put videos online, allowing others to see them at other locations. You can be in 2 places at once. It’s like a 3d Skype. It lets you be in two places at once and lets you experience a location far from where you are. These experiences are in real time, so when you are viewing them, it’s actually happening. People then use a mobile device such as a smartphone, tablet, or Google Cardboard to view the 3D content. Reporters and bloggers can share world events in 3D from remote locations. Users can place an ocean beach in front of their office workspace. First responders to an accident can get expert advice from a skilled professional half a world away. Families can see a distant relatives living room across from their own living room and eat together. Open Telepresence creates the final generation of networked communication by allowing users to be present in 2 locations at once. This is the 1st telepresent software available to the general public and we are letting the general public develop it from the beginning. We’ve created a toolkit.
JM: How will we see AR grow and integrate into our daily lives?
MS: Places where people will start to see AR will be: in the workplace to help workers be safer and more efficient; in advertising and printed media; in data visualization, navigation, cars, entertainment, mobile gaming, social media; and communication in general.
Currently, the most common user experience is seen through a mobile device: smartphone or tablet. Users look at the world through their mobile devices and see digital content overlaying real objects and locations. An example would be looking through your phone in camera mode and seeing a virtual pin marking the location of the restaurant you wish to travel to. However, users don’t like having to hold their phones up to view digital content. It’s fun the first few times, but it gets old fast. For AR to have have mass adoption by the general public the experience will have to change from the phone to light weight glasses.
JM: What do people who work in AR/VR program in?
MS: C, C+, C++, C#, Jason, areal, Java, python and the list goes on…
We also use all types of creative software so there are many points of entry for people interested in working with AR.
JM: What are the types of jobs people working in AR/VR are doing?
MS: Entertainment, literature, advertising, task-based assistance, UX, gaming, social media, and safety to name a few. Everyone is becoming excited about the technology so new areas are coming up all the time.
JM: How can students/faculty get involved?
MS: We are always looking to collaborate with the NYU community on interesting and new projects. If any faculty or students have a great idea I would love to hear about it. We are currently working on a number of projects such as helping to design rides for a VR amusement center and making Google cardboard apps for admissions. We are also working with General Motors Racing division, Northrop Grumman, Insider Navigation Systems, and North West UAY [aerospace and drones] to create AR experiences much like what you see in Iron Man and Marvel’s Agents of Shield. Please check out our new work which lets you travel to Mars and drive around Tandon’s Luna Bot to collect soil samples. We need help on all of these projects so anyone who is interested should contact me.
To contact Mark, please email mls386@nyu.edu.