“We’ll be entering the dawn of ‘the age of computer graphics’ when the augmentation of reality can be seen anywhere, at any time, without needing to hold up a device.”
By Bridgette Austin
Imagine watching a televised football game and seeing the first down line marking the field. This virtual object that appears on the playing surface (sports scores during a TV match is another example) is, in fact, augmented reality in action.
But the use of augmented reality (also known as AR) isn’t just limited to the world of sports. Augmented reality—the modification of a user’s view using computer graphics, sounds, video, and other data—is used every day to insert digital and virtual information into your real-world environment. Heads-up displays in automobiles, animated advertisements in catalogs and on product labels, and wearable eyewear like Google Glass are all examples of how people are using this fast-growing technology.
According to Ken Perlin, professor of computer science in the Media Research Lab at the Courant Institute of Mathematical Sciences, and Director of the Games for Learning Institute (see the Connect article “Video Games & the Future of Learning,” April 2012), augmented reality will play a key role in creating a more blended, interactive experience in and outside the classroom.
“I think the greatest impact of augmented reality will be further out as it gradually becomes an integral part of everyday life,” states Perlin. “Just as we’ve moved in the last six years to iPhone and Android devices, we’ll similarly move to wearables over the next decade. After that, augmented reality will be integral to all fields of activity, including learning and higher education.”
The NYU Community is already getting a glimpse of the possibilities around AR with the help of research groups such as the Mobile Augmented Reality Lab. Working out of the Media and Games Network space in Brooklyn, the Lab is exploring emerging AR technologies in areas ranging from voice recognition to 3D visualizations and gaming. The group’s varied projects reflect AR technology’s far-reaching potential in enhancing users’ perception of the world they see, feel, and hear around them.
“The meaning of augmented reality is constantly changing. For most people, augmented reality has largely meant seeing computer graphics superimposed onto the screen of a user’s camera phone,” says Perlin. “But that will all dramatically change as we shift to wearables. When this transition is complete, we’ll be entering the dawn of ‘the age of computer graphics’ when the augmentation of reality can be seen anywhere, at any time, without needing to hold up a device.”
Augmented Reality in Action
While many augmented reality apps require special headgear to use, new mobile apps are designed to work with smartphones equipped with components such as an accelerometer, compass, and GPS. An Internet connection and video camera, coupled with software that recognizes computer-generated content (e.g., images), allows the device to display graphical data onto physical objects.
Augmented reality is at the center of Tunnel Vision NYC, a project conceived by recent Interactive Telecommunications Program (ITP) graduate, Bill Lindmeier. Like many AR apps, Tunnel Vision uses GPS to establish and search for objects, people, and stores near a smartphone user’s location. The iOS app brings New York City’s subway map to life by allowing users to overlay animated visualizations (see video at right) of the city’s transit and census data onto their smartphone display.
“The idea was an evolving process with the help of my thesis class, but it sits at the junction of a few interests of mine: computer vision, data visualization, and the subway system,” says Lindmeier. By holding your smartphone up to a wall, paper, or onscreen map, Tunnel Vision transmits real-time information on turnstile activity and the estimated positions of trains en route to stations.
On what inspired him to create Tunnel Vision, Lindmeier reflects, “My initial motivation was to create a portrait of New York City through data. I wanted to create an interactive experience that could reveal stories about the people and neighborhoods that make up the city.”
Another ITP student, Adarsh Kosuru, debuted his Slit app at the 2013 ITP Winter Show to demonstrate how a phone camera can use augmented reality to identify objects and trigger specific actions. Based on ofxQCAR (Qualcomm’s AR addon for openFrameworks), Slit tags objects with images, sound, or text. Slit users can then point their phone camera at an appliance, television, or other connected device, which then presents controls that enable users to interact with real objects.
“By using simple gestures, rules can be set to connect multiple objects together to trigger them simultaneously. For example, this can be useful in situations where a user wants to switch off all home appliances while watching TV to reduce home energy use,” says Kosuru.
Ushering in the “Age of Computer Graphics”
With AR introducing a new wave of apps that allow users to animate, explore, and geotag images on mobile devices, it’s no surprise educators are also incorporating these technologies inside the classroom. “I’ll be teaching a computer graphics course this fall using an augmented reality ‘magic whiteboard,’ in which everything I draw on the board will turn into animated diagrams, simulations, and 3D visualizations. This new way of teaching is based on work coming out of the current research in our lab,” reveals Perlin.
Taking their work with augmented reality one step further, the Games for Learning Institute is working to improve videoconferencing by employing eye contact to signal subtle meaning and emphasis. Other current projects include a collaboration with the NYU Langone Medical Center, which involves the development of interactive augmented visualizations of the human heart for diagnostic assessment and pre-surgical planning.
To stay updated on the latest AR projects, news, and events from NYU students and faculty, visit the Mobile Augmented Reality Lab website.