Augmented reality: Difference between revisions
PeterQuain (talk | contribs) No edit summary |
PeterQuain (talk | contribs) |
||
Line 34: | Line 34: | ||
== Mobile Augmented Reality == |
== Mobile Augmented Reality == |
||
Mobile Augmented Reality, or "mobile AR", is a combination of AR and mobile computing technology on [[mobile phones]] capable of online connections. When the mobile phone's camera is pointed at an object with a recognized AR logo or shape on it, the logo or shape is replaced with 3D graphics while the rest of the real-world image remains the same. Proponents of research in this technology include the [[University of Canterbury]] and [[Georgia Institute of Technology]]. |
Mobile Augmented Reality, or "mobile AR", is a combination of AR and mobile computing technology on [[mobile phones]] capable of online connections. When the mobile phone's camera is pointed at an object with a recognized AR logo or shape on it, the logo or shape is replaced with 3D graphics while the rest of the real-world image remains the same. Proponents of research in this technology include the [[University of Canterbury]] and [[Georgia Institute of Technology]]. Currently the largest provider for mobile AR is Media Power Inc. |
||
== Ubiquitous computing == |
== Ubiquitous computing == |
Revision as of 21:15, 22 August 2008
This article needs attention from an expert in computer science. Please add a reason or a talk parameter to this template to explain the issue with the article. |
Augmented reality (AR) is a field of computer research which deals with the combination of real-world and computer-generated data. At present, most AR research is concerned with the use of live video imagery which is digitally processed and "augmented" by the addition of computer-generated graphics. Advanced research includes the use of motion-tracking data, fiducial marker recognition using machine vision, and the construction of controlled environments containing any number of sensors and actuators.
Definition
Ronald Azuma's definition of AR is one of the more focused descriptions. It covers a subset of AR's original goal, but it has come to be understood as representing the whole domain of AR: Augmented reality is an environment that includes both virtual reality and real-world elements. For instance, an AR user might wear translucent goggles; through these, he could see the real world, as well as computer-generated images projected on top of that world. Azuma defines an augmented reality system as one that
- combines real and virtual,
- is interactive in real-time,
- is registered in three dimensions.
This definition is now often used in some parts of the research literature (Azuma, 1997).
History
To describe the history of Augmented Reality is also to describe man's journey of adding to the natural world he was born in.
- 15,000 BC: Lascaux cave drawings showed “virtual” images in a darkened cave that started the idea of enhancing the real world.
- 1849: Richard Wagner introduces the idea of immersive experiences using a darkened theatre and surrounding the audience in imagery and sound.
- 1938: Konrad Zuse invents the first digital computer known as the Z1.
- 1948: Norbert Wiener creates the science of cybernetics: transmitting messages between man and machine.
- 1962: Morton Heilig, a cinematographer, creates a motorcycle simulator called Sensorama with visuals, sound, vibration, and smell.
- 1966: Ivan Sutherland invents the head-mounted display suggesting it was a window into a virtual world.
- 1975: Myron Krueger creates Videoplace that allows users to interact with virtual objects for the first time.
- 1989: Jaron Lanier coins the phrase Virtual Reality and creates the first commercial business around virtual worlds.
- 1990: Tom Caudell coins the phrase Augmented Reality while at Boeing helping workers assemble cables into aircraft.
AR as a transformative technology
For many of those interested in AR, one of its most important characteristics is the way in which it makes possible a transformation of the focus of interaction. The interactive system is no longer a precise location, but the whole environment; interaction is no longer simply a face-to-screen exchange, but dissolves itself in the surrounding space and objects. Using an information system is no longer exclusively a conscious and intentional act.
Outdoor AR
A new and major area of current research is into the use of AR outdoors. GPS and orientation sensors enable backpack computing systems to take AR outdoors.
Early systems have been developed by Steven Feiner at Columbia University (MARS system) and Bruce H. Thomas and Wayne Piekarski in the Wearable Computer Lab[1] at the University of South Australia (Tinmith[2] and ARQuake systems).
Trimble Navigation, a provider of positioning solutions, has been researching Outdoor AR in collaboration with the Human Interface Technology Laboratory at its New Zealand R&D site in Christchurch. Local network news has reviewed its progress[3][4].
Mobile Augmented Reality
Mobile Augmented Reality, or "mobile AR", is a combination of AR and mobile computing technology on mobile phones capable of online connections. When the mobile phone's camera is pointed at an object with a recognized AR logo or shape on it, the logo or shape is replaced with 3D graphics while the rest of the real-world image remains the same. Proponents of research in this technology include the University of Canterbury and Georgia Institute of Technology. Currently the largest provider for mobile AR is Media Power Inc.
Ubiquitous computing
AR has clear connections with the ubiquitous computing (abbreviated UC) and wearable computers domains. Mark Weiser stated that "embodied virtuality", the original term he used before coining "ubiquitous computing", intended to express the exact opposite to the concept of virtual reality (Mark Weiser's personal communication, Boston, March 1993). The most salient distinction to be made between AR and UC is that UC does not focus on the disappearance of conscious and intentional interaction with an information system as much as AR does: UC systems such as pervasive computing devices usually maintain the notion of explicit and intentional interaction which often blurs in typical AR work such as Ronald Azuma's work. The theory of Humanistic Intelligence (HI), however, also challenges this semiotic notion of signifier and signified. [5] In particular, HI is intelligence that arises from the human being in the feedback loop of a computational process in which the human is inextricably intertwined, and does not typically require conscious thought or effort. In this way, HI, which arises from wearable Computer Mediated Reality, shares a lot in common with AR.
Notable researchers
- Steven Feiner is the leading pioneer of augmented reality, and author of the first paper on the subject.
- Bruce H. Thomas is the current Director of the Wearable Computer Laboratory at the University of South Australia. He is currently a NICTA fellow, CTO A-Rage Pty Ltd, Member of HxI team, and visiting Scholar with the Human Interaction Technology Laboratory, University of Washington. He is the inventor of the first outdoor augmented reality game ARQuake. His current research interests include: wearable computers, user interfaces, augmented reality, virtual reality, computer supported cooperative work (CSCW), and tabletop display interfaces.
- Wayne Piekarski is the inventor of the Tinmith System.
Examples
Commonly known examples of AR are the yellow "first down" line seen in television broadcasts of American football games, and the colored trail showing location and direction of the puck in TV broadcasts of hockey games. The real-world elements are the football field and players, and the virtual element is the yellow line, which is drawn over the image by computers in real time. Similarly, rugby fields and cricket pitches are branded by their sponsors using Augmented Reality; giant logos are inserted onto the fields when viewed on television.
Another type of AR application uses projectors and screens to insert objects into the real environment, enhancing museum exhibitions for example. The difference to a simple TV screen for example, is that these objects are related to the environment of the screen or display, and that they often are interactive as well.
Many first-person shooter video games simulate the viewpoint of someone using AR systems. In these games the AR can be used to give visual directions to a location, mark the direction and distance of another person who is not in line of sight, give information about equipment such as remaining bullets in a gun, and display a myriad of other images based on whatever the game designers intend.
Most of the possible applications of AR will, however, need personal display glasses[citation needed].
In some current applications like in cars or airplanes, this is usually a head-up display integrated into the windshield.
Current applications
- Support with complex tasks, in assembly, maintenance, surgery etc.:
- by inserting of additional information into the field of view (for example, a mechanic getting labels displayed at parts of a system and getting operating instructions)
- by visualization of hidden objects (during medical diagnostics or surgery as a virtual X-ray view, based on prior tomography or on real time images from ultrasound or open NMR devices, e.g., a doctor could "see" the fetus inside the mother's womb). See also Mixed Reality
- Navigation devices
- in buildings, e.g. maintenance of industrial plants
- outdoors, e.g. military operations or disaster management
- in cars (headup displays or personal display glasses showing navigation hints and traffic information)
- in airplanes (headup displays in fighter jets are one of the first AR applications anyhow; meanwhile fully interactive as well, with eye pointing)
- Military and emergency services (wearable systems, showing instructions, maps, enemy locations, fire cells etc.)
- Prospecting in hydrology, ecology, geology (display and interactive analysis of terrain characteristics, interactive three-dimensional maps that could be collaboratively modified and analyzed)
- Visualization of architecture (virtual resurrection of destroyed historic buildings as well as simulation of planned construction projects)
- Enhanced sightseeing : labels or any text related to the objects/places seen, rebuilt ruins, building or even landscape as seen in the past. Combined with a wireless network the amount of data displayed is limitless (encyclopedic articles, news, etc...).
- Simulation, e.g. flight and driving simulators
- Collaboration of distributed teams
- conferences with real and virtual participants. See also Mixed Reality
- joint work at simulated 3D models
- Entertainment and education
- virtual objects in museums and exhibitions. See also Mixed Reality
- theme park attractions (Such as Cadbury World)
- games (e.g. ARQuake or The Eye of Judgment). See also Mixed Reality
Future applications
- Expanding a PC screen into the real environment: program windows and icons appear as virtual devices in real space and are eye or gesture operated, by gazing or pointing. A single personal display (glasses) could concurrently simulate a hundred conventional PC screens or application windows all around a user
- Virtual devices of all kinds, e.g. replacement of traditional screens, control panels, and entirely new applications impossible in "real" hardware, like 3D objects interactively changing their shape and appearance based on the current task or need.
- Enhanced media applications, like pseudo holographic virtual screens, virtual surround cinema, virtual 'holodecks' (allowing computer-generated imagery to interact with live entertainers and audience)
- Virtual conferences in "holodeck" style
- Replacement of cellphone and car navigator screens: eye-dialing, insertion of information directly into the environment, e.g. guiding lines directly on the road, as well as enhancements like "X-ray"-views
- Virtual plants, wallpapers, panoramic views, artwork, decorations, illumination etc., enhancing everyday life. For example, a virtual window could be displayed on a regular wall showing a live feed of a camera placed on the exterior of the building, thus allowing the user to effectually toggle a wall's transparency
- With AR systems getting into mass market, we may see virtual window dressings, posters, traffic signs, Christmas decorations, advertisement towers and more. These may be fully interactive even at a distance, by eye pointing for example.
- Virtual gadgetry becomes possible. Any physical device currently produced to assist in data-oriented tasks (such as the clock, radio, PC, arrival/departure board at an airport, stock ticker, PDA, PMP, informational posters/fliers/billboards, in-car navigation systems, etc. could be replaced by virtual devices that cost nothing to produce aside from the cost of writing the software. Examples might be a virtual wall clock, a to-do list for the day docked by your bed for you to look at first thing in the morning, etc.
- Subscribable group-specific AR feeds. For example, a manager on a construction site could create and dock instructions including diagrams in specific locations on the site. The workers could refer to this feed of AR items as they work. Another example could be patrons at a public event subscribing to a feed of direction and information oriented AR items.
Specific applications
- LifeClipper, a wearable AR system
- Characteroke, a portable AR display costume, whereby the head and neck are concealed behind an active flat panel display.
- MARISIL, a media phone user interface based on AR
- CyberCode, a visual tagging system where real-world objects are recognizable by a computer.
Popular culture
Pop group Duran Duran included interactive AR projections into their stage show during their 2000 Pop Trash concert tour.[6]
Anime
The television series Dennō Coil depicts a near-future where children use AR goggles to enhance their environment with games and virtual pets. Ghost in the Shell 2: Innocence gives several examples of augmented reality in use, while Gundam, Gunbuster, Neon Genesis Evangelion, Voices of a Distant Star and Martian Successor Nadesico amongst several others depict 360° augmented reality cockpits that are used to display information. In Serial Experiments Lain, The Wired is overlaid onto the real world via electromagnetic radiation relaying information directly to people's brains, causing people to experience both The Wired and the real world.
Science fiction
In the Star Trek universe, the Jem'Hadar used a sort of augmented display to view the real world and what was outside the ship, integrating with the star ship's main sensors to gain an outside view of the star ship.
The television series Firefly depicts numerous AR applications, including a real-time medical scanner which allows a doctor to use his hands to manipulate a detailed and labeled projection of a patient's brain.
The table top role-playing game, Shadowrun, introduced AR into its game world. Most of the characters in the game use viewing devices to interact with the AR world most of the time.
The books Halting State by Charles Stross and Rainbows End by Vernor Vinge include augmented reality primarily in the form of virtual overlays over the real world. Halting State mentions Copspace, which is used by cops, and the use by gamers to overlay their characters onto themselves during a gaming convention. Rainbows End mentions outdoor overlays based on popular fictional universes from H. P. Lovecraft and Terry Pratchett among others.
The term "Geohacking" has been coined by William Gibson in his book Spook Country, where artists use a combination of GPS and 3D graphics technology to embed rendered meshes in real world landscapes.
Conferences
- 1st International Workshop on Augmented Reality (IWAR'98), San Francisco, Nov. 1998.
- 2nd International Workshop on Augmented Reality (IWAR'99), San Francisco, Oct. 1999.
- 1st International Symposium on Mixed Reality (ISMR'99), Yokohama, Japan, March 1999.
- 2nd International Symposium on Mixed Reality (ISMR'01), Yokohama, Japan, March 2001.
- 1st International Symposium on Augmented Reality (ISAR 2000), Munich, Oct. 2000.
- 2nd International Symposium on Augmented Reality (ISAR 2001), New York, Oct. 2001.
- 1st International Symposium on Mixed and Augmented Reality (ISMAR 2002), Darmstadt, Oct. 2002.
- 2nd International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Oct. 2003.
- 3rd International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, Nov. 2004.
- 4th International Symposium on Mixed and Augmented Reality (ISMAR 2005), Vienna, Oct. 2005.
- 5th International Symposium on Mixed and Augmented Reality (ISMAR 2006) Santa Barbara, Oct. 2006.
- 6th International Symposium on Mixed and Augmented Reality (ISMAR 2007) Nara, Japan, Nov. 2007.
See also
- Alternate reality game
- Augmented virtuality
- Camera resectioning
- Cyborg
- Mixed reality
- Multimedia Esperanto
- Simulated Reality
- Virtual retinal display
- Virtuality Continuum
- Virtual reality
References
- ^ Wearable Computer Lab, University of South Australia
- ^ Tinmith
- ^ Trimble AR demonstration on YouTube
- ^ Human Interface Technology Laboratory
- ^ Mann, Steve. "Intelligence: WearComp as a new framework for Intelligent Signal Processing", Proceedings of the IEEE, Vol. 86, No. 11, November, 1998.
- ^ Pair, J., Wilson, J., Chastine, J., Gandy, M. "The Duran Duran Project: The Augmented Reality Toolkit in Live Performance". The First IEEE International Augmented Reality Toolkit Workshop, 2002. (photos and video)
- Azuma, Ronald T. "A Survey of Augmented Reality". Presence: Teleoperators and Virtual Environments 6, 4 (August 1997), 355–385.
- Barfield, W., and T. Caudell, eds. Fundamentals of Wearable Computers and Augmented Reality. Mahwah, NJ: Lawrence Erlbaum, 2001. ISBN 0805829016.
- Bimber, Oliver, and Ramesh Raskar. Spatial Augmented Reality: Merging Real and Virtual Worlds. A K Peters, 2005. ISBN 1568812302.
- Feiner, S. K. "Augmented Reality: A New Way of Seeing: Computer scientists are developing systems that can enhance and enrich a user's view of the world". Scientific American, April 2002.
- Hainich, Rolf R. "The end of Hardware : A Novel Approach to Augmented Reality" (2nd ed.). Booksurge, 2006. ISBN 1419652184.
- Haller, Michael, Mark Billinghurst and Bruce Thomas. Emerging Technologies of Augmented Reality: Interfaces and Design. Idea Group Publishing, 2006. ISBN 1599040662.
- Raskar, Ramesh. "Spatially Augmented Reality", First International Workshop on Augmented Reality, Sept 1998.
- Starner, T., Mann S., Rhodes B., Levine J., Healey J., Kirsch D., Picard R., & Pentland A. "Augmented Reality Through Wearable Computing". Presence: Teleoperators and Virtual Environments 6, 4 (August 1997), 386-398
- Wellner, P., Mackay, W. & Gold, R. Eds. "Special issue on computer augmented environments: back to the real world". Communications of the ACM, Volume 36, Issue 7 (July 1993).
External links
- Blog and Projects on Augmented Reality, Germany
- Wearable Computer Lab, South Australia
- HITLab, Seattle
- HITLab NZ, Christchurch New Zealand
- TU Munich
- Studierstube, Graz University of Technology, Vienna
- Columbia University Computer Graphics and User Interfaces Lab
- Projet Lagadic IRISA-INRIA Rennes
- HowStuffWorks: How Augmented Reality Will Work
- Resources Page: Jim Vallinos AR
- Mixed Reality: Augmented Reality, Augmented Virtuality, Virtual Reality. Kolsouzoglou Anthony's Research Site
- Augmented Reality (Professor Zorzal's Website), Brazil
- Game Daily - Mobile Augmented Reality