Jump to content

Augmented reality: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m Fixed typo, Clean up
Ryf123 (talk | contribs)
 
Line 1: Line 1:
{{short description|View of the real world with computer-generated supplementary features}}
{{distinguish|Virtual reality}}
{{distinguish|Virtual reality|Alternate reality (disambiguation){{!}}Alternate reality}}
{{Use dmy dates|date=June 2012}}
{{Use dmy dates|date=October 2019}}
[[File:Virtual-Fixtures-USAF-AR.jpg|thumb|alt= Photograph of the first AR system |[[Virtual Fixture]]s – first AR system, U.S. Air Force, [[Wright-Patterson Air Force Base]] (1992)]]


'''Augmented reality''' ('''AR''') is an interactive experience that combines the real world and computer-generated 3D content. The content can span multiple sensory [[Modality (human–computer interaction)|modalities]], including [[visual]], [[Hearing|auditory]], [[haptic perception|haptic]], [[Somatosensory system|somatosensory]] and [[olfactory]].<ref>{{cite journal | last1=Cipresso | first1=Pietro | last2=Giglioli | first2=Irene Alice Chicchi | last3=Raya | first3=iz | last4=Riva | first4=Giuseppe | title=The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature | journal=Frontiers in Psychology | volume=9 | date=2011-12-07 | pmid=30459681 | doi=10.3389/fpsyg.2018.02086 | page=2086| pmc=6232426 | doi-access=free }}</ref> AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.<ref>{{cite journal |last1=Wu |first1=Hsin-Kai |last2=Lee |first2=Silvia Wen-Yu |last3=Chang |first3=Hsin-Yi |last4=Liang |first4=Jyh-Chong |title=Current status, opportunities and challenges of augmented reality in education... |journal=Computers & Education |date=March 2013 |volume=62 |pages=41–49 |doi=10.1016/j.compedu.2012.10.024 |s2cid=15218665 }}</ref> The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).<ref name="B. Rosenberg 1992">{{cite web |last1=Rosenberg |first1=Louis B. |title=The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance in Remote Environments. |date=1992 |url=https://apps.dtic.mil/docs/citations/ADA292450 |archive-url=https://web.archive.org/web/20190710211431/https://apps.dtic.mil/docs/citations/ADA292450 |url-status=live |archive-date=10 July 2019 }}</ref> As such, it is one of the key technologies in the [[Reality–virtuality continuum|reality-virtuality continuum]].<ref>{{Cite journal |last1=Milgram |first1=Paul |last2=Takemura |first2=Haruo |last3=Utsumi |first3=Akira |last4=Kishino |first4=Fumio |date=1995-12-21 |title=Augmented reality: a class of displays on the reality-virtuality continuum |url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2351/0000/Augmented-reality--a-class-of-displays-on-the-reality/10.1117/12.197321.full |journal=Telemanipulator and Telepresence Technologies |publisher=SPIE |volume=2351 |pages=282–292 |doi=10.1117/12.197321|bibcode=1995SPIE.2351..282M }}</ref>
[[File:Virtual-Fixtures-USAF-AR.jpg|thumb|[[Virtual Fixture]]s – first A.R. system,
1992, U.S. Air Force, WPAFB]]
'''Augmented reality''' ('''AR''') is an interactive experience of a real-world environment where the objects that reside in the real-world are "augmented" by computer-generated perceptual information, sometimes across multiple sensory modalities, including [[visual]], [[Hearing|auditory]], [[haptic perception|haptic]], [[Somatosensory system|somatosensory]], and [[olfactory]].<ref>{{Cite news|url=http://images.huffingtonpost.com/2016-05-13-1463155843-8474094-AR_history_timeline.jpg|title=The Lengthy History of Augmented Reality|last=|first=|date=May 15, 2016|work=Huffington Post|access-date=}}</ref><ref>{{Cite book|url=http://www.heg-fr.ch/EN/School-of-Management/Communication-and-Events/events/Pages/EventViewer.aspx?Event=patrick-schuffel.aspx|title=The Concise Fintech Compendium|last=Schueffel|first=Patrick|publisher=School of Management Fribourg/Switzerland|year=2017|isbn=|location=Fribourg|pages=}}</ref> The overlaid sensory information can be constructive (i.e. additive to the natural environment) or destructive (i.e. masking of the natural environment) and is seamlessly interwoven with the physical world such that it is perceived as an [[immersion (virtual reality)|immersive]] aspect of the real environment.<ref name=":1" /> In this way, augmented reality alters one's ongoing perception of a real world environment, whereas [[virtual reality]] completely replaces the user's real world environment with a simulated one.<ref>Steuer,{{Cite web |url=http://ww.cybertherapy.info/pages/telepresence.pdf |title=Archived copy |access-date=27 November 2018 |archive-url=https://web.archive.org/web/20160524233446/http://ww.cybertherapy.info/pages/telepresence.pdf |archive-date=24 May 2016 |dead-url=yes |df=dmy-all }}, Department of Communication, Stanford University. 15 October 1993.</ref><ref>[http://archive.ncsa.illinois.edu/Cyberia/VETopLevels/VR.Overview.html Introducing Virtual Environments] National Center for Supercomputing Applications, University of Illinois.</ref> Augmented reality is related to two largely synonymous terms: [[mixed reality]] and [[computer-mediated reality]].


This experience is seamlessly interwoven with the physical world such that it is perceived as an [[immersion (virtual reality)|immersive]] aspect of the real environment.<ref name="B. Rosenberg 1992" /> In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas [[virtual reality]] completely replaces the user's real-world environment with a simulated one.<ref>Steuer,{{Cite web |url=https://filtermaker.fr/en/augmented-reality/ |title=Defining virtual reality: Dimensions Determining Telepresence |access-date=27 November 2018 |archive-url=https://web.archive.org/web/20220717120913/https://filtermaker.fr/en/augmented-reality/ |archive-date=17 July 2022 |url-status=dead |df=dmy-all }}, Department of Communication, Stanford University. 15 October 1993.</ref><ref>[http://archive.ncsa.illinois.edu/Cyberia/VETopLevels/VR.Overview.html Introducing Virtual Environments] {{Webarchive|url=https://web.archive.org/web/20160421000159/http://archive.ncsa.illinois.edu/Cyberia/VETopLevels/VR.Overview.html |date=21 April 2016 }} National Center for Supercomputing Applications, University of Illinois.</ref>
The primary value of augmented reality is that it brings components of the digital world into a person's perception of the real world, and does so not as a simple display of data, but through the integration of immersive sensations that are perceived as natural parts of an environment. The first functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the [[Virtual Fixtures]] system developed at the U.S. Air Force's [[Armstrong Laboratory]] in 1992.<ref name=":1">{{Cite journal|last=Rosenberg|first=L.B.|year=1992|title=The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments|url=|journal=Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.|volume=|pages=|via=}}</ref><ref>{{Cite journal|last=Rosenberg|first=L.B.|year=1993|title=Virtual Fixtures: Perceptual Overlays for Telerobotic Manipulation|url=|journal=Proc. of the IEEE Annual Int. Symposium on Virtual Reality (1993)|volume=|pages=76–82,|via=}}</ref><ref name="Dupzyk 2016">{{Cite news|url=http://www.popularmechanics.com/technology/a22384/hololens-ar-breakthrough-awards/|title=I Saw the Future Through Microsoft's Hololens|last=Dupzyk|first=Kevin|date=2016|work=|access-date=|via=}}</ref><ref name="huffingtonpost.com">{{Cite web|url=https://www.huffingtonpost.com/dennis-williams-ii/the-history-of-augmented-_b_9955048.html|title=The History of Augmented Reality (Infographic)|last=II|first=Dennis Williams|date=2016-05-13|website=Huffington Post|language=en-US|access-date=2018-06-17}}</ref> The first commercial augmented reality experiences were used largely in the entertainment and gaming businesses, but now other industries are also getting interested about AR's possibilities for example in knowledge sharing, educating, managing the information flood and organizing distant meetings. Augmented reality is also transforming the world of education, where content may be accessed by scanning or viewing an image with a mobile device or by bringing immersive, markerless AR experiences to the classroom.<ref>https://www.edsurge.com/news/2015-11-02-how-to-transform-your-classroom-with-augmented-reality</ref><ref>{{Cite web|url=https://medium.com/ancient-eu/why-we-need-more-tech-in-history-education-805fa10a7251|title=Why We Need More Tech in History Education|last=Crabben|first=Jan van der|date=2018-10-16|website=ancient.eu|access-date=2018-10-23}}</ref> Another example is an AR helmet for construction workers which display information about the construction sites.


Augmented reality is largely synonymous with [[mixed reality]]. There is also overlap in terminology with [[extended reality]] and [[computer-mediated reality]].
Augmented Reality (AR) is used to enhance natural environments or situations and offer perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding [[computer vision]] and [[object recognition]]) the information about the surrounding real world of the user becomes [[interactive]] and digitally manipulable. Information about the environment and its objects is overlaid on the real world. This information can be virtual<ref>Chen, Brian X. [https://www.wired.com/2009/08/augmented-reality/ If You’re Not Seeing Data, You’re Not Seeing], ''Wired'', 25 August 2009.</ref><ref>Maxwell, Kerry. [http://www.macmillandictionary.com/buzzword/entries/augmented-reality.html Augmented Reality], ''Macmillan Dictionary Buzzword''.</ref><ref>[http://www.augmentedrealityon.com/ Augmented reality-Everything about AR] {{webarchive|url=https://web.archive.org/web/20120405071414/http://www.augmentedrealityon.com/|date=5 April 2012}}, ''Augmented Reality On''.</ref><ref name="Azuma_survey">Azuma, Ronald. [http://www.cs.unc.edu/~azuma/ARpresence.pdf ''A Survey of Augmented Reality''] Presence: Teleoperators and Virtual Environments, pp. 355–385, August 1997.</ref><ref>Chatzopoulos D., Bermejo C, Huang Z, and Hui P [http://ieeexplore.ieee.org/document/7912316/ Mobile Augmented Reality Survey: From Where We Are to Where We Go].</ref><ref>Huang Z, P Hui., et al. [https://arxiv.org/abs/1309.4413 Mobile augmented reality survey: a bottom-up approach].</ref> or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.<ref>[http://wearcam.org/PhenomenalAugmentedReality.pdf Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97]</ref><ref>Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp&nbsp;99–128, 1992.</ref><ref>{{Cite book|last=Mann|first=Steve|last2=Feiner|first2=Steve|last3=Harner|first3=Soren|last4=Ali|first4=Mir Adnan|last5=Janzen|first5=Ryan|last6=Hansen|first6=Jayse|last7=Baldassi|first7=Stefano|date=2015-01-15|url=http://dl.acm.org/citation.cfm?id=2677199.2683590|publisher=ACM|pages=497–500|doi=10.1145/2677199.2683590|isbn=9781450333054|chapter=Wearable Computing, 3D Aug* Reality, Photographic/Videographic Gesture Sensing, and Veillance|title=Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction - TEI '14}}</ref> Augmented reality also has a lot of potential in the gathering and sharing of tacit knowledge. Augmentation techniques are typically performed in real time and in semantic context with environmental elements. Immersive perceptual information is sometimes combined with supplemental information like scores over a live video feed of a sporting event. This combines the benefits of both augmented reality technology and [[heads up display]] technology (HUD).


The primary value of augmented reality is the manner in which components of the digital world blend into a person's perception of the real world, not as a simple display of data, but through the integration of immersive sensations, which are perceived as natural parts of an environment. The earliest functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the [[Virtual Fixtures]] system developed at the U.S. Air Force's [[Armstrong Laboratory]] in 1992.<ref name="B. Rosenberg 1992"/><ref>{{cite book |doi=10.1109/VRAIS.1993.380795 |chapter=Virtual fixtures: Perceptual tools for telerobotic manipulation |title=Proceedings of IEEE virtual reality Annual International Symposium |pages=76–82 |year=1993 |last1=Rosenberg |first1=L.B. |s2cid=9856738 |isbn=0-7803-1363-1 }}</ref><ref name="Dupzyk 2016">{{Cite news|url=http://www.popularmechanics.com/technology/a22384/hololens-ar-breakthrough-awards/|title=I Saw the Future Through Microsoft's Hololens|last=Dupzyk|first=Kevin|work=Popular Mechanics|date = 6 September 2016}}</ref> [[Commercial augmented reality]] experiences were first introduced in entertainment and gaming businesses.<ref>{{Citation|last=|first=|title=Augmented Reality: Reflections at Thirty Years|url=https://link.springer.com/10.1007/978-3-030-89906-6_1|work=Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1|series=Lecture Notes in Networks and Systems|year=2022|volume=358|pages=1–11|editor-last=Arai|editor-first=Kohei|place=Cham|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-030-89906-6_1|isbn=978-3-030-89905-9|s2cid=239881216|access-date=}}</ref> Subsequently, augmented reality applications have spanned commercial industries such as education, communications, medicine, and entertainment. In education, content may be accessed by scanning or viewing an image with a mobile device or by using markerless AR techniques.<ref>{{Cite journal|last1=Moro|first1=Christian|last2=Birt|first2=James|last3=Stromberga|first3=Zane|last4=Phelps|first4=Charlotte|last5=Clark|first5=Justin|last6=Glasziou|first6=Paul|last7=Scott|first7=Anna Mae|date=2021|title=Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis|url=https://onlinelibrary.wiley.com/doi/10.1002/ase.2049|journal=Anatomical Sciences Education|language=en|volume=14|issue=3|pages=368–376|doi=10.1002/ase.2049|pmid=33378557|s2cid=229929326|issn=1935-9772}}</ref><ref>{{Cite web | url=https://www.edsurge.com/news/2015-11-02-how-to-transform-your-classroom-with-augmented-reality | title=How to Transform Your Classroom with Augmented Reality - EdSurge News| date=2 November 2015}}</ref><ref>{{Cite web|url=https://medium.com/ancient-eu/why-we-need-more-tech-in-history-education-805fa10a7251|title=Why We Need More Tech in History Education|last=Crabben|first=Jan van der|date=16 October 2018|website=ancient.eu|access-date=2018-10-23|archive-date=23 October 2018|archive-url=https://web.archive.org/web/20181023195947/https://medium.com/ancient-eu/why-we-need-more-tech-in-history-education-805fa10a7251|url-status=dead}}</ref>
{{toclimit|3}}


Augmented reality can be used to enhance natural environments or situations and offers perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding [[computer vision]], incorporating AR cameras into smartphone applications, and [[object recognition]]) the information about the surrounding real world of the user becomes [[interactive]] and digitally manipulated.<ref>{{Cite journal |url=https://doi.org/10.1007/s11831-022-09831-7/ |title=Augmented Reality: A Comprehensive Review|last1=Dargan|first1=Shaveta|last2=Bansal|first2=Shally|last3=Mittal|first3=Ajay|last4=Kumar|first4=Krishan|date=2023 | journal=Archives of Computational Methods in Engineering |volume=30 |issue=2 |pages=1057–1080 |doi=10.1007/s11831-022-09831-7 |access-date=27 February 2024}}</ref> Information about the environment and its objects is overlaid on the real world. This information can be virtual. Augmented Reality is any experience which is artificial and which adds to the already existing reality.<ref>{{Cite journal |url=https://codegres.com/augmented-reality/ |title=What is Augmented Reality |last=Hegde |first=Naveen |date=19 March 2023 | journal=Codegres |access-date=19 March 2023}}</ref><ref>{{Cite magazine |url=https://www.wired.com/2009/08/augmented-reality/ |title=If You're Not Seeing Data, You're Not Seeing |last=Chen |first=Brian |date=25 August 2009 |magazine=Wired |access-date=18 June 2019}}</ref><ref>{{Cite web |url=http://www.macmillandictionary.com/buzzword/entries/augmented-reality.html |title=Augmented Reality |last=Maxwell |first=Kerry |website=macmillandictionary.com |access-date=18 June 2019}}</ref><ref>{{Cite web |url=http://www.augmentedrealityon.com/ |title=Augmented Reality (AR) |website=augmentedrealityon.com |archive-url=https://web.archive.org/web/20120405071414/http://www.augmentedrealityon.com/ |archive-date=5 April 2012 |url-status=dead |access-date=18 June 2019}}</ref><ref name="Azuma_survey">{{cite journal |last=Azuma |first=Ronald |author-link=Ronald Azuma |date=August 1997 |title=A Survey of Augmented Reality |url=http://www.cs.unc.edu/~azuma/ARpresence.pdf |access-date=2 June 2021 |journal=Presence: Teleoperators and Virtual Environments |publisher=MIT Press |volume=6 |issue=4 |pages=355–385 |doi=10.1162/pres.1997.6.4.355|s2cid=469744 }}</ref> or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.<ref>{{Cite web|url=http://wearcam.org/PhenomenalAugmentedReality.pdf|title=Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97}}</ref><ref>Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp&nbsp;99–128, 1992.</ref><ref>{{Cite book|last1=Mann|first1=Steve|last2=Feiner|first2=Steve|last3=Harner|first3=Soren|last4=Ali|first4=Mir Adnan|last5=Janzen|first5=Ryan|last6=Hansen|first6=Jayse|last7=Baldassi|first7=Stefano|s2cid=12247969|date=15 January 2015|publisher=ACM|pages=497–500|doi=10.1145/2677199.2683590|isbn=9781450333054|chapter=Wearable Computing, 3D Aug* Reality, Photographic/Videographic Gesture Sensing, and Veillance|title=Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction - TEI '14}}</ref> Augmented reality also has a lot of potential in the gathering and sharing of tacit knowledge. Augmentation techniques are typically performed in real-time and in semantic [[context awareness|contexts]] with environmental elements. Immersive perceptual information is sometimes combined with supplemental information like scores over a live video feed of a sporting event. This combines the benefits of both augmented reality technology and [[heads up display]] technology (HUD).
== Technology ==


{{toclimit|3}}
[[File:MicrosoftHoloLensBloomGesture.JPG|thumb|A [[Microsoft HoloLens]] being worn by a man]]


==Comparison with virtual reality==
=== Hardware ===
In [[virtual reality]] (VR), the users' perception is completely computer-generated, whereas with augmented reality (AR), it is partially generated and partially from the real world.<ref>{{Cite journal|last1=Carmigniani|first1=Julie|last2=Furht|first2=Borko|last3=Anisetti|first3=Marco|last4=Ceravolo|first4=Paolo|last5=Damiani|first5=Ernesto|last6=Ivkovic|first6=Misa|s2cid=4325516|date=1 January 2011|title=Augmented reality technologies, systems and applications|journal=Multimedia Tools and Applications|language=en|volume=51|issue=1|pages=341–377|doi=10.1007/s11042-010-0660-6|issn=1573-7721}}</ref><ref>{{Cite book|title=Virtual, Augmented Reality and Serious Games for Healthcare 1|last1=Ma|first1=Minhua|last2=C. Jain|first2=Lakhmi|last3=Anderson|first3=Paul|publisher=[[Springer Publishing]]|year=2014|isbn=978-3-642-54816-1|pages=120}}</ref> For example, in architecture, VR can be used to create a walk-through simulation of the inside of a new building; and AR can be used to show a building's structures and systems super-imposed on a real-life view. Another example is through the use of utility applications. Some AR applications, such as [[Augment (app)|Augment]], enable users to apply digital objects into real environments, allowing businesses to use augmented reality devices as a way to preview their products in the real world.<ref>{{Cite web|url=https://www.pcmag.com/news/augment-is-bringing-the-ar-revolution-to-business|title=Augment Is Bringing the AR Revolution to Business|last1=Marvin|first1=Rob|date=16 August 2016|website=PC Mag|language=en|access-date=2021-02-23}}</ref> Similarly, it can also be used to demo what products may look like in an environment for customers, as demonstrated by companies such as [[Mountain Equipment Co-op]] or [[Lowe's]] who use augmented reality to allow customers to preview what their products might look like at home through the use of 3D models.<ref>{{Cite web|url=https://archpaper.com/2019/08/retail-is-getting-reimagined-with-augmented-reality/|title=Retail is getting reimagined with augmented reality|last=Stamp|first=Jimmy|date=30 August 2019|website=The Architect's Newspaper|url-status=live|archive-url=https://web.archive.org/web/20191115233539/https://archpaper.com/2019/08/retail-is-getting-reimagined-with-augmented-reality/|archive-date=15 November 2019}}</ref>


Augmented reality (AR) differs from virtual reality (VR) in the sense that in AR part of the surrounding environment is 'real' and AR is just adding layers of virtual objects to the real environment. On the other hand, in VR the surrounding environment is completely virtual and computer generated. A demonstration of how AR layers objects onto the real world can be seen with augmented reality games. [[WallaMe]] is an augmented reality game application that allows users to hide messages in real environments, utilizing geolocation technology in order to enable users to hide messages wherever they may wish in the world.<ref>{{Cite web|url=https://www.techradar.com/news/the-future-is-virtual-why-ar-and-vr-will-live-in-the-cloud|title=The future is virtual - why AR and VR will live in the cloud|last=Mahmood 2019-04-12T11:30:27Z|first=Ajmal|website=TechRadar|date=12 April 2019|language=en|access-date=2019-12-12}}</ref> Such applications have many uses in the world, including in activism and artistic expression.<ref>{{Cite web|url=https://www.vrfocus.com/2018/02/mural-artists-use-augmented-reality-to-highlight-effects-of-climate-change/|title=Mural Artists Use Augmented Reality To Highlight Effects Of Climate Change|last=Aubrey|first=Dave|website=VRFocus|language=en-US|access-date=2019-12-12}}</ref>
Hardware components for augmented reality are: processor, display, sensors and input devices. Modern [[mobile computing]] devices like [[smartphone]]s and [[tablet computer]]s contain these elements which often include a camera and [[MEMS]] sensors such as [[accelerometer]], [[GPS]], and [[Digital magnetic compass|solid state compass]], making them suitable AR platforms.<ref>Metz, Rachel. [http://www.technologyreview.com/news/428654/augmented-reality-is-finally-getting-real/ Augmented Reality Is Finally Getting Real] ''Technology Review'', 2 August 2012.</ref>


== History ==
There are 2 technologies:''diffractive [[waveguide]]s'' and ''reflective waveguides''. Augmented reality systems guru Karl Guttag compared the optics of diffractive waveguides against the competing technology, reflective waveguides.<ref>[https://www.kguttag.com/2018/10/22/magic-leap-hololens-and-lumus-resolution-shootout-ml1-review-part-3/ Karl Guttag on Technology]</ref>
* 1901: [[L. Frank Baum]], an author, first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.<ref>Johnson, Joel. [https://web.archive.org/web/20130522153011/http://moteandbeam.net/the-master-key-l-frank-baum-envisions-ar-glasses-in-1901 "The Master Key": L. Frank Baum envisions augmented reality glasses in 1901] ''Mote & Beam'' 10 September 2012.</ref>
* 1957–62: [[Morton Heilig]], a cinematographer, creates and patents a simulator called [[Sensorama]] with visuals, sound, vibration, and smell.
* 1968: [[Ivan Sutherland]] creates the first [[head-mounted display]] that has graphics rendered by a computer.<ref>{{cite book |doi=10.1145/1476589.1476686 |chapter=A head-mounted three dimensional display |title=Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS '68 (Fall, part I) |pages=757 |year=1968 |last1=Sutherland |first1=Ivan E. |s2cid=4561103 }}</ref>
* 1975: [[Myron Krueger]] creates [[Videoplace]] to allow users to interact with virtual objects.
* 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a [[Head-up display|heads up display]] for teaching real-world flight skills.<ref name="Lintern-1980"/>
* 1980: [[Steve Mann (inventor)|Steve Mann]] creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.<ref>{{cite news|last=Mann |first=Steve |url=https://techland.time.com/2012/11/02/eye-am-a-camera-surveillance-and-sousveillance-in-the-glassage/ |title=Eye Am a Camera: Surveillance and Sousveillance in the Glassage |publisher=[[Time (magazine)|Time]] |date=2 November 2012 |access-date=14 October 2013}}</ref>
* 1986: Within IBM, Ron Feigenblatt describes the most widely experienced form of AR today (viz. "magic window," e.g. [[smartphone]]-based [[Pokémon Go]]), use of a small, "smart" flat panel display positioned and oriented by hand.<ref>{{cite web|url=https://priorart.ip.com/IPCOM/000040923 |title=Absolute Display Window Mouse/Mice |access-date=19 October 2020 |url-status=live |archive-url=https://web.archive.org/web/20191106031325/https://priorart.ip.com/IPCOM/000040923 |archive-date=6 November 2019 |df=dmy }} (context & abstract only) ''[[IBM Technical Disclosure Bulletin]]'' 1 March 1987</ref><ref>
{{cite web|url=https://priorart.ip.com/IPCOM/000040923 |title=Absolute Display Window Mouse/Mice |access-date=19 October 2020 |url-status=live |archive-url=https://web.archive.org/web/20201019143932/https://priorart.ip.com/first-page/IPCOM000040923D |archive-date=19 October 2020 |df=dmy }} (image of anonymous printed article) ''[[IBM Technical Disclosure Bulletin]]'' 1 March 1987</ref>
* 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "[[Head-up display|heads-up display]]" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.<ref>{{cite journal |title=A computer-driven astronomical telescope guidance and control system with superimposed star field and celestial coordinate graphics display |journal=Journal of the Royal Astronomical Society of Canada |volume=83 |pages=32 |bibcode=1989JRASC..83...32G |last1=George |first1=Douglas B. |last2=Morris |first2=L. Robert |year=1989 }}</ref>
* 1990: The term ''augmented reality'' is attributed to Thomas P. Caudell, a former [[Boeing]] researcher.<ref>{{cite journal |last1=Lee |first1=Kangdon |s2cid=40826055 |title=Augmented Reality in Education and Training |journal=TechTrends |date=7 February 2012 |volume=56 |issue=2 |pages=13–21 |doi=10.1007/s11528-012-0559-3 }}</ref>
* 1992: [[Louis B. Rosenberg|Louis Rosenberg]] developed one of the first functioning AR systems, called [[Virtual fixture|Virtual Fixtures]], at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.<ref>Louis B. Rosenberg. "The Use of [[Virtual fixture|Virtual Fixtures]] As Perceptual Overlays to Enhance Operator Performance in Remote Environments." Technical Report AL-TR-0089, USAF Armstrong Laboratory (AFRL), Wright-Patterson AFB OH, 1992.</ref>
* 1992: [[Steven K. Feiner|Steven Feiner]], [[Blair MacIntyre]] and Doree Seligmann present an early paper on an AR system prototype, KARMA, at the Graphics Interface conference.
* 1993: The [[CMOS]] [[active-pixel sensor]], a type of [[metal–oxide–semiconductor]] (MOS) [[image sensor]], was developed at [[NASA]]'s [[Jet Propulsion Laboratory]].<ref>Eric R. Fossum (1993), "Active Pixel Sensors: Are CCD's Dinosaurs?" Proc. SPIE Vol. 1900, p. 2–14, ''Charge-Coupled Devices and Solid State Optical Sensors III'', Morley M. Blouke; Ed.</ref> CMOS sensors are later widely used for optical tracking in AR technology.<ref>{{cite book |last1=Schmalstieg |first1=Dieter |last2=Hollerer |first2=Tobias |title=Augmented Reality: Principles and Practice |date=2016 |publisher=[[Addison-Wesley Professional]] |isbn=978-0-13-315320-0 |pages=209–10 |url=https://books.google.com/books?id=qPU2DAAAQBAJ&pg=PT209}}</ref>
* 1993: Mike Abernathy, et al., report the first use of augmented reality in identifying space debris using [[Rockwell Collins|Rockwell]] WorldView by overlaying satellite geographic trajectories on live telescope video.<ref name = "ABER93"/>
* 1993: A widely cited version of the paper above is published in [[Communications of the ACM]] – Special issue on computer augmented environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold.<ref>{{cite journal |last1=Wellner |first1=Pierre |last2=Mackay |first2=Wendy |last3=Gold |first3=Rich |s2cid=21169183 |title=Back to the real world |journal=Communications of the ACM |date=1 July 1993 |volume=36 |issue=7 |pages=24–27 |doi=10.1145/159544.159555 |doi-access=free }}</ref>
* 1993: [[Loral Corporation|Loral WDL]], with sponsorship from [[United States Army Simulation and Training Technology Center|STRICOM]], performed the first demonstration combining live AR-equipped vehicles and manned simulators. Unpublished paper, J. Barrilleaux, "Experiences and Observations in Applying Augmented Reality to Live Training", 1999.<ref>Barrilleaux, Jon. [[:File:Experiences and Observations in Applying Augmented Reality to Live Training.pdf|Experiences and Observations in Applying Augmented Reality to Live Training]].</ref>
* 1994: Julie Martin creates first 'Augmented Reality Theater production', Dancing in Cyberspace, funded by the [[Australia Council for the Arts]], features dancers and [[acrobatics|acrobats]] manipulating body–sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. The installation used [[Silicon Graphics]] computers and Polhemus sensing system.
*1996: General Electric develops system for projecting information from 3D CAD models onto real-world instances of those models.<ref>{{Cite web|title=US Patent for Projection of images of computer models in three dimensional space Patent (Patent # 5,687,305 issued November 11, 1997) - Justia Patents Search|url=https://patents.justia.com/patent/5687305|access-date=2021-10-17|website=patents.justia.com}}</ref>
* 1998: Spatial augmented reality introduced at [[University of North Carolina]] at Chapel Hill by [[Ramesh Raskar]], Greg Welch, [[Henry Fuchs]].<ref name="raskarSAR" />
* 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.<ref name="DELG99" /><ref name = "DELG00" />
* 1999: The [[United States Naval Research Laboratory|US Naval Research Laboratory]] engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.<ref>{{Cite web|url=https://www.nrl.navy.mil/itd/imda/research/5581/augmented-reality/|title=Information Technology|website=www.nrl.navy.mil}}</ref>
* 1999: NASA X-38 flown using LandForm software video map overlays at [[Dryden Flight Research Center]].<ref>AviationNow.com Staff, "X-38 Test Features Use of Hybrid Synthetic Vision" AviationNow.com, 11 December 2001</ref>
* 2000: [[Rockwell International]] Science Center demonstrates tetherless wearable augmented reality systems receiving analog video and 3-D audio over radio-frequency wireless channels. The systems incorporate outdoor navigation capabilities, with digital horizon silhouettes from a terrain database overlain in real time on the live outdoor scene, allowing visualization of terrain made invisible by clouds and fog.<ref>{{cite book |doi=10.1109/ISAR.2000.880918 |chapter=A wearable augmented reality testbed for navigation and control, built solely with commercial-off-the-shelf (COTS) hardware |title=Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) |pages=12–19 |year=2000 |last1=Behringer |first1=R. |last2=Tam |first2=C. |last3=McGee |first3=J. |last4=Sundareswaran |first4=S. |last5=Vassiliou |first5=M. |s2cid=18892611 |isbn=0-7695-0846-4 }}</ref><ref>{{cite book |doi=10.1109/ISWC.2000.888495 |chapter=Two wearable testbeds for augmented reality: ItWARNS and WIMMIS |title=Digest of Papers. Fourth International Symposium on Wearable Computers |pages=189–190 |year=2000 |last1=Behringer |first1=R. |last2=Tam |first2=C. |last3=McGee |first3=J. |last4=Sundareswaran |first4=S. |last5=Vassiliou |first5=M. |s2cid=13459308 |isbn=0-7695-0795-6 }}</ref>
*2004: An outdoor helmet-mounted AR system was demonstrated by [[Trimble Navigation]] and the Human Interface Technology Laboratory (HIT lab).<ref name="Outdoor AR" />
*2006: Outland Research develops AR media player that overlays virtual content onto a users view of the real world synchronously with playing music, thereby providing an immersive AR entertainment experience.<ref>{{Cite patent|country=|number=7732694|title=United States Patent: 7732694 - Portable music player with synchronized transmissive visual overlays|status=|pubdate=9 Aug 2006|gdate=8 June 2010|invent1=|inventor1-first=|url=http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-adv.htm&r=1&f=G&l=50&d=PALL&S1=07732694&OS=PN/07732694&RS=PN/07732694}}</ref><ref>{{Cite web|last=Slawski|first=Bill|date=2011-09-04|title=Google Picks Up Hardware and Media Patents from Outland Research|url=https://www.seobythesea.com/2011/09/google-picks-up-hardware-and-media-patents-from-outland-research/|website=SEO by the Sea ⚓|language=en-US}}</ref>
* 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the [[HTC Dream|G1 Android phone]].<ref>[https://www.youtube.com/watch?v=8EA8xlicmT8 Wikitude AR Travel Guide]. YouTube.com. Retrieved 9 June 2012.</ref>
* 2009: ARToolkit was ported to [[Adobe Flash]] (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.<ref>Cameron, Chris. [http://www.readwriteweb.com/archives/flash-based_ar_gets_high-quality_markerless_upgrade.php Flash-based AR Gets High-Quality Markerless Upgrade], ''ReadWriteWeb'' 9 July 2010.</ref>
* 2012: Launch of [[LyteShot|Lyteshot]], an interactive AR gaming platform that utilizes smart glasses for game data
* 2015: [[Microsoft]] announced the [[HoloLens]] augmented reality headset, which uses various sensors and a processing unit to display virtual imagery over the real world.<ref>Microsoft Channel, YouTube [https://www.youtube.com/watch?v=aThCr0PsyuA], 23 January 2015.</ref>
* 2015: [[Snap, Inc.]] releases "Lenses", augmented reality [[Filter_(social_media)|filters]] in the Snapchat application. <ref>{{cite web |last1=Bell |first1=Karissa |title=How to get the most out of the new Snapchat update |url=https://mashable.com/archive/snapchat-update-how-to |website=Mashable |date=15 September 2015}}</ref>
* 2016: [[Niantic, Inc.|Niantic]] released ''[[Pokémon Go]]'' for [[iOS]] and [[Android (operating system)|Android]] in July 2016. The game quickly became one of the most popular smartphone applications and in turn spikes the popularity of augmented reality games.<ref>{{cite news|last1=Bond|first1=Sarah|title=After the Success of Pokémon Go, How Will Augmented Reality Impact Archaeological Sites?|url=https://www.forbes.com/sites/drsarahbond/2016/07/17/after-the-success-of-pokemon-go-how-will-augmented-reality-impact-archaeological-sites/|access-date=17 July 2016|date=17 July 2016}}</ref>
* 2018: [[Magic Leap]] launched the [[Magic Leap One]] augmented reality headset.<ref>{{cite web | last=Haselton | first=Todd | title=After almost a decade and billions in outside investment, Magic Leap's first product is finally on sale for $2,295. Here's what it's like. | website=CNBC | date=2018-08-08 | url=https://www.cnbc.com/2018/08/08/magic-leap-one-creators-edition-first-look.html | access-date=2024-06-02}}</ref> Leap Motion announced the Project North Star augmented reality headset, and later released it under an open source license.<ref>{{cite web |title=Leap Motion's 'Project North Star' could help make cheap AR headsets a reality |website=[[Mashable]] |date=9 April 2018 |url=https://mashable.com/article/leap-motion-project-north-star-ar-headset |access-date=26 March 2024}}</ref><ref>{{cite web |title=Leap Motion designed a $100 augmented reality headset with super-powerful hand tracking |url=https://www.theverge.com/2018/4/9/17208192/leap-motion-project-north-star-augmented-reality-headset-open-source-concept |website=The Verge |date=9 April 2018 |access-date=26 March 2024}}</ref><ref>{{cite web |title=Project North Star is Now Open Source |url=https://blog.leapmotion.com/north-star-open-source/ |website=Leap Motion |date=6 June 2018 |access-date=26 March 2024}}</ref><ref>{{cite web |title=Leap Motion Open-sources Project North Star, An AR Headset Prototype With Impressive Specs |url=https://www.roadtovr.com/leap-motion-reveals-project-north-star-an-open-source-wide-fov-ar-headset-dev-kit/ |website=Road to VR |date=6 June 2018 |access-date=26 March 2024}}</ref>
* 2019: [[Microsoft]] announced [[HoloLens 2]] with significant improvements in terms of field of view and ergonomics.<ref>Official Blog, Microsoft [https://blogs.microsoft.com/blog/2019/02/24/microsoft-at-mwc-barcelona-introducing-microsoft-hololens-2/], 24 February 2019.</ref>
* 2022: Magic Leap launched the Magic Leap 2 headset.<ref>{{cite web |title=Magic Leap 2 is the best AR headset yet, but will an enterprise focus save the company? |url=https://www.engadget.com/magic-leap-2-ar-headset-tech-dive-143046676.html |website=Engadget |date=11 November 2022 |access-date=26 March 2024}}</ref>
* 2024: [[Meta Platforms]] revealed the Orion AR glasses prototype. <ref>{{Cite web |last=Vanian |first=Jonathan |date=2024-09-27 |title=Hands-on with Meta's Orion AR glasses prototype and the possible future of computing |url=https://www.cnbc.com/2024/09/27/hands-on-with-metas-orion-augmented-reality-smart-glasses-prototype.html |access-date=2024-09-28 |website=CNBC |language=en}}</ref>


==== Display ====
== Hardware ==
[[File:MicrosoftHoloLensBloomGesture.JPG|thumb|alt= Photograph of a man wearing an augmented reality headset| A man wearing an augmented reality headset]]


Augmented reality requires hardware components including a processor, display, sensors, and input devices. Modern [[mobile computing]] devices like [[smartphone]]s and [[tablet computer]]s contain these elements, which often include a camera and microelectromechanical systems ([[MEMS]]) sensors such as an [[accelerometer]], [[GPS]], and [[Digital magnetic compass|solid state compass]], making them suitable AR platforms.<ref>{{Cite web |url=http://www.technologyreview.com/news/428654/augmented-reality-is-finally-getting-real/ |title=Augmented Reality Is Finally Getting Real |last=Metz |first=Rachael |date=2 August 2012 |website=technologyreview.com |access-date=18 June 2019}}</ref><ref>{{cite journal|title=Benchmarking Built-In Tracking Systems for Indoor AR Applications on Popular Mobile Devices|journal= Sensors|date=2022|doi= 10.3390/s22145382|doi-access= free|last1= Marino|first1= Emanuele|last2= Bruno|first2= Fabio|last3= Barbieri|first3= Loris|last4= Lagudi|first4= Antonio|volume= 22|issue= 14|page= 5382|pmid= 35891058|pmc= 9320911|bibcode= 2022Senso..22.5382M}}</ref>
Various technologies are used in augmented reality rendering, including [[projector|optical projection systems]], [[computer monitor|monitors]], [[mobile device|handheld devices]], and display systems worn on the human body.


===Displays===
A [[head-mounted display]] (HMD) is a display device worn on the forehead, such as a harness or helmet. HMDs place images of both the physical world and virtual objects over the user's field of view. Modern HMDs often employ sensors for six [[Degrees of freedom (mechanics)|degrees of freedom]] monitoring that allow the system to align virtual information to the physical world and adjust accordingly with the user's head movements.<ref>[http://www.eweek.com/c/a/Security/Fleet-Week-Office-of-Naval-Research-Technology/4/ Fleet Week: Office of Naval Research Technology- Virtual Reality Welder Training], ''eweek'', 28 May 2012.</ref><ref>Rolland, Jannick; Baillott, Yohan; Goon, Alexei.[ftp://ftp.cis.upenn.edu/pub/cg/public_html/research/AF/papers/tracking-chapter.pdf A Survey of Tracking Technology for Virtual Environments]{{dead link|date=October 2016 |bot=InternetArchiveBot |fix-attempted=yes }}, Center for Research and Education in Optics and Lasers, University of Central Florida.</ref><ref name=displays>Klepper, Sebastian.[http://campar.in.tum.de/twiki/pub/Chair/TeachingSs07ArProseminar/1_Display-Systems_Klepper_Report.pdf Augmented Reality – Display Systems] {{webarchive |url=https://web.archive.org/web/20130128175343/http://campar.in.tum.de/twiki/pub/Chair/TeachingSs07ArProseminar/1_Display-Systems_Klepper_Report.pdf |date=28 January 2013 }}.</ref> HMDs can provide VR users with mobile and collaborative experiences.<ref name="hmdcollab">{{cite journal|last=Rolland|first=J|author2=Biocca F|author3=Hamza-Lup F|author4=Yanggang H|author5=Martins R|title=Development of Head-Mounted Projection Displays for Distributed, Collaborative, Augmented Reality Applications|url=http://www.creol.ucf.edu/Research/Publications/1357.pdf|journal=Presence: Teleoperators & Virtual Environments|date=October 2005|volume=14|issue=5|pages=528–549|doi=10.1162/105474605774918741}}</ref> Specific providers, such as [[uSens]] and [[Gestigon]], include [[Gesture recognition|gesture controls]] for full virtual [[Immersion (virtual reality)|immersion]].<ref>{{cite web|title=Gestigon Gesture Tracking – TechCrunch Disrupt|url=https://techcrunch.com/video/gestigon-gesture-tracking/517762030/|website=TechCrunch|accessdate=11 October 2016}}</ref><ref>{{cite web|last1=Matney|first1=Lucas|title=uSens shows off new tracking sensors that aim to deliver richer experiences for mobile VR|url=https://techcrunch.com/2016/08/29/usens-unveils-vr-sensor-modules-with-hand-tracking-and-mobile-positional-tracking-tech-baked-in/|website=TechCrunch|accessdate=29 August 2016}}</ref>
Various technologies can be used to display augmented reality, including [[optical head-mounted display|optical projection systems]], [[computer monitor|monitors]], and [[mobile device|handheld devices]]. Two of the display technologies used in augmented reality are diffractive [[Waveguide (optics)|waveguides]] and reflective waveguides.


A [[head-mounted display]] (HMD) is a display device worn on the forehead, such as a harness or [[Helmet-mounted display|helmet-mounted]]. HMDs place images of both the physical world and virtual objects over the user's field of view. Modern HMDs often employ sensors for six [[Degrees of freedom (mechanics)|degrees of freedom]] monitoring that allow the system to align virtual information to the physical world and adjust accordingly with the user's head movements.<ref>{{Cite web |url=http://www.eweek.com/c/a/Security/Fleet-Week-Office-of-Naval-Research-Technology/4/ |title=Fleet Week: Office of Naval Research Technology |date=28 May 2012 |website=eweek.com |access-date=18 June 2019}}</ref><ref>Rolland, Jannick; Baillott, Yohan; Goon, Alexei.[https://web.archive.org/web/20200227120212/https://pdfs.semanticscholar.org/ce53/48128f94f3383bdc4eb15fb4eaf3721d521f.pdf A Survey of Tracking Technology for Virtual Environments], Center for Research and Education in Optics and Lasers, University of Central Florida.</ref><ref>{{Cite web |url=http://campar.in.tum.de/twiki/pub/Chair/TeachingSs07ArProseminar/1_Display-Systems_Klepper_Report.pdf |title=Augmented Reality - Display Systems |last=Klepper |first=Sebastian |website=campar.in.tum.de |archive-url=https://web.archive.org/web/20130128175343/http://campar.in.tum.de/twiki/pub/Chair/TeachingSs07ArProseminar/1_Display-Systems_Klepper_Report.pdf |archive-date=28 January 2013 |url-status=dead |access-date=18 June 2019}}</ref> When using AR technology, the HMDs only require relatively small displays. In this situation, liquid crystals on silicon (LCOS) and micro-OLED (organic light-emitting diodes) are commonly used.<ref>{{Cite journal |last=Komura |first=Shinichi |date=2024-07-19 |title=Optics of AR/VR using liquid crystals |journal=Molecular Crystals and Liquid Crystals |language=en |pages=1–26 |doi=10.1080/15421406.2024.2379694 |issn=1542-1406|doi-access=free }}</ref> HMDs can provide VR users with mobile and collaborative experiences.<ref>{{cite journal |last1=Rolland |first1=Jannick P. |last2=Biocca |first2=Frank |last3=Hamza-Lup |first3=Felix |last4=Ha |first4=Yanggang |last5=Martins |first5=Ricardo |title=Development of Head-Mounted Projection Displays for Distributed, Collaborative, Augmented Reality Applications |journal=Presence: Teleoperators and Virtual Environments |date=October 2005 |volume=14 |issue=5 |pages=528–549 |doi=10.1162/105474605774918741 |s2cid=5328957 |url=https://stars.library.ucf.edu/facultybib2000/5607 |arxiv=1902.07769 }}</ref> Specific providers, such as [[uSens]] and [[Gestigon]], include [[Gesture recognition|gesture controls]] for full virtual [[Immersion (virtual reality)|immersion]].<ref>{{cite web|title=Gestigon Gesture Tracking – TechCrunch Disrupt|url=https://techcrunch.com/video/gestigon-gesture-tracking/517762030/|website=TechCrunch|access-date=11 October 2016}}</ref><ref>{{cite web|last1=Matney|first1=Lucas|title=uSens shows off new tracking sensors that aim to deliver richer experiences for mobile VR|url=https://techcrunch.com/2016/08/29/usens-unveils-vr-sensor-modules-with-hand-tracking-and-mobile-positional-tracking-tech-baked-in/|website=TechCrunch|date=29 August 2016 |access-date=29 August 2016}}</ref>
In January 2015, [[Meta (company)|Meta]] launched a project led by [[Horizons Ventures]], [[Tim Draper]], [[Alexis Ohanian]], BOE Optoelectronics and [[Garry Tan]].<ref>{{cite news|url=https://blogs.wsj.com/venturecapital/2015/01/28/augmented-reality-headset-maker-meta-secures-23-million/|title=Augmented-Reality Headset Maker Meta Secures $23 Million|last=Chapman|first=Lizette|work=[[Wall Street Journal]]|date=2015-01-28|accessdate=2016-02-29}}</ref><ref>{{cite news|url=https://techcrunch.com/2016/03/02/hands-on-with-the-949-mind-bending-meta-2-augmented-reality-headset/|title=Hands-on with the $949 mind-bending Meta 2 augmented reality headset|last=Matney|first=Lucas|work=[[TechCrunch]]|date=2016-03-02|accessdate=2016-03-02}}</ref><ref>{{cite news|url=https://gigaom.com/2015/01/28/meta-raises-23m-series-a-to-refine-its-augmented-reality-glasses/|title=Meta raises $23M Series A to refine its augmented reality glasses|last=Brewster|first=Signe|work=[[Gigaom]]|date=2015-01-28|accessdate=2016-02-29}}</ref> On February 17, 2016, [[Meta (company)|Meta]] announced their second-generation product at [[TED (conference)|TED]], Meta 2. The Meta 2 [[head-mounted display]] [[Virtual reality headset|headset]] uses a sensory array for hand interactions and positional tracking, visual field view of 90 degrees (diagonal), and resolution display of 2560 x 1440 (20 pixels per degree), which is considered the largest [[field of view]] (FOV) currently available.<ref>{{cite news|url=http://uploadvr.com/meta-2-ar-glasses-ted/|title=Meta Unveils Incredible Augmented Reality Headset at TED|work=[[UploadVR]]|date=2016-02-17|accessdate=2016-02-29}}</ref><ref>{{cite news|url=https://www.bbc.com/news/technology-35583356/|title=TED 2016: Meta augmented reality headset demoed at TED|last=Wakefield|first=Jane|work=[[BBC]]|date=2016-02-17|accessdate=2016-02-29}}</ref><ref>{{cite news|url=https://www.forbes.com/sites/miguelhelft/2016/02/17/new-augmented-reality-startup-meta-dazzles-ted-crowd/#7fcc96713f13/|title=New Augmented Reality Startup Meta Dazzles TED Crowd|last=Helft|first=Miguel|work=[[Forbes]]|date=2016-02-17|accessdate=2016-02-29}}</ref><ref>{{cite news|url=https://www.forbes.com/sites/stevenrosenbaum/2016/02/17/meron-gribetz-wants-to-build-the-ios-of-the-mind/#1bdde5b134bc/|title=Meron Gribetz Wants To Build The IOS Of The Mind|last=Rosenbaum|first=Steven|work=[[Forbes]]|date=2016-02-17|accessdate=2016-02-29}}</ref>


[[Vuzix]] is a company that has produced a number of head-worn optical see through displays marketed for augmented reality.<ref>{{cite web |title=Images Of The Vuzix STAR 1200 Augmented Reality Glasses |url=https://techcrunch.com/2011/06/04/images-of-the-vuzix-star-1200-augmented-reality-glasses/ |website=TechCrunch |date=5 June 2011 |access-date=26 March 2024}}</ref><ref>{{cite web |title=Vuzix Blade AR glasses are the next-gen Google Glass we've all been waiting for |date=9 January 2018 |url=https://www.theverge.com/2018/1/9/16869174/vuzix-blade-ar-glasses-augmented-reality-amazon-alexa-ai-ces-2018 |access-date=26 March 2024}}</ref><ref>{{cite web |title=Hands On: Vuzix's No-Nonsense AR Smart Glasses |url=https://www.pcmag.com/news/hands-on-vuzixs-no-nonsense-ar-smart-glasses |access-date=26 March 2024}}</ref>
===== Eyeglasses =====
[[File:Vuzix AR3000 AugmentedReality SmartGlasses.png|thumb|Vuzix AR3000 AugmentedReality SmartGlasses]]


====Eyeglasses====
AR displays can be rendered on devices resembling eyeglasses. Versions include eyewear that employs cameras to intercept the real world view and re-display its augmented view through the eyepieces<ref>Grifatini, Kristina. [http://www.technologyreview.com/news/421606/augmented-reality-goggles/ Augmented Reality Goggles], ''Technology Review'' 10 November 2010.</ref> and devices in which the AR imagery is projected through or reflected off the surfaces of the eyewear's lenspieces.<ref>Arthur, Charles. [https://www.theguardian.com/technology/2012/sep/10/augmented-reality-glasses-google-project UK company's 'augmented reality' glasses could be better than Google's], ''The Guardian'', 10 September 2012.</ref><ref>Gannes, Liz. {{cite web |url=http://allthingsd.com/20120404/google-unveils-project-glass-wearable-augmented-reality-glasses/ |title=Google Unveils Project Glass: Wearable Augmented-Reality Glasses |work=allthingsd.com |accessdate=2012-04-04}}, All Things D.</ref><ref>Benedetti, Winda. [http://www.nbcnews.com/technology/ingame/xbox-leak-reveals-kinect-2-augmented-reality-glasses-833583 Xbox leak reveals Kinect 2, augmented reality glasses] ''NBC News''.</ref>
{{Update section|date=August 2024}}
AR displays can be rendered on devices resembling eyeglasses. Versions include eyewear that employs cameras to intercept the real world view and re-display its augmented view through the eyepieces<ref>Grifatini, Kristina. [http://www.technologyreview.com/news/421606/augmented-reality-goggles/ Augmented Reality Goggles], ''Technology Review'' 10 November 2010.</ref> and devices in which the AR [[Imagery intelligence|imagery]] is projected through or reflected off the surfaces of the eyewear lens pieces.<ref>Arthur, Charles. [https://www.theguardian.com/technology/2012/sep/10/augmented-reality-glasses-google-project UK company's 'augmented reality' glasses could be better than Google's], ''The Guardian'', 10 September 2012.</ref><ref>Gannes, Liz. {{cite web |url=http://allthingsd.com/20120404/google-unveils-project-glass-wearable-augmented-reality-glasses/ |title=Google Unveils Project Glass: Wearable Augmented-Reality Glasses |work=allthingsd.com |access-date=4 April 2012}}, All Things D.</ref><ref>Benedetti, Winda. [https://web.archive.org/web/20120823000655/https://www.nbcnews.com/technology/ingame/xbox-leak-reveals-kinect-2-augmented-reality-glasses-833583 Xbox leak reveals Kinect 2, augmented reality glasses] ''NBC News''. Retrieved 23 August 2012.</ref>


The [[EyeTap]] (also known as Generation-2 Glass<ref name="GlassEyes">[https://web.archive.org/web/20131004212812/http://wearcam.org/glass.pdf "GlassEyes": The Theory of EyeTap Digital Eye Glass, supplemental material for IEEE Technology and Society, Volume Vol. 31, Number 3, 2012, pp. 10–14].</ref>) captures rays of light that would otherwise pass through the center of the lens of the wearer's eye, and substitutes synthetic computer-controlled light for each ray of real light. The Generation-4 Glass<ref name="GlassEyes" /> (Laser EyeTap) is similar to the VRD (i.e. it uses a computer-controlled laser light source) except that it also has infinite depth of focus and causes the eye itself to, in effect, function as both a camera and a display by way of exact alignment with the eye and resynthesis (in laser light) of rays of light entering the eye.<ref>"Intelligent Image Processing", [[John Wiley and Sons]], 2001, {{ISBN|0-471-40637-6}}, 384 p.</ref>
====== HUD ======
[[File:Headset_computer.png|thumb|Headset computer]]
{{See also|Head-up display}}


=====HUD=====
A '''head-up display''' ('''HUD''') is a transparent display that presents data without requiring users to look away from their usual viewpoints. A precursor technology to augmented reality, heads-up displays were first developed for pilots in the 1950s, projecting simple flight data into their line of sight, thereby enabling them to keep their "heads up" and not look down at the instruments. Near-eye augmented reality devices can be used as portable head-up displays as they can show data, information, and images while the user views the real world. Many definitions of augmented reality only define it as overlaying the information.<ref>{{Cite web|title = augmented reality {{!}} an enhanced version of reality created by the use of technology to overlay digital information on an image of something being viewed through a device (as a smartphone camera) also : the technology used to create augmented reality|url = http://www.merriam-webster.com/dictionary/augmented%2520reality|website = www.merriam-webster.com|accessdate = 2015-10-08}}</ref><ref>{{Cite web|title = augmented reality: definition of augmented reality in Oxford dictionary (American English) (US)|url = http://www.oxforddictionaries.com/us/definition/american_english/augmented-reality|website = www.oxforddictionaries.com|accessdate = 2015-10-08}}</ref> This is basically what a head-up display does; however, practically speaking, augmented reality is expected to include registration and tracking between the superimposed perceptions, sensations, information, data, and images and some portion of the real world.<ref>{{Cite web|title = What is Augmented Reality (AR): Augmented Reality Defined, iPhone Augmented Reality Apps and Games and More|url = http://www.digitaltrends.com/features/what-is-augmented-reality-iphone-apps-games-flash-yelp-android-ar-software-and-more/|website = Digital Trends|accessdate = 2015-10-08}}</ref>
[[File:Headset computer.png|thumb|alt= Photograph of a Headset computer |Headset computer]]
{{Main|Head-up display}}


A head-up display (HUD) is a transparent display that presents data without requiring users to look away from their usual viewpoints. A precursor technology to augmented reality, heads-up displays were first developed for pilots in the 1950s, projecting simple flight data into their line of sight, thereby enabling them to keep their "heads up" and not look down at the instruments. Near-eye augmented reality devices can be used as portable head-up displays as they can show data, information, and images while the user views the real world. Many definitions of augmented reality only define it as overlaying the information.<ref>{{Cite web |url=http://www.merriam-webster.com/dictionary/augmented%2520reality |title=Augmented Reality |website=merriam-webster.com |access-date=8 October 2015 |quote=an enhanced version of reality created by the use of technology to overlay digital information on an image of something being viewed through a device (such as a smartphone camera) also : the technology used to create augmented reality |archive-url=https://web.archive.org/web/20150913022106/http://www.merriam-webster.com/dictionary/augmented%20reality |archive-date=13 September 2015 |url-status=dead }}</ref><ref>{{Cite web |url=http://www.oxforddictionaries.com/us/definition/american_english/augmented-reality |archive-url=https://web.archive.org/web/20131125044327/http://www.oxforddictionaries.com/us/definition/american_english/augmented-reality |url-status=dead |archive-date=25 November 2013 |title=Augmented Reality |website=oxforddictionaries.com |access-date=8 October 2015 |quote=A technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view.}}</ref> This is basically what a head-up display does; however, practically speaking, augmented reality is expected to include registration and tracking between the superimposed perceptions, sensations, information, data, and images and some portion of the real world.<ref>{{Cite web|title = What is Augmented Reality (AR): Augmented Reality Defined, iPhone Augmented Reality Apps and Games and More|url = http://www.digitaltrends.com/features/what-is-augmented-reality-iphone-apps-games-flash-yelp-android-ar-software-and-more/|website = [[Digital Trends]]|access-date = 8 October 2015|date = 3 November 2009}}</ref>
[[CrowdOptic, Inc.|CrowdOptic]], an existing app for smartphones, applies algorithms and triangulation techniques to photo metadata including GPS position, compass heading, and a time stamp to arrive at a relative significance value for photo objects.<ref name="VB2">{{cite web|url = https://venturebeat.com/2012/12/08/how-crowdoptics-big-data-technology-reveals-the-worlds-most-popular-photo-objects/|title = How Crowdoptic’s big data technology reveals the world’s most popular photo objects|publisher = VentureBeat|accessdate = 6 June 2013}}</ref> CrowdOptic technology can be used by [[Google Glass]] users to learn where to look at a given point in time.<ref name="FOR1">{{cite news|url = https://www.forbes.com/sites/tarunwadhwa/2013/06/03/crowdoptic-and-loreal-are-about-to-make-history-by-demonstrating-how-augmented-reality-can-be-a-shared-experience/|title = CrowdOptic and L'Oreal To Make History By Demonstrating How Augmented Reality Can Be A Shared Experience| publisher = Forbes|accessdate = 6 June 2013|first=Tarun|last=Wadhwa}}</ref>


====Contact lenses====
A number of [[smartglasses]] have been launched for augmented reality. Due to encumbered control, smartglasses are primarily designed for micro-interaction like reading a text message but still far from more well-rounded applications of augmented reality.<ref>{{cite arxiv|eprint=1707.09728|title=Interaction Methods for Smart Glasses|author=Lik-Hang Lee and Pan Hui|class=cs.HC|year=2017}}</ref> In January 2015, [[Microsoft]] introduced [[HoloLens]], an independent [[smartglasses]] unit. Brian Blau, Research Director of Consumer Technology and Markets at [[Gartner]],
said that "Out of all the head-mounted displays that I've tried in the past couple of decades, the HoloLens was the best in its class."<ref name="IW1">{{cite web|last1=Sheridan|first1=Kelly|title=Microsoft HoloLens Vs. Google Glass: No Comparison|url=http://www.informationweek.com/mobile/microsoft-hololens-vs-google-glass-no-comparison/d/d-id/1318851|website=InformationWeek|accessdate=15 February 2015}}</ref> HoloLens was designed to be a general purpose immersive device. First impressions were generally that such a device might be more useful than a small off to the side display like Google Glass offered with packaged productivity oriented applications<ref name="IW1" /><ref>{{cite web|last1=Berinato|first1=Scott|title=What HoloLens Has That Google Glass Didn’t|url=https://hbr.org/2015/01/what-hololens-has-that-google-glass-didnt|website=Harward Business Preview|accessdate=15 February 2015|date=January 29, 2015}}</ref>


Contact lenses that display AR imaging are in development. These [[bionic contact lens]]es might contain the elements for display embedded into the lens including integrated circuitry, LEDs and an antenna for wireless communication.
===== Contact lenses =====


Contact lenses that display AR imaging are in development. These [[bionic contact lens]]es might contain the elements for display embedded into the lens including integrated circuitry, LEDs and an antenna for wireless communication. The first contact lens display was reported in 1999,<ref>{{cite web|url=http://www.google.com/patents/CA2280022|title=Patent CA2280022A1 – Contact lens for the display of information such as text, graphics, or pictures|publisher=}}</ref> then 11 years later in 2010-2011.<ref>Greenemeier, Larry. [http://blogs.scientificamerican.com/observations/2011/11/23/computerized-contact-lenses-could-enable-in-eye-augmented-reality/ Computerized Contact Lenses Could Enable In-Eye Augmented Reality]. ''Scientific American'', 23 November 2011.</ref><ref>Yoneda, Yuka. [http://inhabitat.com/solar-powered-augmented-contact-lenses-cover-your-eye-with-100s-of-leds/ Solar Powered Augmented Contact Lenses Cover Your Eye with 100s of LEDs]. ''inhabitat'', 17 March 2010.</ref><ref>{{cite web |last=Rosen |first=Kenneth |title=Contact Lenses Can Display Your Text Messages |url=http://mashable.com/2012/12/08/contact-lenses-text-messages/|work=Mashable.com |publisher=Mashable.com |accessdate=2012-12-13}}</ref><ref>{{cite news|last=O'Neil |first=Lauren |title=LCD contact lenses could display text messages in your eye |url=http://www.cbc.ca/news/yourcommunity/2012/12/lcd-contact-lenses-could-display-text-messages-in-your-eye.html |publisher=CBC |accessdate=2012-12-12 |deadurl=yes |archiveurl=https://web.archive.org/web/20121211075000/http://www.cbc.ca/news/yourcommunity/2012/12/lcd-contact-lenses-could-display-text-messages-in-your-eye.html |archivedate=11 December 2012 }}</ref> Another version of contact lenses, in development for the U.S. military, is designed to function with AR spectacles, allowing soldiers to focus on close-to-the-eye AR images on the spectacles and distant real world objects at the same time.<ref>Anthony, Sebastian. [http://www.extremetech.com/computing/126043-us-military-developing-multi-focus-augmented-reality-contact-lenses US military developing multi-focus augmented reality contact lenses]. ''ExtremeTech'', 13 April 2012.</ref><ref>Bernstein, Joseph. [http://www.popsci.com/diy/article/2012-05/2012-invention-awards-augmented-reality-contact-lenses 2012 Invention Awards: Augmented-Reality Contact Lenses] ''Popular Science'', 5 June 2012.</ref>
The first contact lens display was patented in 1999 by Steve Mann and was intended to work in combination with AR spectacles, but the project was abandoned,<ref>{{Cite web|title=Full Page Reload|url=https://spectrum.ieee.org/profile-innovega|website=IEEE Spectrum: Technology, Engineering, and Science News|date=10 April 2013|language=en|access-date=2020-05-06}}</ref><ref>{{Cite web|url=https://patents.google.com/patent/CA2280022/en|title=Contact lens for the display of information such as text, graphics, or pictures}}</ref> then 11 years later in 2010–2011.<ref>Greenemeier, Larry. [http://blogs.scientificamerican.com/observations/2011/11/23/computerized-contact-lenses-could-enable-in-eye-augmented-reality/ Computerized Contact Lenses Could Enable In-Eye Augmented Reality]. ''[[Scientific American]]'', 23 November 2011.</ref><ref>Yoneda, Yuka. [http://inhabitat.com/solar-powered-augmented-contact-lenses-cover-your-eye-with-100s-of-leds/ Solar Powered Augmented Contact Lenses Cover Your Eye with 100s of LEDs]. ''inhabitat'', 17 March 2010.</ref><ref>{{cite web |last=Rosen |first=Kenneth |title=Contact Lenses Can Display Your Text Messages |url=http://mashable.com/2012/12/08/contact-lenses-text-messages/|work=Mashable.com |date=8 December 2012 |access-date=13 December 2012}}</ref><ref>{{cite news|last=O'Neil |first=Lauren |title=LCD contact lenses could display text messages in your eye |url=http://www.cbc.ca/news/yourcommunity/2012/12/lcd-contact-lenses-could-display-text-messages-in-your-eye.html |publisher=[[CBC News]] |access-date=12 December 2012 |url-status=dead |archive-url=https://web.archive.org/web/20121211075000/http://www.cbc.ca/news/yourcommunity/2012/12/lcd-contact-lenses-could-display-text-messages-in-your-eye.html |archive-date=11 December 2012 }}</ref> Another version of contact lenses, in development for the U.S. military, is designed to function with AR spectacles, allowing soldiers to focus on close-to-the-eye AR images on the spectacles and distant real world objects at the same time.<ref>Anthony, Sebastian. [http://www.extremetech.com/computing/126043-us-military-developing-multi-focus-augmented-reality-contact-lenses US military developing multi-focus augmented reality contact lenses]. ''[[ExtremeTech]]'', 13 April 2012.</ref><ref>Bernstein, Joseph. [http://www.popsci.com/diy/article/2012-05/2012-invention-awards-augmented-reality-contact-lenses 2012 Invention Awards: Augmented-Reality Contact Lenses] ''Popular Science'', 5 June 2012.</ref>


At CES 2013, a company called Innovega also unveiled similar contact lenses that required being combined with AR glasses to work.<ref>{{Cite web|title=Innovega combines glasses and contact lenses for an unusual take on augmented reality|url=https://www.theverge.com/2013/1/10/3863550/innovega-augmented-reality-glasses-contacts-hands-on|last=Robertson|first=Adi|date=2013-01-10|website=The Verge|language=en|access-date=2020-05-06}}</ref>
The [[science fiction|futuristic]] short film ''Sight''<ref>[https://vimeo.com/46304267 ''Sight'']</ref> features contact lens-like augmented reality devices.<ref>{{cite news|last1=Kosner|first1=Anthony Wing|title=Sight: An 8-Minute Augmented Reality Journey That Makes Google Glass Look Tame|url=https://www.forbes.com/sites/anthonykosner/2012/07/29/sight-an-8-minute-augmented-reality-journey-that-makes-google-glass-look-tame/|publisher=Forbes|accessdate=3 August 2015|date=29 July 2012}}</ref><ref>{{cite web|last1=O'Dell|first1=J.|title=Beautiful short film shows a frightening future filled with Google Glass-like devices|url=https://venturebeat.com/2012/07/27/sight-systems/|accessdate=3 August 2015|date=27 July 2012}}</ref>


Many scientists have been working on contact lenses capable of many different technological feats. The company Samsung has been working on a contact lens as well. This lens, when finished, is meant to have a built-in camera on the lens itself.<ref>{{Cite web|url=https://www.sciencealert.com/samsung-just-patented-smart-contact-lenses-with-a-built-in-camera|title=Samsung Just Patented Smart Contact Lenses With a Built-in Camera|last=|first=|date=|website=Science Alert|access-date=}}</ref> The design is intended to have you blink to control its interface for recording purposes. It is also intended to be linked with your smartphone to review footage, and control it separately. When successful, the lens would feature a camera, or sensor inside of it. It is said that it could be anything from a light sensor, to a temperature sensor.
Many scientists have been working on contact lenses capable of different technological feats. A patent filed by [[Samsung]] describes an AR contact lens, that, when finished, will include a built-in camera on the lens itself.<ref>{{Cite web|url=https://www.sciencealert.com/samsung-just-patented-smart-contact-lenses-with-a-built-in-camera|title=Samsung Just Patented Smart Contact Lenses With a Built-in Camera|website=sciencealert.com|date=7 April 2016 |access-date=18 June 2019}}</ref> The design is intended to control its interface by blinking an eye. It is also intended to be linked with the user's smartphone to review footage, and control it separately. When successful, the lens would feature a camera, or sensor inside of it. It is said that it could be anything from a light sensor, to a temperature sensor.


The first publicly unveiled working prototype of an AR contact lens not requiring the use of glasses in conjunction was developed by Mojo Vision and announced and shown off at CES 2020.<ref>{{Cite web|title=Full Page Reload|url=https://spectrum.ieee.org/ar-in-a-contact-lens-its-the-real-deal|website=IEEE Spectrum: Technology, Engineering, and Science News|date=16 January 2020|language=en|access-date=2020-05-06}}</ref><ref>{{Cite web|title=Mojo Vision's AR contact lenses are very cool, but many questions remain|url=https://techcrunch.com/2020/01/16/mojo-visions-ar-contact-lenses-are-very-cool-but-many-questions-remain/|website=TechCrunch|date=16 January 2020 |language=en-US|access-date=2020-05-06}}</ref><ref>{{Cite web|title=Mojo Vision is developing AR contact lenses|url=https://techcrunch.com/video/mojo-vision-is-developing-ar-contact-lenses/|website=TechCrunch|date=16 January 2020 |language=en-US|access-date=2020-05-06}}</ref>
In Augmented Reality, the distinction is made between two distinct modes of tracking, known as ''marker'' and ''[[Markerless motion capture|markerless]]''. Marker are visual cues which trigger the display of the virtual information.<ref>{{Cite news|url=https://anymotion.com/en/wissensgrundlagen/augmented-reality-marker|title=What are augmented reality markers ?|last=|first=|date=|work=|language=}}</ref> A piece of paper with some distinct geometries can be used. The camera recognizes the geometries by identifying specific points in the drawing. Markerless tracking, also called instant tracking, does not use markers. Instead the user positions the object in the camera view preferably in an horizontal plane.It uses sensors in mobile devices to accurately detect the real-world environment, such as the locations of walls and points of intersection.<ref>{{Cite news|url=https://www.marxentlabs.com/what-is-markerless-augmented-reality-dead-reckoning/|title=Markerless Augmented Reality is here.|date=2014-05-09|work=Marxent {{!}} Top Augmented Reality Apps Developer|access-date=2018-01-23|language=en-US}}</ref>


===== Virtual retinal display =====
====Virtual retinal display====


A [[virtual retinal display]] (VRD) is a personal display device under development at the [[University of Washington]]'s Human Interface Technology Laboratory under Dr. Thomas A. Furness III.<ref name=":2">{{Cite journal|last=Viirre|first=E.|last2=Pryor|first2=H.|last3=Nagata|first3=S.|last4=Furness|first4=T. A.|date=1998|title=The virtual retinal display: a new technology for virtual reality and augmented vision in medicine|journal=Studies in Health Technology and Informatics|volume=50|pages=252–257|issn=0926-9630|pmid=10180549}}</ref> With this technology, a display is scanned directly onto the [[retina]] of a viewer's eye. This results in bright images with high resolution and high contrast. The viewer sees what appears to be a conventional display floating in space.<ref>Tidwell, Michael; Johnson, Richard S.; Melville, David; Furness, Thomas A.[http://www.hitl.washington.edu/publications/p-95-1/ The Virtual Retinal Display – A Retinal Scanning Imaging System] {{webarchive|url=https://web.archive.org/web/20101213134809/http://www.hitl.washington.edu/publications/p-95-1/ |date=13 December 2010 }}, Human Interface Technology Laboratory, University of Washington.</ref>
A [[virtual retinal display]] (VRD) is a personal display device under development at the [[University of Washington]]'s Human Interface Technology Laboratory under Dr. Thomas A. Furness III.<ref name="Viirre-1998">{{Cite journal|last1=Viirre|first1=E.|last2=Pryor|first2=H.|last3=Nagata|first3=S.|last4=Furness|first4=T. A.|date=1998|title=The virtual retinal display: a new technology for virtual reality and augmented vision in medicine|journal=Studies in Health Technology and Informatics|volume=50|issue=Medicine Meets virtual reality|pages=252–257|issn=0926-9630|pmid=10180549|doi=10.3233/978-1-60750-894-6-252}}</ref> With this technology, a display is scanned directly onto the [[retina]] of a viewer's eye. This results in bright images with high resolution and high contrast. The viewer sees what appears to be a conventional display floating in space.<ref>Tidwell, Michael; Johnson, Richard S.; Melville, David; Furness, Thomas A.[http://www.hitl.washington.edu/publications/p-95-1/ The Virtual Retinal Display – A Retinal Scanning Imaging System] {{webarchive|url=https://web.archive.org/web/20101213134809/http://www.hitl.washington.edu/publications/p-95-1/ |date=13 December 2010 }}, Human Interface Technology Laboratory, University of Washington.</ref>


Several of tests were done in order to analyze the safety of the VRD.<ref name=":2" /> In one test, patients with partial loss of vision were selected to view images using the technology having either macular degeneration (a disease that degenerates the retina) or keratoconus. In the macular degeneration group, 5 out of 8 subjects preferred the VRD images to the CRT or paper images and thought they were better and brighter and were able to see equal or better resolution levels. The Kerocunus patients could all resolve smaller lines in several line tests using the VDR as opposed to their own correction. They also found the VDR images to be easier to view and sharper. As a result of these several tests, virtual retinal display is considered safe technology.
Several of tests were done to analyze the safety of the VRD.<ref name="Viirre-1998" /> In one test, patients with partial loss of vision—having either [[macular degeneration]] (a disease that degenerates the retina) or [[keratoconus]]—were selected to view images using the technology. In the macular degeneration group, five out of eight subjects preferred the VRD images to the [[cathode-ray tube]] (CRT) or paper images and thought they were better and brighter and were able to see equal or better resolution levels. The Keratoconus patients could all resolve smaller lines in several line tests using the VRD as opposed to their own correction. They also found the VRD images to be easier to view and sharper. As a result of these several tests, virtual retinal display is considered safe technology.


Virtual retinal display creates images that can be seen in ambient daylight and ambient roomlight. The VRD is considered a preferred candidate to use in a surgical display due to its combination of high resolution and high contrast and brightness. Additional tests show high potential for VRD to be used as a display technology for patients that have low vision.
Virtual retinal display creates images that can be seen in ambient daylight and ambient room light. The VRD is considered a preferred candidate to use in a surgical display due to its combination of high resolution and high contrast and brightness. Additional tests show high potential for VRD to be used as a display technology for patients that have low vision.


===== EyeTap =====
====Handheld====


A Handheld display employs a small display that fits in a user's hand. All handheld AR solutions to date opt for video see-through. Initially handheld AR employed [[fiducial marker]]s,<ref>[http://researchguides.dartmouth.edu/content.php?pid=227212&sid=1891183 Marker vs Markerless AR] {{webarchive|url=https://web.archive.org/web/20130128175349/http://researchguides.dartmouth.edu/content.php?pid=227212&sid=1891183 |date=28 January 2013 }}, Dartmouth College Library.</ref> and later GPS units and MEMS sensors such as digital compasses and [[six degrees of freedom]] accelerometer–[[gyroscope]]. Today [[simultaneous localization and mapping]] (SLAM) markerless trackers such as PTAM (parallel tracking and mapping) are starting to come into use. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR are the portable nature of handheld devices and the ubiquitous nature of camera phones. The disadvantages are the physical constraints of the user having to hold the handheld device out in front of them at all times, as well as the distorting effect of classically wide-angled mobile phone cameras when compared to the real world as viewed through the eye.<ref>{{cite web |last=Feiner |first=Steve |title=Augmented reality: a long way off? |url=http://www.pocket-lint.com/news/38869/augmented-reality-interview-steve-feiner |work=AR Week |publisher=Pocket-lint |access-date=3 March 2011|date=3 March 2011}}</ref>
The [[EyeTap]] (also known as Generation-2 Glass<ref name="GlassEyes">[https://www.webcitation.org/6DKyiVEP3?url=http://wearcam.org/glass.pdf "GlassEyes": The Theory of EyeTap Digital Eye Glass, supplemental material for IEEE Technology and Society, Volume Vol. 31, Number 3, 2012, pp. 10–14].</ref>) captures rays of light that would otherwise pass through the center of the lens of the eye of the wearer, and substitutes synthetic computer-controlled light for each ray of real light.


====Projection mapping====
The Generation-4 Glass<ref name="GlassEyes" /> (Laser EyeTap) is similar to the VRD (i.e. it uses a computer-controlled laser light source) except that it also has infinite depth of focus and causes the eye itself to, in effect, function as both a camera and a display by way of exact alignment with the eye and resynthesis (in laser light) of rays of light entering the eye.<ref>"Intelligent Image Processing", John Wiley and Sons, 2001, {{ISBN|0-471-40637-6}}, 384 p.</ref>


[[Projection mapping]] augments real-world objects and scenes without the use of special displays such as monitors, head-mounted displays or hand-held devices. Projection mapping makes use of digital projectors to display graphical information onto physical objects. The key difference in projection mapping is that the display is separated from the users of the system. Since the displays are not associated with each user, projection mapping scales naturally up to groups of users, allowing for collocated collaboration between users.
===== Handheld =====


Examples include [[shader lamps]], mobile projectors, virtual tables, and smart projectors. Shader lamps mimic and augment reality by projecting imagery onto neutral objects. This provides the opportunity to enhance the object's appearance with materials of a simple unit—a projector, camera, and sensor.
A Handheld display employs a small display that fits in a user's hand. All handheld AR solutions to date opt for video see-through. Initially handheld AR employed [[fiducial marker]]s,<ref name="markersnonmarkers">[http://researchguides.dartmouth.edu/content.php?pid=227212&sid=1891183 Marker vs Markerless AR] {{webarchive|url=https://web.archive.org/web/20130128175349/http://researchguides.dartmouth.edu/content.php?pid=227212&sid=1891183 |date=28 January 2013 }}, Dartmouth College Library.</ref> and later [[GPS]] units and [[MEMS]] sensors such as digital compasses and [[six degrees of freedom]] [[accelerometer]]–[[gyroscope]]. Today [[Simultaneous localization and mapping|SLAM]] markerless trackers such as PTAM are starting to come into use. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR are the portable nature of handheld devices and the ubiquitous nature of camera phones. The disadvantages are the physical constraints of the user having to hold the handheld device out in front of them at all times, as well as the distorting effect of classically wide-angled mobile phone cameras when compared to the real world as viewed through the eye.<ref>{{cite web |last=Feiner |first=Steve |title=Augmented reality: a long way off? |url=http://www.pocket-lint.com/news/38869/augmented-reality-interview-steve-feiner |work=AR Week |publisher=Pocket-lint |accessdate=2011-03-03}}</ref> The issues arising from the user having to hold the handheld device (manipulability) and perceiving the visualisation correctly (comprehensibility) have been summarised into the HARUS usability questionnaire.<ref>{{Cite journal|last=Santos|first=M. E. C.|last2=Polvi|first2=J.|last3=Taketomi|first3=T.|last4=Yamamoto|first4=G.|last5=Sandor|first5=C.|last6=Kato|first6=H.|date=September 2015|title=Toward Standard Usability Questionnaires for Handheld Augmented Reality|url=http://ieeexplore.ieee.org/document/7274434/|journal=IEEE Computer Graphics and Applications|volume=35|issue=5|pages=66–75|doi=10.1109/mcg.2015.94|pmid=26416363|issn=0272-1716}}</ref>


Other applications include table and wall projections. Virtual showcases, which employ beam splitter mirrors together with multiple graphics displays, provide an interactive means of simultaneously engaging with the virtual and the real.
Games such as ''[[Pokémon Go]]'' and ''[[Ingress (video game)|Ingress]]'' utilize an [[Image Linked Map]] (ILM) interface, where approved [[geotagged]] locations appear on a stylized map for the user to interact with.<ref>{{cite web |last=Borge |first=Ariel |title=The story behind 'Pokémon Go's' impressive mapping |url=http://mashable.com/2016/07/10/john-hanke-pokemon-go/ |date=2016-07-11 |work=Mashable |accessdate=2016-07-13}}</ref>


A projection mapping system can display on any number of surfaces in an indoor setting at once. Projection mapping supports both a graphical visualization and passive [[Haptic perception|haptic]] sensation for the end users. Users are able to touch physical objects in a process that provides passive haptic sensation.<ref name="Azuma_survey" /><ref name="raskarSAR">Ramesh Raskar, Greg Welch, Henry Fuchs [https://web.archive.org/web/19981205111134/http://www.cs.unc.edu/~raskar/Office/ Spatially Augmented Reality], First International Workshop on Augmented Reality, Sept 1998.</ref><ref>Knight, Will. [https://www.newscientist.com/article/dn7695 Augmented reality brings maps to life] 19 July 2005.</ref><ref>Sung, Dan. [http://www.pocket-lint.com/news/38802/augmented-reality-maintenance-and-repair Augmented reality in action – maintenance and repair]. ''Pocket-lint'', 1 March 2011.</ref>
===== Spatial =====


===Tracking===
[[Projection mapping|Spatial augmented reality]] (SAR) augments real-world objects and scenes without the use of special displays such as [[Computer monitor|monitors]], [[head-mounted display]]s or hand-held devices. SAR makes use of digital projectors to display graphical information onto physical objects. The key difference in SAR is that the display is separated from the users of the system. Because the displays are not associated with each user, SAR scales naturally up to groups of users, thus allowing for collocated collaboration between users.
{{main|VR positional tracking}}
Modern mobile augmented-reality systems use one or more of the following [[motion capture|motion tracking]] technologies: [[digital camera]]s and/or other [[image sensor|optical sensors]], accelerometers, GPS, gyroscopes, solid state compasses, [[radio-frequency identification]] (RFID). These technologies offer varying levels of accuracy and precision. These technologies are implemented in the ARKit [[API]] by [[Apple Inc.|Apple]] and [[ARCore]] API by [[Google]] to allow tracking for their respective mobile device platforms.


===Input devices===
Examples include [[shader lamps]], mobile projectors, virtual tables, and smart projectors. Shader lamps mimic and augment reality by projecting imagery onto neutral objects, providing the opportunity to enhance the object's appearance with materials of a simple unit - a projector, camera, and sensor.


Techniques include [[speech recognition]] systems that translate a user's spoken words into computer instructions, and gesture recognition systems that interpret a user's body movements by visual detection or from sensors embedded in a peripheral device such as a wand, stylus, pointer, glove or other body wear.<ref>Marshall, Gary.[http://www.techradar.com/news/computing/beyond-the-mouse-how-input-is-evolving-626794?artc_pg=1 Beyond the mouse: how input is evolving, Touch, voice and gesture recognition and augmented reality]''TechRadar.computing''\''PC Plus'' 23 August 2009.</ref><ref>Simonite, Tom. [http://www.technologyreview.com/news/425431/augmented-reality-meets-gesture-recognition/ Augmented Reality Meets Gesture Recognition], ''Technology Review'', 15 September 2011.</ref><ref>Chaves, Thiago; Figueiredo, Lucas; Da Gama, Alana; de Araujo, Christiano; Teichrieb, Veronica. [http://dl.acm.org/citation.cfm?id=2377147 Human Body Motion and Gestures Recognition Based on Checkpoints]. SVR '12 Proceedings of the 2012 14th Symposium on Virtual and Augmented Reality pp. 271–278.</ref><ref>Barrie, Peter; Komninos, Andreas; Mandrychenko, Oleksii.[http://www.buccleuchpark.net/MUCOM/publi/acmMobility09.pdf A Pervasive Gesture-Driven Augmented Reality Prototype using Wireless Sensor Body Area Networks].</ref> Products which are trying to serve as a controller of AR headsets include Wave by Seebright Inc. and Nimble by Intugine Technologies.
Other applications include table and wall projections. One innovation, the Extended Virtual Table, separates the virtual from the real by including [[beam splitter|beam-splitter]] mirrors attached to the ceiling at an adjustable angle.<ref>Bimber, Oliver; Encarnação, Miguel; Branco, Pedro. [http://www.mitpressjournals.org/doi/abs/10.1162/105474601753272862?journalCode=pres The Extended Virtual Table: An Optical Extension for Table-Like Projection Systems], ''MIT Press Journal'' Vol. 10, No. 6, Pages 613–631, March 13, 2006.</ref> Virtual showcases, which employ beam-splitter mirrors together with multiple graphics displays, provide an interactive means of simultaneously engaging with the virtual and the real. Many more implementations and configurations make spatial augmented reality display an increasingly attractive interactive alternative.


===Computer===
An SAR system can display on any number of surfaces of an indoor setting at once. SAR supports both a graphical visualization and passive [[Haptic perception|haptic]] sensation for the end users. Users are able to touch physical objects in a process that provides passive haptic sensation.<ref name="Azuma_survey" /><ref name="raskarSAR">Ramesh Raskar, Greg Welch, Henry Fuchs [http://www.cs.unc.edu/~raskar/Office Spatially Augmented Reality], First International Workshop on Augmented Reality, Sept 1998.</ref><ref>Knight, Will. [https://www.newscientist.com/article/dn7695 Augmented reality brings maps to life] 19 July 2005.</ref><ref>Sung, Dan. [http://www.pocket-lint.com/news/38802/augmented-reality-maintenance-and-repair Augmented reality in action – maintenance and repair]. ''Pocket-lint'', 1 March 2011.</ref>


Computers are responsible for graphics in augmented reality. For camera-based 3D tracking methods, a computer analyzes the sensed visual and other data to synthesize and position virtual objects. With the improvement of technology and computers, augmented reality is going to lead to a drastic change on ones perspective of the real world.<ref>{{Cite web|url=https://computer.howstuffworks.com/augmented-reality.htm|title=How Augmented Reality Works|last=Bosnor|first=Kevin|website=howstuffworks|date=19 February 2001}}</ref>
==== Tracking ====


Computers are improving at a very fast rate, leading to new ways to improve other technology. Computers are the core of augmented reality.<ref>{{Cite web|date=6 April 1999|first1=Jeffrey |last1=Meisner |first2=Walter P. |last2=Donnelly |first3=Richard |last3=Roosen |title=Augmented reality technology|url=https://patents.google.com/patent/US6625299B1/en}}</ref> The computer receives data from the sensors which determine the relative position of an objects' surface. This translates to an input to the computer which then outputs to the users by adding something that would otherwise not be there. The computer comprises memory and a processor.<ref>{{Cite book|title=A Survey of Augmented Reality Technologies, Applications and Limitations|last=Krevelen, Poelman|first=D.W.F, Ronald|publisher=International Journal of virtual reality|year=2010|pages=3, 6}}</ref> The computer takes the scanned environment then generates images or a video and puts it on the receiver for the observer to see. The fixed marks on an object's surface are stored in the memory of a computer. The computer also withdraws from its memory to present images realistically to the onlooker.
Modern mobile augmented-reality systems use one or more of the following [[motion capture|motion tracking]] technologies:
[[digital camera]]s and/or other [[image sensor|optical sensors]], [[accelerometer]]s, [[GPS]], [[gyroscope]]s, [[Digital magnetic compass|solid state compasses]], [[RFID]]. These technologies offer varying levels of accuracy and precision. The most important is the position and orientation of the user's head. [[Hand tracking|Tracking the user's hand(s)]] or a handheld input device can provide a 6DOF interaction technique.<ref>Stationary systems can employ 6DOF track systems such as Polhemus, ViCON, A.R.T, or Ascension.</ref><ref name="SolinixAR">Solinix Company (Spanish Language) [http://www.solinix.co Mobile Marketing based on Augmented Reality], First Company that revolutionizes the concept Mobile Marketing based on Augmented Reality, January 2015.</ref>


==== Networking ====
=== Projector ===
Projectors can also be used to display AR contents. The projector can throw a virtual object on a projection screen and the viewer can interact with this virtual object. Projection surfaces can be many objects such as walls or glass panes.<ref>{{Cite book|title=Augmented reality and virtual reality : empowering human, place and business|others=Jung, Timothy,, Dieck, M. Claudia tom|isbn=9783319640273|location=Cham, Switzerland|oclc=1008871983|last1 = Jung|first1 = Timothy|last2 = Claudia Tom Dieck|first2 = M.|date = 4 September 2017}}</ref>


===Networking===
Mobile augmented reality applications are gaining popularity due to the wide adoption of mobile and especially wearable devices. However, they often rely on computationally intensive computer vision algorithms with extreme latency requirements. To compensate for the lack of computing power, offloading data processing to a distant machine is often desired. Computation offloading introduces new constraints in applications, especially in terms of latency and bandwidth. Although there are a plethora of real-time multimedia transport protocols, there is a need for support from network infrastructure as well.<ref>Braud T, Hassani F, et al. [http://www.cse.ust.hk/~panhui/papers/future-networking-challenges_CameraReady.pdf Future Networking Challenges: The Case of Mobile Augmented Reality].</ref>
Mobile augmented reality applications are gaining popularity because of the wide adoption of mobile and especially wearable devices. However, they often rely on computationally intensive computer vision algorithms with extreme latency requirements. To compensate for the lack of computing power, offloading data processing to a distant machine is often desired. Computation offloading introduces new constraints in applications, especially in terms of latency and bandwidth. Although there are a plethora of real-time multimedia transport protocols, there is a need for support from network infrastructure as well.<ref>{{Cite web |url=http://www.cse.ust.hk/~panhui/papers/future-networking-challenges_CameraReady.pdf |title=Future Networking Challenges: The Case of Mobile Augmented Reality |last=Braud |first=T. |website=cse.ust.hk |access-date=20 June 2019 |archive-date=16 May 2018 |archive-url=https://web.archive.org/web/20180516203453/http://www.cse.ust.hk/~panhui/papers/future-networking-challenges_CameraReady.pdf |url-status=dead }}</ref>


==== Input devices ====
==Software and algorithms==
{{comparison_of_augmented_reality_fiducial_markers.svg}}
A key measure of AR systems is how realistically they integrate virtual imagery with the real world. The software must derive real world coordinates, independent of camera, and camera images. That process is called [[image registration]], and uses different methods of [[computer vision]], mostly related to [[video tracking]].<ref name="recentadvances" /><ref>Maida, James; Bowen, Charles; Montpool, Andrew; Pace, John. [http://research.jsc.nasa.gov/PDF/SLiSci-14.pdf Dynamic registration correction in augmented-reality systems] {{webarchive|url=https://web.archive.org/web/20130518032710/http://research.jsc.nasa.gov/PDF/SLiSci-14.pdf |date=18 May 2013 }}, ''Space Life Sciences'', NASA.</ref> Many computer vision methods of augmented reality are inherited from [[visual odometry]].


Usually those methods consist of two parts. The first stage is to detect [[interest point detection|interest points]], fiducial markers or [[optical flow]] in the camera images. This step can use [[Feature detection (computer vision)|feature detection]] methods like [[corner detection]], [[blob detection]], [[edge detection]] or [[Thresholding (image processing)|thresholding]], and other [[image processing]] methods.<ref>State, Andrei; Hirota, Gentaro; Chen, David T; Garrett, William; Livingston, Mark. [http://www.cs.princeton.edu/courses/archive/fall01/cs597d/papers/state96.pdf Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking], Department of Computer Science, University of North Carolina at Chapel Hill.</ref><ref>Bajura, Michael; Neumann, Ulrich. [http://graphics.usc.edu/cgit/publications/papers/DynamicRegistrationVRAIS95.pdf Dynamic Registration Correction in Augmented-Reality Systems] [https://web.archive.org/web/20120713224616/https://graphics.usc.edu/cgit/publications/papers/DynamicRegistrationVRAIS95.pdf Archived] 13 July 2012, University of North Carolina, University of Southern California.</ref> The second stage restores a real world coordinate system from the data obtained in the first stage. Some methods assume objects with known geometry (or fiducial markers) are present in the scene. In some of those cases the scene 3D structure should be calculated beforehand. If part of the scene is unknown simultaneous localization and mapping (SLAM) can map relative positions. If no information about scene geometry is available, [[structure from motion]] methods like [[bundle adjustment]] are used. Mathematical methods used in the second stage include: [[projective geometry|projective]] ([[Epipolar geometry|epipolar]]) geometry, [[geometric algebra]], [[Rotation formalisms in three dimensions|rotation representation]] with [[Rotation matrix#Exponential map|exponential map]], [[Kalman filter|kalman]] and [[Particle filter|particle]] filters, [[nonlinear optimization]], [[robust statistics]].{{citation needed|date=February 2017}}
Techniques include [[speech recognition]] systems that translate a user's spoken words into computer instructions, and [[gesture recognition]] systems that interpret a user's body movements by visual detection or from sensors embedded in a peripheral device such as a wand, stylus, pointer, glove or other body wear.<ref>Marshall, Gary.[http://www.techradar.com/news/computing/beyond-the-mouse-how-input-is-evolving-626794?artc_pg=1 Beyond the mouse: how input is evolving, Touch, voice and gesture recognition and augmented reality]''TechRadar.computing''\''PC Plus'' 23 August 2009.</ref><ref>Simonite, Tom. [http://www.technologyreview.com/news/425431/augmented-reality-meets-gesture-recognition/ Augmented Reality Meets Gesture Recognition], ''Technology Review'', 15 September 2011.</ref><ref>Chaves, Thiago; Figueiredo, Lucas; Da Gama, Alana; de Araujo, Christiano; Teichrieb, Veronica. [http://dl.acm.org/citation.cfm?id=2377147 Human Body Motion and Gestures Recognition Based on Checkpoints]. SVR '12 Proceedings of the 2012 14th Symposium on Virtual and Augmented Reality pp. 271–278.</ref><ref>Barrie, Peter; Komninos, Andreas; Mandrychenko, Oleksii.[http://www.buccleuchpark.net/MUCOM/publi/acmMobility09.pdf A Pervasive Gesture-Driven Augmented Reality Prototype using Wireless Sensor Body Area Networks].</ref> Products which are trying to serve as a controller of AR headsets include Wave by Seebright Inc. and Nimble by [[Intugine]] Technologies.


In augmented reality, the distinction is made between two distinct modes of tracking, known as ''marker'' and ''[[Markerless motion capture|markerless]]''. Markers are visual cues which trigger the display of the virtual information.<ref>{{Cite news|url=https://anymotion.com/en/wissensgrundlagen/augmented-reality-marker|title=What are augmented reality markers ?|website=anymotion.com|access-date=18 June 2019}}</ref> A piece of paper with some distinct geometries can be used. The camera recognizes the geometries by identifying specific points in the drawing. Markerless tracking, also called instant tracking, does not use markers. Instead, the user positions the object in the camera view preferably in a horizontal plane. It uses sensors in mobile devices to accurately detect the real-world environment, such as the locations of walls and points of intersection.<ref>{{Cite news|url=https://www.marxentlabs.com/what-is-markerless-augmented-reality-dead-reckoning/|title=Markerless Augmented Reality is here.|date=9 May 2014|work=Marxent {{!}} Top Augmented Reality Apps Developer|access-date=23 January 2018|language=en-US}}</ref>
==== Computer ====


[[Augmented Reality Markup Language]] (ARML) is a data standard developed within the [[Open Geospatial Consortium]] (OGC),<ref>{{cite web | title = ARML 2.0 SWG | work = Open Geospatial Consortium website | publisher = Open Geospatial Consortium | url = http://www.opengeospatial.org/projects/groups/arml2.0swg | access-date = 12 November 2013 | archive-date = 12 November 2013 | archive-url = https://web.archive.org/web/20131112013312/http://www.opengeospatial.org/projects/groups/arml2.0swg | url-status = dead }}</ref> which consists of Extensible Markup Language ([[XML]]) grammar to describe the location and appearance of virtual objects in the scene, as well as [[ECMAScript for XML|ECMAScript]] bindings to allow dynamic access to properties of virtual objects.
The computer analyzes the sensed visual and other data to synthesize and position augmentations. Computers are responsible for the graphics that go with augmented reality. Augmented reality uses a computer-generated image and it has an striking effect on the way the real world is shown. With the improvement of technology and computers, augmented reality is going to have a drastic change on our perspective of the real world.<ref>{{Cite web|url=https://computer.howstuffworks.com/augmented-reality.htm|title=How Augmented Reality Works|last=Bosnor|first=Kevin|website=howstuffworks}}</ref> According to Time Magazine, in about 15–20 years it is predicted that Augmented reality and virtual reality are going to become the primary use for computer interactions.<ref>{{Cite web|url=http://time.com/4654944/this-technology-could-replace-the-keyboard-and-mouse/|title=This Technology Could Replace the Keyboard and Mouse|last=Bajarin|first=Tim|website=Time Magazine|access-date=}}</ref> Computers are improving at a very fast rate, which means that we are figuring out new ways to improve other technology. The more that computers progress, augmented reality will become more flexible and more common in our society. Computers are the core of augmented reality.


{{anchor|Spark_AR}}
<ref>{{Cite journal|date=1999-04-06|others=Jeffrey Meisner, Walter P. Donnelly, Richard Roosen, Jeffrey Meisner, Walter P. Donnelly, Richard Roosen|title=Augmented reality technology|url=https://patents.google.com/patent/US6625299B1/en}}</ref> The Computer receives data from the sensors which determine the relative position of objects surface. This translates to an input to the computer which then outputs to the users by adding something that would otherwise not be there. The computer comprises memory and a processor.<ref>{{Cite book|title=A Survey of Augmented Reality Technologies, Applications and Limitations|last=Krevelen, Poelman|first=D.W.F, Ronald|publisher=International Journal of Virtual Reality|year=2010|isbn=|location=|pages=3,6.}}</ref> The computer takes the scanned environment then generates images or a video and puts it on the receiver for the observer to see. The fixed marks on an objects surface are stored in the memory of a computer. The computer also withdrawals from its memory to present images realistically to the onlooker. The best example of this is of the Pepsi Max AR Bus Shelter.<ref>{{Citation|last=Pepsi Max|title=Unbelievable Bus Shelter {{!}} Pepsi Max. Unbelievable #LiveForNow|date=2014-03-20|url=https://www.youtube.com/watch?time_continue=80&v=Go9rf9GmYpM|accessdate=2018-03-06}}</ref>
To enable rapid development of augmented reality applications, software development applications have emerged, including Lens Studio from [[Snapchat]] and Spark AR from [[Facebook]]. Augmented reality Software Development Kits (SDKs) have been launched by Apple and Google.<ref>{{cite web|url=http://augmentedrealitynews.org/ar-sdk/top-5-augmented-reality-sdks/|title=Top 5 AR SDKs|publisher=Augmented Reality News|url-status=dead|archive-url=https://web.archive.org/web/20131213111219/http://augmentedrealitynews.org/ar-sdk/top-5-augmented-reality-sdks/|archive-date=13 December 2013|access-date=15 November 2013}}</ref><ref>{{cite web|url = http://augmentedworldexpo.com/news/tutorial-top-10-mobile-augmented-reality-sdks-for-developers//|title = Top 10 AR SDKs|publisher = Augmented World Expo|access-date = 15 November 2013|archive-url = https://web.archive.org/web/20131123011106/http://augmentedworldexpo.com/news/tutorial-top-10-mobile-augmented-reality-sdks-for-developers/|archive-date = 23 November 2013|url-status = dead|df = dmy-all}}</ref>


==Development==
=== Software and algorithms ===
AR systems rely heavily on the immersion of the user. The following lists some considerations for designing augmented reality applications:


===Environmental/context design===
A key measure of AR systems is how realistically they integrate augmentations with the real world. The software must derive real world coordinates, independent from the camera, from camera images. That process is called [[image registration]], and uses different methods of [[computer vision]], mostly related to [[video tracking]].<ref name="recentadvances" /><ref>Maida, James; Bowen, Charles; Montpool, Andrew; Pace, John. [http://research.jsc.nasa.gov/PDF/SLiSci-14.pdf Dynamic registration correction in augmented-reality systems] {{webarchive|url=https://web.archive.org/web/20130518032710/http://research.jsc.nasa.gov/PDF/SLiSci-14.pdf |date=18 May 2013 }}, ''Space Life Sciences'', NASA.</ref> Many computer vision methods of augmented reality are inherited from [[visual odometry]].


Context Design focuses on the end-user's physical surrounding, spatial space, and accessibility that may play a role when using the AR system. Designers should be aware of the possible physical scenarios the end-user may be in such as:
Usually those methods consist of two parts. The first stage is to detect [[interest point detection|interest points]], [[fiducial marker]]s or [[optical flow]] in the camera images. This step can use [[Feature detection (computer vision)|feature detection]] methods like [[corner detection]], [[blob detection]], [[edge detection]] or [[Thresholding (image processing)|thresholding]], and other [[image processing]] methods.<ref>State, Andrei; Hirota, Gentaro; Chen, David T; Garrett, William; Livingston, Mark. [http://www.cs.princeton.edu/courses/archive/fall01/cs597d/papers/state96.pdf Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking], Department of Computer ScienceUniversity of North Carolina at Chapel Hill.</ref><ref>Bajura, Michael; Neumann, Ulrich. [http://graphics.usc.edu/cgit/publications/papers/DynamicRegistrationVRAIS95.pdf Dynamic Registration Correction in Augmented-Reality Systems] University of North Carolina, University of Southern California.</ref> The second stage restores a real world coordinate system from the data obtained in the first stage. Some methods assume objects with known geometry (or fiducial markers) are present in the scene. In some of those cases the scene 3D structure should be precalculated beforehand. If part of the scene is unknown [[simultaneous localization and mapping]] (SLAM) can map relative positions. If no information about scene geometry is available, [[structure from motion]] methods like [[bundle adjustment]] are used. Mathematical methods used in the second stage include [[projective geometry|projective]] ([[Epipolar geometry|epipolar]]) geometry, [[geometric algebra]], [[Rotation formalisms in three dimensions|rotation representation]] with [[Rotation matrix#Exponential map|exponential map]], [[Kalman filter|kalman]] and [[Particle filter|particle]] filters, [[nonlinear optimization]], [[robust statistics]].{{citation needed|date=February 2017}}
* Public, in which the users use their whole body to interact with the software
* Personal, in which the user uses a smartphone in a public space
* Intimate, in which the user is sitting with a desktop and is not really moving
* Private, in which the user has on a wearable.<ref name="Wilson-2018">{{Cite web|url=https://uxdesign.cc/the-principles-of-good-user-experience-design-for-augmented-reality-d8e22777aabd|title="The Principles of Good UX for Augmented Reality – UX Collective." UX Collective|last=Wilson|first=Tyler|date=30 January 2018|access-date=19 June 2019}}</ref>


By evaluating each physical scenario, potential safety hazards can be avoided and changes can be made to greater improve the end-user's immersion. [[User experience|UX designers]] will have to define user journeys for the relevant physical scenarios and define how the interface reacts to each.
[[Augmented Reality Markup Language]] (ARML) is a data standard developed within the [[Open Geospatial Consortium]] (OGC),<ref>{{cite web
| title = ARML 2.0 SWG
| work = Open Geospatial Consortium website
| publisher = Open Geospatial Consortium
| url = http://www.opengeospatial.org/projects/groups/arml2.0swg
| format =
| doi =
| accessdate = 12 November 2013
}}</ref> which consists of [[XML]] grammar to describe the location and appearance of virtual objects in the scene, as well as ECMAScript bindings to allow dynamic access to properties of virtual objects.


Another aspect of context design involves the design of the system's functionality and its ability to accommodate user preferences.<ref name="blog.google-2017">{{Cite web|url=https://blog.google/products/google-vr/best-practices-mobile-ar-design/|title=Best Practices for Mobile AR Design- Google|date=13 December 2017|website=blog.google}}</ref><ref>{{Cite web|url=http://www.eislab.fim.uni-passau.de/files/publications/2014/TR2014-HCIwithAR_1.pdf|title=Human Computer Interaction with Augmented Reality|website=eislab.fim.uni-passau.de|archive-url=https://web.archive.org/web/20180525000513/http://www.eislab.fim.uni-passau.de/files/publications/2014/TR2014-HCIwithAR_1.pdf|archive-date=25 May 2018|url-status=dead|df=dmy-all}}</ref> While accessibility tools are common in basic application design, some consideration should be made when designing time-limited prompts (to prevent unintentional operations), audio cues and overall engagement time. In some situations, the application's functionality may hinder the user's ability. For example, applications that is used for driving should reduce the amount of user interaction and use audio cues instead.
To enable rapid development of augmented reality applications, some software development kits (SDKs) have emerged.<ref>{{cite web|url = http://augmentedrealitynews.org/ar-sdk/top-5-augmented-reality-sdks/|title = Top 5 AR SDKs |publisher = Augmented Reality News |accessdate = 15 November 2013}}</ref><ref>{{cite web|url = http://augmentedworldexpo.com/news/tutorial-top-10-mobile-augmented-reality-sdks-for-developers//|title = Top 10 AR SDKs|publisher = Augmented World Expo|accessdate = 15 November 2013|archive-url = https://web.archive.org/web/20131123011106/http://augmentedworldexpo.com/news/tutorial-top-10-mobile-augmented-reality-sdks-for-developers/|archive-date = 23 November 2013|dead-url = yes|df = dmy-all}}</ref> A few SDKs such as CloudRidAR<ref>Huang, Z., Li, W., Hui, P., Peylo, C. [http://www.cse.ust.hk/~panhui/papers/mars2014_cloudridar.pdf]. ''CloudRidAR: A Cloud-based Architecture for Mobile''
Augmented Reality'' Proceeding of MARS'14, July 2014.''</ref> leverage cloud computing for performance improvement. AR SDKs are offered by Vuforia,<ref>{{cite web|url = https://www.vuforia.com |title = Vuforia AR SDK |publisher = Vuforia |accessdate = 15 November 2013}}</ref> [[ARToolKit]], Catchoom CraftAR<ref>[http://catchoom.com/product/craftar/augmented-reality-sdk/ Catchoom CraftAR]</ref> Mobinett AR,<ref>{{cite web|url = http://www.mobinett.com |title = Mobinett AR SDK |publisher = Mobinett |accessdate = 15 November 2014}}</ref> Wikitude,<ref>{{cite web|url = http://www.wikitude.com |title = Wikitude AR SDK |publisher = Wikitude |accessdate = 15 November 2013}}</ref> Blippar<ref>{{cite web|url = https://blippar.com |title = Blippar AR |publisher = Blippar |accessdate = 3 January 2015}}</ref> [[Layar]],<ref>{{cite web|url = https://www.layar.com |title = Layar AR SDK |publisher = Layar |accessdate = 15 November 2013}}</ref> [[Meta (company)|Meta]].<ref>{{cite news|url=http://www.cnn.com/2013/10/31/tech/innovation/meta-augmented-reality-glasses/|title=Glasses to make you a real-life Tony Stark|last=Angley|first=Natalie|work=[[CNN]]|date=2013-10-31|accessdate=2014-11-14}}</ref><ref>{{cite news|url=https://www.usatoday.com/story/tech/2013/07/29/change-agents-meron-gribetz-meta-3d-glasses/2579315/|title=Change Agents: Seeing world through Meta's 3-D glasses|last=Cava|first=Marco|work=[[USATODAY]]|date=2013-07-30|accessdate=2016-02-28}}</ref> and ARLab.<ref>{{cite web |url = https://www.arlab.com |title = ARLab Augmented Reality SDK |publisher = ARLab |deadurl = yes |archiveurl = https://web.archive.org/web/20170718060716/http://www.arlab.com/ |archivedate = 18 July 2017 |df = dmy-all }}</ref>


=== Development ===
===Interaction design===
The implementation of Augmented Reality in consumer products requires considering the design of the applications and the related constraints of the technology platform. Since AR system rely heavily on the immersion of the user and the interaction between the user and the system, design can facilitate the adoption of virtuality. For most Augmented Reality systems, a similar design guideline can be followed. The following lists some considerations for designing Augmented Reality applications:


[[Interaction design]] in augmented reality technology centers on the user's engagement with the end product to improve the overall user experience and enjoyment. The purpose of interaction design is to avoid alienating or confusing the user by organizing the information presented. Since user interaction relies on the user's input, designers must make system controls easier to understand and accessible. A common technique to improve usability for augmented reality applications is by discovering the frequently accessed areas in the device's touch display and design the application to match those areas of control.<ref>{{Cite web|url=https://theblog.adobe.com/basic-patterns-of-mobile-navigation/|title=Basic Patterns of Mobile Navigation|date=9 May 2017|website=theblog.adobe.com|access-date=12 April 2018|archive-date=13 April 2018|archive-url=https://web.archive.org/web/20180413044751/https://theblog.adobe.com/basic-patterns-of-mobile-navigation/|url-status=dead}}</ref> It is also important to structure the user journey maps and the flow of information presented which reduce the system's overall cognitive load and greatly improves the learning curve of the application.<ref>{{Cite web|url=https://www.thinkwithgoogle.com/marketing-resources/experience-design/principles-of-mobile-app-design-engage-users-and-drive-conversions/|title=Principles of Mobile App Design: Engage Users and Drive Conversions|website=thinkwithgoogle.com|archive-url=https://web.archive.org/web/20180413185621/https://www.thinkwithgoogle.com/marketing-resources/experience-design/principles-of-mobile-app-design-engage-users-and-drive-conversions/|archive-date=13 April 2018|url-status=dead}}</ref>
==== Environmental/context design<ref name=":3">{{Cite web|url=https://uxdesign.cc/the-principles-of-good-user-experience-design-for-augmented-reality-d8e22777aabd.|title="The Principles of Good UX for Augmented Reality – UX Collective." UX Collective|last=Wilson|first=Tyler|date=|website=|access-date=}}{{dead link|date=September 2018|bot=medic}}{{cbignore|bot=medic}}</ref> ====


In interaction design, it is important for developers to utilize augmented reality technology that complement the system's function or purpose.<ref>{{Cite web|url=https://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php|title=Inside Out: Interaction Design for Augmented Reality-UXmatters|website=uxmatters.com}}</ref> For instance, the utilization of exciting AR filters and the design of the unique sharing platform in [[Snapchat]] enables users to augment their in-app social interactions. In other applications that require users to understand the focus and intent, designers can employ a [[reticle]] or [[Ray casting|raycast]] from the device.<ref name="blog.google-2017" />
Context Design focuses on the end-user's physical surrounding, spatial space, and accessibility that may play a role when using the AR system. &nbsp;Designers should be aware of the possible physical scenarios the end-user may be in such as:


===Visual design===
* Public, in which the users uses their whole body to interact with the software
To improve the graphic interface elements and user interaction, developers may use visual cues to inform the user what elements of UI are designed to interact with and how to interact with them. Visual cue design can make interactions seem more natural.<ref name="Wilson-2018" />
* Personal, in which the user uses a smartphone in a public space
* Intimate, in which the user is sitting with a desktop and is not really in movement
* Private, in which the user has on a wearable.


In some augmented reality applications that use a 2D device as an interactive surface, the 2D control environment does not translate well in 3D space, which can make users hesitant to explore their surroundings. To solve this issue, designers should apply visual cues to assist and encourage users to explore their surroundings.
By evaluating each physical scenario, potential safety hazard can be avoided and changes can be made to greater improve the end-user's immersion. UX designers will have to define user journeys for the relevant physical scenarios and define how the interface will react to each.


It is important to note the two main objects in AR when developing VR applications: 3D [[volumetric]] objects that are manipulated and realistically interact with light and shadow; and animated media imagery such as images and videos which are mostly traditional 2D media rendered in a new context for augmented reality.<ref name="Wilson-2018" /> When virtual objects are projected onto a real environment, it is challenging for augmented reality application designers to ensure a perfectly seamless integration relative to the real-world environment, especially with 2D objects. As such, designers can add weight to objects, use depths maps, and choose different material properties that highlight the object's presence in the real world. Another visual design that can be applied is using different [[computer graphics lighting|lighting]] techniques or casting shadows to improve overall depth judgment. For instance, a common lighting technique is simply placing a light source overhead at the 12&nbsp;o’clock position, to create shadows on virtual objects.<ref name="Wilson-2018" />
Especially in AR systems, it is vital to also consider the spatial space and the surrounding elements that change the effectiveness of the AR technology. Environmental elements such as lighting, and sound can prevent the sensor of AR devices from detecting necessary data and ruin the immersion of the end-user.<ref name=":4">{{Cite web|url=https://www.igi-global.com/book/emerging-technologies-augmented-reality/338.|title=Emerging Technologies of Augmented Reality: Interfaces and Design|last=Haller, Michael, Billinghurst, Mark, Thomas, and Bruce|first=|date=|website=|access-date=}}</ref>


==Uses==
Another aspect of context design involves the design of the system's functionality and its ability to accommodate for user preferences.<ref name=":5">{{Cite web|url=https://blog.google/products/google-vr/best-practices-mobile-ar-design/|title=Best Practices for Mobile AR Design- Google|last=|first=|date=|website=|access-date=}}</ref><ref>{{Cite web|url=http://www.eislab.fim.uni-passau.de/files/publications/2014/TR2014-HCIwithAR_1.pdf|title=Human Computer Interaction with Augmented Reality|last=|first=|date=|website=|access-date=}}</ref> While accessibility tools are common in basic application design, some consideration should be made when designing time-limited prompts (to prevent unintentional operations), audio cues and overall engagement time. It is important to note that in some situations, the application's functionality may hinder the user's ability. For example, applications that is used for driving should reduce the amount of user interaction and user audio cues instead.
Augmented reality has been explored for many uses, including gaming, medicine, and entertainment. It has also been explored for education and business.<ref>{{Cite journal|last1=Moro|first1=Christian|last2=Štromberga|first2=Zane|last3=Raikos|first3=Athanasios|last4=Stirling|first4=Allan|date=2017|title=The effectiveness of virtual and augmented reality in health sciences and medical anatomy|url=https://pubmed.ncbi.nlm.nih.gov/28419750|journal=Anatomical Sciences Education|volume=10|issue=6|pages=549–559|doi=10.1002/ase.1696|issn=1935-9780|pmid=28419750|s2cid=25961448}}</ref> Example application areas described below include archaeology, architecture, commerce and education. Some of the earliest cited examples include augmented reality used to support surgery by providing virtual overlays to guide medical practitioners, to AR content for astronomy and welding.<ref name="Dupzyk 2016"/><ref>{{Cite news|url=https://www.slashgear.com/dont-be-blind-on-wearable-cameras-insists-ar-genius-20239514/|title=Don't be blind on wearable cameras insists AR genius|date=20 July 2012|work=SlashGear|access-date=21 October 2018|language=en-US}}</ref>


==== Interaction design ====
===Archaeology===
AR has been used to aid [[Archaeology|archaeological]] research. By augmenting archaeological features onto the modern landscape, AR allows archaeologists to formulate possible site configurations from extant structures.<ref>{{cite journal |title=Augmenting Phenomenology: Using Augmented Reality to Aid Archaeological Phenomenology in the Landscape |author=Stuart Eve |doi=10.1007/s10816-012-9142-7 | volume=19 |issue=4 |journal=Journal of Archaeological Method and Theory |pages=582–600|url=http://discovery.ucl.ac.uk/1352447/1/Eve_2012_Augmented_Phenomenology.pdf |year=2012 |s2cid=4988300 }}</ref> Computer generated models of ruins, buildings, landscapes or even ancient people have been recycled into early archaeological AR applications.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=854948 |title=Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System |author1=Dähne, Patrick |author2=Karigiannis, John N. |access-date=6 January 2010|isbn=9780769517810 |year=2002 }}</ref><ref>{{cite web |url=http://archpro.lbg.ac.at/press-release/school-gladiators-discovered-roman-carnuntum-austria |title=School of Gladiators discovered at Roman Carnuntum, Austria |author=LBI-ArchPro |access-date=29 December 2014|date=5 September 2011}}</ref><ref>{{Cite journal|title = Mixing virtual and real scenes in the site of ancient Pompeii|journal = Computer Animation and Virtual Worlds|date = 1 February 2005|issn = 1546-427X|pages = 11–24|volume = 16|issue = 1|doi = 10.1002/cav.53|first1 = George|last1 = Papagiannakis|first2 = Sébastien|last2 = Schertenleib|first3 = Brian|last3 = O'Kennedy|first4 = Marlene|last4 = Arevalo-Poizat|first5 = Nadia|last5 = Magnenat-Thalmann|first6 = Andrew|last6 = Stoddart|first7 = Daniel|last7 = Thalmann|citeseerx = 10.1.1.64.8781|s2cid = 5341917}}</ref> For example, implementing a system like VITA (Visual Interaction Tool for Archaeology) will allow users to imagine and investigate instant excavation results without leaving their home. Each user can collaborate by mutually "navigating, searching, and viewing data". Hrvoje Benko, a researcher in the computer science department at [[Columbia University]], points out that these particular systems and others like them can provide "3D panoramic images and 3D models of the site itself at different excavation stages" all the while organizing much of the data in a collaborative way that is easy to use. Collaborative AR systems supply [[multimodal interaction]]s that combine the real world with virtual images of both environments.<ref>{{Cite book |doi = 10.1109/ISMAR.2004.23|chapter = Collaborative Mixed Reality Visualization of an Archaeological Excavation|title = Third IEEE and ACM International Symposium on Mixed and Augmented Reality|pages = 132–140|year = 2004|last1 = Benko|first1 = H.|last2 = Ishak|first2 = E.W.|last3 = Feiner|first3 = S.|s2cid = 10122485|isbn = 0-7695-2191-6}}</ref>


===Architecture===
[[Interaction design]] in augmented reality technology centers on the user's engagement with the end product to improve the overall user experience and enjoyment. The purpose of Interaction Design is to avoid alienating or confusing the user by organising the information presented. Since user interaction relies on the user's input, designers must make system controls easier to understand and accessible. A common technique to improve usability for augmented reality applications is by discovering the frequently accessed areas in the device's touch display and design the application to match those areas of control.<ref>{{Cite web|url=https://theblog.adobe.com/basic-patterns-of-mobile-navigation/|title=Basic Patterns of Mobile Navigation|last=|first=|date=|website=|access-date=}}</ref> It is also important to structure the user journey maps and the flow of information presented which reduce the system's overall cognitive load and greatly improves the learning curve of the application.<ref>{{Cite web|url=https://www.thinkwithgoogle.com/marketing-resources/experience-design/principles-of-mobile-app-design-engage-users-and-drive-conversions/|title=Principles of Mobile App Design: Engage Users and Drive Conversions|last=|first=|date=|website=|archive-url=https://web.archive.org/web/20180413185621/https://www.thinkwithgoogle.com/marketing-resources/experience-design/principles-of-mobile-app-design-engage-users-and-drive-conversions/|archive-date=2018-04-13|dead-url=yes|access-date=}}</ref>
AR can aid in visualizing building projects. Computer-generated images of a structure can be superimposed onto a real-life local view of a property before the physical building is constructed there; this was demonstrated publicly by [[Trimble Navigation]] in 2004. AR can also be employed within an architect's workspace, rendering animated 3D visualizations of their 2D drawings. Architecture sight-seeing can be enhanced with AR applications, allowing users viewing a building's exterior to virtually see through its walls, viewing its interior objects and layout.<ref>Divecha, Devina.[http://www.designmena.com/inspiration/augmented-reality-ar-part-architecture-design Augmented Reality (AR) used in architecture and design] {{Webarchive|url=https://web.archive.org/web/20130214173708/http://www.designmena.com/inspiration/augmented-reality-ar-part-architecture-design |date=14 February 2013 }}. ''designMENA'' 8 September 2011.</ref><ref>[http://www.news.uwa.edu.au/201203054410/events/architectural-dreams-agumented-reality Architectural dreams in augmented reality]. ''University News'', University of Western Australia. 5 March 2012.</ref><ref name="Outdoor AR">[https://www.youtube.com/watch?v=jL3C-OVQKWU Outdoor AR]. ''TV One News'', 8 March 2004.</ref>


With continual improvements to [[Global Positioning System|GPS]] accuracy, businesses are able to use augmented reality to visualize [[georeference]]d models of construction sites, underground structures, cables and pipes using mobile devices.<ref>{{cite web|last=Churcher|first=Jason|title=Internal accuracy vs external accuracy|url=http://www.augview.net/blog/archive-7May2013.html|access-date=7 May 2013}}</ref> Augmented reality is applied to present new projects, to solve on-site construction challenges, and to enhance promotional materials.<ref>{{cite web|title=Augment for Architecture & Construction|url=http://www.augmentedev.com/augmented-reality-architecture/|access-date=12 October 2015|archive-url=https://web.archive.org/web/20151108054418/http://www.augmentedev.com/augmented-reality-architecture/|archive-date=8 November 2015|url-status=dead|df=dmy-all}}</ref> Examples include the [[Daqri]] Smart Helmet, an Android-powered hard hat used to create augmented reality for the industrial worker, including visual instructions, real-time alerts, and 3D mapping.
In interaction design, it is important for developers to utilize augmented reality technology that complement the system's function or purpose.<ref>{{Cite web|url=https://www.uxmatters.com/mt/archives/2009/08/inside-out-interaction-design-for-augmented-reality.php|title=Inside Out: Interaction Design for Augmented Reality-UXmatters|last=|first=|date=|website=|access-date=}}</ref> For instance, the utilization of exciting AR filters and the design of the unique sharing platform in [[Snapchat]] enables users to better the user's social interactions. In other applications that require users to understand the focus and intent, designers can employ a [[reticle]] or [[Ray casting|raycast]] from the device.<ref name=":5" /> Moreover, augmented reality developers may find it appropriate to have digital elements scale or react to the direction of the camera and the context of objects that can are detected.<ref name=":4" /> &nbsp;


Following the [[Christchurch earthquake]], the University of Canterbury released CityViewAR,<ref>{{Cite web|url=https://www.stuff.co.nz/technology/digital-living/6121248/App-gives-a-view-of-city-as-it-used-to-be|title=App gives a view of city as it used to be|website=Stuff|date=10 December 2011|language=en|access-date=20 May 2018}}</ref> which enabled city planners and engineers to visualize buildings that had been destroyed.<ref>{{cite book|last=Lee|first=Gun|s2cid=34199215|chapter=CityViewAR outdoor AR visualization|year=2012|publisher=ACM|isbn=978-1-4503-1474-9|chapter-url=http://dl.acm.org/citation.cfm?id=2379281|doi=10.1145/2379256.2379281|title=Proceedings of the 13th International Conference of the NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction - CHINZ '12|pages=97|hdl=10092/8693}}</ref> This not only provided planners with tools to reference the previous [[cityscape]], but it also served as a reminder of the magnitude of the resulting devastation, as entire buildings had been demolished.
The most exciting factor of augmented reality technology is the ability to utilize the introduction of [[Three-dimensional space|3D space]]. This means that a user can potentially access multiple copies of 2D interfaces within a single AR application.<ref name=":4" /> AR applications are collaborative, a user can also connect to another's device and view or manipulate virtual objects in the other person's context.


==== Visual design ====
===Education and Training===
In educational settings, AR has been used to complement a standard curriculum. Text, graphics, video, and audio may be superimposed into a student's real-time environment. Textbooks, flashcards and other educational reading material may contain embedded "markers" or triggers that, when scanned by an AR device, produced supplementary information to the student rendered in a multimedia format.<ref>[https://web.archive.org/web/20111024105916/http://www.prweb.com/releases/2011/10/prweb8899908.htm Groundbreaking Augmented Reality-Based Reading Curriculum Launches], ''PRweb'', 23 October 2011.</ref><ref>Stewart-Smith, Hanna. [https://www.zdnet.com/article/education-with-augmented-reality-ar-textbooks-released-in-japan-video/ Education with Augmented Reality: AR textbooks released in Japan], ''ZDnet'', 4 April 2012.</ref><ref>[http://smarterlearning.wordpress.com/2011/11/10/augmented-reality-in-education/ Augmented reality in education] ''smarter learning''.</ref> The 2015 Virtual, Augmented and Mixed Reality: 7th International Conference mentioned [[Google Glass]] as an example of augmented reality that can replace the physical classroom.<ref>{{Cite book|url=https://books.google.com/books?id=O7g0CgAAQBAJ&q=virternity|title=Virtual, Augmented and Mixed Reality: 7th International Conference, VAMR 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA, 2–7 August 2015, Proceedings|last1=Shumaker|first1=Randall|last2=Lackey|first2=Stephanie|date=20 July 2015|publisher=Springer|isbn=9783319210674|language=en}}</ref> First, AR technologies help learners engage in authentic exploration in the real world, and virtual objects such as texts, videos, and pictures are supplementary elements for learners to conduct investigations of the real-world surroundings.<ref>{{cite journal |last1=Wu |first1=Hsin-Kai |last2=Lee |first2=Silvia Wen-Yu |last3=Chang |first3=Hsin-Yi |last4=Liang |first4=Jyh-Chong |title=Current status, opportunities and challenges of augmented reality in education |journal=Computers & Education |date=March 2013 |volume=62 |pages=41–49 |doi=10.1016/j.compedu.2012.10.024 |s2cid=15218665 }}</ref>


As AR evolves, students can participate interactively and interact with knowledge more authentically. Instead of remaining passive recipients, students can become active learners, able to interact with their learning environment. Computer-generated simulations of historical events allow students to explore and learning details of each significant area of the event site.<ref>Lubrecht, Anna. [http://digitalunion.osu.edu/2012/04/24/augmented-reality-for-education/ Augmented Reality for Education] {{Webarchive|url=https://web.archive.org/web/20120905075530/http://digitalunion.osu.edu/2012/04/24/augmented-reality-for-education/ |date=5 September 2012 }} ''The Digital Union'', The Ohio State University 24 April 2012.</ref>
In general, [[Visual design elements and principles|visual design]] is the appearance of the developing application that engages the user. To improve the graphic interface elements and user interaction, developers may use visual cues to inform user what elements of UI are designed to interact with and how to interact with them. Since navigating in AR application may appear difficult and seem frustrating, visual cues design can make interactions seem more natural.<ref name=":3" />


In higher education, Construct3D, a Studierstube system, allows students to learn mechanical engineering concepts, math or geometry.<ref>{{Cite web |url=http://acdc.sav.us.es/pixelbit/images/stories/p41/15.pdf |title=Augmented reality, an evolution of the application of mobile devices |access-date=19 June 2014 |archive-url=https://web.archive.org/web/20150417053823/http://acdc.sav.us.es/pixelbit/images/stories/p41/15.pdf |archive-date=17 April 2015 |url-status=dead |df=dmy-all }}</ref> Chemistry AR apps allow students to visualize and interact with the spatial structure of a molecule using a marker object held in the hand.<ref>Maier, Patrick; Tönnis, Marcus; Klinker, Gudron. [http://ar.in.tum.de/pub/maierp2009ijas/maierp2009ijas.pdf Augmented Reality for teaching spatial relations] {{Webarchive|url=https://web.archive.org/web/20130128175343/http://ar.in.tum.de/pub/maierp2009ijas/maierp2009ijas.pdf |date=28 January 2013 }}, ''Conference of the International Journal of Arts & Sciences (Toronto 2009'').</ref> Others have used HP Reveal, a free app, to create AR notecards for studying organic chemistry mechanisms or to create virtual demonstrations of how to use laboratory instrumentation.<ref>{{cite journal |last1=Plunkett |first1=Kyle N. |title=A Simple and Practical Method for Incorporating Augmented Reality into the Classroom and Laboratory |journal=Journal of Chemical Education |date=12 November 2019 |volume=96 |issue=11 |pages=2628–2631 |doi=10.1021/acs.jchemed.9b00607 |bibcode=2019JChEd..96.2628P |doi-access=free}}</ref> Anatomy students can visualize different systems of the human body in three dimensions.<ref>{{cite web|url=https://www.vuforia.com/case-studies/anatomy-4d |title=Anatomy 4D |work=Qualcomm |access-date=2 July 2015 |url-status=dead |archive-url=https://web.archive.org/web/20160311085744/http://vuforia.com/case-studies/anatomy-4d |archive-date=11 March 2016 |df=dmy }}</ref> Using AR as a tool to learn anatomical structures has been shown to increase the learner knowledge and provide intrinsic benefits, such as increased engagement and learner immersion.<ref>{{Cite journal|last1=Moro|first1=Christian|last2=Štromberga|first2=Zane|last3=Raikos|first3=Athanasios|last4=Stirling|first4=Allan|date=November 2017|title=The effectiveness of virtual and augmented reality in health sciences and medical anatomy: VR and AR in Health Sciences and Medical Anatomy|journal=Anatomical Sciences Education|language=en|volume=10|issue=6|pages=549–559|doi=10.1002/ase.1696|pmid=28419750|s2cid=25961448|url=https://research.bond.edu.au/en/publications/d761ced8-4406-4a5e-ae3f-01862a09a36e}}</ref><ref>{{Cite journal|last1=Birt|first1=James|last2=Stromberga|first2=Zane|last3=Cowling|first3=Michael|last4=Moro|first4=Christian|date=2018-01-31|title=Mobile Mixed Reality for Experiential Learning and Simulation in Medical and Health Sciences Education|journal=Information|language=en|volume=9|issue=2|pages=31|doi=10.3390/info9020031|issn=2078-2489|doi-access=free}}</ref>
In some augmented reality applications that uses a 2D device as an interactive surface, the 2D control environment does not translate well in 3D space making users hesitant to explore their surroundings. To solve this issue, designers should apply visual cues to assist and encourage users to explore their surroundings.


AR has been used to develop different safety training applications for several types of disasters, such as, earthquakes and building fire, and health and safety tasks.<ref>{{Cite journal |last1=Catal |first1=Cagatay |last2=Akbulut |first2=Akhan |last3=Tunali |first3=Berkay |last4=Ulug |first4=Erol |last5=Ozturk |first5=Eren |date=2020-09-01 |title=Evaluation of augmented reality technology for the design of an evacuation training game |journal=Virtual Reality |language=en |volume=24 |issue=3 |pages=359–368 |doi=10.1007/s10055-019-00410-z |issn=1434-9957|doi-access=free }}</ref><ref>{{Cite journal |last1=Gong |first1=Peizhen |last2=Lu |first2=Ying |last3=Lovreglio |first3=Ruggiero |last4=Lv |first4=Xiaofeng |last5=Chi |first5=Zexun |date=2024-10-01 |title=Applications and effectiveness of augmented reality in safety training: A systematic literature review and meta-analysis |url=https://www.sciencedirect.com/science/article/pii/S0925753524002145 |journal=Safety Science |volume=178 |pages=106624 |doi=10.1016/j.ssci.2024.106624 |issn=0925-7535}}</ref><ref>{{Cite journal |last1=Paes |first1=Daniel |last2=Feng |first2=Zhenan |last3=King |first3=Maddy |last4=Khorrami Shad |first4=Hesam |last5=Sasikumar |first5=Prasanth |last6=Pujoni |first6=Diego |last7=Lovreglio |first7=Ruggiero |date=June 2024 |title=Optical see-through augmented reality fire safety training for building occupants |journal=Automation in Construction |language=en |volume=162 |pages=105371 |doi=10.1016/j.autcon.2024.105371|doi-access=free }}</ref> Further, several AR solutions have been proposed and tested to navigate building evacuees towards safe places in both large scale and small scale disasters.<ref>{{Cite journal |last1=Lovreglio |first1=Ruggiero |last2=Kinateder |first2=Max |date=October 2020 |title=Augmented reality for pedestrian evacuation research: Promises and limitations |url=https://linkinghub.elsevier.com/retrieve/pii/S0925753520301478 |journal=Safety Science |language=en |volume=128 |pages=104750 |doi=10.1016/j.ssci.2020.104750}}</ref><ref>{{Cite book |last1=Mantoro |first1=Teddy |last2=Alamsyah |first2=Zaenal |last3=Ayu |first3=Media Anugerah |chapter=Pathfinding for Disaster Emergency Route Using Sparse A* and Dijkstra Algorithm with Augmented Reality |date=October 2021 |title=2021 IEEE 7th International Conference on Computing, Engineering and Design (ICCED) |chapter-url=https://ieeexplore.ieee.org/document/9664869/;jsessionid=ji9AewRr7XUqMhvK4eTjYawVSsfm_uYd8B6qi3p56mlvzZQMEkTV!1091101768 |pages=1–6 |doi=10.1109/ICCED53389.2021.9664869|isbn=978-1-6654-3996-1 }}</ref> AR applications can have several overlapping with many other digital technologies, such as [[Building information modeling|BIM]], [[internet of things]] and [[artificial intelligence]], to generate smarter safety training and navigation solutions.<ref>{{Citation |last1=Lovreglio |first1=R. |title=Digital Technologies for Fire Evacuations |date=2024 |work=Intelligent Building Fire Safety and Smart Firefighting |pages=439–454 |editor-last=Huang |editor-first=Xinyan |url=https://link.springer.com/10.1007/978-3-031-48161-1_18 |access-date=2024-03-15 |place=Cham |publisher=Springer Nature Switzerland |language=en |doi=10.1007/978-3-031-48161-1_18 |isbn=978-3-031-48160-4 |last2=Paes |first2=D. |last3=Feng |first3=Z. |last4=Zhao |first4=X. |editor2-last=Tam |editor2-first=Wai Cheong}}</ref>
It is important to note the two main objects in AR when developing VR applications: 3D [[volumetric]] objects that are manipulatable and realistically interact with light and shadow; and animated media imagery such as images and videos which are mostly traditional 2D media rendered in a new context for augmented reality.<ref name=":3" /> When virtual objects are projected onto a real environment, it is challenging for augmented reality application designers to ensure a perfectly seamless integration relative to the real world environment, especially with 2D objects. As such, designers can add weight to objects, use depths maps, and choose different material properties that highlight the object's presence in the real world. Another visual design that can be applied is using different [[lighting]] techniques or casting shadows to improve overall depth judgment. For instance, a common lighting technique is simply placing a light source overhead at the 12 o’clock position, to create shadows upon virtual objects.<ref name=":3" />


== Possible applications ==
=== Industrial manufacturing ===
AR is used to substitute paper manuals with digital instructions which are overlaid on the manufacturing operator's field of view, reducing mental effort required to operate.<ref name="Mourtzis-2019">{{Cite journal|last1=Mourtzis|first1=Dimitris|last2=Zogopoulos|first2=Vasilios|last3=Xanthi|first3=Fotini|s2cid=189904235|date=2019-06-11|title=Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling|journal=The International Journal of Advanced Manufacturing Technology|volume=105|issue=9|pages=3899–3910|language=en|doi=10.1007/s00170-019-03941-6|issn=0268-3768}}</ref> AR makes machine maintenance efficient because it gives operators direct access to a machine's maintenance history.<ref>{{Citation|last1=Boccaccio|first1=A.|title=Exploiting Augmented Reality to Display Technical Information on Industry 4.0 P&ID|date=2019|work=Advances on Mechanics, Design Engineering and Manufacturing II|pages=282–291|editor-last=Cavas-Martínez|editor-first=Francisco|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-030-12346-8_28|isbn=978-3-030-12345-1|last2=Cascella|first2=G. L.|last3=Fiorentino|first3=M.|last4=Gattullo|first4=M.|last5=Manghisi|first5=V. M.|last6=Monno|first6=G.|last7=Uva|first7=A. E.|series=Lecture Notes in Mechanical Engineering |s2cid=150159603|editor2-last=Eynard|editor2-first=Benoit|editor3-last=Fernández Cañavate|editor3-first=Francisco J.|editor4-last=Fernández-Pacheco|editor4-first=Daniel G.}}</ref> Virtual manuals help manufacturers adapt to rapidly-changing product designs, as digital instructions are more easily edited and distributed compared to physical manuals.<ref name="Mourtzis-2019" />
{{editorial|section|date=June 2013}}


Digital instructions increase operator safety by removing the need for operators to look at a screen or manual away from the working area, which can be hazardous. Instead, the instructions are overlaid on the working area.<ref name="Mourtzis-2018">{{Cite journal|last1=Mourtzis|first1=Dimitris|last2=Zogopoulos|first2=Vasilios|last3=Katagis|first3=Ioannis|last4=Lagios|first4=Panagiotis|date=2018|title=Augmented Reality based Visualization of CAM Instructions towards Industry 4.0 paradigm: a CNC Bending Machine case study|journal=Procedia CIRP|language=en|volume=70|pages=368–373|doi=10.1016/j.procir.2018.02.045|doi-access=free}}</ref><ref>{{cite journal|title=An Augmented Reality inspection tool to support workers in Industry 4.0 environments|journal=Computers in Industry|date=2021|doi=10.1016/j.compind.2021.103412 |url=https://doi.org/10.1016/j.compind.2021.103412 |last1=Marino |first1=Emanuele |last2=Barbieri |first2=Loris |last3=Colacino |first3=Biagio |last4=Fleri |first4=Anna Kum |last5=Bruno |first5=Fabio |volume=127 |s2cid=232272256 }}</ref> The use of AR can increase operators' feeling of safety when working near high-load industrial machinery by giving operators additional information on a machine's status and safety functions, as well as hazardous areas of the workspace.<ref name="Mourtzis-2018" /><ref>{{Cite journal|last1=Michalos|first1=George|last2=Kousi|first2=Niki|last3=Karagiannis|first3=Panagiotis|last4=Gkournelos|first4=Christos|last5=Dimoulas|first5=Konstantinos|last6=Koukas|first6=Spyridon|last7=Mparis|first7=Konstantinos|last8=Papavasileiou|first8=Apostolis|last9=Makris|first9=Sotiris|date=November 2018|title=Seamless human robot collaborative assembly – An automotive case study|journal=Mechatronics|volume=55|pages=194–211|doi=10.1016/j.mechatronics.2018.08.006|s2cid=115979090|issn=0957-4158}}</ref>
Augmented reality has been explored for many applications, from gaming and entertainment to medicine, education and business. Example application areas described below include Archaeology, Architecture, Commerce and Education. Some of the earliest cited examples include Augmented Reality used to support surgery by providing virtual overlays to guide medical practitioners to AR content for astronomy and welding.<ref>{{Cite news|url=http://images.huffingtonpost.com/2016-05-13-1463155843-8474094-AR_history_timeline.jpg|title=The Lengthy History of Augmented Reality|last=|first=|date=May 15, 2016|work=Huffington Post|access-date=}}</ref><ref name="Dupzyk 2016"/><ref>{{Cite news|url=https://www.slashgear.com/dont-be-blind-on-wearable-cameras-insists-ar-genius-20239514/|title=Don’t be blind on wearable cameras insists AR genius|date=2012-07-20|work=SlashGear|access-date=2018-10-21|language=en-US}}</ref>


=== Literature ===
===Commerce===
{{main|Commercial augmented reality}}
[[File:Ar code.png|thumb|An example of an AR code containing a [[QR code]]]]
The first description of AR as it is known today was in ''[[Virtual Light]]'', the 1994 novel by William Gibson. In 2011, AR was blended with poetry by [[ni ka]] from Sekai Camera in Tokyo, Japan. The prose of these AR poems come from [[Paul Celan]], "[[Die Niemandsrose]]", expressing the aftermath of the [[2011 Tōhoku earthquake and tsunami]].<ref>「AR技術による喪の空間の創造 ni_kaのAR詩について」『[[DOMMUNE]] OFFICIAL GUIDE BOOK2』[[河出書房新社]] 2011年 pp&nbsp;49–50</ref><ref>「ni_kaの「AR詩」」『[[Web Designing]]』2012年6月号 [[マイナビ]] p43</ref><ref>{{Cite web|url=http://yaplog.jp/tipotipo/category_33/|title=AR詩 | にかにかブログ! (おぶんがく&包丁&ちぽちぽ革命)|website=にかにかブログ! (おぶんがく&包丁&ちぽちぽ革命)|language=ja-JP|access-date=2018-05-20}}</ref> 5


[[File:AR-Icon.svg|thumb|alt= Illustration of an AR-Icon image | The AR-Icon can be used as a marker on print as well as on online media. It signals the viewer that digital content is behind it. The content can be viewed with a smartphone or tablet.]]
=== Archaeology ===


AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect different types of media.<ref>Katts, Rima. [http://www.mobilemarketer.com/cms/news/software-technology/13810.html Elizabeth Arden brings new fragrance to life with augmented reality] ''Mobile Marketer'', 19 September 2012.</ref><ref>Meyer, David. [http://gigaom.com/europe/telefonica-bets-on-augmented-reality-with-aurasma-tie-in/ Telefónica bets on augmented reality with Aurasma tie-in] ''gigaom'', 17 September 2012.</ref><ref>Mardle, Pamela.[http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ Video becomes reality for Stuprint.com] {{webarchive |url=https://web.archive.org/web/20130312171811/http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ |date=12 March 2013 }}. ''[[PrintWeek]]'', 3 October 2012.</ref><ref>Giraldo, Karina.[http://www.solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ Why mobile marketing is important for brands?] {{webarchive|url=https://web.archive.org/web/20150402135323/http://solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ |date=2 April 2015 }}. ''SolinixAR'', Enero 2015.</ref><ref>{{cite news|title=Augmented reality could be advertising world's best bet|url=http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|agency=The Financial Express|date=18 April 2015|url-status=dead|archive-url=https://web.archive.org/web/20150521061314/http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|archive-date=21 May 2015|df=dmy-all}}</ref>
AR has been used to aid archaeological research. By augmenting archaeological features onto the modern landscape, AR allows archaeologists to formulate possible site configurations from extant structures.<ref>{{cite journal |title=Augmenting Phenomenology: Using Augmented Reality to Aid Archaeological Phenomenology in the Landscape |author=Stuart Eve |doi=10.1007/s10816-012-9142-7 | volume=19 |issue=4 |journal=Journal of Archaeological Method and Theory |pages=582–600|url=http://discovery.ucl.ac.uk/1352447/1/Eve_2012_Augmented_Phenomenology.pdf |year=2012 }}</ref> Computer generated models of ruins, buildings, landscapes or even ancient people have been recycled into early archaeological AR applications.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=854948 |title=Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System |author1=Dähne, Patrick |author2=Karigiannis, John N. |accessdate=2010-01-06|isbn=9780769517810 |year=2002 }}</ref><ref>{{cite web |url=http://archpro.lbg.ac.at/press-release/school-gladiators-discovered-roman-carnuntum-austria |title=School of Gladiators discovered at Roman Carnuntum, Austria |author=LBI-ArchPro |accessdate=2014-12-29|date=2011-09-05 }}</ref><ref name="ref0">{{Cite journal|title = Mixing virtual and real scenes in the site of ancient Pompeii|journal = Computer Animation and Virtual Worlds|date = February 1, 2005|issn = 1546-427X|pages = 11–24|volume = 16|issue = 1|doi = 10.1002/cav.53|first = George|last = Papagiannakis|first2 = Sébastien|last2 = Schertenleib|first3 = Brian|last3 = O'Kennedy|first4 = Marlene|last4 = Arevalo-Poizat|first5 = Nadia|last5 = Magnenat-Thalmann|first6 = Andrew|last6 = Stoddart|first7 = Daniel|last7 = Thalmann|citeseerx = 10.1.1.64.8781}}</ref> For example, implementing a system like, "VITA (Visual Interaction Tool for Archaeology)" will allow users to imagine and investigate instant excavation results without leaving their home. Each user can collaborate by mutually "navigating, searching, and viewing data." Hrvoje Benko, a researcher for the computer science department at Colombia University, points out that these particular systems and others like it can provide "3D panoramic images and 3D models of the site itself at different excavation stages" all the while organizing much of the data in a collaborative way that is easy to use. Collaborative AR systems supply multimodal interactions that combine the real world with virtual images of both environments.<ref>{{Cite journal|last=Benko|first=Hrvoje|date=2004|title=Collaborative Mixed Reality Visualization of an Archaeological Excavation|url=|journal=1|volume=|pages=1–3|doi=10.1145/1040000/1033710/21910132|via=|doi-broken-date=2018-11-22}}</ref>
AR has been recently adopted also in the underwater archaeology field to efficiently support and facilitate the manipulation of archaeological artefacts.<ref>{{cite journal|last1= Bruno|first1=Fabio|last2=Lagudi|first2=Antonio|last3=Barbieri|first3=Loris|last4=Rizzo|first4=Domenico|last5=Muzzupappa|first5=Maurizio|last6=De Napoli|first6=Luigi|title=Augmented reality visualization of scene depth for aiding ROV pilots in underwater manipulation|journal=Ocean Engineering|date=2018|volume=168|pages=140–154|doi=10.1016/j.oceaneng.2018.09.007}}</ref>


AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.<ref>Humphries, Mathew.[http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/] {{Webarchive|url=https://web.archive.org/web/20120626192637/http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/|date=26 June 2012}}.''Geek.com'' 19 September 2011.</ref> AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.<ref>Netburn, Deborah.[https://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story Ikea introduces augmented reality app for 2013 catalog] {{Webarchive|url=https://web.archive.org/web/20121202070158/http://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story |date=2 December 2012 }}. ''[[Los Angeles Times]]'', 23 July 2012.</ref>
=== Architecture ===


By 2010, virtual dressing rooms had been developed for e-commerce.<ref>{{cite journal |last1=van Krevelen |first1=D.W.F. |last2=Poelman |first2=R. |title=A Survey of Augmented Reality Technologies, Applications and Limitations |journal=International Journal of Virtual Reality |date=November 2015 |volume=9 |issue=2 |pages=1–20 |url=https://hal.archives-ouvertes.fr/hal-01530500/ |doi=10.20870/IJVR.2010.9.2.2767 |doi-access=free }}</ref>
AR can aid in visualizing building projects. Computer-generated images of a structure can be superimposed into a real life local view of a property before the physical building is constructed there; this was demonstrated publicly by [[Trimble Navigation]] in 2004. AR can also be employed within an architect's workspace, rendering animated 3D visualizations of their 2D drawings. Architecture sight-seeing can be enhanced with AR applications, allowing users viewing a building's exterior to virtually see through its walls, viewing its interior objects and layout.<ref>Divecha, Devina.[http://www.designmena.com/inspiration/augmented-reality-ar-part-architecture-design Augmented Reality (AR) used in architecture and design]. ''designMENA'' 8 September 2011.</ref><ref>[http://www.news.uwa.edu.au/201203054410/events/architectural-dreams-agumented-reality Architectural dreams in augmented reality]. ''University News'', University of Western Australia. 5 March 2012.</ref><ref name="Outdoor AR">[https://www.youtube.com/watch?v=jL3C-OVQKWU Outdoor AR]. ''TV One News'', 8 March 2004.</ref>


In 2012, a mint used AR techniques to market a commemorative coin for Aruba. The coin itself was used as an AR trigger, and when held in front of an AR-enabled device it revealed additional objects and layers of information that were not visible without the device.<ref>Alexander, Michael.[http://news.coinupdate.com/arbua-shoco-owl-silver-coin-with-augmented-reality-1490/ Arbua Shoco Owl Silver Coin with Augmented Reality], ''Coin Update'' 20 July 2012.</ref><ref>[http://www.todaysxm.com/2012/08/07/royal-mint-produces-revolutionary-commemorative-coin-for-aruba/ Royal Mint produces revolutionary commemorative coin for Aruba] {{webarchive |url=https://web.archive.org/web/20150904090653/http://www.todaysxm.com/2012/08/07/royal-mint-produces-revolutionary-commemorative-coin-for-aruba/ |date=4 September 2015 }}, ''Today'' 7 August 2012.</ref>
With the continual improvements to [[Global Positioning System|GPS]] accuracy, businesses are able to use augmented reality to visualize [[georeference]]d models of construction sites, underground structures, cables and pipes using mobile devices.<ref>{{cite web|last=Churcher|first=Jason|title=Internal accuracy vs external accuracy|url=http://www.augview.net/blog/archive-7May2013.html|accessdate=7 May 2013}}</ref> Augmented reality is applied to present new projects, to solve on-site construction challenges, and to enhance promotional materials.<ref>{{cite web|title=Augment for Architecture & Construction|url=http://www.augmentedev.com/augmented-reality-architecture/|accessdate=12 October 2015|archive-url=https://web.archive.org/web/20151108054418/http://www.augmentedev.com/augmented-reality-architecture/|archive-date=8 November 2015|dead-url=yes|df=dmy-all}}</ref> Examples include the [[Daqri]] Smart Helmet, an Android-powered hard hat used to create augmented reality for the industrial worker, including visual instructions, real-time alerts, and 3D mapping.


In 2018, [[Apple Inc.|Apple]] announced [[Universal Scene Description]] (USDZ) AR file support for iPhones and iPads with iOS 12. Apple has created an AR QuickLook Gallery that allows masses to experience augmented reality on their own Apple device.<ref>{{cite web | url = https://www.computerworld.com/article/3307437/mobile-wireless/this-small-ios-12-feature-is-the-birth-of-a-whole-industry.html | title = This small iOS 12 feature is the birth of a whole industry | publisher = Jonny Evans | access-date = 19 September 2018| date = 19 September 2018 }}</ref>
Following the [[Christchurch earthquake]], the University of Canterbury released CityViewAR,<ref>{{Cite web|url=https://www.stuff.co.nz/technology/digital-living/6121248/App-gives-a-view-of-city-as-it-used-to-be|title=App gives a view of city as it used to be|website=Stuff|language=en|access-date=2018-05-20}}</ref> which enabled city planners and engineers to visualize buildings that had been destroyed.<ref>{{cite book|last=Lee|first=Gun|title=CityViewAR outdoor AR visualization|year=2012|publisher=ACM|isbn=978-1-4503-1474-9|page=97|url=http://dl.acm.org/citation.cfm?id=2379281}}</ref> Not only did this provide planners with tools to reference the previous [[cityscape]], but it also served as a reminder to the magnitude of the devastation caused, as entire buildings had been demolished.


In 2018, [[Shopify]], the Canadian e-commerce company, announced AR Quick Look integration. Their merchants will be able to upload 3D models of their products and their users will be able to tap on the models inside the Safari browser on their iOS devices to view them in their real-world environments.<ref>{{cite web | url = https://techcrunch.com/2018/09/17/shopify-is-bringing-apples-latest-ar-tech-to-their-platform/ | title = Shopify is bringing Apple's latest AR tech to their platform | date = 17 September 2018 | publisher = Lucas Matney | access-date = 3 December 2018}}</ref>
=== Visual art ===


In 2018, [[Twinkl]] released a free AR classroom application. Pupils can see how [[York]] looked over 1,900 years ago.<ref>{{cite journal | url = https://www.qaeducation.co.uk/article/ar-classroom-york | title = History re-made: New AR classroom application lets pupils see how York looked over 1,900 years ago | journal = QA Education| access-date = 4 September 2018| date = 4 September 2018}}</ref> Twinkl launched the first ever multi-player AR game, ''Little Red''<ref>{{cite journal | url = https://www.prolificnorth.co.uk/news/digital/2018/09/sheffields-twinkl-claims-ar-first-new-game| title = Sheffield's Twinkl claims AR first with new game | journal = Prolific North| access-date = 19 September 2018| date = 19 September 2018}}</ref> and has over 100 free AR educational models.<ref>{{cite journal | url = http://www.the-educator.org/technology-from-twinkl-brings-never-seen-before-objects-to-the-classroom/ | title = Technology from Twinkl brings never seen before objects to the classroom | journal = The Educator UK| access-date = 21 December 2018| date = 21 September 2018}}</ref>
AR applied in the visual arts allows objects or places to trigger artistic multidimensional experiences and interpretations of reality.


Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which are overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer [[full-body scanning]]. These booths render a 3-D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.<ref>Pavlik, John V., and Shawn McIntosh. "Augmented Reality." ''Converging Media: a New Introduction to Mass Communication'', 5th ed., [[Oxford University Press]], 2017, pp. 184–185.</ref> For example, [[J. C. Penney|JC Penney]] and [[Bloomingdale's]] use "[[virtual dressing room]]s" that allow customers to see themselves in clothes without trying them on.<ref name="Dacko-2017">{{cite journal |last1=Dacko |first1=Scott G. |title=Enabling smart retail settings via mobile augmented reality shopping apps |journal=Technological Forecasting and Social Change |date=November 2017 |volume=124 |pages=243–256 |doi=10.1016/j.techfore.2016.09.032 |url=http://wrap.warwick.ac.uk/81922/5/WRAP-enabling-smart-retail-Dacko-2017.pdf }}</ref> Another store that uses AR to market clothing to its customers is [[Neiman Marcus]].<ref name="Retail Dive">{{Cite news|url=https://www.retaildive.com/news/how-neiman-marcus-is-turning-technology-innovation-into-a-core-value/436590/|title=How Neiman Marcus is turning technology innovation into a 'core value'|work=Retail Dive|access-date=23 September 2018|language=en-US}}</ref> Neiman Marcus offers consumers the ability to see their outfits in a 360-degree view with their "memory mirror".<ref name="Retail Dive" /> Makeup stores like [[L'Oréal|L'Oreal]], [[Sephora]], [[Charlotte Tilbury]], and [[Rimmel]] also have apps that utilize AR.<ref name="Arthur" /> These apps allow consumers to see how the makeup will look on them.<ref name="Arthur" /> According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail".<ref name="Arthur" />
Augmented Reality can aid in the progression of visual art in museums by allowing museum visitors to view artwork in galleries in a multidimensional way through their phone screens. [[Museum of Modern Art|The Museum of Modern Art]] in New York has created an exhibit in their art museum showcasing Augmented Reality features that viewers can see using an app on their smartphone.<ref>{{Cite book|url=https://books.google.com/books?hl=en&lr=&id=OyGiW2OYI8AC&oi=fnd&pg=PR1&dq=augmented+reality:+an+emerging+technologies+guide+to+AR&ots=Z_CS2VJZD6&sig=oqCIbzJw8Lu4nUxfgeoFp50kk5o#v=onepage&q=augmented%20reality:%20an%20emerging%20technologies%20guide%20to%20AR&f=false|title=Augmented Reality: An Emerging Technologies Guide to AR|last=Kipper|first=Greg|last2=Rampolla|first2=Joseph|date=2012-12-31|publisher=Elsevier|isbn=9781597497343|language=en}}</ref> The museum has developed their personal app, called MoMAR Gallery, that museum guests can download and use in the Augmented Reality specialized gallery in order to view the museum's paintings in a different way.<ref>{{Cite news|url=https://www.wired.com/story/augmented-reality-art-museums/|title=Augmented Reality Is Transforming Museums|work=WIRED|access-date=2018-09-30|language=en-US}}</ref> This allows individuals to see hidden aspects and information about the paintings, and to be able to have an interactive technological experience with artwork as well.


AR technology is also used by furniture retailers such as [[IKEA]], [[Houzz]], and [[Wayfair]].<ref name="Arthur">{{Cite news|url=https://www.forbes.com/sites/rachelarthur/2017/10/31/augmented-reality-is-set-to-transform-fashion-and-retail/#364c701b3151|title=Augmented Reality Is Set To Transform Fashion And Retail|last=Arthur|first=Rachel|work=Forbes|access-date=23 September 2018|language=en}}</ref><ref name="Dacko-2017" /> These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.<ref name="Arthur" /> <ref>{{cite web |url=https://archvisualizations.com/augmented-reality-apps-for-interior-visualization/ |title=Augmented Reality Apps for Interior Visualization |access-date=2024-04-09 |website=archvisualizations.com|date=30 January 2024 }}</ref>
AR technology aided the development of [[eye tracking]] technology<ref>{{Cite news|url=https://yeppar.com/blog/microsoft-hololens/|title=Microsoft Hololens - Transform business with new dimensions from Yeppar|date=2017-11-28|work=Yeppar|access-date=2018-05-20|language=en-US}}</ref> to translate a disabled person's eye movements into drawings on a screen.<ref>Webley, Kayla. [http://www.time.com/time/specials/packages/article/0,28804,2029497_2030618_2029822,00.html The 50 Best Inventions of 2010 – EyeWriter] ''Time'', 11 November 2010.</ref>
In 2017, [[Ikea]] announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.<ref>{{cite magazine | url = https://www.wired.com/story/ikea-place-ar-kit-augmented-reality/ | title = IKEA's new app flaunts what you'll love most about AR| magazine = [[Wired (magazine)|Wired]] | access-date = 20 September 2017| date = 20 September 2017| last1 = Pardes| first1 = Arielle}}</ref> The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore.<ref>{{Cite web|url=https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html|title=IKEA Highlights 2017|access-date=8 October 2018|archive-date=8 October 2018|archive-url=https://web.archive.org/web/20181008214446/https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html|url-status=dead}}</ref><ref>{{Cite web|url=https://www.inter.ikea.com/en/performance|archiveurl=https://web.archive.org/web/20180626015939/https://highlights.ikea.com/2017/facts-and-figures/|url-status=dead|title=Performance|archivedate=26 June 2018|website=www.inter.ikea.com}}</ref> Shopify's acquisition of Primer, an AR [[Application software|app]] aims to push small and medium-sized sellers towards interactive AR shopping with easy to use AR integration and user experience for both merchants and consumers.<ref>{{Cite web|title=How Shopify is setting the future of AR shopping and what it means for sellers|date=29 June 2021 |url=https://www.suntecindia.com/blog/how-shopify-is-setting-the-future-of-ar-shopping-and-what-it-means-for-sellers/|access-date=2021-06-29|language=en-US}}</ref> AR helps the retail industry reduce operating costs. Merchants upload product information to the AR system, and consumers can use mobile terminals to search and generate 3D maps.<ref>{{Cite journal |last1=Indriani |first1=Masitoh |last2=Liah Basuki Anggraeni |date=2022-06-30 |title=What Augmented Reality Would Face Today? The Legal Challenges to the Protection of Intellectual Property in Virtual Space |journal=Media Iuris |volume=5 |issue=2 |pages=305–330 |doi=10.20473/mi.v5i2.29339 |s2cid=250464007 |issn=2621-5225|doi-access=free }}</ref>


=== Commerce ===
=== Literature ===
[[File:Ar code.png|thumb|alt= Illustration of a QR code | An example of an AR code containing a [[QR code]]]]


The first description of AR as it is known today was in ''[[Virtual Light]]'', the 1994 novel by William Gibson. In 2011, AR was blended with poetry by [[ni ka]] from Sekai Camera in Tokyo, Japan. The prose of these AR poems come from [[Paul Celan]], ''[[Die Niemandsrose]]'', expressing the aftermath of the [[2011 Tōhoku earthquake and tsunami]].<ref>{{Cite web|url=http://yaplog.jp/tipotipo/category_33/|title=AR詩 | にかにかブログ! (おぶんがく&包丁&ちぽちぽ革命)|website=にかにかブログ! (おぶんがく&包丁&ちぽちぽ革命)|language=ja-JP|access-date=20 May 2018}}</ref>
[[File:AR-Icon.svg|thumb|The AR-Icon can be used as a marker on print as well as on online media. It signals the viewer that digital content is behind it. The content can be viewed with a smartphone or tablet.]]
AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect many different types of media.<ref>Katts, Rima. [http://www.mobilemarketer.com/cms/news/software-technology/13810.html Elizabeth Arden brings new fragrance to life with augmented reality] ''Mobile Marketer'', 19 September 2012.</ref><ref>Meyer, David. [http://gigaom.com/europe/telefonica-bets-on-augmented-reality-with-aurasma-tie-in/ Telefónica bets on augmented reality with Aurasma tie-in] ''gigaom'', 17 September 2012.</ref><ref>Mardle, Pamela.[http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ Video becomes reality for Stuprint.com] {{webarchive |url=https://web.archive.org/web/20130312171811/http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ |date=12 March 2013 }}. ''Printweek'', 3 October 2012.</ref><ref>Giraldo, Karina.[http://www.solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ Why mobile marketing is important for brands?] {{webarchive|url=https://web.archive.org/web/20150402135323/http://solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ |date=2 April 2015 }}. ''SolinixAR'', Enero 2015.</ref><ref>{{cite news|title=Augmented reality could be advertising world’s best bet|url=http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|agency=The Financial Express|date=18 April 2015|deadurl=yes|archiveurl=https://web.archive.org/web/20150521061314/http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|archivedate=21 May 2015|df=dmy-all}}</ref>


===Visual art===
AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.<ref>Humphries, Mathew.[http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/].''Geek.com'' 19 September 2011.</ref> AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.<ref>Netburn, Deborah.[http://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story Ikea introduces augmented reality app for 2013 catalog]. ''Los Angeles Times'', 23 July 2012.</ref>
[[File:10.000 Moving Cities, Augmented Reality Multiplayer Game.png|thumb|alt= Illustration from AR Game ''10.000 Moving Cities'' Art Installation. |''10.000 Moving Cities'', [[Marc Lee]], Augmented Reality Multiplayer Game, Art Installation<ref>{{cite web|title=10.000 Moving Cities – Same but Different, AR (Augmented Reality) Art Installation, 2018|publisher = Marc Lee|url=http://marclee.io/en/10-000-moving-cities-same-but-different-ar/|access-date=24 December 2018 }}</ref>]]


AR applied in the visual arts allows objects or places to trigger artistic multidimensional experiences and interpretations of reality.
By 2010, virtual dressing rooms had been developed for e-commerce.<ref>[https://www.researchgate.net/profile/Rick_Van_Krevelen2/publication/279867852_A_Survey_of_Augmented_Reality_Technologies_Applications_and_Limitations/links/58dab7f445851578dfcac285/A-Survey-of-Augmented-Reality-Technologies-Applications-and-Limitations.pdf The International Journal of Virtual Reality, 2010, 9 (2)]</ref>
[[File:Augmented Reality for eCommerce.jpg|alt=Augment SDK|thumb|[[Augment (app)|Augment]] SDK offers brands and retailers the capability to personalize their customers' shopping experience by embedding AR product visualization into their eCommerce platforms.]]


The Australian new media artist [[Jeffrey Shaw]] pioneered Augmented Reality in three artworks: ''Viewpoint'' in 1975, ''Virtual Sculptures'' in 1987 and ''The Golden Calf'' in 1993.<ref>{{Cite book |last=Duguet |first=Anne-Marie |title=Jeffrey Shaw, Future Cinema. The Cinematic Imaginary after Film |publisher=ZKM Karlsruhe and MIT Press, Cambridge, Massachusetts |year=2003 |isbn=9780262692861 |pages=376–381}}</ref><ref>{{Cite book |last1=Duguet |first1=Anne-Marie |title=Jeffrey Shaw: A User's Manual. From Expanded Cinema to Virtual Reality |last2=Klotz |first2=Heinrich |last3=Weibel |first3=Peter |publisher=ZKM Cantz |year=1997 |isbn= |pages=9–20}}</ref> He continues to explore new permutations of AR in numerous recent works.
In 2012, a mint used AR techniques to market a commemorative coin for Aruba. The coin itself was used as an AR trigger, and when held in front of an AR-enabled device it revealed additional objects and layers of information that were not visible without the device.<ref>Alexander, Michael.[http://news.coinupdate.com/arbua-shoco-owl-silver-coin-with-augmented-reality-1490/ Arbua Shoco Owl Silver Coin with Augmented Reality], ''Coin Update'' July 20, 2012.</ref><ref>[http://www.todaysxm.com/2012/08/07/royal-mint-produces-revolutionary-commemorative-coin-for-aruba/ Royal Mint produces revolutionary commemorative coin for Aruba] {{webarchive |url=https://web.archive.org/web/20150904090653/http://www.todaysxm.com/2012/08/07/royal-mint-produces-revolutionary-commemorative-coin-for-aruba/ |date=4 September 2015 }}, ''Today'' August 7, 2012.</ref>


Manifest.AR was an international artists' collective founded in 2010 that specialized in augmented reality (AR) art and interventions. The collective typically created site-specific AR installations that could be viewed through mobile devices using custom-developed applications. Their work often challenged traditional notions of art exhibition and ownership by placing virtual artworks in spaces without institutional permission. The collective gained prominence in 2010 when they staged an unauthorized virtual exhibition at the Museum of Modern Art (MoMA) in New York City, overlaying their digital artworks throughout the museum's spaces using AR technology. The collective's unauthorized AR intervention at MoMA involved placing virtual artworks throughout the museum's spaces, viewable through mobile devices. In 2011, members of Manifest.AR created AR artworks that were virtually placed throughout the Venice Biennial, creating an unofficial parallel exhibition accessible through mobile devices. During the Occupy Wall Street movement in 2011, the collective created AR installations in and around Zuccotti Park, adding a digital dimension to the physical protests. Key members of the collective have included: Mark Skwarek; John Craig Freeman; Will Pappenheimer; Tamiko Thiel; and Sander Veenhof. The group published their "AR Art Manifesto" in 2011, which outlined their artistic philosophy and approach to augmented reality as a medium. The manifesto emphasized the democratic potential of AR technology and its ability to challenge traditional institutional control over public space and art display.<ref>Freeman, John Craig. "ManifestAR: An Augmented Reality Manifesto." Leonardo Electronic Almanac, Vol. 19, No. 1, 2013.</ref> Manifest.AR has been influential in: Pioneering artistic applications of AR technology; Developing new forms of institutional critique; Expanding concepts of public art and digital space; and Influencing subsequent generations of new media artists. Their work has been documented and discussed in various publications about digital art and new media, and has influenced contemporary discussions about virtual and augmented reality in artistic practice.<ref>Paul, Christiane. "Digital Art" (Third edition). Thames & Hudson, 2015.</ref>
In 2013, [[L'Oreal Paris]] used CrowdOptic technology to create an augmented reality experience at the seventh annual Luminato Festival in Toronto, Canada.<ref name="FOR1" />


Augmented reality can aid in the progression of visual art in museums by allowing museum visitors to view artwork in galleries in a multidimensional way through their phone screens.<ref>{{Cite journal|last1=tom Dieck|first1=M. Claudia|last2=Jung|first2=Timothy|last3=Han|first3=Dai-In|date=July 2016|title=Mapping requirements for the wearable smart glasses augmented reality museum application|url=https://www.emerald.com/insight/content/doi/10.1108/JHTT-09-2015-0036/full/html|journal=Journal of Hospitality and Tourism Technology|language=en|volume=7|issue=3|pages=230–253|doi=10.1108/JHTT-09-2015-0036|issn=1757-9880}}</ref> [[Museum of Modern Art|The Museum of Modern Art]] in New York has created an exhibit in their art museum showcasing AR features that viewers can see using an app on their smartphone.<ref>{{Cite book|url=https://books.google.com/books?id=OyGiW2OYI8AC&q=augmented+reality:+an+emerging+technologies+guide+to+AR&pg=PR1|title=Augmented Reality: An Emerging Technologies Guide to AR|last1=Kipper|first1=Greg|last2=Rampolla|first2=Joseph|date=31 December 2012|publisher=[[Elsevier]]|isbn=9781597497343|language=en}}</ref> The museum has developed their personal app, called MoMAR Gallery, that museum guests can download and use in the augmented reality specialized gallery in order to view the museum's paintings in a different way.<ref>{{Cite magazine|url=https://www.wired.com/story/augmented-reality-art-museums/|title=Augmented Reality Is Transforming Museums|magazine=[[Wired (magazine)|WIRED]]|access-date=30 September 2018 |language=en-US}}</ref> This allows individuals to see hidden aspects and information about the paintings, and to be able to have an interactive technological experience with artwork as well.
In 2014, L'Oreal brought the AR experience to a personal level with their "Makeup Genius" app. It allowed users to try out make-up and beauty styles via a mobile device.<ref name="TechAcute.com">{{cite web | url = http://techacute.com/augmented-beauty-loreal-makeup-genius-app/ | title = Augmented Beauty: L’Oreal Makeup Genius App | publisher = Alexandru Tanase | accessdate = 1 January 2015}}</ref>


AR technology was used in [[Nancy Baker Cahill|Nancy Baker Cahill's]] "Margin of Error" and "Revolutions,"<ref>{{Cite news|last=Vankin|first=Deborah|date=28 February 2019|title=With a free phone app, Nancy Baker Cahill cracks the glass ceiling in male-dominated land art|work=Los Angeles Times|url=https://www.latimes.com/entertainment/arts/la-et-cm-nancy-baker-cahill-desert-x-20190228-story.html|access-date=26 August 2020}}</ref> the two public art pieces she created for the 2019 [[Desert X]] exhibition.<ref>{{Cite web|url=https://news.artnet.com/exhibitions/desert-x-2019-2-1462891|title=In the Vast Beauty of the Coachella Valley, Desert X Artists Emphasize the Perils of Climate Change|date=12 February 2019|website=[[artnet News]]|language=en-US|access-date=2019-04-10}}</ref>
In 2015, the Bulgarian startup iGreet developed its own AR technology and used it to make the first premade "live" greeting card. A traditional paper card was augmented with digital content which was revealed by using the iGreet app.<ref>{{Cite web|url=http://time.com/3665770/5-apps-evernote/|title=5 Apps You Just Can’t Miss This Week|date=|website=time.com|publisher=|access-date=}}</ref><ref>{{Cite web|url=http://bnr.bg/en/post/100678361/greeting-cards-brought-back-to-life-via-bulgarian-mobile-application|title=Greeting cards brought back to life via Bulgarian mobile application|date=|website=bnr.bg|publisher=|access-date=}}</ref>


AR technology aided the development of [[eye tracking]] technology to translate a disabled person's eye movements into drawings on a screen.<ref>{{cite magazine |title=The 50 Best Inventions of 2010 - EyeWriter|url=http://www.time.com/time/specials/packages/article/0,28804,2029497_2030618_2029822,00.html |magazine=Time |access-date=26 March 2024|archive-url=https://web.archive.org/web/20101114075903/http://www.time.com/time/specials/packages/article/0,28804,2029497_2030618_2029822,00.html|archive-date=2010-11-14|date=11 November 2010|last=Webley|first=Kayla}}</ref>
In 2017, [[Ikea]] announced Ikea Place app. The app contains a catalogue of over 2,000 products—nearly the company’s full collection of umlauted sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.<ref name="Wired.com">{{cite web | url = https://www.wired.com/story/ikea-place-ar-kit-augmented-reality/ | title = IKEA's new app flaunts what you'll love most about AR| publisher = Arielle Pardes | accessdate = 20 September 2017}}</ref>


A Danish artist, [[Olafur Eliasson]], has placed objects like burning suns, extraterrestrial rocks, and rare animals, into the user's environment.<ref>{{Cite web|url=https://www.dezeen.com/2020/05/14/olafur-eliasson-augmented-reality-wunderkammer/|title=Olafur Eliasson creates augmented-reality cabinet of curiosities|date=14 May 2020|language=en-US|access-date=2020-05-17}}</ref> [[Martin & Muñoz]] started using Augmented Reality (AR) technology in 2020 to create and place virtual works, based on their snow globes, in their exhibitions and in user's environments. Their first AR work was presented at the Cervantes Institute in New York in early 2022.<ref>{{Cite web|url=https://www.spainculture.us/city/new-york/walter-martin-paloma-munoz-the-houses-are-blind-but-the-trees-can-see/|title=The Houses are Blind but the Trees Can See|date= March 2022|language=en-US|access-date=2023-02-07}}</ref>
In 2018, [[Apple]] announced USDZ AR file support for iPhones and iPads with iOS12. Apple has created an AR QuickLook Gallery that allows masses experience Augmented reality on their own Apple device.<ref name="ComputerWorld.com">{{cite web | url = https://www.computerworld.com/article/3307437/mobile-wireless/this-small-ios-12-feature-is-the-birth-of-a-whole-industry.html | title = This small iOS 12 feature is the birth of a whole industry | publisher = Jonny Evans | accessdate = 19 September 2018}}</ref>


{{Further|topic=the 2004 augmented reality outdoor art project|LifeClipper}}
In 2018, [[Shopify]], the Canadian commerce company, announced ARkit2 integrations and their merchants are able to use the tools to upload 3D models of their products, which users will be able to tap on the goods inside Safari to view in their real-world environments.<ref name="Techcrunch.com">{{cite web | url = https://techcrunch.com/2018/09/17/shopify-is-bringing-apples-latest-ar-tech-to-their-platform/ | title = Shopify is bringing Apple’s latest AR tech to their platform | publisher = Lucas Matney | accessdate = 3 December 2018}}</ref>


=== Education ===
===Fitness===
AR hardware and software for use in fitness includes [[smart glasses]] made for biking and running, with performance analytics and map navigation projected onto the user's field of vision,<ref>{{Cite web|title=Augmented Reality (AR) vs. virtual reality (VR): What's the Difference?|url=https://www.pcmag.com/news/augmented-reality-ar-vs-virtual-reality-vr-whats-the-difference|access-date=2020-11-06|website=PCMAG|language=en}}</ref> and boxing, martial arts, and tennis, where users remain aware of their physical environment for safety.<ref>{{Cite web|author=Sandee LaMotte|title=The very real health dangers of virtual reality|url=https://www.cnn.com/2017/12/13/health/virtual-reality-vr-dangers-safety/index.html|access-date=2020-11-06|website=CNN|date=13 December 2017}}</ref> Fitness-related games and software include [[Pokémon Go]] and [[Jurassic World Alive]].<ref>{{Cite web|last=Thier|first=Dave|title='Jurassic World Alive' Makes Two Big Improvements Over 'Pokémon GO'|url=https://www.forbes.com/sites/davidthier/2018/06/04/jurassic-world-alive-makes-two-big-improvements-over-pokemon-go/|access-date=2020-11-06|website=Forbes|language=en}}</ref>


===Human–computer interaction===
In educational settings, AR has been used to complement a standard curriculum. Text, graphics, video, and audio may be superimposed into a student's real-time environment. Textbooks, flashcards and other educational reading material may contain embedded "[[fiducial marker|markers]]" or triggers that, when scanned by an AR device, produced supplementary information to the student rendered in a multimedia format.<ref>[http://www.prweb.com/releases/2011/10/prweb8899908.htm Groundbreaking Augmented Reality-Based Reading Curriculum Launches], ‘’PRweb’’, 23 October 2011.</ref><ref>Stewart-Smith, Hanna. [http://www.zdnet.com/blog/asia/education-with-augmented-reality-ar-textbooks-released-in-japan-video/1541 Education with Augmented Reality: AR textbooks released in Japan], ‘’ZDnet’’, 4 April 2012.</ref><ref>[http://smarterlearning.wordpress.com/2011/11/10/augmented-reality-in-education/ Augmented reality in education] ''smarter learning''.</ref> This makes AR a good alternative method for presenting information and Multimedia Learning Theory can be applied.<ref>{{Cite journal|last=Santos|first=M. E. C.|last2=Chen|first2=A.|last3=Taketomi|first3=T.|last4=Yamamoto|first4=G.|last5=Miyazaki|first5=J.|last6=Kato|first6=H.|date=January 2014|title=Augmented Reality Learning Experiences: Survey of Prototype Design and Evaluation|url=http://ieeexplore.ieee.org/document/6681863/|journal=IEEE Transactions on Learning Technologies|volume=7|issue=1|pages=38–56|doi=10.1109/tlt.2013.37|issn=1939-1382}}</ref>


[[Human–computer interaction]] (HCI) is an interdisciplinary area of computing that deals with design and implementation of systems that interact with people. Researchers in HCI come from a number of disciplines, including computer science, engineering, design, human factor, and social science, with a shared goal to solve problems in the design and the use of technology so that it can be used more easily, effectively, efficiently, safely, and with satisfaction.<ref>{{cite web |title=Research Human Computer Interaction (HCI), Virtual and Augmented Reality, Wearable Technologies |url=https://www.cs.nycu.edu.tw/research/human-computer-interaction-virtual-and-augmented-reality-wearable-technology?locale=en |website=cs.nycu.edu.tw |access-date=28 March 2021}}</ref>
As AR evolved, students can participate interactively and interact with knowledge more authentically. Instead of remaining passive recipients, students can become active learners, able to interact with their learning environment. Computer-generated simulations of historical events allow students to explore and learning details of each significant area of the event site.<ref>Lubrecht, Anna. [http://digitalunion.osu.edu/2012/04/24/augmented-reality-for-education/ Augmented Reality for Education] ‘’The Digital Union’’, The Ohio State University 24 April 2012.</ref>


According to a 2017 ''[[Time (magazine)|Time]]'' article, in about 15 to 20 years it is predicted that augmented reality and virtual reality are going to become the primary use for computer interactions.<ref>{{Cite magazine|url=https://time.com/4654944/this-technology-could-replace-the-keyboard-and-mouse/|title=This Technology Could Replace the Keyboard and Mouse|last=Bajarin|first=Tim|magazine=[[Time (magazine)|Time]]|date=31 January 2017 |access-date=19 June 2019}}</ref>
In higher education, Construct3D, a Studierstube system, allows students to learn mechanical engineering concepts, math or geometry.<ref>{{Cite web |url=http://acdc.sav.us.es/pixelbit/images/stories/p41/15.pdf |title=Archived copy |access-date=19 June 2014 |archive-url=https://web.archive.org/web/20150417053823/http://acdc.sav.us.es/pixelbit/images/stories/p41/15.pdf |archive-date=17 April 2015 |dead-url=yes |df=dmy-all }}</ref> Chemistry AR apps allow students to visualize and interact with the spatial structure of a molecule using a marker object held in the hand.<ref>Maier, Patrick; Tönnis, Marcus; Klinker, Gudron. [http://ar.in.tum.de/pub/maierp2009ijas/maierp2009ijas.pdf Augmented Reality for teaching spatial relations], ''Conference of the International Journal of Arts & Sciences (Toronto 2009'').</ref> Others have used HP Reveal, a free app, to create AR notecards for studying organic chemistry mechanisms or to create virtual demonstrations of how to use laboratory instrumentation.<ref>{{Cite journal|title=A Simple and Practical Method for Incorporating Augmented Reality into the Classroom and Laboratory|last=Plunkett|first=Kyle|date=2018-09-27|doi=10.26434/chemrxiv.7137827.v1}}</ref> Anatomy students can visualize different systems of the human body in three dimensions.<ref>{{cite web|url=https://www.vuforia.com/case-studies/anatomy-4d |title=Anatomy 4D – Qualcomm |work=Qualcomm |accessdate=2 July 2015 |deadurl=yes |archiveurl=https://web.archive.org/web/20160311085744/http://vuforia.com/case-studies/anatomy-4d |archivedate=11 March 2016 |df=dmy }}</ref>


== Remote collaboration ==
===Remote collaboration===
Primary school children learn easily from interactive experiences. Astronomical constellations and the movements of objects in the solar system were oriented in 3D and overlaid in the direction the device was held, and expanded with supplemental video information. Paper-based science book illustrations could seem to come alive as video without requiring the child to navigate to web-based materials.
Primary school children learn easily from interactive experiences. As an example, astronomical constellations and the movements of objects in the solar system were oriented in 3D and overlaid in the direction the device was held, and expanded with supplemental video information. Paper-based science book illustrations could seem to come alive as video without requiring the child to navigate to web-based materials.


In 2013, a project was launched on Kickstarter to teach about electronics with an educational toy that allowed children to scan their circuit with an iPad and see the electric current flowing around.<ref>{{Cite web|url=https://circuits.lightup.io/|title=LightUp - An award-winning toy that teaches kids about circuits and coding|website=LightUp|language=en-US|access-date=2018-08-29}}</ref> While some educational apps were available for AR by 2016, it was not broadly used. Apps that leverage augmented reality to aid learning included SkyView for studying astronomy,<ref>{{Cite web|title = Terminal Eleven: SkyView – Explore the Universe|url = http://www.terminaleleven.com/skyview/iphone/|website = www.terminaleleven.com|access-date = 2016-02-15}}</ref> AR Circuits for building simple electric circuits,<ref>{{Cite web|title = AR Circuits – Augmented Reality Electronics Kit|url = http://arcircuits.com|website = arcircuits.com|access-date = 2016-02-15}}</ref> and SketchAr for drawing.<ref>{{Cite web|url=http://sketchar.tech|title=SketchAR - start drawing easily&nbsp;using augmented reality|website=SketchAR - start drawing easily&nbsp;using augmented reality|access-date=2018-05-20}}</ref>
In 2013, a project was launched on [[Kickstarter]] to teach about electronics with an educational toy that allowed children to scan their circuit with an iPad and see the electric current flowing around.<ref>{{Cite web|url=https://circuits.lightup.io/|title=LightUp - An award-winning toy that teaches kids about circuits and coding|website=LightUp|language=en-US|access-date=29 August 2018|archive-url=https://web.archive.org/web/20180829110100/https://circuits.lightup.io/|archive-date=29 August 2018|url-status=dead}}</ref> While some educational apps were available for AR by 2016, it was not broadly used. Apps that leverage augmented reality to aid learning included SkyView for studying astronomy,<ref>{{Cite web|title = Terminal Eleven: SkyView – Explore the Universe|url = http://www.terminaleleven.com/skyview/iphone/|website = www.terminaleleven.com|access-date = 15 February 2016}}</ref> AR Circuits for building simple electric circuits,<ref>{{Cite web|title = AR Circuits – Augmented Reality Electronics Kit|url = http://arcircuits.com|website = arcircuits.com|access-date = 15 February 2016}}</ref> and SketchAR for drawing.<ref>{{Cite web|url=http://sketchar.tech|title=SketchAR - start drawing easily using augmented reality|website=sketchar.tech|access-date=20 May 2018}}</ref>


AR would also be a way for parents and teachers to achieve their goals for modern education, which might include providing a more individualized and flexible learning, making closer connections between what is taught at school and the real world, and helping students to become more engaged in their own learning.
AR would also be a way for parents and teachers to achieve their goals for modern education, which might include providing more individualized and flexible learning, making closer connections between what is taught at school and the real world, and helping students to become more engaged in their own learning.


===Emergency management/search and rescue===
A recent research compared the functionalities of augmented reality tools with potential for education <ref>Herpich, F.; Guarese, R. L. M.; Tarouco, L. R.[https://www.researchgate.net/publication/318707271_A_Comparative_Analysis_of_Augmented_Reality_Frameworks_Aimed_at_the_Development_of_Educational_Applications A Comparative Analysis of Augmented Reality Frameworks Aimed at the Development of Educational Applications], Creative Education, 08(09):1433-1451</ref>


Augmented reality systems are used in [[Public security|public safety]] situations, from [[Superstorm|super storms]] to suspects at large.
=== Emergency management/search and rescue ===


As early as 2009, two articles from ''Emergency Management'' discussed AR technology for emergency management. The first was "Augmented Reality—Emerging Technology for Emergency Management", by Gerald Baron.<ref>"Augmented Reality—Emerging Technology for Emergency Management", ''Emergency Management'' 24 September 2009.</ref> According to Adam Crow,: "Technologies like augmented reality (ex: Google Glass) and the growing expectation of the public will continue to force professional emergency managers to radically shift when, where, and how technology is deployed before, during, and after disasters."<ref>"What Does the Future Hold for Emergency Management?", Emergency Management Magazine, 8 November 2013</ref>
Augmented reality systems are used in public safety situations, from super storms to suspects at large.


Another early example was a search aircraft looking for a lost hiker in rugged mountain terrain. Augmented reality systems provided aerial camera operators with a geographic awareness of forest road names and locations blended with the camera video. The camera operator was better able to search for the hiker knowing the geographic context of the camera image. Once located, the operator could more efficiently direct rescuers to the hiker's location because the geographic position and reference landmarks were clearly labeled.<ref>{{cite thesis |type=Master's thesis |last1=Cooper |first1=Joseph |title=Supporting Flight Control for UAV-Assisted Wilderness Search and Rescue Through Human Centered Interface Design |publisher=Brigham Young University |date=15 November 2007 |url=https://scholarsarchive.byu.edu/etd/1217/ }}</ref>
As early as 2009, two articles from ''Emergency Management'' magazine discussed the power of this technology for emergency management. The first was "Augmented Reality--Emerging Technology for Emergency Management" by Gerald Baron.<ref name="BARO13">"Augmented Reality--Emerging Technology for Emergency Management", ''Emergency Management Magazine'', September 24, 2009</ref> Per Adam Crowe: "Technologies like augmented reality (ex: Google Glass) and the growing expectation of the public will continue to force professional emergency managers to radically shift when, where, and how technology is deployed before, during, and after disasters."<ref name="CROW13">"What Does the Future Hold for Emergency Management?", Emergency Management Magazine, November 8, 2013</ref> Moreover, a 2018 article in the ''Journal of Agromedicine'' describes lessons learned from the development of an augmented reality prototype app developed to assist emergency responders in rural and agricultural environments.<ref>{{Cite journal|last=Weichelt|first=Bryan|last2=Yoder|first2=Aaron|last3=Bendixsen|first3=Casper|last4=Pilz|first4=Matthew|last5=Minor|first5=Gerald|last6=Keifer|first6=Matthew|date=2018-07-03|title=Augmented Reality Farm MAPPER Development: Lessons Learned from an App Designed to Improve Rural Emergency Response|url=https://www.tandfonline.com/doi/full/10.1080/1059924X.2018.1470051|journal=Journal of Agromedicine|language=en|volume=23|issue=3|pages=284–296|doi=10.1080/1059924x.2018.1470051|pmid=30047852|issn=1059-924X}}</ref> The article also describes this application's utility and current use as a training tool with rural firefighters, part of a [[National Farm Medicine Center]] research project on [[Agricultural safety and health|agricultural safety]] throughout [[Wisconsin]] and [[Minnesota]].


===Social interaction===
Another early example was a search aircraft looking for a lost hiker in rugged mountain terrain. Augmented reality systems provided aerial camera operators with a geographic awareness of forest road names and locations blended with the camera video. The camera operator was better able to search for the hiker knowing the geographic context of the camera image. Once located, the operator could more efficiently direct rescuers to the hiker's location because the geographic position and reference landmarks were clearly labeled.<ref name="COOP07">Cooper, J., "SUPPORTING FLIGHT CONTROL FOR UAV-ASSISTED WILDERNESS SEARCH AND RESCUE THROUGH HUMAN CENTERED INTERFACE DESIGN", Thesis, Brigham Young University, DEC 2007</ref>
AR can be used to facilitate social interaction. An augmented reality social network framework called Talk2Me enables people to disseminate information and view others' advertised information in an augmented reality way. The timely and dynamic information sharing and viewing functionalities of Talk2Me help initiate conversations and make friends for users with people in physical proximity.<ref>{{cite book |doi=10.1109/PERCOM.2018.8444578 |chapter=Talk2Me: A Framework for Device-to-Device Augmented Reality Social Network |title=2018 IEEE International Conference on Pervasive Computing and Communications (Per ''Com'') |pages=1–10 |year=2018 |last1=Shu |first1=Jiayu |last2=Kosta |first2=Sokol |last3=Zheng |first3=Rui |last4=Hui |first4=Pan |s2cid=44017349 |isbn=978-1-5386-3224-6 }}</ref> However, use of an AR headset can inhibit the quality of an interaction between two people if one isn't wearing one if the headset becomes a distraction.<ref>{{cite web |title=Effects of Augmented Reality on Social Interactions |url=https://www.electronicsdiary.com/2019/05/effects-of-augmented-reality-on-social.html |website=Electronics Diary|date=27 May 2019 }}</ref>


Augmented reality also gives users the ability to practice different forms of social interactions with other people in a safe, risk-free environment. Hannes Kauffman, Associate Professor for virtual reality at TU [[Vienna]], says: "In collaborative augmented reality multiple users may access a shared space populated by virtual objects, while remaining grounded in the real world. This technique is particularly powerful for educational purposes when users are collocated and can use natural means of communication (speech, gestures, etc.), but can also be mixed successfully with immersive VR or remote collaboration."{{quote without source|date=October 2019}} Hannes cites [[education]] as a potential use of this technology.
=== Social interaction ===
AR can be used to facilitate social interaction. An augmented reality social network framework called Talk2Me enables people to disseminate information and view others’ advertised information in an augmented reality way. The timely and dynamic information sharing and viewing functionalities of Talk2Me help initiate conversations and make friends for users with people in physical proximity.<ref>Talk2Me: A Framework for Device-to-Device Augmented Reality Social Network. Jiayu Shu, Sokol Kosta, Rui Zheng, Pan Hui. In Proceedings of IEEE International Conference on Pervasive Computing and Communications (PerCom 2018), Athens, Greece, March 2018.</ref>


===Video games===
Augmented reality also Gives users the ability to practice different forms of social interactions with other people in a safe, risk-free environment. Hannes Kauffman, Associate Professor for Virtual Reality at TU Vienna, says “In collaborative Augmented Reality multiple users may access a shared space populated by virtual objects, while remaining grounded in the real world. This technique is particularly powerful for educational purposes when users are collocated and can use natural means of communication (speech, gestures etc.), but can also be mixed successfully with immersive VR or remote collaboration.” (Hannes)Hannes cites a specific use for this technology, [[education]].
{{redirect-distinguish|Augmented reality game|alternate reality game}}
{{Redirect|AR games|the Nintendo 3DS game|AR Games{{!}}''AR Games''}}
{{See also|List of augmented reality video games}}
[[File:Desjardins AR Augmented Reality Game, March 2013.png|thumb|left|upright|alt= An image from an AR mobile game | An AR mobile game using a trigger image as [[fiducial marker]]]]


The gaming industry embraced AR technology. A number of games were developed for prepared indoor environments, such as AR air hockey, ''Titans of Space'', collaborative combat against virtual enemies, and AR-enhanced pool table games.<ref>Hawkins, Mathew. [http://www.gamesetwatch.com/2011/10/augmented_reality_used_to_enhance_both_pool_and_air_hockey.php Augmented Reality Used To Enhance Both Pool And Air Hockey] ''Game Set Watch''15 October 2011.</ref><ref>[http://combathelo.blogspot.com/2012/07/one-week-only-augmented-reality-project.html One Week Only – Augmented Reality Project] {{webarchive |url=https://web.archive.org/web/20131106180740/http://combathelo.blogspot.com/2012/07/one-week-only-augmented-reality-project.html |date=6 November 2013 }} ''Combat-HELO Dev Blog'' 31 July 2012.</ref><ref>{{Cite web |url=http://getandroidstuff.com/best-augmented-reality-apps-vr-games-android/ |title=Best VR, Augmented Reality apps & games on Android |access-date=14 February 2017 |archive-url=https://web.archive.org/web/20170215114103/http://getandroidstuff.com/best-augmented-reality-apps-vr-games-android/ |archive-date=15 February 2017 |url-status=dead }}</ref>
=== Video games ===
{{See also|List of augmented reality software#Games|l1=List of augmented reality software § Games}}
[[File:Desjardins AR Augmented Reality Game, March 2013.png|thumb| [[Merchlar]]'s mobile game ''Get On Target'' uses a trigger image as [[fiducial marker]].]]


In 2010, Ogmento became the first AR gaming startup to receive VC Funding. The company went on to produce early location-based AR games for titles like Paranormal Activity: Sanctuary, NBA: King of the Court, and Halo: King of the Hill. The companies computer vision technology was eventually repackaged and sold to Apple, became a major contribution to ARKit.<ref>{{cite web | url=https://techcrunch.com/2010/05/26/ogmento-first-ar-gaming-startup-to-win-vc-funding/ | title=Ogmento First AR Gaming Startup to Win VC Funding | date=26 May 2010 }}</ref>
The gaming industry embraced AR technology. A number of games were developed for prepared indoor environments, such as AR air hockey, ''Titans of Space'', collaborative combat against virtual enemies, and AR-enhanced pool table games.<ref>Hawkins, Mathew. [http://www.gamesetwatch.com/2011/10/augmented_reality_used_to_enhance_both_pool_and_air_hockey.php Augmented Reality Used To Enhance Both Pool And Air Hockey] ''Game Set Watch''October 15, 2011.</ref><ref>[http://combathelo.blogspot.com/2012/07/one-week-only-augmented-reality-project.html One Week Only – Augmented Reality Project] {{webarchive |url=https://web.archive.org/web/20131106180740/http://combathelo.blogspot.com/2012/07/one-week-only-augmented-reality-project.html |date=6 November 2013 }} ''Combat-HELO Dev Blog'' July 31, 2012.</ref><ref>[http://getandroidstuff.com/best-augmented-reality-apps-vr-games-android/ Best VR, Augmented Reality apps & games on Android]</ref>


Augmented reality allowed video game players to experience digital game play in a real world environment. Companies and platforms like [[Niantic, Inc.|Niantic]] and [[Proxy42]] emerged as major augmented reality gaming creators.<ref name="YOUR THOUGHTS ABOUT AUGMENTED REALITY IN VIDEO GAMES">{{cite web |url=http://day9.tv/d/Lineste/your-thoughts-about-augmented-reality-in-video-games/ |title=YOUR THOUGHTS ABOUT AUGMENTED REALITY IN VIDEO GAMES |date=2013-05-01|accessdate=2013-05-07}}</ref><ref>{{Cite web|title = Father.io - AR FPS |url = http://www.father.io/|language = en-US}}</ref> Niantic is notable for releasing the record-breaking game ''[[Pokémon Go]]''.<ref>{{cite web|last1=Swatman|first1=Rachel|title=Pokémon Go catches five new world records|url=http://www.guinnessworldrecords.com/news/2016/8/pokemon-go-catches-five-world-records-439327|publisher=[[Guinness World Records]]|accessdate=28 August 2016}}</ref>
Augmented reality allows video game players to experience digital game play in a real-world environment. [[Niantic, Inc.|Niantic]] released the augmented reality mobile game ''Pokémon Go''.<ref>{{cite web|last1=Swatman|first1=Rachel|title=Pokémon Go catches five new world records|url=http://www.guinnessworldrecords.com/news/2016/8/pokemon-go-catches-five-world-records-439327|publisher=[[Guinness World Records]]|access-date=28 August 2016|date=10 August 2016}}</ref> [[Disney]] has partnered with [[Lenovo]] to create the augmented reality game ''[[Star Wars]]: Jedi Challenges'' that works with a Lenovo Mirage AR headset, a tracking sensor and a [[Lightsaber]] controller, scheduled to launch in December 2017.<ref>{{Cite web | url=https://www.cnbc.com/2017/08/31/star-wars-jedi-challenges-augmented-reality-game-launches-with-lenovo-mirage-headset.html | title='Star Wars' augmented reality game that lets you be a Jedi launched| website=[[CNBC]]| date=31 August 2017}}</ref>
[[Disney]] has partnered with [[Lenovo]] to create the augmented reality game ''[[Star Wars]]: Jedi Challenges'' that works with a Lenovo Mirage AR headset, a tracking sensor and a [[Lightsaber]] controller, scheduled to launch in December 2017.<ref>https://www.cnbc.com/2017/08/31/star-wars-jedi-challenges-augmented-reality-game-launches-with-lenovo-mirage-headset.html</ref>


{{clear left}}
=== Industrial design ===


{{Main|Industrial Augmented Reality}}
===Industrial design===


{{Main|Industrial augmented reality}}
AR allows industrial designers to experience a product's design and operation before completion. Volkswagen has used AR for comparing calculated and actual crash test imagery.<ref>{{cite journal |last1=Noelle |first1=S. |year=2002 |title=Stereo augmentation of simulation results on a projection wall |journal=Mixed and Augmented Reality, 2002. ISMAR 2002. Proceedings. |pages=271–322 |url=http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1115108&tag=1 |accessdate=2012-10-07}}</ref> AR has been used to visualize and modify car body structure and engine layout. It has also been used to compare digital mock-ups with physical mock-ups for finding discrepancies between them.<ref>{{cite journal|last1=Verlinden |first1=Jouke |last2=Horvath |first2=Imre |title=Augmented Prototyping as Design Means in Industrial Design Engineering |publisher=Delft University of Technology |url=http://www.io.tudelft.nl/index.php?id=24954&L=1 |accessdate=2012-10-07 |deadurl=yes |archiveurl=https://web.archive.org/web/20130616010611/http://www.io.tudelft.nl/index.php?id=24954&L=1 |archivedate=16 June 2013 |df=dmy }}</ref><ref>{{cite journal |last1=Pang |first1=Y |last2=Nee |first2=A |last3=Youcef-Toumie |first3=Kamal |last4=Ong |first4=S.K |last5=Yuan |first5=M.L |date=November 18, 2004 |title=Assembly Design and Evaluation in an Augmented Reality Environment |publisher=National University of Singapore, M.I.T. |url=http://dspace.mit.edu/bitstream/handle/1721.1/7441/IMST?sequence=1 |accessdate=2012-10-07}}</ref>


AR allows industrial designers to experience a product's design and operation before completion. [[Volkswagen]] has used AR for comparing calculated and actual crash test imagery.<ref>{{cite book |doi=10.1109/ISMAR.2002.1115108 |chapter=Stereo augmentation of simulation results on a projection wall by combining two basic ARVIKA systems |title=Proceedings. International Symposium on Mixed and Augmented Reality |pages=271–322 |year=2002 |last1=Noelle |first1=S. |isbn=0-7695-1781-1 |citeseerx=10.1.1.121.1268 |s2cid=24876142 }}</ref> AR has been used to visualize and modify car body structure and engine layout. It has also been used to compare digital mock-ups with physical mock-ups to find discrepancies between them.<ref>{{cite web|last1=Verlinden |first1=Jouke |last2=Horvath |first2=Imre |title=Augmented Prototyping as Design Means in Industrial Design Engineering |publisher=[[Delft University of Technology]] |url=http://www.io.tudelft.nl/index.php?id=24954&L=1 |access-date=7 October 2012 |url-status=dead |archive-url=https://web.archive.org/web/20130616010611/http://www.io.tudelft.nl/index.php?id=24954&L=1 |archive-date=16 June 2013 |df=dmy }}</ref><ref>{{cite news |last1=Pang |first1=Y. |last2=Nee |first2=Andrew Y. C. |last3=Youcef-Toumi |first3=Kamal |last4=Ong |first4=S. K. |last5=Yuan |first5=M. L. |title=Assembly Design and Evaluation in an Augmented Reality Environment |date=January 2005 |hdl=1721.1/7441 }}</ref>
=== Medical ===


===Healthcare planning, practice and education===
Since 2005, a device called a [[near-infrared vein finder]] that films subcutaneous veins, processes and projects the image of the veins onto the skin has been used to locate veins.<ref>{{cite journal|title=Vein imaging: a new method of near infrared imaging, where a processed image is projected onto the skin for the enhancement of vein treatment |vauthors=Miyake RK, etal |pmid=16918565 | doi=10.1111/j.1524-4725.2006.32226.x |volume=32 |issue=8 |journal=Dermatol Surg |pages=1031–8|year=2006 }}</ref><ref>{{cite news| url=http://www.economist.com/node/10202623 | work=The Economist | title=Reality_Only_Better | date=8 December 2007}}</ref>


AR provides surgeons with patient monitoring data in the style of a fighter pilot's heads-up display, and allows patient imaging records, including functional videos, to be accessed and overlaid. Examples include a virtual [[X-ray]] view based on prior [[tomography]] or on real-time images from [[ultrasound]] and [[confocal microscopy]] probes,<ref>{{cite journal |vauthors=Mountney P, Giannarou S, Elson D, Yang GZ |title=Optical biopsy mapping for minimally invasive cancer screening |journal=Medical Image Computing and Computer-assisted Intervention |volume=12 |issue=Pt 1 |pages=483–90 |year=2009 |pmid=20426023}}</ref> visualizing the position of a tumor in the video of an [[endoscope]],<ref>{{youtube|4emmCcBb4s|Scopis Augmented Reality: Path guidance to craniopharyngioma}}</ref> or radiation exposure risks from X-ray imaging devices.<ref>N. Loy Rodas, N. Padoy. "3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose". Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI), Oral, 2014</ref><ref>{{youtube|pINE2gaOVOY|3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose}}</ref> AR can enhance viewing a [[fetus]] inside a mother's [[womb]].<ref>{{cite web |url=http://www.cs.unc.edu/Research/us/ |title=UNC Ultrasound/Medical Augmented Reality Research |accessdate=2010-01-06 |archiveurl=https://web.archive.org/web/20100212231230/http://www.cs.unc.edu/Research/us/ |archivedate=12 February 2010 <!-- DASHBot -->|deadurl=no}}</ref> Siemens, Karl Storz and IRCAD have developed a system for laparoscopic liver surgery that uses AR to view sub-surface tumors and vessels.<ref>{{Cite journal|last=Mountney|first=Peter|last2=Fallert|first2=Johannes|last3=Nicolau|first3=Stephane|last4=Soler|first4=Luc|last5=Mewes|first5=Philip W.|date=2014-01-01|title=An augmented reality framework for soft tissue surgery|url=https://link.springer.com/chapter/10.1007/978-3-319-10404-1_53|journal=International Conference on Medical Image Computing and Computer-Assisted Intervention|volume=8673|pages=423–431|doi=10.1007/978-3-319-10404-1_53|series=Lecture Notes in Computer Science|isbn=978-3-319-10403-4}}</ref>
One of the first applications of augmented reality was in healthcare, particularly to support the planning, practice, and training of surgical procedures. As far back as 1992, enhancing human performance during surgery was a formally stated objective when building the first augmented reality systems at U.S. Air Force laboratories.<ref name="B. Rosenberg 1992"/> Since 2005, a device called a [[near-infrared vein finder]] that films subcutaneous veins, processes and projects the image of the veins onto the skin has been used to locate veins.<ref>{{cite journal|title=Vein imaging: a new method of near infrared imaging, where a processed image is projected onto the skin for the enhancement of vein treatment |vauthors=Miyake RK, etal |s2cid=8872471 |pmid=16918565 | doi=10.1111/j.1524-4725.2006.32226.x |volume=32 |issue=8 |journal=[[Dermatol Surg]] |pages=1031–8|year=2006 }}</ref><ref>{{cite news| url=http://www.economist.com/node/10202623 | newspaper=[[The Economist]] | title=Reality_Only_Better | date=8 December 2007}}</ref> AR provides surgeons with patient monitoring data in the style of a fighter pilot's heads-up display, and allows patient imaging records, including functional videos, to be accessed and overlaid. Examples include a virtual [[X-ray]] view based on prior [[tomography]] or on real-time images from [[ultrasound]] and [[confocal microscopy]] probes,<ref>{{cite book |doi=10.1007/978-3-642-04268-3_60 |pmid=20426023 |chapter=Optical Biopsy Mapping for Minimally Invasive Cancer Screening |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2009 |volume=5761 |issue=Pt 1 |pages=483–490 |series=Lecture Notes in Computer Science |year=2009 |last1=Mountney |first1=Peter |last2=Giannarou |first2=Stamatia |last3=Elson |first3=Daniel |last4=Yang |first4=Guang-Zhong |isbn=978-3-642-04267-6 }}</ref> visualizing the position of a tumor in the video of an [[endoscope]],<ref>{{youTube|4emmCcBb4s|Scopis Augmented Reality: Path guidance to craniopharyngioma}}</ref> or radiation exposure risks from X-ray imaging devices.<ref>{{cite book |doi=10.1007/978-3-319-10404-1_52 |pmid=25333145 |chapter=3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose |title=Medical Image Computing and Computer-Assisted Intervention MICCAI 2014 |volume=8673 |issue=Pt 1 |pages=415–422 |series=Lecture Notes in Computer Science |year=2014 |last1=Loy Rodas |first1=Nicolas |last2=Padoy |first2=Nicolas |isbn=978-3-319-10403-4 |s2cid=819543 }}</ref><ref>{{youTube|pINE2gaOVOY|3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose}}</ref> AR can enhance viewing a [[fetus]] inside a mother's [[womb]].<ref>{{cite web |url=http://www.cs.unc.edu/Research/us/ |title=UNC Ultrasound/Medical Augmented Reality Research |access-date=6 January 2010 |archive-url=https://web.archive.org/web/20100212231230/http://www.cs.unc.edu/Research/us/ |archive-date=12 February 2010 |url-status=live}}</ref> Siemens, Karl Storz and IRCAD have developed a system for [[Laparoscopy|laparoscopic]] liver surgery that uses AR to view sub-surface tumors and vessels.<ref>{{cite book |doi=10.1007/978-3-319-10404-1_53 |pmid=25333146 |chapter=An Augmented Reality Framework for Soft Tissue Surgery |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014 |volume=8673 |issue=Pt 1 |pages=423–431 |series=Lecture Notes in Computer Science |year=2014 |last1=Mountney |first1=Peter |last2=Fallert |first2=Johannes |last3=Nicolau |first3=Stephane |last4=Soler |first4=Luc |last5=Mewes |first5=Philip W. |isbn=978-3-319-10403-4 }}</ref>
AR has been used for cockroach phobia treatment<ref>{{cite journal |last1=Botella |first1=Cristina |last2=Bretón-López |first2=Juani |last3=Quero |first3=Soledad |last4=Baños |first4=Rosa |last5=García-Palacios |first5=Azucena |title=Treating Cockroach Phobia With Augmented Reality |journal=Behavior Therapy |date=September 2010 |volume=41 |issue=3 |pages=401–413 |doi=10.1016/j.beth.2009.07.002 |pmid=20569788 |s2cid=29889630 }}</ref> and to reduce the fear of spiders.<ref>{{Cite journal|last1=Zimmer|first1=Anja|last2=Wang|first2=Nan|last3=Ibach|first3=Merle K.|last4=Fehlmann|first4=Bernhard|last5=Schicktanz|first5=Nathalie S.|last6=Bentz|first6=Dorothée|last7=Michael|first7=Tanja|last8=Papassotiropoulos|first8=Andreas|last9=de Quervain|first9=Dominique J. F.|date=2021-08-01|title=Effectiveness of a smartphone-based, augmented reality exposure app to reduce fear of spiders in real-life: A randomized controlled trial|journal=Journal of Anxiety Disorders|language=en|volume=82|pages=102442|doi=10.1016/j.janxdis.2021.102442|pmid=34246153|s2cid=235791626|issn=0887-6185|doi-access=free}}</ref> Patients wearing augmented reality glasses can be reminded to take medications.<ref>{{cite web | url = http://www.healthtechevent.com/technology/augmented-reality-revolutionizing-medicine-healthcare/ | title = Augmented Reality Revolutionizing Medicine | publisher = Health Tech Event | access-date = 9 October 2014 | date = 6 June 2014 | archive-date = 12 October 2014 | archive-url = https://web.archive.org/web/20141012184851/http://www.healthtechevent.com/technology/augmented-reality-revolutionizing-medicine-healthcare/ | url-status = dead }}</ref> Augmented reality can be very helpful in the medical field.<ref>{{Cite journal|last=Thomas|first=Daniel J.|date=December 2016|title=Augmented reality in surgery: The Computer-Aided Medicine revolution|journal=International Journal of Surgery |volume=36|issue=Pt A|pages=25|doi=10.1016/j.ijsu.2016.10.003|issn=1743-9159|pmid=27741424|doi-access=free}}</ref> It could be used to provide crucial information to a doctor or surgeon without having them take their eyes off the patient. On 30 April 2015 Microsoft announced the [[Microsoft HoloLens]], their first attempt at augmented reality. The HoloLens has advanced through the years and is capable of projecting holograms for near infrared fluorescence based image guided surgery.<ref>{{Cite book|last1=Cui|first1=Nan|last2=Kharel|first2=Pradosh|last3=Gruev|first3=Viktor|s2cid=125528534|date=8 February 2017|title=Augmented reality with Microsoft HoloLens holograms for near-infrared fluorescence based image guided surgery|publisher=International Society for Optics and Photonics|volume=10049|pages=100490I|doi=10.1117/12.2251625|series=Molecular-Guided Surgery: Molecules, Devices, and Applications III|chapter=Augmented reality with Microsoft Holo ''Lens'' holograms for near-infrared fluorescence based image guided surgery|editor1-last=Pogue|editor1-first=Brian W|editor2-last=Gioux|editor2-first=Sylvain}}</ref> As augmented reality advances, it finds increasing applications in healthcare. Augmented reality and similar computer based-utilities are being used to train medical professionals.<ref>{{cite journal |last1=Moro |first1=C |last2=Birt |first2=J |last3=Stromberga |first3=Z |last4=Phelps |first4=C |last5=Clark |first5=J |last6=Glasziou |first6=P |last7=Scott |first7=AM |title=Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis. |journal=Anatomical Sciences Education |date=May 2021 |volume=14 |issue=3 |pages=368–376 |doi=10.1002/ase.2049 |pmid=33378557|s2cid=229929326 |url=https://research.bond.edu.au/en/publications/63e5a776-f3fd-48f2-b0ba-f47ca4ca96e2 }}</ref><ref>{{Cite journal|last1=Barsom|first1=E. Z.|last2=Graafland|first2=M.|last3=Schijven|first3=M. P.|date=1 October 2016|title=Systematic review on the effectiveness of augmented reality applications in medical training|journal=Surgical Endoscopy|language=en|volume=30|issue=10|pages=4174–4183|doi=10.1007/s00464-016-4800-6|pmid=26905573|issn=0930-2794|pmc=5009168}}</ref> In healthcare, AR can be used to provide guidance during diagnostic and therapeutic interventions e.g. during surgery. Magee et al.,<ref>{{Cite journal|last1=Magee|first1=D.|last2=Zhu|first2=Y.|last3=Ratnalingam|first3=R.|last4=Gardner|first4=P.|last5=Kessel|first5=D.|date=1 October 2007|title=An augmented reality simulator for ultrasound guided needle placement training|journal=Medical & Biological Engineering & Computing|language=en|volume=45|issue=10|pages=957–967|doi=10.1007/s11517-007-0231-9|pmid=17653784|s2cid=14943048|issn=1741-0444|url=http://eprints.whiterose.ac.uk/75786/8/Combine.pdf}}</ref> for instance, describe the use of augmented reality for medical training in simulating ultrasound-guided needle placement. Similarly, Javaid, Mohd, Haleem, and Abid found that virtual reality provided medical students' brains with an experience that simulates motion and the surgery experience. <ref>{{Cite journal |last1=Javaid |first1=Mohd |last2=Haleem |first2=Abid |date=2020-06-01 |title=Virtual reality applications toward medical field |journal=Clinical Epidemiology and Global Health |volume=8 |issue=2 |pages=600–605 |doi=10.1016/j.cegh.2019.12.010 |issn=2213-3984|doi-access=free }}</ref> A very recent study by Akçayır, Akçayır, Pektaş, and Ocak (2016) revealed that AR technology both improves university students' laboratory skills and helps them to build positive attitudes relating to physics laboratory work.<ref>{{cite journal |last1=Akçayır |first1=Murat |last2=Akçayır |first2=Gökçe |title=Advantages and challenges associated with augmented reality for education: A systematic review of the literature |journal=Educational Research Review |date=February 2017 |volume=20 |pages=1–11 |doi=10.1016/j.edurev.2016.11.002 |s2cid=151764812 }}</ref> Recently, augmented reality began seeing adoption in [[neurosurgery]], a field that requires heavy amounts of imaging before procedures.<ref>{{Cite journal|last1=Tagaytayan|first1=Raniel|last2=Kelemen|first2=Arpad|last3=Sik-Lanyi|first3=Cecilia|title=Augmented reality in neurosurgery|journal=Archives of Medical Science |volume=14|issue=3|pages=572–578|doi=10.5114/aoms.2016.58690|issn=1734-1922|pmc=5949895|pmid=29765445|year=2018}}</ref>
AR has been used for cockroach phobia treatment.<ref>Botella, C., Bretón-López, J., Quero, S., Baños, R., & García-Palacios, A. (2010). Treating Cockroach Phobia With Augmented Reality.</ref>
Patients wearing augmented reality glasses can be reminded to take medications.<ref name="healthtechevent">{{cite web | url = http://www.healthtechevent.com/technology/augmented-reality-revolutionizing-medicine-healthcare/ | title = Augmented Reality Revolutionizing Medicine | publisher = Health Tech Event | accessdate = 9 October 2014}}</ref> Virtual reality has been seen promising in the medical field since the 90's.<ref>{{Cite book|url=https://www.researchgate.net/publication/289532420_The_New_Dawn_of_Virtual_Reality_in_Health_Care_Medical_Simulation_and_Experiential_Interface|title=The New Dawn of Virtual Reality in Health Care: Medical Simulation and Experiential Interface|last=Riva|first=Giuseppe|last2=Wiederhold|first2=Brenda|date=2015-12-28|volume=13}}</ref> Augmented reality can be very helpful in the medical field.<ref>{{Cite journal|last=Thomas|first=Daniel J.|date=December 2016|title=Augmented reality in surgery: The Computer-Aided Medicine revolution|journal=International Journal of Surgery |volume=36|issue=Pt A|pages=25|doi=10.1016/j.ijsu.2016.10.003|issn=1743-9159|pmid=27741424}}</ref> It could be used to provide crucial information to a doctor or surgeon with having them take their eyes off the patient. On the 30th of April, 2015 Microsoft announced the [[Microsoft HoloLens]], their first shot at augmented reality. The [[Microsoft HoloLens|HoloLens]] has advanced through the years and it has gotten so advanced that it has been used to project holograms for near infrared fluorescence based image guided surgery.<ref>{{Cite book|last=Cui|first=Nan|last2=Kharel|first2=Pradosh|last3=Gruev|first3=Viktor|date=2017-02-08|title=Augmented reality with Microsoft HoloLens holograms for near infrared fluorescence based image guided surgery|url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10049/100490I/Augmented-reality-with-Microsoft-HoloLens-holograms-for-near-infrared-fluorescence/10.1117/12.2251625.short|publisher=International Society for Optics and Photonics|volume=10049|pages=100490I|doi=10.1117/12.2251625|chapter=Augmented reality with Microsoft HoloLens holograms for near infrared fluorescence based image guided surgery|series=Molecular-Guided Surgery: Molecules, Devices, and Applications III}}</ref> As augment reality advances, the more it is implemented into medical use. Augmented reality and other computer based-utility is being used today to help train medical professionals.<ref>{{Cite journal|last=Barsom|first=E. Z.|last2=Graafland|first2=M.|last3=Schijven|first3=M. P.|date=2016-10-01|title=Systematic review on the effectiveness of augmented reality applications in medical training|url=https://link.springer.com/article/10.1007/s00464-016-4800-6|journal=Surgical Endoscopy|language=en|volume=30|issue=10|pages=4174–4183|doi=10.1007/s00464-016-4800-6|issn=0930-2794|pmc=5009168}}</ref> With the creation of [[Google Glass]] and [[Microsoft HoloLens]], has helped pushed Augmented Reality into medical education.


=== Spatial immersion and interaction ===
===Spatial immersion and interaction===


Augmented reality applications, running on handheld devices utilized as virtual reality headsets, can also digitalize human presence in space and provide a computer generated model of them, in a virtual space where they can interact and perform various actions. Such capabilities are demonstrated by "Project Anywhere", developed by a postgraduate student at ETH Zurich, which was dubbed as an "out-of-body experience".<ref>{{cite web | url=https://www.theguardian.com/technology/2015/jan/07/project-anywhere-digital-route-to-an-out-of-body-experience | title=Project Anywhere: digital route to an out-of-body experience | work=The Guardian | date=January 7, 2015 | accessdate=September 21, 2016 | author=Davis, Nicola}}</ref><ref>{{cite web | url=http://www.euronews.com/2015/02/25/project-anywhere-an-out-of-body-experience-of-a-new-kind | title=Project Anywhere: an out-of-body experience of a new kind | work=Euronews | date=2015-02-25 | accessdate=September 21, 2016}}</ref><ref>[http://www.studioany.com/#!projectanywhere/c1g1s Project Anywhere] at studioany.com</ref>
Augmented reality applications, running on handheld devices utilized as virtual reality headsets, can also digitize human presence in space and provide a computer generated model of them, in a virtual space where they can interact and perform various actions. Such capabilities are demonstrated by Project Anywhere, developed by a postgraduate student at ETH Zurich, which was dubbed as an "out-of-body experience".<ref>{{cite news | url=https://www.theguardian.com/technology/2015/jan/07/project-anywhere-digital-route-to-an-out-of-body-experience | title=Project Anywhere: digital route to an out-of-body experience | newspaper=[[The Guardian]] | date=7 January 2015 | access-date=21 September 2016 | author=Davis, Nicola}}</ref><ref>{{cite web | url=http://www.euronews.com/2015/02/25/project-anywhere-an-out-of-body-experience-of-a-new-kind | title=Project Anywhere: an out-of-body experience of a new kind | work=Euronews | date=25 February 2015 | access-date=21 September 2016}}</ref><ref>[http://www.studioany.com/#!projectanywhere/c1g1s Project Anywhere] at studioany.com</ref>


=== Flight training ===
===Flight training===
Building on decades of perceptual-motor research in experimental psychology, researchers at the Aviation Research Laboratory of the [[University of Illinois at Urbana–Champaign]] used augmented reality in the form of a flight path in the sky to teach flight students how to land an airplane using a flight simulator. An adaptive augmented schedule in which students were shown the augmentation only when they departed from the flight path proved to be a more effective training intervention than a constant schedule.<ref name="Lintern-1980" /><ref>{{cite journal |last1=Lintern |first1=Gavan |last2=Roscoe |first2=Stanley N. |last3=Sivier |first3=Jonathan E. |title=Display Principles, Control Dynamics, and Environmental Factors in Pilot Training and Transfer |journal=[[Human Factors (journal)|Human Factors]] |date=June 1990 |volume=32 |issue=3 |pages=299–317 |doi=10.1177/001872089003200304 |s2cid=110528421}}</ref> Flight students taught to land in the simulator with the adaptive augmentation learned to land a light aircraft more quickly than students with the same amount of landing training in the simulator but with constant augmentation or without any augmentation.<ref name="Lintern-1980">{{cite journal|last1=Lintern|first1=Gavan|title=Transfer of landing skill after training with supplementary visual cues|journal=Human Factors|date=1980|volume=22|issue=1|pages=81–88|doi=10.1177/001872088002200109|pmid=7364448|s2cid=113087380}}</ref>


===Military===
Building on decades of perceptual-motor research in experimental psychology, researchers at the Aviation Research Laboratory of the University of Illinois at Urbana-Champaign used augmented reality in the form of a flight path in the sky to teach flight students how to land a flight simulator. An adaptive augmented schedule in which students were shown the augmentation only when they departed from the flight path proved to be a more effective training intervention than a constant schedule.<ref name=":0" /><ref>{{Cite journal|last=Lintern|first=Gavan|last2=Roscoe, Stanley N|last3=Sivier, Jonathon|date=1990|title=Display principles, control dynamics, and environmental factors in pilot training and transfer|url=|journal=Human Factors|volume=32|pages=299–317|via=}}</ref> Flight students taught to land in the simulator with the adaptive augmentation learned to land a light aircraft more quickly than students with the same amount of landing training in the simulator but with constant augmentation or without any augmentation.<ref name=":0">{{cite journal|last1=Lintern|first1=Gavan|title=Transfer of landing skill after training with supplementary visual cues|journal=Human Factors|date=1980|volume=22|pages=81–88}}</ref>
[[File:ARC4 AR System.jpg|thumb|alt= Photograph of an Augmented Reality System for Soldier ARC4. |Augmented reality system for soldier ARC4 (U.S. Army 2017)]]
An interesting early application of AR occurred when [[Rockwell International]] created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris.<ref name="ABER93">Abernathy, M., Houchard, J., Puccetti, M., and Lambert, J,"Debris Correlation Using the Rockwell WorldView System", Proceedings of 1993 Space Surveillance Workshop 30 March to 1 April 1993, pages 189-195</ref>


Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows it to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness.
=== Military ===
[[File:ARC4_AR_System.jpg|thumb|Augmented Reality System for Soldier ARC4(USA)]]
An interesting early application of AR occurred when Rockwell International created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris.<ref name="ABER93">Abernathy, M., Houchard, J., Puccetti, M., and Lambert, J,"Debris Correlation Using the Rockwell WorldView System", Proceedings of 1993 Space Surveillance Workshop 30 March to 1 April 1993, pages 189-195</ref>


Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology.<ref>Calhoun, G. L., Draper, M. H., Abernathy, M. F., Delgado, F., and Patzek, M. "Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness," 2005 Proceedings of SPIE Enhanced and Synthetic Vision, Vol. 5802, pp. 219–230.</ref> This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems.
Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined both fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows the system to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness.
[[File:Limpid Armor LCG DSS.jpg|thumb|Circular review system of the company LimpidArmor]]
In combat, AR can serve as a networked communication system that renders useful battlefield data onto a soldier's goggles in real time. From the soldier's viewpoint, people and various objects can be marked with special indicators to warn of potential dangers. Virtual maps and 360° view camera imaging can also be rendered to aid a soldier's navigation and battlefield perspective, and this can be transmitted to military leaders at a remote command center.<ref>Cameron, Chris. [http://www.readwriteweb.com/archives/military_grade_augmented_reality_could_redefine_modern_warfare.php Military-Grade Augmented Reality Could Redefine Modern Warfare] ''ReadWriteWeb'' 11 June 2010.</ref> The combination of 360° view cameras visualization and AR can be used on board combat vehicles and tanks as [[circular review system]].


AR can be an effective tool for virtually mapping out the 3D topologies of munition storages in the terrain, with the choice of the munitions combination in stacks and distances between them with a visualization of risk areas.<ref name=AI>{{cite news |last1=Slyusar |first1=Vadym |title=Augmented reality in the interests of ESMRM and munitions safety |date=19 July 2019 }}</ref>{{unreliable source?|date=October 2019}} The scope of AR applications also includes visualization of data from embedded munitions monitoring sensors.<ref name=AI />
As of 2010, Korean researchers are looking to implement mine-detecting robots into the military. The proposed design for such a robot includes a mobile platform that is like a track which would be able to cover uneven distances including stairs. The robot's mine detection sensor would include a combination of metal detectors and ground penetration radars to locate mines or IEDs. This unique design would be immeasurably helpful in saving lives of Korean soldiers.<ref name="ieeexplore-ieee-org.mutex.gmu.edu">Kang, Seong Pal; Choi, Junho; Suh, Seung-Beum; Kang, Sungchul. [https://ieeexplore-ieee-org.mutex.gmu.edu/document/5679622] Ret. November 30, 2018.</ref>


===Navigation===
Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology.<ref name="CALH05">Calhoun, G. L., Draper, M. H., Abernathy, M. F., Delgado, F., and Patzek, M. "Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness," 2005 Proceedings of SPIE Enhanced and Synthetic Vision, Vol. 5802, pp. 219–230.</ref> This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems.

In combat, AR can serve as a networked communication system that renders useful battlefield data onto a soldier's goggles in real time. From the soldier's viewpoint, people and various objects can be marked with special indicators to warn of potential dangers. Virtual maps and 360° view camera imaging can also be rendered to aid a soldier's navigation and battlefield perspective, and this can be transmitted to military leaders at a remote command center.<ref>Cameron, Chris. [http://www.readwriteweb.com/archives/military_grade_augmented_reality_could_redefine_modern_warfare.php Military-Grade Augmented Reality Could Redefine Modern Warfare] ''ReadWriteWeb'' June 11, 2010.</ref>

=== Navigation ===


{{See also|Automotive navigation system}}
{{See also|Automotive navigation system}}
[[File:LandForm displays landmarks and other indicators during helicopter flight at Yuma Proving Ground..JPG|thumb|LandForm video map overlay marking runways, road, and buildings during 1999 helicopter flight test]]
[[File:LandForm displays landmarks and other indicators during helicopter flight at Yuma Proving Ground..JPG|thumb|alt= Illustration of a LandForm video map overlay marking runways, road, and buildings|LandForm video map overlay marking runways, road, and buildings during 1999 helicopter flight test]]


The NASA X-38 was flown using a Hybrid Synthetic Vision system that overlaid map data on video to provide enhanced navigation for the spacecraft during flight tests from 1998 to 2002. It used the LandForm software and was useful for times of limited visibility, including an instance when the video camera window frosted over leaving astronauts to rely on the map overlays.<ref name="DELG99">Delgado, F., Abernathy, M., White J., and Lowrey, B. ''[http://adsabs.harvard.edu/abs/1999SPIE.3691..149D Real-Time 3-D Flight Guidance with Terrain for the X-38]'', SPIE Enhanced and Synthetic Vision 1999, Orlando Florida, April 1999, Proceedings of the SPIE Vol. 3691, pages 149–156</ref> The LandForm software was also test flown at the Army Yuma Proving Ground in 1999. In the photo at right one can see the map markers indicating runways, air traffic control tower, taxiways, and hangars overlaid on the video.<ref name="DELG00">Delgado, F., Altman, S., Abernathy, M., White, J. ''[http://adsabs.harvard.edu/abs/2000SPIE.4023...63D Virtual Cockpit Window for the X-38]'', SPIE Enhanced and Synthetic Vision 2000, Orlando Florida, Proceedings of the SPIE Vol. 4023, pages 63–70</ref>
The [[NASA X-38]] was flown using a hybrid synthetic vision system that overlaid map data on video to provide enhanced navigation for the spacecraft during flight tests from 1998 to 2002. It used the LandForm software which was useful for times of limited visibility, including an instance when the video camera window frosted over leaving astronauts to rely on the map overlays.<ref name="DELG99">Delgado, F., Abernathy, M., White J., and Lowrey, B. ''[http://adsabs.harvard.edu/abs/1999SPIE.3691..149D Real-Time 3-D Flight Guidance with Terrain for the X-38]'', SPIE Enhanced and Synthetic Vision 1999, Orlando Florida, April 1999, Proceedings of the SPIE Vol. 3691, pages 149–156</ref> The LandForm software was also test flown at the Army [[Yuma Proving Ground]] in 1999. In the photo at right one can see the map markers indicating runways, air traffic control tower, taxiways, and hangars overlaid on the video.<ref name="DELG00">Delgado, F., Altman, S., Abernathy, M., White, J. ''[http://adsabs.harvard.edu/abs/2000SPIE.4023...63D Virtual Cockpit Window for the X-38]'', SPIE Enhanced and Synthetic Vision 2000, Orlando Florida, Proceedings of the SPIE Vol. 4023, pages 63–70</ref>


AR can augment the effectiveness of navigation devices. Information can be displayed on an automobile's windshield indicating destination directions and meter, weather, terrain, road conditions and traffic information as well as alerts to potential hazards in their path.<ref>[https://techcrunch.com/2010/03/17/gms-enhanced-vision-system-brings-augmented-reality-to-vehicle-huds/ GM's Enhanced Vision System]. Techcrunch.com (17 March 2010). Retrieved 9 June 2012.</ref><ref>Couts, Andrew. [http://www.digitaltrends.com/cars/new-augmented-reality-system-shows-3d-gps-navigation-through-your-windshield/ New augmented reality system shows 3D GPS navigation through your windshield] ''Digital Trens'',27 October 2011.</ref><ref>Griggs, Brandon. [http://www.cnn.com/2012/01/13/tech/innovation/ces-future-driving/index.html Augmented-reality' windshields and the future of driving] ''CNN Tech'', 13 January 2012.</ref> Since 2012, a Swiss-based company [[WayRay]] has been developing holographic AR navigation systems that use holographic optical elements for projecting all route-related information including directions, important notifications, and points of interest right into the drivers’ line of sight and far ahead of the vehicle.<ref>{{Cite news|url=https://techcrunch.com/2018/01/09/wayrays-ar-in-car-hud-convinced-me-huds-can-be-better/|title=WayRay’s AR in-car HUD convinced me HUDs can be better|work=TechCrunch|access-date=2018-10-03|language=en-US}}</ref><ref>{{Cite web|url=http://www.futurecar.com/1013/WayRay-Creates-Holographic-Navigation-Alibaba-Invests-$18-Million|title=WayRay Creates Holographic Navigation: Alibaba Invests $18 Million|last=Walz|first=Eric|date=May 22, 2017|website=FutureCar|access-date=2018-10-17}}</ref> Aboard maritime vessels, AR can allow bridge watch-standers to continuously monitor important information such as a ship's heading and speed while moving throughout the bridge or performing other tasks.<ref>{{cite web |url=http://cimsec.org/bridgegoggles/ |title=CIMSEC: Google's AR Goggles|author=Cheney-Peters, Scott|date=12 April 2012 |accessdate=2012-04-20}}</ref>
AR can augment the effectiveness of navigation devices. Information can be displayed on an automobile's windshield indicating destination directions and meter, weather, terrain, road conditions and traffic information as well as alerts to potential hazards in their path.<ref>[https://techcrunch.com/2010/03/17/gms-enhanced-vision-system-brings-augmented-reality-to-vehicle-huds/ GM's Enhanced Vision System]. Techcrunch.com (17 March 2010). Retrieved 9 June 2012.</ref><ref>Couts, Andrew. [http://www.digitaltrends.com/cars/new-augmented-reality-system-shows-3d-gps-navigation-through-your-windshield/ New augmented reality system shows 3D GPS navigation through your windshield] ''[[Digital Trends]]'',27 October 2011.</ref><ref>Griggs, Brandon. [http://www.cnn.com/2012/01/13/tech/innovation/ces-future-driving/index.html Augmented-reality' windshields and the future of driving] ''CNN Tech'', 13 January 2012.</ref> Since 2012, a Swiss-based company [[WayRay]] has been developing holographic AR navigation systems that use holographic optical elements for projecting all route-related information including directions, important notifications, and points of interest right into the drivers' line of sight and far ahead of the vehicle.<ref>{{Cite news|url=https://techcrunch.com/2018/01/09/wayrays-ar-in-car-hud-convinced-me-huds-can-be-better/|title=WayRay's AR in-car HUD convinced me HUDs can be better|work=TechCrunch|access-date=3 October 2018|language=en-US}}</ref><ref>{{Cite web|url=http://www.futurecar.com/1013/WayRay-Creates-Holographic-Navigation-Alibaba-Invests-$18-Million|title=WayRay Creates Holographic Navigation: Alibaba Invests $18 Million|last=Walz|first=Eric|date=22 May 2017|website=FutureCar|access-date=2018-10-17}}</ref> Aboard maritime vessels, AR can allow bridge watch-standers to continuously monitor important information such as a ship's heading and speed while moving throughout the bridge or performing other tasks.<ref>{{cite web |url=http://cimsec.org/bridgegoggles/ |title=CIMSEC: Google's AR Goggles|author=Cheney-Peters, Scott|date=12 April 2012 |access-date=20 April 2012}}</ref>


=== Workplace ===
===Workplace===


Augmented reality may have a good impact on work collaboration as people may be inclined to interact more actively with their learning environment. It may also encourage tacit knowledge renewal which makes firms more competitive. AR was used to facilitate collaboration among distributed team members via conferences with local and virtual participants. AR tasks included brainstorming and discussion meetings utilizing common visualization via touch screen tables, interactive digital whiteboards, shared design spaces and distributed control rooms.<ref>{{cite web |url=http://www.hog3d.net/ |title=Hand of God |author1=Stafford, Aaron |author2=Piekarski, Wayne |author3=Thomas, Bruce H. |accessdate=2009-12-18 |archiveurl=https://web.archive.org/web/20091207022651/http://www.hog3d.net/ |archivedate=2009-12-07 <!-- DASHBot -->|deadurl=no}}</ref><ref>Benford, S, Greenhalgh, C, Reynard, G, Brown, C and Koleva, B. Understanding and constructing shared spaces with mixed-reality boundaries. ACM Trans. Computer-Human Interaction, 5(3):185–223, Sep. 1998.</ref><ref>[http://mi-lab.org/projects/office-of-tomorrow/ Office of Tomorrow] ''Media Interaction Lab''.</ref>
Augmented reality may have a positive impact on work collaboration as people may be inclined to interact more actively with their learning environment. It may also encourage tacit knowledge renewal which makes firms more competitive. AR was used to facilitate collaboration among distributed team members via conferences with local and virtual participants. AR tasks included brainstorming and discussion meetings utilizing common visualization via touch screen tables, interactive digital whiteboards, shared design spaces and distributed control rooms.<ref>{{cite web |url=http://www.hog3d.net/ |title=Hand of God |author1=Stafford, Aaron |author2=Piekarski, Wayne |author3=Thomas, Bruce H. |access-date=18 December 2009 |archive-url=https://web.archive.org/web/20091207022651/http://www.hog3d.net/ |archive-date=7 December 2009 |url-status=dead |df=dmy-all }}</ref><ref>{{cite journal |last1=Benford |first1=Steve |last2=Greenhalgh |first2=Chris |last3=Reynard |first3=Gail |last4=Brown |first4=Chris |last5=Koleva |first5=Boriana |s2cid=672378 |title=Understanding and constructing shared spaces with mixed-reality boundaries |journal=ACM Transactions on Computer-Human Interaction |date=1 September 1998 |volume=5 |issue=3 |pages=185–223 |doi=10.1145/292834.292836 }}</ref><ref>[http://mi-lab.org/projects/office-of-tomorrow/ Office of Tomorrow] ''Media Interaction Lab''.</ref>


In industrial environments, augmented reality is proving to have a substantial impact with more and more use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system.<ref>[http://ngm.nationalgeographic.com/big-idea/14/augmented-reality-pg1 The big idea:Augmented Reality]. Ngm.nationalgeographic.com (15 May 2012). Retrieved 2012-06-09.</ref><ref>{{cite web |url=http://graphics.cs.columbia.edu/projects/armar/ |title=Augmented Reality for Maintenance and Repair (ARMAR) |author1=Henderson, Steve |author2=Feiner, Steven |accessdate=2010-01-06}}</ref> Assembly lines benefited from the usage of AR. In addition to Boeing, BMW and Volkswagen were known for incorporating this technology into assembly lines for monitoring process improvements.<ref>Sandgren, Jeffrey. [http://brandtechnews.net/tag/augmented-reality/ The Augmented Eye of the Beholder], ''BrandTech News'' January 8, 2011.</ref><ref>Cameron, Chris. [http://www.slideshare.net/readwriteweb/augmented-reality-for-marketers-and-developers-analysis-of-the-leaders-the-challenges-and-the-future Augmented Reality for Marketers and Developers], ''ReadWriteWeb''.</ref><ref>Dillow, Clay [http://www.popsci.com/scitech/article/2009-09/bmw-developing-augmented-reality-help-mechanics BMW Augmented Reality Glasses Help Average Joes Make Repairs], ''Popular Science'' September 2009.</ref> Big machines are difficult to maintain because of their multiple layers or structures. AR permits people to look through the machine as if with an x-ray, pointing them to the problem right away.<ref>King, Rachael. [http://www.businessweek.com/stories/2009-11-03/augmented-reality-goes-mobilebusinessweek-business-news-stock-market-and-financial-advice Augmented Reality Goes Mobile], ''Bloomberg Business Week Technology'' November 3, 2009.</ref>
In industrial environments, augmented reality is proving to have a substantial impact with more and more use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system.<ref>[https://web.archive.org/web/20110511082745/http://ngm.nationalgeographic.com/big-idea/14/augmented-reality-pg1 The big idea:Augmented Reality]. Ngm.nationalgeographic.com (15 May 2012). Retrieved 9 June 2012.</ref><ref>{{cite web |url=http://graphics.cs.columbia.edu/projects/armar/ |title=Augmented Reality for Maintenance and Repair (ARMAR) |author1=Henderson, Steve |author2=Feiner, Steven |access-date=6 January 2010 |archive-date=6 March 2010 |archive-url=https://web.archive.org/web/20100306202422/http://graphics.cs.columbia.edu/projects/armar/ |url-status=dead }}</ref> Assembly lines benefited from the usage of AR. In addition to Boeing, BMW and Volkswagen were known for incorporating this technology into assembly lines for monitoring process improvements.<ref>Sandgren, Jeffrey. [http://brandtechnews.net/tag/augmented-reality/ The Augmented Eye of the Beholder] {{Webarchive|url=https://web.archive.org/web/20130621054848/http://brandtechnews.net/tag/augmented-reality/ |date=21 June 2013 }}, ''BrandTech News'' 8 January 2011.</ref><ref>Cameron, Chris. [http://www.slideshare.net/readwriteweb/augmented-reality-for-marketers-and-developers-analysis-of-the-leaders-the-challenges-and-the-future Augmented Reality for Marketers and Developers], ''ReadWriteWeb''.</ref><ref>Dillow, Clay [http://www.popsci.com/scitech/article/2009-09/bmw-developing-augmented-reality-help-mechanics BMW Augmented Reality Glasses Help Average Joes Make Repairs], ''Popular Science'' September 2009.</ref> Big machines are difficult to maintain because of their multiple layers or structures. AR permits people to look through the machine as if with an x-ray, pointing them to the problem right away.<ref>King, Rachael. [https://web.archive.org/web/20120704074014/http://www.businessweek.com/stories/2009-11-03/augmented-reality-goes-mobilebusinessweek-business-news-stock-market-and-financial-advice Augmented Reality Goes Mobile], ''Bloomberg Business Week Technology'' 3 November 2009.</ref>


As AR technology has evolved and second and third generation AR devices come to market, the impact of AR in enterprise continues to flourish. In one [https://hbr.org/2017/03/augmented-reality-is-already-improving-worker-performance Harvard Business Review article] by Magid Abraham of [[Upskill]] and Marco Annunziata of GE, the authors discuss how AR devices are now being used to "boost workers’ productivity on an array of tasks the first time they’re used, even without prior training." They go on to contend that "these technologies increase productivity by making workers more skilled and efficient, and thus have the potential to yield both more economic growth and better jobs." To support their points, they reference a [https://www.youtube.com/watch?v=n5LhQqggGTE side-by-side time-lapse comparison] of a GE technician wiring a wind turbine's control box using the company's current process, and then doing the same task while guided by line-of-sight instructions overlaid on the job by an AR headset. The device improved the worker's performance by 34% on first use.
As AR technology has evolved and second and third generation AR devices come to market, the impact of AR in enterprise continues to flourish. In the ''Harvard Business Review'', Magid Abraham and Marco Annunziata discuss how AR devices are now being used to "boost workers' productivity on an array of tasks the first time they're used, even without prior training".<ref name="Abraham-2017">{{Cite journal|url=https://hbr.org/2017/03/augmented-reality-is-already-improving-worker-performance|title=Augmented Reality Is Already Improving Worker Performance|last1=Abraham|first1=Magid|last2=Annunziata|first2=Marco|date=13 March 2017|journal=[[Harvard Business Review]]|access-date=13 January 2019}}</ref> They contend that "these technologies increase productivity by making workers more skilled and efficient, and thus have the potential to yield both more economic growth and better jobs".<ref name="Abraham-2017" />


===Broadcast and live events===
The new wave of professionals, the Millennial workforce, demands more efficient knowledge sharing solutions and easier access to rapidly growing knowledge bases. Augmented reality offers a solution to that.<ref>http://www.information-age.com/knowledge-sharing-age-millennials-123461732/</ref>


Weather visualizations were the first application of augmented reality in television. It has now become common in weather casting to display full motion video of images captured in real-time from multiple cameras and other imaging devices. Coupled with 3D graphics symbols and mapped to a common virtual geospatial model, these animated visualizations constitute the first true application of AR to TV.
=== Broadcast and live events===


AR has become common in sports telecasting. Sports and entertainment venues are provided with see-through and overlay augmentation through tracked camera feeds for enhanced viewing by the audience. Examples include the yellow "[[first down]]" line seen in television broadcasts of [[American football]] games showing the line the offensive team must cross to receive a first down. AR is also used in association with football and other sporting events to show commercial advertisements overlaid onto the view of the playing area. Sections of [[rugby football|rugby]] fields and [[cricket]] pitches also display sponsored images. Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance. Other examples include hockey puck tracking and annotations of racing car performance<ref>Archived at [https://ghostarchive.org/varchive/youtube/20211211/1jQUkqqnZIc Ghostarchive]{{cbignore}} and the [https://web.archive.org/web/20210714184600/https://www.youtube.com/watch?v=1jQUkqqnZIc Wayback Machine]{{cbignore}}: {{Citation|title=Arti AR highlights at SRX -- the first sports augmented reality live from a moving car!| date=14 July 2021 |url=https://www.youtube.com/watch?v=1jQUkqqnZIc|language=en|access-date=2021-07-14}}{{cbignore}}</ref> and snooker ball trajectories.<ref name="recentadvances">[[Azuma, Ronald]]; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. [http://www.cc.gatech.edu/~blair/papers/ARsurveyCGA.pdf Recent Advances in Augmented Reality] ''Computers & Graphics'', November 2001.</ref><ref>Marlow, Chris. [http://www.dmwmedia.com/news/2012/04/27/hey-hockey-puck-nhl-preplay-adds-a-second-screen-experience-to-live-games Hey, hockey puck! NHL PrePlay adds a second-screen experience to live games], ''digitalmediawire'' 27 April 2012.</ref>
Weather visualizations were the first application of augmented reality to television. It has now become common in weathercasting to display full motion video of images captured in real-time from multiple cameras and other imaging devices. Coupled with 3D graphics symbols and mapped to a common virtual geospace model, these animated visualizations constitute the first true application of AR to TV.


AR has been used to enhance concert and theater performances. For example, artists allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.<ref>{{cite book |doi=10.1109/ART.2002.1107010 |chapter=The Duran Duran project: The augmented reality toolkit in live performance |title=The First IEEE International Workshop Agumented Reality Toolkit |pages=2 |year=2002 |last1=Pair |first1=J. |last2=Wilson |first2=J. |last3=Chastine |first3=J. |last4=Gandy |first4=M. |s2cid=55820154 |isbn=0-7803-7680-3 }}</ref><ref>Broughall, Nick. [http://www.gizmodo.com.au/2009/10/sydney-band-uses-augmented-reality-for-video-clip/ Sydney Band Uses Augmented Reality For Video Clip.] ''Gizmodo'', 19 October 2009.</ref><ref>Pendlebury, Ty. [http://www.cnet.com.au/augmented-reality-in-aussie-film-clip-339299097.htm Augmented reality in Aussie film clip]. ''c|net'' 19 October 2009.</ref>
AR has become common in sports telecasting. Sports and entertainment venues are provided with see-through and overlay augmentation through tracked camera feeds for enhanced viewing by the audience. Examples include the yellow "[[first down]]" line seen in television broadcasts of [[American football]] games showing the line the offensive team must cross to receive a first down. AR is also used in association with football and other sporting events to show commercial advertisements overlaid onto the view of the playing area. Sections of [[rugby football|rugby]] fields and [[cricket]] pitches also display sponsored images. Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance. Other examples include hockey puck tracking and annotations of racing car performance and snooker ball trajectories.<ref name="recentadvances">Azuma, Ronald; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. [http://www.cc.gatech.edu/~blair/papers/ARsurveyCGA.pdf Recent Advances in Augmented Reality] ''Computers & Graphics'', November 2001.</ref><ref>Marlow, Chris. [http://www.dmwmedia.com/news/2012/04/27/hey-hockey-puck-nhl-preplay-adds-a-second-screen-experience-to-live-games Hey, hockey puck! NHL PrePlay adds a second-screen experience to live games], ''digitalmediawire'' April 27, 2012.</ref>


===Tourism and sightseeing===
Augmented reality for Next Generation TV allows viewers to interact with the programs they were watching. They can place objects into an existing program and interact with them, such as moving them around. Objects include avatars of real persons in real time who are also watching the same program.


AR has been used to enhance concert and theater performances. For example, artists allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.<ref>Pair, J.; Wilson, J.; Chastine, J.; Gandy, M. "[http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?tp=&arnumber=1107010 The Duran Duran Project: The Augmented Reality Toolkit in Live Performance]". ''The First IEEE International Augmented Reality Toolkit Workshop'', 2002.</ref><ref>Broughall, Nick. [http://www.gizmodo.com.au/2009/10/sydney-band-uses-augmented-reality-for-video-clip/ Sydney Band Uses Augmented Reality For Video Clip.] ''Gizmodo'', 19 October 2009.</ref><ref>Pendlebury, Ty. [http://www.cnet.com.au/augmented-reality-in-aussie-film-clip-339299097.htm Augmented reality in Aussie film clip]. ''c|net'' 19 October 2009.</ref>
Travelers may use AR to access real-time informational displays regarding a location, its features, and comments or content provided by previous visitors. Advanced AR applications include simulations of historical events, places, and objects rendered into the landscape.<ref>Saenz, Aaron [http://singularityhub.com/2009/11/19/augmented-reality-does-time-travel-tourism/ Augmented Reality Does Time Travel Tourism] ''SingularityHUB'' 19 November 2009.</ref><ref>Sung, Dan [http://www.pocket-lint.com/news/38806/augmented-reality-travel-tourism-apps Augmented reality in action travel and tourism] ''Pocket-lint'' 2 March 2011.</ref><ref>Dawson, Jim [http://www.livescience.com/5644-augmented-reality-reveals-history-tourists.html Augmented Reality Reveals History to Tourists] ''Life Science'' 16 August 2009.</ref>


AR applications linked to geographic locations present location information by audio, announcing features of interest at a particular site as they become visible to the user.<ref>{{Cite journal | doi=10.1111/j.1467-9671.2006.00244.x| title=Development of a Speech-Based Augmented Reality System to Support Exploration of Cityscape| journal=Transactions in GIS| volume=10| pages=63–86| year=2006| last1=Bartie| first1=Phil J.| last2=MacKaness| first2=William A.| issue=1| bibcode=2006TrGIS..10...63B| s2cid=13325561}}</ref><ref>Benderson, Benjamin B. [http://www.cs.umd.edu/~bederson/papers/chi-95-aar/ Audio Augmented Reality: A Prototype Automated Tour Guide] {{webarchive |url=https://web.archive.org/web/20020701071038/http://www.cs.umd.edu/~bederson/papers/chi-95-aar/ |date=1 July 2002 }} Bell Communications Research, ACM Human Computer in Computing Systems Conference, pp. 210–211.</ref><ref>Jain, Puneet and Manweiler, Justin and Roy Choudhury, Romit. [http://synrg.csl.illinois.edu/papers/overlay.pdf OverLay: Practical Mobile Augmented Reality]
=== Tourism and sightseeing ===

Travelers may use AR to access real-time informational displays regarding a location, its features, and comments or content provided by previous visitors. Advanced AR applications include simulations of historical events, places, and objects rendered into the landscape.<ref>Saenz, Aaron [http://singularityhub.com/2009/11/19/augmented-reality-does-time-travel-tourism/ Augmented Reality Does Time Travel Tourism] ''SingularityHUB'' November 19, 2009.</ref><ref>Sung, Dan [http://www.pocket-lint.com/news/38806/augmented-reality-travel-tourism-apps Augmented reality in action – travel and tourism] ''Pocket-lint'' March 2, 2011.</ref><ref>Dawson, Jim [http://www.livescience.com/5644-augmented-reality-reveals-history-tourists.html Augmented Reality Reveals History to Tourists] ''Life Science'' August 16, 2009.</ref>

AR applications linked to geographic locations present location information by audio, announcing features of interest at a particular site as they become visible to the user.<ref>Bartie, P and Mackaness, W.[http://onlinelibrary.wiley.com/doi/10.1111/j.1467-9671.2006.00244.x/abstract Development of a speech-based augmented reality system to support exploration of the cityscape.] Trans. GIS, 10(1):63–86, 2006.</ref><ref>Benderson, Benjamin B. [http://www.cs.umd.edu/~bederson/papers/chi-95-aar/ Audio Augmented Reality: A Prototype Automated Tour Guide] {{webarchive |url=https://web.archive.org/web/20020701071038/http://www.cs.umd.edu/~bederson/papers/chi-95-aar/ |date=1 July 2002 }} Bell Communications Research, ACM Human Computer in Computing Systems Conference, pp. 210–211.</ref><ref>Jain, Puneet and Manweiler, Justin and Roy Choudhury, Romit. [http://synrg.csl.illinois.edu/papers/overlay.pdf OverLay: Practical Mobile Augmented Reality]
ACM MobiSys, May 2015.</ref>
ACM MobiSys, May 2015.</ref>


===Translation===
Companies can use AR to attract tourists to particular areas that they may not be familiar with by name. Tourists will be able to experience beautiful landscapes in first person with the use of AR devices. Companies like Phocuswright plan to use such technology in order to expose the lesser known but beautiful areas of the planet, and in turn, increase tourism. Other companies such as Matoke Tours have already developed an application where the user can see 360 degrees from several different places in Uganda. Matoke Tours and Phocuswright have the ability to display their apps on virtual reality headsets like the Samsung VR and Oculus Rift.<ref>{{Cite news|url=https://www.cnbc.com/2016/01/08/virtual-reality-devices-could-transform-the-tourism-experience.html|title=VR devices could transform tourism|last=CNBC.com|first=Luke Graham, special to|date=2016-01-08|work=CNBC|access-date=2018-03-08}}</ref>


AR systems such as [[Word Lens]] can interpret the foreign text on signs and menus and, in a user's augmented view, re-display the text in the user's language. Spoken words of a foreign language can be translated and displayed in a user's view as printed subtitles.<ref>Tsotsis, Alexia. [https://techcrunch.com/2010/12/16/world-lens-translates-words-inside-of-images-yes-really Word Lens Translates Words Inside of Images. Yes Really.] ''TechCrunch'' (16 December 2010).</ref><ref>N.B. [https://www.economist.com/blogs/gulliver/2010/12/instant_translation Word Lens: This changes everything] ''The Economist: Gulliver blog'' 18 December 2010.</ref><ref>Borghino, Dario [http://www.gizmag.com/language-translating-glasses/23494/ Augmented reality glasses perform real-time language translation]. ''gizmag'', 29 July 2012.</ref>
=== Translation ===


===Music===
AR systems such as [[Word Lens]] can interpret the foreign text on signs and menus and, in a user's augmented view, re-display the text in the user's language. Spoken words of a foreign language can be translated and displayed in a user's view as printed subtitles.<ref>Tsotsis, Alexia. [https://techcrunch.com/2010/12/16/world-lens-translates-words-inside-of-images-yes-really Word Lens Translates Words Inside of Images. Yes Really.] ''TechCrunch'' (16 December 2010).</ref><ref>N.B. [https://www.economist.com/blogs/gulliver/2010/12/instant_translation Word Lens: This changes everything] ''The Economist: Gulliver blog'' 18 December 2010.</ref><ref>Borghino, Dario [http://www.gizmag.com/language-translating-glasses/23494/ Augmented reality glasses perform real-time language translation]. ''gizmag'', 29 July 2012.</ref>


It has been suggested that augmented reality may be used in new methods of [[music production]], [[music mixing|mixing]], [[Media controls|control]] and [[music visualization|visualization]].<ref>{{cite web|title=Music Production in the Era of Augmented Reality|url=https://medium.com/@Soundspringstudio/music-production-in-the-era-of-augmented-reality-2e79f4926275|website=Medium|access-date=5 January 2017|date=14 October 2016}}</ref><ref>{{cite web|title=Augmented Reality music making with Oak on Kickstarter – gearnews.com|url=https://www.gearnews.com/augmented-reality-music-making-oak-kickstarter/|website=gearnews.com|access-date=5 January 2017|date=3 November 2016}}</ref><ref>{{cite web|last1=Clouth|first1=Robert|title=Mobile Augmented Reality as a Control Mode for Real-time Music Systems|url=http://mtg.upf.edu/node/2846|access-date=5 January 2017|date=1 January 2013}}</ref><ref>{{cite book |doi=10.1109/ICICS.2007.4449564 |chapter=A multimodal augmented reality DJ music system |title=2007 6th International Conference on Information, Communications & Signal Processing |pages=1–5 |year=2007 |last1=Farbiz |first1=Farzam |last2=Tang |first2=Ka Yin |last3=Wang |first3=Kejian |last4=Ahmad |first4=Waqas |last5=Manders |first5=Corey |last6=Jyh Herng |first6=Chong |last7=Kee Tan |first7=Yeow |s2cid=17807179 |isbn=978-1-4244-0982-2 }}</ref>
=== Music ===


In a proof-of-concept project Ian Sterling, an interaction design student at [[California College of the Arts]], and software engineer Swaroop Pal demonstrated a HoloLens app whose primary purpose is to provide a 3D spatial UI for cross-platform devices—the Android Music Player app and Arduino-controlled Fan and Light—and also allow interaction using gaze and gesture control.<ref>{{cite web|title=HoloLens concept lets you control your smart home via augmented reality|url=http://www.digitaltrends.com/cool-tech/hololens-hackathon-smart-home/|publisher=[[Digital Trends]]|access-date=5 January 2017|date=26 July 2016}}</ref><ref>{{cite web|title=Hololens: Entwickler zeigt räumliches Interface für Elektrogeräte|url=https://mixed.de/hololens-entwickler-zeigt-raeumliches-interface-fuer-elektrogeraete/|publisher=MIXED|access-date=5 January 2017|language=de-DE|date=22 July 2016}}</ref><ref>{{cite web|title=Control Your IoT Smart Devices Using Microsoft HoloLen (video) – Geeky Gadgets|url=http://www.geeky-gadgets.com/control-your-iot-smart-devices-using-microsoft-hololen-27-07-2016/|publisher=Geeky Gadgets|access-date=5 January 2017|date=27 July 2016}}</ref><ref>{{cite web|title=Experimental app brings smart home controls into augmented reality with HoloLens|url=http://www.windowscentral.com/experimental-app-brings-smart-home-controls-augmented-reality-hololens|publisher=Windows Central|access-date=5 January 2017|date=22 July 2016}}</ref>
It has been suggested that augmented reality may be used in new methods of [[music production]], [[music mixing|mixing]], [[Media controls|control]] and [[music visualization|visualization]].<ref>{{cite web|title=Music Production in the Era of Augmented Reality|url=https://medium.com/@Soundspringstudio/music-production-in-the-era-of-augmented-reality-2e79f4926275|website=Medium|accessdate=5 January 2017|date=14 October 2016}}</ref><ref>{{cite web|title=Augmented Reality music making with Oak on Kickstarter – gearnews.com|url=https://www.gearnews.com/augmented-reality-music-making-oak-kickstarter/|website=gearnews.com|accessdate=5 January 2017|date=3 November 2016}}</ref><ref>{{cite web|last1=Clouth|first1=Robert|title=Mobile Augmented Reality as a Control Mode for Real-time Music Systems|url=http://mtg.upf.edu/node/2846|accessdate=5 January 2017|date=1 January 2013}}</ref><ref>{{Cite book|last1=Farbiz|first1=Farzam|title=2007 6th International Conference on Information, Communications & Signal Processing|pages=1–5|last2=Tang|first2=Ka Yin|last3=Manders|first3=Corey|last4=Herng|first4=Chong Jyh|last5=Tan|first5=Yeow Kee|last6=Wang|first6=Kejian|last7=Ahmad|first7=Waqas|url=https://www.researchgate.net/publication/4318144_A_multimodal_augmented_reality_DJ_music_system|website=ResearchGate|accessdate=5 January 2017|doi=10.1109/ICICS.2007.4449564|date=10 January 2008|chapter=A multimodal augmented reality DJ music system|isbn=978-1-4244-0982-2}}</ref>


Research by members of the CRIStAL at the [[University of Lille]] makes use of augmented reality to enrich musical performance. The ControllAR project allows musicians to augment their [[MIDI]] control surfaces with the remixed [[graphical user interface]]s of [[music software]].<ref>{{cite book|pages=271–277|chapter-url=https://hal.archives-ouvertes.fr/hal-01356239|doi=10.1145/2992154.2992170|year=2016|last1=Berthaut|first1=Florent|last2=Jones|first2=Alex|title=Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces |chapter=ControllAR: Appropriation of Visual Feedback on Control Surfaces |isbn=9781450342483|s2cid=7180627|url=https://hal.archives-ouvertes.fr/hal-01356239/file/ControllAR.pdf}}</ref> The Rouages project proposes to augment [[Electronic musical instrument|digital musical instruments]] to reveal their mechanisms to the audience and thus improve the perceived liveness.<ref>{{cite web|title=Rouages: Revealing the Mechanisms of Digital Musical Instruments to the Audience|pages=6 pages|url=https://hal.archives-ouvertes.fr/hal-00807049|date=May 2013}}</ref> Reflets is a novel augmented reality display dedicated to musical performances where the audience acts as a 3D display by revealing virtual content on stage, which can also be used for 3D musical interaction and collaboration.<ref>{{cite web|title=Reflets: Combining and Revealing Spaces for Musical Performances|url=https://hal.archives-ouvertes.fr/hal-01136857|date=May 2015}}</ref>
A tool for 3D music creation in clubs that, in addition to regular sound mixing features, allows the [[DJ]] to play dozens of [[sound sample]]s, placed anywhere in 3D space, has been conceptualized.<ref>{{cite journal|last1=Stampfl|first1=Philipp|title=Augmented Reality Disk Jockey: AR/DJ|journal=ACM SIGGRAPH 2003 Sketches & Applications|date=1 January 2003|pages=1–1|doi=10.1145/965400.965556}}</ref>


===Snapchat===
[[Leeds College of Music]] teams have developed an AR app that can be used with [[Audient]] desks and allow students to use their smartphone or tablet to put layers of information or interactivity on top of an Audient mixing desk.<ref>{{cite web|title=GROUND-BREAKING AUGMENTED REALITY PROJECT Supporting music production through new technology|url=http://www.lcm.ac.uk/News/Ground-breaking-AR-project-for-Production|accessdate=5 January 2017|archive-url=https://web.archive.org/web/20170106102945/http://www.lcm.ac.uk/News/Ground-breaking-AR-project-for-Production|archive-date=6 January 2017|dead-url=yes|df=dmy-all}}</ref>


[[Snapchat]] users have access to augmented reality in the app through use of camera filters. In September 2017, Snapchat updated its app to include a camera filter that allowed users to render an animated, cartoon version of themselves called "[[Bitmoji]]". These animated avatars would be projected in the real world through the camera, and can be photographed or video recorded.<ref>Wagner, Kurt. "Snapchat's New Augmented Reality Feature Brings Your Cartoon Bitmoji into the Real World." Recode, Recode, 14 Sept. 2017, www.recode.net/2017/9/14/16305890/snapchat-bitmoji-ar-Facebook.</ref> In the same month, Snapchat also announced a new feature called "Sky Filters" that will be available on its app. This new feature makes use of augmented reality to alter the look of a picture taken of the sky, much like how users can apply the app's filters to other pictures. Users can choose from sky filters such as starry night, stormy clouds, beautiful sunsets, and rainbow.<ref>Miller, Chance. "Snapchat's Latest Augmented Reality Feature Lets You Paint the Sky with New Filters." 9to5Mac, 9to5Mac, 25 Sept. 2017, 9to5mac.com/2017/09/25/how-to-use-snapchat-sky-filters/.</ref>
ARmony is a software package that makes use of augmented reality to help people to learn an instrument.<ref>{{cite web|title=ARmony – Using Augmented Reality to learn music|url=https://www.youtube.com/watch?v=wolO1lbKzAw|website=YouTube|accessdate=5 January 2017|date=24 August 2014}}</ref>


==Concerns==
In a proof-of-concept project Ian Sterling, interaction design student at [[California College of the Arts]], and software engineer Swaroop Pal demonstrated a HoloLens app whose primary purpose is to provide a 3D spatial UI for cross-platform devices — the Android Music Player app and Arduino-controlled Fan and Light — and also allow interaction using gaze and gesture control.<ref>{{cite web|title=HoloLens concept lets you control your smart home via augmented reality|url=http://www.digitaltrends.com/cool-tech/hololens-hackathon-smart-home/|publisher=Digital Trends|accessdate=5 January 2017|date=26 July 2016}}</ref><ref>{{cite web|title=Hololens: Entwickler zeigt räumliches Interface für Elektrogeräte|url=https://vrodo.de/hololens-entwickler-zeigt-raeumliches-interface-fuer-elektrogeraete/|publisher=VRODO|accessdate=5 January 2017|language=de-DE|date=22 July 2016}}</ref><ref>{{cite web|title=Control Your IoT Smart Devices Using Microsoft HoloLen (video) – Geeky Gadgets|url=http://www.geeky-gadgets.com/control-your-iot-smart-devices-using-microsoft-hololen-27-07-2016/|publisher=Geeky Gadgets|accessdate=5 January 2017|date=27 July 2016}}</ref><ref>{{cite web|title=Experimental app brings smart home controls into augmented reality with HoloLens|url=http://www.windowscentral.com/experimental-app-brings-smart-home-controls-augmented-reality-hololens|publisher=Windows Central|accessdate=5 January 2017}}</ref>


===Reality modifications===
AR Mixer is an app that allows one to select and mix between songs by manipulating objects – such as changing the orientation of a bottle or can.<ref>{{cite web|title=This app can mix music while you mix drinks, and proves augmented reality can be fun|url=http://www.digitaltrends.com/mobile/augmented-reality-mixer-app/|publisher=Digital Trends|accessdate=5 January 2017|date=20 November 2013}}</ref>
In a paper titled [[Pokémon Go|"Death by Pokémon GO"]], researchers at [[Purdue University]]'s [[Krannert School of Management]] claim the game caused "a disproportionate increase in vehicular crashes and associated vehicular damage, personal injuries, and fatalities in the vicinity of locations, called PokéStops, where users can play the game while driving."<ref>{{cite news |last1=Faccio |first1=Mara |last2=McConnell |first2=John J. |title=Death by Pokémon GO |date=2017 |doi=10.2139/ssrn.3073723 |ssrn=3073723 }}</ref> Using data from one municipality, the paper extrapolates what that might mean nationwide and concluded "the increase in crashes attributable to the introduction of Pokémon GO is 145,632 with an associated increase in the number of injuries of 29,370 and an associated increase in the number of fatalities of 256 over the period of 6 July 2016, through 30 November 2016." The authors extrapolated the cost of those crashes and fatalities at between $2bn and $7.3 billion for the same period. Furthermore, more than one in three surveyed advanced Internet users would like to edit out disturbing elements around them, such as garbage or graffiti.<ref>Peddie, J., 2017, Agumented Reality, Springer{{page needed|date=October 2019}}</ref> They would like to even modify their surroundings by erasing street signs, billboard ads, and uninteresting shopping windows. So it seems that AR is as much a threat to companies as it is an opportunity. Although, this could be a nightmare to numerous brands that do not manage to capture consumer imaginations it also creates the risk that the wearers of augmented reality glasses may become unaware of surrounding dangers. Consumers want to use augmented reality glasses to change their surroundings into something that reflects their own personal opinions. Around two in five want to change the way their surroundings look and even how people appear to them. {{Citation needed|date=September 2020}}


Next, to the possible privacy issues that are described below, overload and over-reliance issues are the biggest danger of AR. For the development of new AR-related products, this implies that the user-interface should follow certain guidelines as not to overload the user with information while also preventing the user from over-relying on the AR system such that important cues from the environment are missed.<ref name="Azuma_survey"/> This is called the virtually-augmented key.<ref name="Azuma_survey"/> Once the key is ignored, people might not desire the real world anymore.
In a video, Uriel Yehezkel demonstrates using the [[Leap Motion]] controller and GECO MIDI to control [[Ableton Live]] with [[gesture recognition|hand gesture]]s and states that by this method he was able to control more than 10 parameters simultaneously with both hands and take full control over the construction of the song, emotion and energy.<ref>{{cite web|last1=Sterling|first1=Bruce|title=Augmented Reality: Controlling music with Leapmotion Geco and Ableton (Hands Control)|url=https://www.wired.com/2013/11/augmented-reality-controlling-music-with-leapmotion-geco-and-ableton-hands-control/|publisher=WIRED|accessdate=5 January 2017}}</ref><ref>{{cite web|title=Controlling Music With Leap Motion Geco & Ableton|url=http://www.synthtopia.com/content/2013/11/04/controlling-music-with-leap-motion-geco-ableton/|publisher=Synthtopia|accessdate=5 January 2017|date=4 November 2013}}</ref>{{Better source|reason=|date=January 2017}}


===Privacy concerns===
A novel musical instrument that allows novices to play electronic musical compositions, interactively remixing and modulating their elements, by manipulating simple physical objects has been proposed.<ref>{{cite journal|title=Augmented Reality Interface for Electronic Music Performance|url=https://pdfs.semanticscholar.org/4dcc/f17d5a68206a872d887ceec84fa0a085c21d.pdf|accessdate=5 January 2017}}</ref>
The concept of modern augmented reality depends on the ability of the device to record and analyze the environment in real time. Because of this, there are potential legal concerns over privacy. While the [[First Amendment to the United States Constitution]] allows for such recording in the name of public interest, the constant recording of an AR device makes it difficult to do so without also recording outside of the public domain. Legal complications would be found in areas where a right to a certain amount of privacy is expected or where copyrighted media are displayed.


In terms of individual privacy, there exists the ease of access to information that one should not readily possess about a given person. This is accomplished through facial recognition technology. Assuming that AR automatically passes information about persons that the user sees, there could be anything seen from social media, criminal record, and marital status.<ref>{{cite book |doi=10.1145/2638728.2641709 |chapter=Augmented reality |title=Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp '14 Adjunct |pages=1283–1288 |year=2014 |last1=Roesner |first1=Franziska |last2=Kohno |first2=Tadayoshi |last3=Denning |first3=Tamara |last4=Calo |first4=Ryan |last5=Newell |first5=Bryce Clayton |s2cid=15190154 |isbn=978-1-4503-3047-3 }}</ref>
A system using explicit gestures and implicit dance moves to control the visual augmentations of a live music performance that enable more dynamic and spontaneous performances and—in combination with indirect augmented reality—leading to a more intense interaction between artist and audience has been suggested.<ref>{{cite journal|title=Expressive Control of Indirect Augmented Reality During Live Music Performances|url=http://nime.org/proceedings/2013/nime2013_32.pdf|accessdate=5 January 2017}}</ref>


The Code of Ethics on Human Augmentation, which was originally introduced by [[Steve Mann (inventor)|Steve Mann]] in 2004 and further refined with [[Ray Kurzweil]] and [[Marvin Minsky]] in 2013, was ultimately ratified at the virtual reality Toronto conference on 25 June 2017.<ref>{{Cite web|url=https://m.ebrary.net/123089/sociology/code_ethics_human_augmentation|title=The Code of Ethics on Human Augmentation - Augmented Reality : Where We Will All Live -|website=m.ebrary.net|access-date=2019-11-18}}</ref><ref>{{Cite web|url=https://www.huffpost.com/entry/the-future-of-tech-just-c_b_10929608|title=The Future of Tech Just Changed at VRTO--Here's Why That Matters to You|last=Damiani|first=Jesse|date=2016-07-18|website=HuffPost|language=en|access-date=2019-11-18}}</ref><ref>{{Cite web|url=https://www.vrfocus.com/2016/07/vrto-spearheads-code-of-ethics-on-human-augmentation/|title=VRTO Spearheads Code of Ethics on Human Augmentation|website=VRFocus|language=en-US|access-date=2019-11-18|archive-date=11 August 2020|archive-url=https://web.archive.org/web/20200811140128/https://www.vrfocus.com/2016/07/vrto-spearheads-code-of-ethics-on-human-augmentation/|url-status=dead}}</ref><ref>{{Cite web|url=http://www.eyetap.org/hacode/|title=The Code of Ethics on Human Augmentation|website=www.eyetap.org|access-date=2019-11-18|archive-date=28 February 2021|archive-url=https://web.archive.org/web/20210228095934/http://www.eyetap.org/hacode/|url-status=dead}}</ref>
Research by members of the CRIStAL at the [[University of Lille]] makes use of augmented reality in order to enrich musical performance. The ControllAR project allows musicians to augment their [[MIDI]] control surfaces with the remixed [[graphical user interface]]s of [[music software]].<ref>{{cite journal|title=ControllAR : Appropriation of Visual Feedback on Control Surfaces|url=https://hal.archives-ouvertes.fr/hal-01356239}}</ref> The Rouages project proposes to augment [[Electronic musical instrument|digital musical instruments]] in order to reveal their mechanisms to the audience and thus improve the perceived liveness.<ref>{{cite journal|title=Rouages: Revealing the Mechanisms of Digital Musical Instruments to the Audience|url=https://hal.archives-ouvertes.fr/hal-00807049}}</ref> Reflets is a novel augmented reality display dedicated to musical performances where the audience acts as a 3D display by revealing virtual content on stage, which can also be used for 3D musical interaction and collaboration.<ref>{{cite journal|title=Reflets: Combining and Revealing Spaces for Musical Performances|url=https://hal.archives-ouvertes.fr/hal-01136857}}</ref>


=== Retail ===
== Property law ==


The interaction of location-bound augmented reality with [[property law]] is largely undefined.{{sfn|McClure|2017|p=364-366}}<ref>{{ cite web | url = https://slate.com/technology/2018/06/can-you-prevent-augmented-reality-ads-from-appearing-on-your-house.html | title = What Are Your Augmented Reality Property Rights? | first = Fiona J | last = McEvoy | publisher = [[Slate (magazine)|Slate]] | date = 4 June 2018 | accessdate = 31 May 2022 }}</ref> Several models have been analysed for how this interaction may be resolved in a [[common law]] context: an extension of [[real property]] rights to also cover augmentations on or near the property with a strong notion of [[trespassing]], forbidding augmentations unless allowed by the owner; an '[[open range]]' system, where augmentations are allowed unless forbidden by the owner; and a '[[freedom to roam]]' system, where real property owners have no control over non-disruptive augmentations.{{sfn|Mallick|2020|p=1068-1072}}
Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which is overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer [[full-body scanning]]. These booths render a 3-D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.<ref>Pavlik, John V., and Shawn McIntosh. “Augmented Reality.” Converging Media: a New Introduction to Mass Communication, 5th ed., Oxford University Press, 2017, pp. 184–185.</ref> For example, [[J. C. Penney|JC Penney]] and [[Bloomingdale's]] use "[[virtual dressing room]]s" that allow customers to see themselves in clothes without trying them on.<ref name=":02">{{Cite journal|date=2017-11-01|title=Enabling smart retail settings via mobile augmented reality shopping apps|url=https://www.sciencedirect.com/science/article/pii/S0040162516304243|journal=Technological Forecasting and Social Change|language=en|volume=124|pages=243–256|doi=10.1016/j.techfore.2016.09.032|issn=0040-1625}}</ref> Another store that uses AR to market clothing to its customers is [[Neiman Marcus]].<ref name=":12">{{Cite news|url=https://www.retaildive.com/news/how-neiman-marcus-is-turning-technology-innovation-into-a-core-value/436590/|title=How Neiman Marcus is turning technology innovation into a 'core value'|work=Retail Dive|access-date=2018-09-23|language=en-US}}</ref> Neiman Marcus offers consumers the ability to see their outfits in a 360 degree view with their "memory mirror".<ref name=":12" /> Makeup stores like [[L'Oréal|L'Oreal]], [[Sephora]], [[Charlotte Tilbury]], and [[Rimmel]] also have apps that utilize AR.<ref name=":22" /> These apps allow consumers to see how the makeup will look on them.<ref name=":22" /> According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail."<ref name=":22" />


One issue experienced during the [[Pokémon Go]] craze was the game's players disturbing owners of private property while visiting nearby location-bound augmentations, which may have been on the properties or the properties may have been ''en route''. The terms of service of Pokémon Go explicitly disclaim responsibility for players' actions, which may limit (but may not totally extinguish) the liability of its producer, [[Niantic (company)|Niantic]], in the event of a player [[trespassing]] while playing the game: by Niantic's argument, the player is the one committing the trespass, while Niantic has merely engaged in permissible [[free speech]]. A theory advanced in lawsuits brought against Niantic is that their placement of game elements in places that will lead to trespass or an exceptionally large flux of visitors can constitute [[nuisance]], despite each individual trespass or visit only being tenuously caused by Niantic.{{sfn|McClure|2017|p=341-343}}{{sfn|McClure|2017|p=347-351}}{{sfn|Conroy|2017|p=20}}
AR technology is also used by furniture retailers such as [[IKEA]], [[Houzz]], and [[Wayfair]].<ref name=":22">{{Cite news|url=https://www.forbes.com/sites/rachelarthur/2017/10/31/augmented-reality-is-set-to-transform-fashion-and-retail/#364c701b3151|title=Augmented Reality Is Set To Transform Fashion And Retail|last=Arthur|first=Rachel|work=Forbes|access-date=2018-09-23|language=en}}</ref><ref name=":02" /> These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.<ref name=":22" /> IKEA launched their IKEA Place at the end of 2017 and made it possible to have 3D and true to scale models of furniture in their living space, through using the app and their camera. IKEA realized that their customers are not shopping in stores as often or directly buy things anymore. So they created IKEA Place to tackle these problems and have people try out the furniture with Augmented Reality and then decide if they want to buy it.<ref>IKEA Highlights 2017 https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html
</ref><ref>Fact & Figures, IKEA, 2017, https://highlights.ikea.com/2017/facts-and-figures/</ref>


Another claim raised against Niantic is that the placement of profitable game elements on land without permission of the land's owners is [[unjust enrichment]].{{sfn|McClure|2017|p=351-353}} More hypothetically, a property may be augmented with [[advertising]] or disagreeable content against its owner's wishes.{{sfn|Conroy|2017|p=21-22}} Under American law, these situations are unlikely to be seen as a violation of [[real property]] rights by courts without an expansion of those rights to include augmented reality (similarly to how [[English common law]] came to recognise [[air rights]]).{{sfn|McClure|2017|p=351-353}}
=== Snapchat ===


An article in the ''[[Michigan Telecommunications and Technology Law Review]]'' argues that there are three bases for this extension, starting with various understanding of property. The personality theory of property, outlined by [[Margaret Radin]], is claimed to support extending property rights due to the intimate connection between personhood and ownership of property; however, her viewpoint is not universally shared by legal theorists.{{sfn|Conroy|2017|p=24-26}} Under the [[utilitarian theory of property]], the benefits from avoiding the harms to real property owners caused by augmentations and the [[tragedy of the commons]], and the reduction in [[transaction costs]] by making discovery of ownership easy, were assessed as justifying recognising real property rights as covering location-bound augmentations, though there does remain the possibility of a [[tragedy of the anticommons]] from having to negotiate with property owners slowing innovation.{{sfn|Conroy|2017|p=27-29}} Finally, following the 'property as the law of things' identification as supported by [[Thomas Merrill]] and [[Henry E Smith]], location-based augmentation is naturally identified as a 'thing', and, while the [[non-rivalrous]] and ephemeral nature of digital objects presents difficulties to the excludeability prong of the definition, the article argues that this is not insurmountable.{{sfn|Conroy|2017|p=29-34}}
Snapchat users have access to augmented reality in the company's instant messaging app through use of camera filters. In September 2017, Snapchat updated its app to include a camera filter that allowed users to render an animated, cartoon version of themselves called "Bitmoji". These animated avatars would be projected in the real world through the camera, and can be photographed or video recorded.<ref>Wagner, Kurt. “Snapchat's New Augmented Reality Feature Brings Your Cartoon Bitmoji into the Real World.” Recode, Recode, 14 Sept. 2017, www.recode.net/2017/9/14/16305890/snapchat-bitmoji-ar-facebook.</ref> In the same month, Snapchat also announced a new feature called "Sky Filters" that will be available on its app. This new feature makes use of augmented reality to alter the look of a picture taken of the sky, much like how users can apply the app's filters to other pictures. Users can choose from sky filters such as starry night, stormy clouds, beautiful sunsets, and rainbow.<ref>Miller, Chance. “Snapchat’s Latest Augmented Reality Feature Lets You Paint the Sky with New Filters.” 9to5Mac, 9to5Mac, 25 Sept. 2017, 9to5mac.com/2017/09/25/how-to-use-snapchat-sky-filters/.</ref>


Some attempts at legislative regulation have been made in the [[United States]]. [[Milwaukee County, Wisconsin]] attempted to regulate augmented reality games played in its parks, requiring prior issuance of a permit,{{sfn|McClure|2017|p=354-355}} but this was criticised on [[free speech]] grounds by a federal judge;<ref>{{ cite news | url = https://apnews.com/article/58ca55eb0a00440d872444bde69b8092 | title = Judge halts Wisconsin county rule for apps like Pokemon Go | work = Associated Press | date = 21 July 2017 }}</ref> and [[Illinois]] considered mandating a [[notice and take down]] procedure for location-bound augmentations.{{sfn|McClure|2017|p=356-357}}
== The Dangers of AR ==


An article for the ''[[Iowa Law Review]]'' observed that dealing with many local permitting processes would be arduous for a large-scale service,{{sfn|McClure|2017|p=355}} and, while the proposed Illinois mechanism could be made workable,{{sfn|McClure|2017|p=357}} it was reactive and required property owners to potentially continually deal with new augmented reality services; instead, a national-level [[geofencing]] registry, analogous to a [[do-not-call list]], was proposed as the most desirable form of regulation to efficiently balance the interests of both providers of augmented reality services and real property owners.{{sfn|McClure|2017|p=357-359}} An article in the ''[[Vanderbilt Journal of Entertainment and Technology Law]]'', however, analyses a monolithic do-not-locate registry as an insufficiently flexible tool, either permitting unwanted augmentations or foreclosing useful applications of augmented reality.{{sfn|Mallick|2020|p=1079-1080}} Instead, it argues that an 'open range' model, where augmentations are permitted by default but property owners may restrict them on a case-by-case basis (and with noncompliance treated as a form of trespass), will produce the socially-best outcome.{{sfn|Mallick|2020|p=1080-1084}}
=== Reality modifications ===
There is a danger that will make individuals overconfident and put their life at risk because of it. Pokémon GO with a couple of deaths and many injuries is the perfect example of it. "[https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3073723 Death by Pokémon GO]”, by a pair of researchers from Purdue University's Krannert School of Management, says the game caused “a disproportionate increase in vehicular crashes and associated vehicular damage, personal injuries, and fatalities in the vicinity of locations, called PokéStops, where users can play the game while driving.”.<ref>{{Cite journal |ssrn=3073723|doi=10.2139/ssrn.3073723|title = Death by Pokémon GO: The Economic and Human Cost of Using Apps While Driving|date = 2018-02-02|last1 = Faccio|first1 = Mara|last2=McConnell|first2=John J.}}</ref> The paper extrapolated what that might mean nationwide and concluded “the increase in crashes attributable to the introduction of Pokémon GO is 145,632 with an associated increase in the number of injuries of 29,370 and an associated increase in the number of fatalities of 256 over the period of July 6, 2016, through November 30, 2016.” The authors valued those crashes and fatalities at between $2bn and $7.3 billion for the same period.

Furthermore, more than one in three surveyed advanced internet users would like to edit out disturbing elements around them, such as garbage or graffiti.<ref>Peddie, J., 2017, Agumented Reality, Springer</ref> They would like to even modify their surroundings by erasing street signs, billboard ads, and uninteresting shopping windows. So it seems that AR is a threat to companies as it is an opportunity. Although this could be a nightmare to numerous brands that do not manage to capture consumer imaginations it also creates the risk that the wearers of augmented reality glasses may become unaware of surrounding dangers. Consumers want to use augmented reality glasses to change their surroundings into something that reflects their own personal opinions. Around two in five want to change the way their surroundings look and even how people appear to them.

Next, to the possible privacy issues that are described below, overload and over-reliance issues is the biggest danger of AR. For the development of new AR related products, this implies that the user-interface should follow certain guidelines as not to overload the user with information while also preventing the user to overly rely on the AR system such that important cues from the environment are missed.<ref name="Azuma, R. T. 1997">Azuma, R. T. (1997). A survey of augmented reality. Presence-Teleoperators and Virtual Environments, 6(4), 355–385.</ref>  This is called the virtually-augmented key.<ref name="Azuma, R. T. 1997"/> Once the key is not taken into account people might not need the real world anymore.

=== Privacy concerns ===
The concept of modern augmented reality depends on the ability of the device to record and analyze the environment in real time. Because of this, there are potential legal concerns over privacy. While the [[First Amendment to the United States Constitution]] allows for such recording in the name of public interest, the constant recording of an AR device makes it difficult to do so without also recording outside of the public domain. Legal complications would be found in areas where a right to a certain amount of privacy is expected or where copyrighted media are displayed.

In terms of individual privacy, there exists the ease of access to information that one should not readily possess about a given person. This is accomplished through facial recognition technology. Assuming that AR automatically passes information about persons that the user sees, there could be anything seen from social media, criminal record, and marital status.<ref>TRoesner, Franziska, Tadayoshi Kohno, Tamara Denning, Ryan Calo, and Bryce Clayton Newell. "Augmented Reality." Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication – UbiComp '14 Adjunct (2014). The university of Utah. Ubicomp. Web. 18 Aug. 2015.</ref>

Privacy-compliant image capture solutions can be deployed to temper the impact of constant filming on individual privacy.<ref>Jiayu Shu, Rui Zheng, Pan Hui [http://www.cse.ust.hk/~panhui/papers/mar.privacy.2017.pdf Your Privacy Is in Your Hand: Interactive Visual Privacy Control with Tags and Gestures].</ref>


== Notable researchers ==
== Notable researchers ==
<!--♦♦♦ Please keep the list in alphabetical order ♦♦♦-->

* [[Ronald Azuma]] is a scientist and author of works on AR.
* [[Jeri Ellsworth]] headed a research effort for [[Valve Corporation|Valve]] on augmented reality (AR), later taking that research to her own start-up [[CastAR]]. The company, founded in 2013, eventually shuttered. Later, she created another start-up based on the same technology called Tilt Five; another AR start-up formed by her with the purpose of creating a device for digital [[board game]]s.<ref>{{Cite news|url=https://www.nytimes.com/2019/10/24/technology/jeri-ellsworth-augmented-reality.html|title=Always Building, From the Garage to Her Company|last=Markoff|first=John|date=2019-10-24|work=The New York Times|access-date=2019-12-12|language=en-US|issn=0362-4331}}</ref>
* [[Steve Mann (inventor)|Steve Mann]] formulated an earlier concept of [[mediated reality]] in the 1970s and 1980s, using cameras, processors, and display systems to modify visual reality to help people see better (dynamic range management), building computerized welding helmets, as well as "augmediated reality" vision systems for use in everyday life. He is also an adviser to [[Meta (augmented reality company)|Meta]].<ref>{{cite journal |last1=Mann |first1=S. |title=Wearable computing: a first step toward personal imaging |journal=Computer |date=1997 |volume=30 |issue=2 |pages=25–32 |doi=10.1109/2.566147 |s2cid=28001657 }}</ref>
* [[Dieter Schmalstieg]] and Daniel Wagner developed a marker tracking systems for mobile phones and PDAs in 2009.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=946910 |title=First Steps Towards Handheld Augmented Reality |author=Wagner, Daniel |date=29 September 2009 |publisher=ACM |access-date=29 September 2009|isbn=9780769520346 }}</ref>
* [[Ivan Sutherland]] invented the [[The Sword of Damocles (virtual reality)|first VR head-mounted display]] at [[Harvard University]].
* [[Ivan Sutherland]] invented the [[The Sword of Damocles (virtual reality)|first VR head-mounted display]] at [[Harvard University]].
* [[Steve Mann]] formulated an earlier concept of [[mediated reality]] in the 1970s and 1980s, using cameras, processors, and display systems to modify visual reality to help people see better (dynamic range management), building computerized welding helmets, as well as "augmediated reality" vision systems for use in everyday life. He is also an adviser to [[Meta (company)|Meta]].<ref>"Wearable Computing: A first step towards personal imaging", IEEE Computer, pp. 25–32, Vol. 30, Issue 2, Feb. 1997 [http://www.eyetap.org/papers/docs/first_step.pdf link].</ref>
* [[Louis Rosenberg (writer)|Louis Rosenberg]] developed one of the first known AR systems, called [[Virtual Fixtures]], while working at the U.S. Air Force Armstrong Labs in 1991, and published the first study of how an AR system can enhance human performance.<ref name="B. Rosenberg 1992">L. B. Rosenberg. The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments. Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.</ref> Rosenberg's subsequent work at Stanford University in the early 90's, was the first proof that virtual overlays when registered and presented over a user's direct view of the real physical world, could significantly enhance human performance.<ref>Rosenberg, L., "[https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2057/0000/Virtual-fixtures-as-tools-to-enhance-operator-performance-in-telepresence/10.1117/12.164901.short Virtual fixtures as tools to enhance operator performance in telepresence environments]," SPIE Manipulator Technology, 1993.</ref><ref>Rosenberg, "[https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2351/0000/Virtual-haptic-overlays-enhance-performance-in-telepresence-tasks/10.1117/12.197302.short Virtual Haptic Overlays Enhance Performance in Telepresence Tasks]," Dept. of Mech. Eng., Stanford Univ., 1994.</ref><ref name="autogenerated1">Rosenberg, "[https://dl.acm.org/citation.cfm?id=221788 Virtual Fixtures: Perceptual Overlays Enhance Operator Performance in Telepresence Tasks]," Ph.D. Dissertation, Stanford University.</ref>
* [[Mike Abernathy]] pioneered one of the first successful augmented video overlays (also called hybrid syntheric vision) using map data for space debris in 1993,<ref name="ABER93"/> while at Rockwell International. He co-founded Rapid Imaging Software, Inc. and was the primary author of the LandForm system in 1995, and the SmartCam3D system.<ref name="DELG99" /><ref name = "DELG00" /> LandForm augmented reality was successfully flight tested in 1999 aboard a helicopter and SmartCam3D was used to fly the NASA X-38 from 1999–2002. He and NASA colleague Francisco Delgado received the National Defense Industries Association Top5 awards in 2004.<ref name="ReferenceA">C. Segura E. George F. Doherty J. H. Lindley M. W. Evans "[http://www.crosstalkonline.org/storage/issue-archives/2005/200509/200509-Delgado.pdf SmartCam3D Provides New Levels of Situation Awareness]", CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 9, pages 10–11.</ref>
* [[Steven Feiner]], Professor at [[Columbia University]], is the author of a 1993 paper on an AR system prototype, KARMA (the Knowledge-based Augmented Reality Maintenance Assistant), along with [[Blair MacIntyre]] and [[Doree Seligmann]]. He is also an advisor to [[Meta (company)|Meta]].<ref>{{cite journal |url=http://portal.acm.org/citation.cfm?id=159587 |title=Knowledge-based augmented reality |journal=Communications of the ACM |volume=36 |issue=7 |pages=53–62 |date=July 1993 |doi=10.1145/159544.159587 |last1=Feiner |first1=Steven |last2=MacIntyre |first2=Blair |last3=Seligmann |first3=Dorée }}</ref>
*[[Tracy McSheery]], of Phasespace, developer in 2009 of wide field of view AR lenses as used in Meta 2 and others.<ref>{{cite web|title=SBIR STTR Development of Low-Cost Augmented Reality Head Mounted Display|url=https://www.sbir.gov/sbirsearch/detail/269401}}</ref>
*S. Ravela, B. Draper, J. Lim and A. Hanson developed a marker/fixture-less augmented reality system with computer vision in 1994. They augmented an engine block observed from a single video camera with annotations for repair. They use model-based pose estimation, aspect graphs and visual feature tracking to dynamically register model with the observed video.<ref>{{Cite book | doi=10.1109/IROS.1995.525793|chapter = Adaptive tracking and model registration across distinct aspects|title = Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots| volume=1| pages=174–180|year = 1995|last1 = Ravela|first1 = S.| last2=Draper| first2=B.| last3=Lim| first3=J.| last4=Weiss| first4=R.| isbn=978-0-8186-7108-1| chapter-url=http://scholarworks.umass.edu/cs_faculty_pubs/219}}</ref>
* [[Francisco Delgado (researcher)|Francisco Delgado]] is a [[NASA]] engineer and project manager specializing in human interface research and development. Starting 1998 he conducted research into displays that combined video with synthetic vision systems (called hybrid synthetic vision at the time) that we recognize today as augmented reality systems for the control of aircraft and spacecraft. In 1999 he and colleague Mike Abernathy flight-tested the LandForm system aboard a US Army helicopter. Delgado oversaw integration of the LandForm and SmartCam3D systems into the X-38 Crew Return Vehicle.<ref name="DELG99" /><ref name = "DELG00" /><ref name="huffingtonpost.com"/> In 2001, Aviation Week reported NASA astronaut's successful use of hybrid synthetic vision (augmented reality) to fly the X-38 during a flight test at Dryden Flight Research Center. The technology was used in all subsequent flights of the X-38. Delgado was co-recipient of the National Defense Industries Association 2004 Top 5 software of the year award for SmartCam3D.<ref name="ReferenceA"/>
* [[Bruce H. Thomas]] and [[Wayne Piekarski]] develop the Tinmith system in 1998.<ref>Piekarski, William; Thomas, Bruce. [http://www.computer.org/portal/web/csdl/doi/10.1109/ISWC.2001.962093 Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality Wearable Computer] Fifth International Symposium on Wearable Computers (ISWC'01), 2001, pp. 31.</ref> They along with [[Steve Feiner]] with his MARS system pioneer outdoor augmented reality.
*Mark Billinghurst is Director of the HIT Lab New Zealand (HIT Lab NZ) at the [[University of Canterbury]] in New Zealand and a notable AR researcher. He has produced over 250 technical publications and presented demonstrations and courses at a wide variety of conferences.
*[[Reinhold Behringer]] performed important early work (1998) in image registration for augmented reality, and prototype wearable testbeds for augmented reality. He also co-organized the First IEEE International Symposium on Augmented Reality in 1998 (IWAR'98), and co-edited one of the first books on augmented reality.<ref>Behringer, R.;[http://reference.kfupm.edu.sa/content/i/m/improving_the_registration_precision_by__1670204.pdf Improving the Registration Precision by Visual Horizon Silhouette Matching.]{{dead link|date=August 2017|bot=medic}}{{cbignore|bot=medic}} Rockwell Science Center.</ref><ref>Behringer, R.;Tam, C; McGee, J.; Sundareswaran, V.; Vassiliou, Marius. [http://www.computer.org/portal/web/csdl/doi/10.1109/ISWC.2000.888495 Two Wearable Testbeds for Augmented Reality: itWARNS and WIMMIS.] ISWC 2000, Atlanta, 16–17 October 2000.</ref><ref>R. Behringer, G. Klinker,. D. Mizell. [http://www.crcpress.com/product/isbn/9781568810980 Augmented Reality – Placing Artificial Objects in Real Scenes]. Proceedings of IWAR '98. A.K.Peters, Natick, 1999. {{ISBN|1-56881-098-9}}.</ref>
*[[Felix G. Hamza-Lup]], [[Larry Davis (researcher)|Larry Davis]] and [[Jannick Rolland]] developed the 3D ARC display with optical see-through head-warned display for AR visualization in 2002.<ref>{{cite journal|title=The ARC Display: An Augmented Reality Visualization Center |author=Felix, Hamza-Lup |date=30 September 2002 |publisher=CiteSeer|citeseerx = 10.1.1.89.5595}}</ref>
* [[Dieter Schmalstieg]] and [[Daniel Wagner (researcher)|Daniel Wagner]] developed a marker tracking systems for mobile phones and PDAs in 2009.<ref>{{cite web |url=http://portal.acm.org/citation.cfm?id=946910 |title=First Steps Towards Handheld Augmented Reality |author=Wagner, Daniel |date=29 September 2009 |publisher=ACM |accessdate=2009-09-29}}</ref>
* [[Peter Purdy]] is one of the earliest pioneers working with Augmented Reality in the 1980s. He worked with international ski teams to develop the first wearable sensor and computing system for biometrics to help them see important information like distance traveled, speed, and time lapsed. In the 90s he worked with the first production of full color augmented reality glasses and designed advanced stereo transparent virtual displays with head and body tracking systems for the aerospace industry. <ref>https://www.kickstarter.com/projects/1337201627/klip-the-revolutionary-wearable-computing-display/description</ref>


== History ==
== In media ==
The [[science fiction|futuristic]] short film ''Sight''<ref>{{Cite web |url=https://vimeo.com/46304267 |title=Sight |last=Robot Genius |website=vimeo.com |access-date=18 June 2019|date=24 July 2012 }}</ref> features contact lens-like augmented reality devices.<ref>{{cite magazine|last1=Kosner|first1=Anthony Wing|title=Sight: An 8-Minute Augmented Reality Journey That Makes Google Glass Look Tame|url=https://www.forbes.com/sites/anthonykosner/2012/07/29/sight-an-8-minute-augmented-reality-journey-that-makes-google-glass-look-tame/|magazine=Forbes|access-date=3 August 2015|date=29 July 2012}}</ref><ref>{{cite web|last1=O'Dell|first1=J.|title=Beautiful short film shows a frightening future filled with Google Glass-like devices|url=https://venturebeat.com/2012/07/27/sight-systems/|access-date=3 August 2015|date=27 July 2012}}</ref>
* 1901: [[L. Frank Baum]], an author, first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.<ref>Johnson, Joel. [https://web.archive.org/web/20130522153011/http://moteandbeam.net/the-master-key-l-frank-baum-envisions-ar-glasses-in-1901 "The Master Key": L. Frank Baum envisions augmented reality glasses in 1901] ''Mote & Beam'' 10 September 2012.</ref>
* 1957–62: [[Morton Heilig]], a cinematographer, creates and patents a simulator called [[Sensorama]] with visuals, sound, vibration, and smell.<ref>{{cite web|url=http://www.google.com/patents?q=3050870|title=3050870 – Google Search|work=google.com|accessdate=2 July 2015}}</ref>
* 1968: [[Ivan Sutherland]] invents the [[head-mounted display]] and positions it as a window into a virtual world.<ref>{{cite web|url=http://90.146.8.18/en/archiv_files/19902/E1990b_123.pdf |title=Archived copy |accessdate=2014-02-19 |deadurl=yes |archiveurl=https://web.archive.org/web/20140123204209/http://90.146.8.18/en/archiv_files/19902/E1990b_123.pdf |archivedate=23 January 2014 |df=dmy }}</ref>
* 1975: [[Myron Krueger]] creates [[Videoplace]] to allow users to interact with virtual objects.
* 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a [[Head-up display|heads up display]] for teaching real-world flight skills.<ref name=":0"/>
* 1980: [[Steve Mann]] creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.<ref>{{cite news|last=Mann |first=Steve |url=http://techland.time.com/2012/11/02/eye-am-a-camera-surveillance-and-sousveillance-in-the-glassage |title=Eye Am a Camera: Surveillance and Sousveillance in the Glassage |publisher=Techland.time.com |date=2012-11-02 |accessdate=2013-10-14}}</ref> See [[EyeTap]]. See [[Head-up display|Heads Up Display]].
* 1981: Dan Reitan geospatially maps multiple weather radar images and space-based and studio cameras to earth maps and abstract symbols for television weather broadcasts, bringing a precursor concept to augmented reality (mixed real/graphical images) to TV.<ref>{{cite web|url=http://www.etceter.com/c-news/p-google-glasses-project/ |title=Archived copy |accessdate=2014-02-21 |deadurl=yes |archiveurl=https://web.archive.org/web/20131003224001/http://www.etceter.com/c-news/p-google-glasses-project/ |archivedate=3 October 2013 |df=dmy }}</ref>
* 1984: In the film [[The Terminator (wrestler)|The Terminator]], the Terminator uses a [[Head-up display|heads-up display]] in several parts of the film. In one part, he accesses a diagram of the gear system of the truck he gets into towards the end of the film.
* 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "[[Head-up display|heads-up display]]" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.<ref>George, D.B., "A COMPUTER-DRIVEN ASTRONOMICAL TELESCOPE GUIDANCE AND CONTROL SYSTEM WITH SUPERIMPOSED STARFIELD AND CELESTIAL COORDINATE GRAPHICS DISPLAY", M.Eng. Thesis, Carleton University, Ottawa, Oct. 1987.</ref><ref>George, Douglas; Morris, Robert. "A Computer-driven Astronomical Telescope Guidance and Control System with Superimposed Star Field and Celestial Coordinate Graphics Display" pp. 32–41, J. Roy. Astron. Soc. Can., Vol. 83, No. 1, 1989.</ref>
* 1987: Insight Inc. was founded by Pete Purdy and created the first doppler biometric embedded system computer for downhill skiers through K2 as the Accelorator, designed by Purdy<ref>https://www.kickstarter.com/projects/1337201627/klip-the-revolutionary-wearable-computing-display/description</ref>
* 1990: Pete Purdy and Tom Furness co-found HIT Lab (Human Interface Technology Laboratories) at the University of Washington
* 1990: The term 'Augmented Reality' is attributed to Thomas P. Caudell, a former [[Boeing]] researcher.<ref>{{cite journal |last=Lee |first=Kangdon |title=Augmented Reality in Education and Training |journal=Techtrends: Linking Research & Practice to Improve Learning |date=March 2012 |volume=56 |issue=2 |accessdate=2014-05-15|url = http://www2.potsdam.edu/betrusak/566/Augmented%20Reality%20in%20Education.pdf}}</ref>
* 1992: [[Louis Rosenberg (entrepreneur)|Louis Rosenberg]] developed one of the first functioning AR systems, called [[Virtual fixture|Virtual Fixtures]], at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.<ref>{{Cite journal|last=Rosenberg|first=R.|last2=Rosenberg|first2=S.|last3=Rosenberg|first3=R.|last4=Fenton|first4=D.|date=2009-04-07|title=Bernard Cecil Rosenberg|journal=BMJ|volume=338|issue=apr07 2|pages=b1450–b1450|doi=10.1136/bmj.b1450|issn=0959-8138}}</ref>
* 1992: US Patent 5162828 for a display system for a head mounted viewing transparency granted to Pete Purdy, Tom Furness, Robert Fisher, and Kurt Beach<ref>https://patents.justia.com/inventor/peter-k-purdy</ref>
* 1992: Pete Purdy designed and produced the first scuba dive heads-up display mask with a wireless link to the algorithmic dive computer - the Scubapro dive mask HUD
* 1993: [[Steven Feiner]], [[Blair MacIntyre]] and [[Doree Seligmann]] present an early paper on an AR system prototype, KARMA, at the Graphics Interface conference.
* 1993: Mike Abernathy, et al., report the first use of augmented reality in identifying space debris using [[Rockwell Collins|Rockwell]] WorldView by overlaying satellite geographic trajectories on live telescope video.<ref name = "ABER93"/>
* 1993 A widely cited version of the paper above is published in [[Communications of the ACM]] – Special issue on computer augmented environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold.<ref>{{cite web|last=Wellner|first=Pierre|title=Computer Augmented Environments: back to the real world|url=http://dl.acm.org/citation.cfm?id=159544|publisher=ACM|accessdate=2012-07-28}}</ref>
* 1993: [[Loral Corporation|Loral WDL]], with sponsorship from [[United States Army Simulation and Training Technology Center|STRICOM]], performed the first demonstration combining live AR-equipped vehicles and manned simulators. Unpublished paper, J. Barrilleaux, "Experiences and Observations in Applying Augmented Reality to Live Training", 1999.<ref>Barrilleaux, Jon. [[:File:Experiences and Observations in Applying Augmented Reality to Live Training.pdf|Experiences and Observations in Applying Augmented Reality to Live Training]].</ref>
* 1993: Pete Purdy designed a fully transparent stereographic virtual augmented reality display for aerospace industry for Boeing Computer Services to help when wiring aircraft
* 1994: Julie Martin creates first 'Augmented Reality Theater production', Dancing In Cyberspace, funded by the [[Australia Council for the Arts]], features dancers and [[acrobatics|acrobats]] manipulating body–sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. The installation used [[Silicon Graphics]] computers and Polhemus sensing system.
* 1995: S. Ravela et al. at [[University of Massachusetts]] introduce a vision-based system using monocular cameras to track objects (engine blocks) across views for augmented reality.
* 1996: Motion Research develops and produces a helmet mounted HUD for motorcyclists
* 1998: Spatial Augmented Reality introduced at [[University of North Carolina]] at Chapel Hill by [[Ramesh Raskar]], Welch, [[Henry Fuchs]].<ref name="raskarSAR" />
* 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.<ref name="DELG99" /><ref name = "DELG00" />
* 1999: The [[United States Naval Research Laboratory|US Naval Research Laboratory]] engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.<ref>[http://www.nrl.navy.mil/itd/imda/research/5581/augmented-reality NRL BARS Web page]</ref>
* 1999: Hirokazu Kato (加藤 博一) created [[ARToolKit]] at [[HITLab]], where AR later was further developed by other [[HITLab]] scientists, demonstrating it at [[SIGGRAPH]].
* 2000: [[Bruce H. Thomas]] develops [[ARQuake]], the first outdoor mobile AR game, demonstrating it in the [[International Symposium on Wearable Computers]].
* 2001: NASA X-38 flown using LandForm software video map overlays at [[Dryden Flight Research Center]].<ref name = " AN2001">AviationNow.com Staff, "X-38 Test Features Use Of Hybrid Synthetic Vision" AviationNow.com, December 11, 2001</ref>
* 2004: Outdoor helmet-mounted AR system demonstrated by [[Trimble Navigation]] and the Human Interface Technology Laboratory (HIT lab).<ref name="Outdoor AR"/>
* 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the [[HTC Dream|G1 Android phone]].<ref>[https://www.youtube.com/watch?v=8EA8xlicmT8 Wikitude AR Travel Guide]. Youtube.com. Retrieved 2012-06-09.</ref>
* 2009: ARToolkit was ported to [[Adobe Flash]] (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.<ref>Cameron, Chris. [http://www.readwriteweb.com/archives/flash-based_ar_gets_high-quality_markerless_upgrade.php Flash-based AR Gets High-Quality Markerless Upgrade], ''ReadWriteWeb'' 9 July 2010.</ref>
* 2009: Cell phone and glasses display patent (US Patent 7,631,968) granted to Peter Purdy and Dominic Dobson. Describes the physical unit build to allow the connection from a source to the user interface.<ref>http://www.seobythesea.com/2012/04/google-acquires-glasses-patents/</ref>
* 2010: Design of mine detection robot for Korean mine field.<ref name="ieeexplore-ieee-org.mutex.gmu.edu"/>
* 2010: US Patent 7,675,683 granted to Pete Purdy and partners for light and display system that allows a unit to display data out the viewing window for users to view<ref>http://www.seobythesea.com/2012/04/google-acquires-glasses-patents/</ref>
* 2012: Launch of [[LyteShot|Lyteshot]], an interactive AR gaming platform that utilizes smart glasses for game data
* 2013: Meta announces the Meta 1 developer kit.<ref>{{Cite news|url=https://www.slashgear.com/meta-plans-true-augmented-reality-with-epson-powered-wearable-28266900/|title=Meta plans true augmented reality with Epson-powered wearable|date=2013-01-28|work=SlashGear|access-date=2018-08-31|language=en-US}}</ref><ref>{{Cite news|url=https://www.roadtovr.com/meta-01-augmented-reality-glasses-pre-order-price/|title=Meta 01 Augmented Reality Glasses Available for Pre-order for $667|last=Lang|first=Ben|date=2013-08-13|work=Road to VR|access-date=2018-08-31|language=en-US}}</ref>
* 2013: [[Google]] announces an open beta test of its [[Google Glass]] augmented reality glasses. The glasses reach the Internet through Bluetooth, which connects to the wireless service on a user's cellphone. The glasses respond when a user speaks, touches the frame or moves the head.<ref>Miller, Claire. [https://www.nytimes.com/2013/02/21/technology/google-looks-to-make-its-computer-glasses-stylish.html?pagewanted=all&_r=0], ''New York Times'' 20 February 2013.</ref><ref>{{Cite web|url=http://images.huffingtonpost.com/2016-05-13-1463155843-8474094-AR_history_timeline.jpg|title=Timeline of Augmented Reality (Huffington Post)|last=|first=|date=|website=huffingtonpost.com|access-date=}}</ref>
* 2014: Mahei creates the first generation of augmented reality enhanced educational toys.<ref>{{cite web|url=http://mahei.es/index.php?lang=en|title=Mahei :: Augmented reality for mobile devices|work=mahei.es|accessdate=2 July 2015}}</ref>
* 2015: [[Microsoft]] announces [[Windows Holographic]] and the [[HoloLens]] augmented reality headset. The headset utilizes various sensors and a processing unit to blend high definition "holograms" with the real world.<ref>Microsoft Channel, YouTube [https://www.youtube.com/watch?v=aThCr0PsyuA], 23 January 2015.</ref>
* 2016: [[Niantic, Inc.|Niantic]] released ''[[Pokémon Go]]'' for [[iOS]] and [[Android (operating system)|Android]] in July 2016. The game quickly became one of the most popular smartphone applications and in turn spikes the popularity of augmented reality games.<ref>{{cite news|last1=Bond|first1=Sarah|title=After the Success Of Pokémon Go, How Will Augmented Reality Impact Archaeological Sites?|url=https://www.forbes.com/sites/drsarahbond/2016/07/17/after-the-success-of-pokemon-go-how-will-augmented-reality-impact-archaeological-sites/|accessdate=July 17, 2016|date=July 17, 2016}}</ref>
* 2017: [[Magic Leap]] announces the use of [[Digital Lightfield]] technology embedded into the [[Magic Leap One]] headset. The creators edition headset includes the glasses and a computing pack worn on your belt.<ref>C|NET [https://www.cnet.com/products/magic-leap-one/preview/], 20 December 2017.</ref>


==See also==
==See also==
{{div col|colwidth=22em}}
{{div col}}
* {{annotated link|ARTag}}

* {{annotated link|Browser extension|aka=Augmented browsing}}
* [[Alternate reality game]]
* {{annotated link|Augmented reality-based testing}}
* [[ARTag]]
* [[Augmented browsing]]
* {{annotated link|Augmented web}}
* {{annotated link|Automotive head-up display}}
* [[Augmented reality-based testing]]
* {{annotated link|Bionic contact lens}}
* [[Augmented web]]
* {{annotated link|Computer-mediated reality}}
* [[Automotive head-up display]]
* {{annotated link|Cyborg}}
* [[Bionic contact lens]]
* {{annotated link|Head-mounted display}}
* [[Brain in a vat]]
* {{annotated link|Holography}}
* [[Computer-mediated reality]]
* {{annotated link|List of augmented reality software}}
* [[Cyborg]]
* {{annotated link|Location-based service}}
* [[EyeTap]]
* {{annotated link|Mixed reality}}
* [[Head-mounted display]]
* {{annotated link|Optical head-mounted display}}
* [[Holography]]
* {{annotated link|Simulated reality}}
* [[Lifelike experience]]
* {{annotated link|Smartglasses}}
* [[List of augmented reality software]]
* {{annotated link|Virtual reality}}
* [[Magic Leap]]
* [[Mixed reality]]
* {{annotated link|Visuo-haptic mixed reality}}
* {{annotated link|Wearable computer}}
* [[Optical head-mounted display]]
* [[Simulated reality]]
* [[Smartglasses]]
* [[Transreality gaming]]
* [[Video mapping]]
* [[Viractualism]]
* [[Virtual reality]]
* [[Visuo-haptic mixed reality]]
* [[Wearable computing]]
{{div col end}}
{{div col end}}


== References ==
== References ==
{{Reflist|30em}}
{{Reflist}}


== External links ==
== Sources ==
* {{ cite journal | title = Property Rights in Augmented Reality | first = Declan T | last = Conroy | journal = [[Michigan Telecommunications and Technology Law Review]] | volume = 24 | issue = 1 | date = 2017 | url = https://repository.law.umich.edu/mttlr/vol24/iss1/2 }}
{{Commons}}
* {{ cite journal | title = When the Virtual and Real Worlds Collide: Beginning to Address the Clash Between Real Property Rights and Augmented Reality Location-Based Technologies Through a Federal Do-Not-Locate Registry | first = William | last = McClure | journal = [[Iowa Law Review]] | volume = 103 | number = 1 | date = 1 November 2017 | ssrn = 3906369 }}
* {{ cite journal | title = Augmenting Property Law: Applying the Right to Exclude in the Augmented Reality Universe | first = Samuel | last = Mallick | journal = [[Vanderbilt Journal of Entertainment and Technology Law]] | volume = 19 | issue = 4 | date = 2020 | url = https://scholarship.law.vanderbilt.edu/jetlaw/vol19/iss4/6/ }}


== External links ==
{{Commons category-inline}}
{{Extended reality|state=collapsed}}
{{Authority control}}
{{Authority control}}


{{DEFAULTSORT:Augmented reality}}
[[Category:Augmented reality| ]]
[[Category:Extended reality]]
[[Category:Applications of computer vision]]
[[Category:Applications of computer vision]]
[[Category:Augmented reality]]
[[Category:Advertising techniques]]
[[Category:Advertising techniques]]
[[Category:Marketing techniques]]
[[Category:Marketing techniques]]
[[Category:Promotion and marketing communications]]
[[Category:Promotion and marketing communications]]
[[Category:User interface techniques]]
[[Category:User interface techniques]]
[[Category:21st-century inventions]]
[[Category:3D GUIs]]

Latest revision as of 07:13, 26 December 2024

Photograph of the first AR system
Virtual Fixtures – first AR system, U.S. Air Force, Wright-Patterson Air Force Base (1992)

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated 3D content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory.[1] AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.[2] The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).[3] As such, it is one of the key technologies in the reality-virtuality continuum.[4]

This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment.[3] In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.[5][6]

Augmented reality is largely synonymous with mixed reality. There is also overlap in terminology with extended reality and computer-mediated reality.

The primary value of augmented reality is the manner in which components of the digital world blend into a person's perception of the real world, not as a simple display of data, but through the integration of immersive sensations, which are perceived as natural parts of an environment. The earliest functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Laboratory in 1992.[3][7][8] Commercial augmented reality experiences were first introduced in entertainment and gaming businesses.[9] Subsequently, augmented reality applications have spanned commercial industries such as education, communications, medicine, and entertainment. In education, content may be accessed by scanning or viewing an image with a mobile device or by using markerless AR techniques.[10][11][12]

Augmented reality can be used to enhance natural environments or situations and offers perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding computer vision, incorporating AR cameras into smartphone applications, and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulated.[13] Information about the environment and its objects is overlaid on the real world. This information can be virtual. Augmented Reality is any experience which is artificial and which adds to the already existing reality.[14][15][16][17][18] or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.[19][20][21] Augmented reality also has a lot of potential in the gathering and sharing of tacit knowledge. Augmentation techniques are typically performed in real-time and in semantic contexts with environmental elements. Immersive perceptual information is sometimes combined with supplemental information like scores over a live video feed of a sporting event. This combines the benefits of both augmented reality technology and heads up display technology (HUD).

Comparison with virtual reality

[edit]

In virtual reality (VR), the users' perception is completely computer-generated, whereas with augmented reality (AR), it is partially generated and partially from the real world.[22][23] For example, in architecture, VR can be used to create a walk-through simulation of the inside of a new building; and AR can be used to show a building's structures and systems super-imposed on a real-life view. Another example is through the use of utility applications. Some AR applications, such as Augment, enable users to apply digital objects into real environments, allowing businesses to use augmented reality devices as a way to preview their products in the real world.[24] Similarly, it can also be used to demo what products may look like in an environment for customers, as demonstrated by companies such as Mountain Equipment Co-op or Lowe's who use augmented reality to allow customers to preview what their products might look like at home through the use of 3D models.[25]

Augmented reality (AR) differs from virtual reality (VR) in the sense that in AR part of the surrounding environment is 'real' and AR is just adding layers of virtual objects to the real environment. On the other hand, in VR the surrounding environment is completely virtual and computer generated. A demonstration of how AR layers objects onto the real world can be seen with augmented reality games. WallaMe is an augmented reality game application that allows users to hide messages in real environments, utilizing geolocation technology in order to enable users to hide messages wherever they may wish in the world.[26] Such applications have many uses in the world, including in activism and artistic expression.[27]

History

[edit]
  • 1901: L. Frank Baum, an author, first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.[28]
  • 1957–62: Morton Heilig, a cinematographer, creates and patents a simulator called Sensorama with visuals, sound, vibration, and smell.
  • 1968: Ivan Sutherland creates the first head-mounted display that has graphics rendered by a computer.[29]
  • 1975: Myron Krueger creates Videoplace to allow users to interact with virtual objects.
  • 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a heads up display for teaching real-world flight skills.[30]
  • 1980: Steve Mann creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.[31]
  • 1986: Within IBM, Ron Feigenblatt describes the most widely experienced form of AR today (viz. "magic window," e.g. smartphone-based Pokémon Go), use of a small, "smart" flat panel display positioned and oriented by hand.[32][33]
  • 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "heads-up display" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.[34]
  • 1990: The term augmented reality is attributed to Thomas P. Caudell, a former Boeing researcher.[35]
  • 1992: Louis Rosenberg developed one of the first functioning AR systems, called Virtual Fixtures, at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.[36]
  • 1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present an early paper on an AR system prototype, KARMA, at the Graphics Interface conference.
  • 1993: The CMOS active-pixel sensor, a type of metal–oxide–semiconductor (MOS) image sensor, was developed at NASA's Jet Propulsion Laboratory.[37] CMOS sensors are later widely used for optical tracking in AR technology.[38]
  • 1993: Mike Abernathy, et al., report the first use of augmented reality in identifying space debris using Rockwell WorldView by overlaying satellite geographic trajectories on live telescope video.[39]
  • 1993: A widely cited version of the paper above is published in Communications of the ACM – Special issue on computer augmented environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold.[40]
  • 1993: Loral WDL, with sponsorship from STRICOM, performed the first demonstration combining live AR-equipped vehicles and manned simulators. Unpublished paper, J. Barrilleaux, "Experiences and Observations in Applying Augmented Reality to Live Training", 1999.[41]
  • 1994: Julie Martin creates first 'Augmented Reality Theater production', Dancing in Cyberspace, funded by the Australia Council for the Arts, features dancers and acrobats manipulating body–sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. The installation used Silicon Graphics computers and Polhemus sensing system.
  • 1996: General Electric develops system for projecting information from 3D CAD models onto real-world instances of those models.[42]
  • 1998: Spatial augmented reality introduced at University of North Carolina at Chapel Hill by Ramesh Raskar, Greg Welch, Henry Fuchs.[43]
  • 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.[44][45]
  • 1999: The US Naval Research Laboratory engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.[46]
  • 1999: NASA X-38 flown using LandForm software video map overlays at Dryden Flight Research Center.[47]
  • 2000: Rockwell International Science Center demonstrates tetherless wearable augmented reality systems receiving analog video and 3-D audio over radio-frequency wireless channels. The systems incorporate outdoor navigation capabilities, with digital horizon silhouettes from a terrain database overlain in real time on the live outdoor scene, allowing visualization of terrain made invisible by clouds and fog.[48][49]
  • 2004: An outdoor helmet-mounted AR system was demonstrated by Trimble Navigation and the Human Interface Technology Laboratory (HIT lab).[50]
  • 2006: Outland Research develops AR media player that overlays virtual content onto a users view of the real world synchronously with playing music, thereby providing an immersive AR entertainment experience.[51][52]
  • 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the G1 Android phone.[53]
  • 2009: ARToolkit was ported to Adobe Flash (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.[54]
  • 2012: Launch of Lyteshot, an interactive AR gaming platform that utilizes smart glasses for game data
  • 2015: Microsoft announced the HoloLens augmented reality headset, which uses various sensors and a processing unit to display virtual imagery over the real world.[55]
  • 2015: Snap, Inc. releases "Lenses", augmented reality filters in the Snapchat application. [56]
  • 2016: Niantic released Pokémon Go for iOS and Android in July 2016. The game quickly became one of the most popular smartphone applications and in turn spikes the popularity of augmented reality games.[57]
  • 2018: Magic Leap launched the Magic Leap One augmented reality headset.[58] Leap Motion announced the Project North Star augmented reality headset, and later released it under an open source license.[59][60][61][62]
  • 2019: Microsoft announced HoloLens 2 with significant improvements in terms of field of view and ergonomics.[63]
  • 2022: Magic Leap launched the Magic Leap 2 headset.[64]
  • 2024: Meta Platforms revealed the Orion AR glasses prototype. [65]

Hardware

[edit]
Photograph of a man wearing an augmented reality headset
A man wearing an augmented reality headset

Augmented reality requires hardware components including a processor, display, sensors, and input devices. Modern mobile computing devices like smartphones and tablet computers contain these elements, which often include a camera and microelectromechanical systems (MEMS) sensors such as an accelerometer, GPS, and solid state compass, making them suitable AR platforms.[66][67]

Displays

[edit]

Various technologies can be used to display augmented reality, including optical projection systems, monitors, and handheld devices. Two of the display technologies used in augmented reality are diffractive waveguides and reflective waveguides.

A head-mounted display (HMD) is a display device worn on the forehead, such as a harness or helmet-mounted. HMDs place images of both the physical world and virtual objects over the user's field of view. Modern HMDs often employ sensors for six degrees of freedom monitoring that allow the system to align virtual information to the physical world and adjust accordingly with the user's head movements.[68][69][70] When using AR technology, the HMDs only require relatively small displays. In this situation, liquid crystals on silicon (LCOS) and micro-OLED (organic light-emitting diodes) are commonly used.[71] HMDs can provide VR users with mobile and collaborative experiences.[72] Specific providers, such as uSens and Gestigon, include gesture controls for full virtual immersion.[73][74]

Vuzix is a company that has produced a number of head-worn optical see through displays marketed for augmented reality.[75][76][77]

Eyeglasses

[edit]

AR displays can be rendered on devices resembling eyeglasses. Versions include eyewear that employs cameras to intercept the real world view and re-display its augmented view through the eyepieces[78] and devices in which the AR imagery is projected through or reflected off the surfaces of the eyewear lens pieces.[79][80][81]

The EyeTap (also known as Generation-2 Glass[82]) captures rays of light that would otherwise pass through the center of the lens of the wearer's eye, and substitutes synthetic computer-controlled light for each ray of real light. The Generation-4 Glass[82] (Laser EyeTap) is similar to the VRD (i.e. it uses a computer-controlled laser light source) except that it also has infinite depth of focus and causes the eye itself to, in effect, function as both a camera and a display by way of exact alignment with the eye and resynthesis (in laser light) of rays of light entering the eye.[83]

HUD
[edit]
Photograph of a Headset computer
Headset computer

A head-up display (HUD) is a transparent display that presents data without requiring users to look away from their usual viewpoints. A precursor technology to augmented reality, heads-up displays were first developed for pilots in the 1950s, projecting simple flight data into their line of sight, thereby enabling them to keep their "heads up" and not look down at the instruments. Near-eye augmented reality devices can be used as portable head-up displays as they can show data, information, and images while the user views the real world. Many definitions of augmented reality only define it as overlaying the information.[84][85] This is basically what a head-up display does; however, practically speaking, augmented reality is expected to include registration and tracking between the superimposed perceptions, sensations, information, data, and images and some portion of the real world.[86]

Contact lenses

[edit]

Contact lenses that display AR imaging are in development. These bionic contact lenses might contain the elements for display embedded into the lens including integrated circuitry, LEDs and an antenna for wireless communication.

The first contact lens display was patented in 1999 by Steve Mann and was intended to work in combination with AR spectacles, but the project was abandoned,[87][88] then 11 years later in 2010–2011.[89][90][91][92] Another version of contact lenses, in development for the U.S. military, is designed to function with AR spectacles, allowing soldiers to focus on close-to-the-eye AR images on the spectacles and distant real world objects at the same time.[93][94]

At CES 2013, a company called Innovega also unveiled similar contact lenses that required being combined with AR glasses to work.[95]

Many scientists have been working on contact lenses capable of different technological feats. A patent filed by Samsung describes an AR contact lens, that, when finished, will include a built-in camera on the lens itself.[96] The design is intended to control its interface by blinking an eye. It is also intended to be linked with the user's smartphone to review footage, and control it separately. When successful, the lens would feature a camera, or sensor inside of it. It is said that it could be anything from a light sensor, to a temperature sensor.

The first publicly unveiled working prototype of an AR contact lens not requiring the use of glasses in conjunction was developed by Mojo Vision and announced and shown off at CES 2020.[97][98][99]

Virtual retinal display

[edit]

A virtual retinal display (VRD) is a personal display device under development at the University of Washington's Human Interface Technology Laboratory under Dr. Thomas A. Furness III.[100] With this technology, a display is scanned directly onto the retina of a viewer's eye. This results in bright images with high resolution and high contrast. The viewer sees what appears to be a conventional display floating in space.[101]

Several of tests were done to analyze the safety of the VRD.[100] In one test, patients with partial loss of vision—having either macular degeneration (a disease that degenerates the retina) or keratoconus—were selected to view images using the technology. In the macular degeneration group, five out of eight subjects preferred the VRD images to the cathode-ray tube (CRT) or paper images and thought they were better and brighter and were able to see equal or better resolution levels. The Keratoconus patients could all resolve smaller lines in several line tests using the VRD as opposed to their own correction. They also found the VRD images to be easier to view and sharper. As a result of these several tests, virtual retinal display is considered safe technology.

Virtual retinal display creates images that can be seen in ambient daylight and ambient room light. The VRD is considered a preferred candidate to use in a surgical display due to its combination of high resolution and high contrast and brightness. Additional tests show high potential for VRD to be used as a display technology for patients that have low vision.

Handheld

[edit]

A Handheld display employs a small display that fits in a user's hand. All handheld AR solutions to date opt for video see-through. Initially handheld AR employed fiducial markers,[102] and later GPS units and MEMS sensors such as digital compasses and six degrees of freedom accelerometer–gyroscope. Today simultaneous localization and mapping (SLAM) markerless trackers such as PTAM (parallel tracking and mapping) are starting to come into use. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR are the portable nature of handheld devices and the ubiquitous nature of camera phones. The disadvantages are the physical constraints of the user having to hold the handheld device out in front of them at all times, as well as the distorting effect of classically wide-angled mobile phone cameras when compared to the real world as viewed through the eye.[103]

Projection mapping

[edit]

Projection mapping augments real-world objects and scenes without the use of special displays such as monitors, head-mounted displays or hand-held devices. Projection mapping makes use of digital projectors to display graphical information onto physical objects. The key difference in projection mapping is that the display is separated from the users of the system. Since the displays are not associated with each user, projection mapping scales naturally up to groups of users, allowing for collocated collaboration between users.

Examples include shader lamps, mobile projectors, virtual tables, and smart projectors. Shader lamps mimic and augment reality by projecting imagery onto neutral objects. This provides the opportunity to enhance the object's appearance with materials of a simple unit—a projector, camera, and sensor.

Other applications include table and wall projections. Virtual showcases, which employ beam splitter mirrors together with multiple graphics displays, provide an interactive means of simultaneously engaging with the virtual and the real.

A projection mapping system can display on any number of surfaces in an indoor setting at once. Projection mapping supports both a graphical visualization and passive haptic sensation for the end users. Users are able to touch physical objects in a process that provides passive haptic sensation.[18][43][104][105]

Tracking

[edit]

Modern mobile augmented-reality systems use one or more of the following motion tracking technologies: digital cameras and/or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, radio-frequency identification (RFID). These technologies offer varying levels of accuracy and precision. These technologies are implemented in the ARKit API by Apple and ARCore API by Google to allow tracking for their respective mobile device platforms.

Input devices

[edit]

Techniques include speech recognition systems that translate a user's spoken words into computer instructions, and gesture recognition systems that interpret a user's body movements by visual detection or from sensors embedded in a peripheral device such as a wand, stylus, pointer, glove or other body wear.[106][107][108][109] Products which are trying to serve as a controller of AR headsets include Wave by Seebright Inc. and Nimble by Intugine Technologies.

Computer

[edit]

Computers are responsible for graphics in augmented reality. For camera-based 3D tracking methods, a computer analyzes the sensed visual and other data to synthesize and position virtual objects. With the improvement of technology and computers, augmented reality is going to lead to a drastic change on ones perspective of the real world.[110]

Computers are improving at a very fast rate, leading to new ways to improve other technology. Computers are the core of augmented reality.[111] The computer receives data from the sensors which determine the relative position of an objects' surface. This translates to an input to the computer which then outputs to the users by adding something that would otherwise not be there. The computer comprises memory and a processor.[112] The computer takes the scanned environment then generates images or a video and puts it on the receiver for the observer to see. The fixed marks on an object's surface are stored in the memory of a computer. The computer also withdraws from its memory to present images realistically to the onlooker.

Projector

[edit]

Projectors can also be used to display AR contents. The projector can throw a virtual object on a projection screen and the viewer can interact with this virtual object. Projection surfaces can be many objects such as walls or glass panes.[113]

Networking

[edit]

Mobile augmented reality applications are gaining popularity because of the wide adoption of mobile and especially wearable devices. However, they often rely on computationally intensive computer vision algorithms with extreme latency requirements. To compensate for the lack of computing power, offloading data processing to a distant machine is often desired. Computation offloading introduces new constraints in applications, especially in terms of latency and bandwidth. Although there are a plethora of real-time multimedia transport protocols, there is a need for support from network infrastructure as well.[114]

Software and algorithms

[edit]
Comparison of augmented reality fiducial markers for computer vision

A key measure of AR systems is how realistically they integrate virtual imagery with the real world. The software must derive real world coordinates, independent of camera, and camera images. That process is called image registration, and uses different methods of computer vision, mostly related to video tracking.[115][116] Many computer vision methods of augmented reality are inherited from visual odometry.

Usually those methods consist of two parts. The first stage is to detect interest points, fiducial markers or optical flow in the camera images. This step can use feature detection methods like corner detection, blob detection, edge detection or thresholding, and other image processing methods.[117][118] The second stage restores a real world coordinate system from the data obtained in the first stage. Some methods assume objects with known geometry (or fiducial markers) are present in the scene. In some of those cases the scene 3D structure should be calculated beforehand. If part of the scene is unknown simultaneous localization and mapping (SLAM) can map relative positions. If no information about scene geometry is available, structure from motion methods like bundle adjustment are used. Mathematical methods used in the second stage include: projective (epipolar) geometry, geometric algebra, rotation representation with exponential map, kalman and particle filters, nonlinear optimization, robust statistics.[citation needed]

In augmented reality, the distinction is made between two distinct modes of tracking, known as marker and markerless. Markers are visual cues which trigger the display of the virtual information.[119] A piece of paper with some distinct geometries can be used. The camera recognizes the geometries by identifying specific points in the drawing. Markerless tracking, also called instant tracking, does not use markers. Instead, the user positions the object in the camera view preferably in a horizontal plane. It uses sensors in mobile devices to accurately detect the real-world environment, such as the locations of walls and points of intersection.[120]

Augmented Reality Markup Language (ARML) is a data standard developed within the Open Geospatial Consortium (OGC),[121] which consists of Extensible Markup Language (XML) grammar to describe the location and appearance of virtual objects in the scene, as well as ECMAScript bindings to allow dynamic access to properties of virtual objects.

To enable rapid development of augmented reality applications, software development applications have emerged, including Lens Studio from Snapchat and Spark AR from Facebook. Augmented reality Software Development Kits (SDKs) have been launched by Apple and Google.[122][123]

Development

[edit]

AR systems rely heavily on the immersion of the user. The following lists some considerations for designing augmented reality applications:

Environmental/context design

[edit]

Context Design focuses on the end-user's physical surrounding, spatial space, and accessibility that may play a role when using the AR system. Designers should be aware of the possible physical scenarios the end-user may be in such as:

  • Public, in which the users use their whole body to interact with the software
  • Personal, in which the user uses a smartphone in a public space
  • Intimate, in which the user is sitting with a desktop and is not really moving
  • Private, in which the user has on a wearable.[124]

By evaluating each physical scenario, potential safety hazards can be avoided and changes can be made to greater improve the end-user's immersion. UX designers will have to define user journeys for the relevant physical scenarios and define how the interface reacts to each.

Another aspect of context design involves the design of the system's functionality and its ability to accommodate user preferences.[125][126] While accessibility tools are common in basic application design, some consideration should be made when designing time-limited prompts (to prevent unintentional operations), audio cues and overall engagement time. In some situations, the application's functionality may hinder the user's ability. For example, applications that is used for driving should reduce the amount of user interaction and use audio cues instead.

Interaction design

[edit]

Interaction design in augmented reality technology centers on the user's engagement with the end product to improve the overall user experience and enjoyment. The purpose of interaction design is to avoid alienating or confusing the user by organizing the information presented. Since user interaction relies on the user's input, designers must make system controls easier to understand and accessible. A common technique to improve usability for augmented reality applications is by discovering the frequently accessed areas in the device's touch display and design the application to match those areas of control.[127] It is also important to structure the user journey maps and the flow of information presented which reduce the system's overall cognitive load and greatly improves the learning curve of the application.[128]

In interaction design, it is important for developers to utilize augmented reality technology that complement the system's function or purpose.[129] For instance, the utilization of exciting AR filters and the design of the unique sharing platform in Snapchat enables users to augment their in-app social interactions. In other applications that require users to understand the focus and intent, designers can employ a reticle or raycast from the device.[125]

Visual design

[edit]

To improve the graphic interface elements and user interaction, developers may use visual cues to inform the user what elements of UI are designed to interact with and how to interact with them. Visual cue design can make interactions seem more natural.[124]

In some augmented reality applications that use a 2D device as an interactive surface, the 2D control environment does not translate well in 3D space, which can make users hesitant to explore their surroundings. To solve this issue, designers should apply visual cues to assist and encourage users to explore their surroundings.

It is important to note the two main objects in AR when developing VR applications: 3D volumetric objects that are manipulated and realistically interact with light and shadow; and animated media imagery such as images and videos which are mostly traditional 2D media rendered in a new context for augmented reality.[124] When virtual objects are projected onto a real environment, it is challenging for augmented reality application designers to ensure a perfectly seamless integration relative to the real-world environment, especially with 2D objects. As such, designers can add weight to objects, use depths maps, and choose different material properties that highlight the object's presence in the real world. Another visual design that can be applied is using different lighting techniques or casting shadows to improve overall depth judgment. For instance, a common lighting technique is simply placing a light source overhead at the 12 o’clock position, to create shadows on virtual objects.[124]

Uses

[edit]

Augmented reality has been explored for many uses, including gaming, medicine, and entertainment. It has also been explored for education and business.[130] Example application areas described below include archaeology, architecture, commerce and education. Some of the earliest cited examples include augmented reality used to support surgery by providing virtual overlays to guide medical practitioners, to AR content for astronomy and welding.[8][131]

Archaeology

[edit]

AR has been used to aid archaeological research. By augmenting archaeological features onto the modern landscape, AR allows archaeologists to formulate possible site configurations from extant structures.[132] Computer generated models of ruins, buildings, landscapes or even ancient people have been recycled into early archaeological AR applications.[133][134][135] For example, implementing a system like VITA (Visual Interaction Tool for Archaeology) will allow users to imagine and investigate instant excavation results without leaving their home. Each user can collaborate by mutually "navigating, searching, and viewing data". Hrvoje Benko, a researcher in the computer science department at Columbia University, points out that these particular systems and others like them can provide "3D panoramic images and 3D models of the site itself at different excavation stages" all the while organizing much of the data in a collaborative way that is easy to use. Collaborative AR systems supply multimodal interactions that combine the real world with virtual images of both environments.[136]

Architecture

[edit]

AR can aid in visualizing building projects. Computer-generated images of a structure can be superimposed onto a real-life local view of a property before the physical building is constructed there; this was demonstrated publicly by Trimble Navigation in 2004. AR can also be employed within an architect's workspace, rendering animated 3D visualizations of their 2D drawings. Architecture sight-seeing can be enhanced with AR applications, allowing users viewing a building's exterior to virtually see through its walls, viewing its interior objects and layout.[137][138][50]

With continual improvements to GPS accuracy, businesses are able to use augmented reality to visualize georeferenced models of construction sites, underground structures, cables and pipes using mobile devices.[139] Augmented reality is applied to present new projects, to solve on-site construction challenges, and to enhance promotional materials.[140] Examples include the Daqri Smart Helmet, an Android-powered hard hat used to create augmented reality for the industrial worker, including visual instructions, real-time alerts, and 3D mapping.

Following the Christchurch earthquake, the University of Canterbury released CityViewAR,[141] which enabled city planners and engineers to visualize buildings that had been destroyed.[142] This not only provided planners with tools to reference the previous cityscape, but it also served as a reminder of the magnitude of the resulting devastation, as entire buildings had been demolished.

Education and Training

[edit]

In educational settings, AR has been used to complement a standard curriculum. Text, graphics, video, and audio may be superimposed into a student's real-time environment. Textbooks, flashcards and other educational reading material may contain embedded "markers" or triggers that, when scanned by an AR device, produced supplementary information to the student rendered in a multimedia format.[143][144][145] The 2015 Virtual, Augmented and Mixed Reality: 7th International Conference mentioned Google Glass as an example of augmented reality that can replace the physical classroom.[146] First, AR technologies help learners engage in authentic exploration in the real world, and virtual objects such as texts, videos, and pictures are supplementary elements for learners to conduct investigations of the real-world surroundings.[147]

As AR evolves, students can participate interactively and interact with knowledge more authentically. Instead of remaining passive recipients, students can become active learners, able to interact with their learning environment. Computer-generated simulations of historical events allow students to explore and learning details of each significant area of the event site.[148]

In higher education, Construct3D, a Studierstube system, allows students to learn mechanical engineering concepts, math or geometry.[149] Chemistry AR apps allow students to visualize and interact with the spatial structure of a molecule using a marker object held in the hand.[150] Others have used HP Reveal, a free app, to create AR notecards for studying organic chemistry mechanisms or to create virtual demonstrations of how to use laboratory instrumentation.[151] Anatomy students can visualize different systems of the human body in three dimensions.[152] Using AR as a tool to learn anatomical structures has been shown to increase the learner knowledge and provide intrinsic benefits, such as increased engagement and learner immersion.[153][154]

AR has been used to develop different safety training applications for several types of disasters, such as, earthquakes and building fire, and health and safety tasks.[155][156][157] Further, several AR solutions have been proposed and tested to navigate building evacuees towards safe places in both large scale and small scale disasters.[158][159] AR applications can have several overlapping with many other digital technologies, such as BIM, internet of things and artificial intelligence, to generate smarter safety training and navigation solutions.[160]

Industrial manufacturing

[edit]

AR is used to substitute paper manuals with digital instructions which are overlaid on the manufacturing operator's field of view, reducing mental effort required to operate.[161] AR makes machine maintenance efficient because it gives operators direct access to a machine's maintenance history.[162] Virtual manuals help manufacturers adapt to rapidly-changing product designs, as digital instructions are more easily edited and distributed compared to physical manuals.[161]

Digital instructions increase operator safety by removing the need for operators to look at a screen or manual away from the working area, which can be hazardous. Instead, the instructions are overlaid on the working area.[163][164] The use of AR can increase operators' feeling of safety when working near high-load industrial machinery by giving operators additional information on a machine's status and safety functions, as well as hazardous areas of the workspace.[163][165]

Commerce

[edit]
Illustration of an AR-Icon image
The AR-Icon can be used as a marker on print as well as on online media. It signals the viewer that digital content is behind it. The content can be viewed with a smartphone or tablet.

AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect different types of media.[166][167][168][169][170]

AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.[171] AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.[172]

By 2010, virtual dressing rooms had been developed for e-commerce.[173]

In 2012, a mint used AR techniques to market a commemorative coin for Aruba. The coin itself was used as an AR trigger, and when held in front of an AR-enabled device it revealed additional objects and layers of information that were not visible without the device.[174][175]

In 2018, Apple announced Universal Scene Description (USDZ) AR file support for iPhones and iPads with iOS 12. Apple has created an AR QuickLook Gallery that allows masses to experience augmented reality on their own Apple device.[176]

In 2018, Shopify, the Canadian e-commerce company, announced AR Quick Look integration. Their merchants will be able to upload 3D models of their products and their users will be able to tap on the models inside the Safari browser on their iOS devices to view them in their real-world environments.[177]

In 2018, Twinkl released a free AR classroom application. Pupils can see how York looked over 1,900 years ago.[178] Twinkl launched the first ever multi-player AR game, Little Red[179] and has over 100 free AR educational models.[180]

Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which are overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer full-body scanning. These booths render a 3-D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.[181] For example, JC Penney and Bloomingdale's use "virtual dressing rooms" that allow customers to see themselves in clothes without trying them on.[182] Another store that uses AR to market clothing to its customers is Neiman Marcus.[183] Neiman Marcus offers consumers the ability to see their outfits in a 360-degree view with their "memory mirror".[183] Makeup stores like L'Oreal, Sephora, Charlotte Tilbury, and Rimmel also have apps that utilize AR.[184] These apps allow consumers to see how the makeup will look on them.[184] According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail".[184]

AR technology is also used by furniture retailers such as IKEA, Houzz, and Wayfair.[184][182] These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.[184] [185] In 2017, Ikea announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.[186] The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore.[187][188] Shopify's acquisition of Primer, an AR app aims to push small and medium-sized sellers towards interactive AR shopping with easy to use AR integration and user experience for both merchants and consumers.[189] AR helps the retail industry reduce operating costs. Merchants upload product information to the AR system, and consumers can use mobile terminals to search and generate 3D maps.[190]

Literature

[edit]
Illustration of a QR code
An example of an AR code containing a QR code

The first description of AR as it is known today was in Virtual Light, the 1994 novel by William Gibson. In 2011, AR was blended with poetry by ni ka from Sekai Camera in Tokyo, Japan. The prose of these AR poems come from Paul Celan, Die Niemandsrose, expressing the aftermath of the 2011 Tōhoku earthquake and tsunami.[191]

Visual art

[edit]
Illustration from AR Game 10.000 Moving Cities Art Installation.
10.000 Moving Cities, Marc Lee, Augmented Reality Multiplayer Game, Art Installation[192]

AR applied in the visual arts allows objects or places to trigger artistic multidimensional experiences and interpretations of reality.

The Australian new media artist Jeffrey Shaw pioneered Augmented Reality in three artworks: Viewpoint in 1975, Virtual Sculptures in 1987 and The Golden Calf in 1993.[193][194] He continues to explore new permutations of AR in numerous recent works.

Manifest.AR was an international artists' collective founded in 2010 that specialized in augmented reality (AR) art and interventions. The collective typically created site-specific AR installations that could be viewed through mobile devices using custom-developed applications. Their work often challenged traditional notions of art exhibition and ownership by placing virtual artworks in spaces without institutional permission. The collective gained prominence in 2010 when they staged an unauthorized virtual exhibition at the Museum of Modern Art (MoMA) in New York City, overlaying their digital artworks throughout the museum's spaces using AR technology. The collective's unauthorized AR intervention at MoMA involved placing virtual artworks throughout the museum's spaces, viewable through mobile devices. In 2011, members of Manifest.AR created AR artworks that were virtually placed throughout the Venice Biennial, creating an unofficial parallel exhibition accessible through mobile devices. During the Occupy Wall Street movement in 2011, the collective created AR installations in and around Zuccotti Park, adding a digital dimension to the physical protests. Key members of the collective have included: Mark Skwarek; John Craig Freeman; Will Pappenheimer; Tamiko Thiel; and Sander Veenhof. The group published their "AR Art Manifesto" in 2011, which outlined their artistic philosophy and approach to augmented reality as a medium. The manifesto emphasized the democratic potential of AR technology and its ability to challenge traditional institutional control over public space and art display.[195] Manifest.AR has been influential in: Pioneering artistic applications of AR technology; Developing new forms of institutional critique; Expanding concepts of public art and digital space; and Influencing subsequent generations of new media artists. Their work has been documented and discussed in various publications about digital art and new media, and has influenced contemporary discussions about virtual and augmented reality in artistic practice.[196]

Augmented reality can aid in the progression of visual art in museums by allowing museum visitors to view artwork in galleries in a multidimensional way through their phone screens.[197] The Museum of Modern Art in New York has created an exhibit in their art museum showcasing AR features that viewers can see using an app on their smartphone.[198] The museum has developed their personal app, called MoMAR Gallery, that museum guests can download and use in the augmented reality specialized gallery in order to view the museum's paintings in a different way.[199] This allows individuals to see hidden aspects and information about the paintings, and to be able to have an interactive technological experience with artwork as well.

AR technology was used in Nancy Baker Cahill's "Margin of Error" and "Revolutions,"[200] the two public art pieces she created for the 2019 Desert X exhibition.[201]

AR technology aided the development of eye tracking technology to translate a disabled person's eye movements into drawings on a screen.[202]

A Danish artist, Olafur Eliasson, has placed objects like burning suns, extraterrestrial rocks, and rare animals, into the user's environment.[203] Martin & Muñoz started using Augmented Reality (AR) technology in 2020 to create and place virtual works, based on their snow globes, in their exhibitions and in user's environments. Their first AR work was presented at the Cervantes Institute in New York in early 2022.[204]

Fitness

[edit]

AR hardware and software for use in fitness includes smart glasses made for biking and running, with performance analytics and map navigation projected onto the user's field of vision,[205] and boxing, martial arts, and tennis, where users remain aware of their physical environment for safety.[206] Fitness-related games and software include Pokémon Go and Jurassic World Alive.[207]

Human–computer interaction

[edit]

Human–computer interaction (HCI) is an interdisciplinary area of computing that deals with design and implementation of systems that interact with people. Researchers in HCI come from a number of disciplines, including computer science, engineering, design, human factor, and social science, with a shared goal to solve problems in the design and the use of technology so that it can be used more easily, effectively, efficiently, safely, and with satisfaction.[208]

According to a 2017 Time article, in about 15 to 20 years it is predicted that augmented reality and virtual reality are going to become the primary use for computer interactions.[209]

Remote collaboration

[edit]

Primary school children learn easily from interactive experiences. As an example, astronomical constellations and the movements of objects in the solar system were oriented in 3D and overlaid in the direction the device was held, and expanded with supplemental video information. Paper-based science book illustrations could seem to come alive as video without requiring the child to navigate to web-based materials.

In 2013, a project was launched on Kickstarter to teach about electronics with an educational toy that allowed children to scan their circuit with an iPad and see the electric current flowing around.[210] While some educational apps were available for AR by 2016, it was not broadly used. Apps that leverage augmented reality to aid learning included SkyView for studying astronomy,[211] AR Circuits for building simple electric circuits,[212] and SketchAR for drawing.[213]

AR would also be a way for parents and teachers to achieve their goals for modern education, which might include providing more individualized and flexible learning, making closer connections between what is taught at school and the real world, and helping students to become more engaged in their own learning.

Emergency management/search and rescue

[edit]

Augmented reality systems are used in public safety situations, from super storms to suspects at large.

As early as 2009, two articles from Emergency Management discussed AR technology for emergency management. The first was "Augmented Reality—Emerging Technology for Emergency Management", by Gerald Baron.[214] According to Adam Crow,: "Technologies like augmented reality (ex: Google Glass) and the growing expectation of the public will continue to force professional emergency managers to radically shift when, where, and how technology is deployed before, during, and after disasters."[215]

Another early example was a search aircraft looking for a lost hiker in rugged mountain terrain. Augmented reality systems provided aerial camera operators with a geographic awareness of forest road names and locations blended with the camera video. The camera operator was better able to search for the hiker knowing the geographic context of the camera image. Once located, the operator could more efficiently direct rescuers to the hiker's location because the geographic position and reference landmarks were clearly labeled.[216]

Social interaction

[edit]

AR can be used to facilitate social interaction. An augmented reality social network framework called Talk2Me enables people to disseminate information and view others' advertised information in an augmented reality way. The timely and dynamic information sharing and viewing functionalities of Talk2Me help initiate conversations and make friends for users with people in physical proximity.[217] However, use of an AR headset can inhibit the quality of an interaction between two people if one isn't wearing one if the headset becomes a distraction.[218]

Augmented reality also gives users the ability to practice different forms of social interactions with other people in a safe, risk-free environment. Hannes Kauffman, Associate Professor for virtual reality at TU Vienna, says: "In collaborative augmented reality multiple users may access a shared space populated by virtual objects, while remaining grounded in the real world. This technique is particularly powerful for educational purposes when users are collocated and can use natural means of communication (speech, gestures, etc.), but can also be mixed successfully with immersive VR or remote collaboration."[This quote needs a citation] Hannes cites education as a potential use of this technology.

Video games

[edit]
An image from an AR mobile game
An AR mobile game using a trigger image as fiducial marker

The gaming industry embraced AR technology. A number of games were developed for prepared indoor environments, such as AR air hockey, Titans of Space, collaborative combat against virtual enemies, and AR-enhanced pool table games.[219][220][221]

In 2010, Ogmento became the first AR gaming startup to receive VC Funding. The company went on to produce early location-based AR games for titles like Paranormal Activity: Sanctuary, NBA: King of the Court, and Halo: King of the Hill. The companies computer vision technology was eventually repackaged and sold to Apple, became a major contribution to ARKit.[222]

Augmented reality allows video game players to experience digital game play in a real-world environment. Niantic released the augmented reality mobile game Pokémon Go.[223] Disney has partnered with Lenovo to create the augmented reality game Star Wars: Jedi Challenges that works with a Lenovo Mirage AR headset, a tracking sensor and a Lightsaber controller, scheduled to launch in December 2017.[224]

Industrial design

[edit]

AR allows industrial designers to experience a product's design and operation before completion. Volkswagen has used AR for comparing calculated and actual crash test imagery.[225] AR has been used to visualize and modify car body structure and engine layout. It has also been used to compare digital mock-ups with physical mock-ups to find discrepancies between them.[226][227]

Healthcare planning, practice and education

[edit]

One of the first applications of augmented reality was in healthcare, particularly to support the planning, practice, and training of surgical procedures. As far back as 1992, enhancing human performance during surgery was a formally stated objective when building the first augmented reality systems at U.S. Air Force laboratories.[3] Since 2005, a device called a near-infrared vein finder that films subcutaneous veins, processes and projects the image of the veins onto the skin has been used to locate veins.[228][229] AR provides surgeons with patient monitoring data in the style of a fighter pilot's heads-up display, and allows patient imaging records, including functional videos, to be accessed and overlaid. Examples include a virtual X-ray view based on prior tomography or on real-time images from ultrasound and confocal microscopy probes,[230] visualizing the position of a tumor in the video of an endoscope,[231] or radiation exposure risks from X-ray imaging devices.[232][233] AR can enhance viewing a fetus inside a mother's womb.[234] Siemens, Karl Storz and IRCAD have developed a system for laparoscopic liver surgery that uses AR to view sub-surface tumors and vessels.[235] AR has been used for cockroach phobia treatment[236] and to reduce the fear of spiders.[237] Patients wearing augmented reality glasses can be reminded to take medications.[238] Augmented reality can be very helpful in the medical field.[239] It could be used to provide crucial information to a doctor or surgeon without having them take their eyes off the patient. On 30 April 2015 Microsoft announced the Microsoft HoloLens, their first attempt at augmented reality. The HoloLens has advanced through the years and is capable of projecting holograms for near infrared fluorescence based image guided surgery.[240] As augmented reality advances, it finds increasing applications in healthcare. Augmented reality and similar computer based-utilities are being used to train medical professionals.[241][242] In healthcare, AR can be used to provide guidance during diagnostic and therapeutic interventions e.g. during surgery. Magee et al.,[243] for instance, describe the use of augmented reality for medical training in simulating ultrasound-guided needle placement. Similarly, Javaid, Mohd, Haleem, and Abid found that virtual reality provided medical students' brains with an experience that simulates motion and the surgery experience. [244] A very recent study by Akçayır, Akçayır, Pektaş, and Ocak (2016) revealed that AR technology both improves university students' laboratory skills and helps them to build positive attitudes relating to physics laboratory work.[245] Recently, augmented reality began seeing adoption in neurosurgery, a field that requires heavy amounts of imaging before procedures.[246]

Spatial immersion and interaction

[edit]

Augmented reality applications, running on handheld devices utilized as virtual reality headsets, can also digitize human presence in space and provide a computer generated model of them, in a virtual space where they can interact and perform various actions. Such capabilities are demonstrated by Project Anywhere, developed by a postgraduate student at ETH Zurich, which was dubbed as an "out-of-body experience".[247][248][249]

Flight training

[edit]

Building on decades of perceptual-motor research in experimental psychology, researchers at the Aviation Research Laboratory of the University of Illinois at Urbana–Champaign used augmented reality in the form of a flight path in the sky to teach flight students how to land an airplane using a flight simulator. An adaptive augmented schedule in which students were shown the augmentation only when they departed from the flight path proved to be a more effective training intervention than a constant schedule.[30][250] Flight students taught to land in the simulator with the adaptive augmentation learned to land a light aircraft more quickly than students with the same amount of landing training in the simulator but with constant augmentation or without any augmentation.[30]

Military

[edit]
Photograph of an Augmented Reality System for Soldier ARC4.
Augmented reality system for soldier ARC4 (U.S. Army 2017)

An interesting early application of AR occurred when Rockwell International created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris.[39]

Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows it to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness.

Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology.[251] This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems.

Circular review system of the company LimpidArmor

In combat, AR can serve as a networked communication system that renders useful battlefield data onto a soldier's goggles in real time. From the soldier's viewpoint, people and various objects can be marked with special indicators to warn of potential dangers. Virtual maps and 360° view camera imaging can also be rendered to aid a soldier's navigation and battlefield perspective, and this can be transmitted to military leaders at a remote command center.[252] The combination of 360° view cameras visualization and AR can be used on board combat vehicles and tanks as circular review system.

AR can be an effective tool for virtually mapping out the 3D topologies of munition storages in the terrain, with the choice of the munitions combination in stacks and distances between them with a visualization of risk areas.[253][unreliable source?] The scope of AR applications also includes visualization of data from embedded munitions monitoring sensors.[253]

[edit]
Illustration of a LandForm video map overlay marking runways, road, and buildings
LandForm video map overlay marking runways, road, and buildings during 1999 helicopter flight test

The NASA X-38 was flown using a hybrid synthetic vision system that overlaid map data on video to provide enhanced navigation for the spacecraft during flight tests from 1998 to 2002. It used the LandForm software which was useful for times of limited visibility, including an instance when the video camera window frosted over leaving astronauts to rely on the map overlays.[44] The LandForm software was also test flown at the Army Yuma Proving Ground in 1999. In the photo at right one can see the map markers indicating runways, air traffic control tower, taxiways, and hangars overlaid on the video.[45]

AR can augment the effectiveness of navigation devices. Information can be displayed on an automobile's windshield indicating destination directions and meter, weather, terrain, road conditions and traffic information as well as alerts to potential hazards in their path.[254][255][256] Since 2012, a Swiss-based company WayRay has been developing holographic AR navigation systems that use holographic optical elements for projecting all route-related information including directions, important notifications, and points of interest right into the drivers' line of sight and far ahead of the vehicle.[257][258] Aboard maritime vessels, AR can allow bridge watch-standers to continuously monitor important information such as a ship's heading and speed while moving throughout the bridge or performing other tasks.[259]

Workplace

[edit]

Augmented reality may have a positive impact on work collaboration as people may be inclined to interact more actively with their learning environment. It may also encourage tacit knowledge renewal which makes firms more competitive. AR was used to facilitate collaboration among distributed team members via conferences with local and virtual participants. AR tasks included brainstorming and discussion meetings utilizing common visualization via touch screen tables, interactive digital whiteboards, shared design spaces and distributed control rooms.[260][261][262]

In industrial environments, augmented reality is proving to have a substantial impact with more and more use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system.[263][264] Assembly lines benefited from the usage of AR. In addition to Boeing, BMW and Volkswagen were known for incorporating this technology into assembly lines for monitoring process improvements.[265][266][267] Big machines are difficult to maintain because of their multiple layers or structures. AR permits people to look through the machine as if with an x-ray, pointing them to the problem right away.[268]

As AR technology has evolved and second and third generation AR devices come to market, the impact of AR in enterprise continues to flourish. In the Harvard Business Review, Magid Abraham and Marco Annunziata discuss how AR devices are now being used to "boost workers' productivity on an array of tasks the first time they're used, even without prior training".[269] They contend that "these technologies increase productivity by making workers more skilled and efficient, and thus have the potential to yield both more economic growth and better jobs".[269]

Broadcast and live events

[edit]

Weather visualizations were the first application of augmented reality in television. It has now become common in weather casting to display full motion video of images captured in real-time from multiple cameras and other imaging devices. Coupled with 3D graphics symbols and mapped to a common virtual geospatial model, these animated visualizations constitute the first true application of AR to TV.

AR has become common in sports telecasting. Sports and entertainment venues are provided with see-through and overlay augmentation through tracked camera feeds for enhanced viewing by the audience. Examples include the yellow "first down" line seen in television broadcasts of American football games showing the line the offensive team must cross to receive a first down. AR is also used in association with football and other sporting events to show commercial advertisements overlaid onto the view of the playing area. Sections of rugby fields and cricket pitches also display sponsored images. Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance. Other examples include hockey puck tracking and annotations of racing car performance[270] and snooker ball trajectories.[115][271]

AR has been used to enhance concert and theater performances. For example, artists allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.[272][273][274]

Tourism and sightseeing

[edit]

Travelers may use AR to access real-time informational displays regarding a location, its features, and comments or content provided by previous visitors. Advanced AR applications include simulations of historical events, places, and objects rendered into the landscape.[275][276][277]

AR applications linked to geographic locations present location information by audio, announcing features of interest at a particular site as they become visible to the user.[278][279][280]

Translation

[edit]

AR systems such as Word Lens can interpret the foreign text on signs and menus and, in a user's augmented view, re-display the text in the user's language. Spoken words of a foreign language can be translated and displayed in a user's view as printed subtitles.[281][282][283]

Music

[edit]

It has been suggested that augmented reality may be used in new methods of music production, mixing, control and visualization.[284][285][286][287]

In a proof-of-concept project Ian Sterling, an interaction design student at California College of the Arts, and software engineer Swaroop Pal demonstrated a HoloLens app whose primary purpose is to provide a 3D spatial UI for cross-platform devices—the Android Music Player app and Arduino-controlled Fan and Light—and also allow interaction using gaze and gesture control.[288][289][290][291]

Research by members of the CRIStAL at the University of Lille makes use of augmented reality to enrich musical performance. The ControllAR project allows musicians to augment their MIDI control surfaces with the remixed graphical user interfaces of music software.[292] The Rouages project proposes to augment digital musical instruments to reveal their mechanisms to the audience and thus improve the perceived liveness.[293] Reflets is a novel augmented reality display dedicated to musical performances where the audience acts as a 3D display by revealing virtual content on stage, which can also be used for 3D musical interaction and collaboration.[294]

Snapchat

[edit]

Snapchat users have access to augmented reality in the app through use of camera filters. In September 2017, Snapchat updated its app to include a camera filter that allowed users to render an animated, cartoon version of themselves called "Bitmoji". These animated avatars would be projected in the real world through the camera, and can be photographed or video recorded.[295] In the same month, Snapchat also announced a new feature called "Sky Filters" that will be available on its app. This new feature makes use of augmented reality to alter the look of a picture taken of the sky, much like how users can apply the app's filters to other pictures. Users can choose from sky filters such as starry night, stormy clouds, beautiful sunsets, and rainbow.[296]

Concerns

[edit]

Reality modifications

[edit]

In a paper titled "Death by Pokémon GO", researchers at Purdue University's Krannert School of Management claim the game caused "a disproportionate increase in vehicular crashes and associated vehicular damage, personal injuries, and fatalities in the vicinity of locations, called PokéStops, where users can play the game while driving."[297] Using data from one municipality, the paper extrapolates what that might mean nationwide and concluded "the increase in crashes attributable to the introduction of Pokémon GO is 145,632 with an associated increase in the number of injuries of 29,370 and an associated increase in the number of fatalities of 256 over the period of 6 July 2016, through 30 November 2016." The authors extrapolated the cost of those crashes and fatalities at between $2bn and $7.3 billion for the same period. Furthermore, more than one in three surveyed advanced Internet users would like to edit out disturbing elements around them, such as garbage or graffiti.[298] They would like to even modify their surroundings by erasing street signs, billboard ads, and uninteresting shopping windows. So it seems that AR is as much a threat to companies as it is an opportunity. Although, this could be a nightmare to numerous brands that do not manage to capture consumer imaginations it also creates the risk that the wearers of augmented reality glasses may become unaware of surrounding dangers. Consumers want to use augmented reality glasses to change their surroundings into something that reflects their own personal opinions. Around two in five want to change the way their surroundings look and even how people appear to them. [citation needed]

Next, to the possible privacy issues that are described below, overload and over-reliance issues are the biggest danger of AR. For the development of new AR-related products, this implies that the user-interface should follow certain guidelines as not to overload the user with information while also preventing the user from over-relying on the AR system such that important cues from the environment are missed.[18] This is called the virtually-augmented key.[18] Once the key is ignored, people might not desire the real world anymore.

Privacy concerns

[edit]

The concept of modern augmented reality depends on the ability of the device to record and analyze the environment in real time. Because of this, there are potential legal concerns over privacy. While the First Amendment to the United States Constitution allows for such recording in the name of public interest, the constant recording of an AR device makes it difficult to do so without also recording outside of the public domain. Legal complications would be found in areas where a right to a certain amount of privacy is expected or where copyrighted media are displayed.

In terms of individual privacy, there exists the ease of access to information that one should not readily possess about a given person. This is accomplished through facial recognition technology. Assuming that AR automatically passes information about persons that the user sees, there could be anything seen from social media, criminal record, and marital status.[299]

The Code of Ethics on Human Augmentation, which was originally introduced by Steve Mann in 2004 and further refined with Ray Kurzweil and Marvin Minsky in 2013, was ultimately ratified at the virtual reality Toronto conference on 25 June 2017.[300][301][302][303]

Property law

[edit]

The interaction of location-bound augmented reality with property law is largely undefined.[304][305] Several models have been analysed for how this interaction may be resolved in a common law context: an extension of real property rights to also cover augmentations on or near the property with a strong notion of trespassing, forbidding augmentations unless allowed by the owner; an 'open range' system, where augmentations are allowed unless forbidden by the owner; and a 'freedom to roam' system, where real property owners have no control over non-disruptive augmentations.[306]

One issue experienced during the Pokémon Go craze was the game's players disturbing owners of private property while visiting nearby location-bound augmentations, which may have been on the properties or the properties may have been en route. The terms of service of Pokémon Go explicitly disclaim responsibility for players' actions, which may limit (but may not totally extinguish) the liability of its producer, Niantic, in the event of a player trespassing while playing the game: by Niantic's argument, the player is the one committing the trespass, while Niantic has merely engaged in permissible free speech. A theory advanced in lawsuits brought against Niantic is that their placement of game elements in places that will lead to trespass or an exceptionally large flux of visitors can constitute nuisance, despite each individual trespass or visit only being tenuously caused by Niantic.[307][308][309]

Another claim raised against Niantic is that the placement of profitable game elements on land without permission of the land's owners is unjust enrichment.[310] More hypothetically, a property may be augmented with advertising or disagreeable content against its owner's wishes.[311] Under American law, these situations are unlikely to be seen as a violation of real property rights by courts without an expansion of those rights to include augmented reality (similarly to how English common law came to recognise air rights).[310]

An article in the Michigan Telecommunications and Technology Law Review argues that there are three bases for this extension, starting with various understanding of property. The personality theory of property, outlined by Margaret Radin, is claimed to support extending property rights due to the intimate connection between personhood and ownership of property; however, her viewpoint is not universally shared by legal theorists.[312] Under the utilitarian theory of property, the benefits from avoiding the harms to real property owners caused by augmentations and the tragedy of the commons, and the reduction in transaction costs by making discovery of ownership easy, were assessed as justifying recognising real property rights as covering location-bound augmentations, though there does remain the possibility of a tragedy of the anticommons from having to negotiate with property owners slowing innovation.[313] Finally, following the 'property as the law of things' identification as supported by Thomas Merrill and Henry E Smith, location-based augmentation is naturally identified as a 'thing', and, while the non-rivalrous and ephemeral nature of digital objects presents difficulties to the excludeability prong of the definition, the article argues that this is not insurmountable.[314]

Some attempts at legislative regulation have been made in the United States. Milwaukee County, Wisconsin attempted to regulate augmented reality games played in its parks, requiring prior issuance of a permit,[315] but this was criticised on free speech grounds by a federal judge;[316] and Illinois considered mandating a notice and take down procedure for location-bound augmentations.[317]

An article for the Iowa Law Review observed that dealing with many local permitting processes would be arduous for a large-scale service,[318] and, while the proposed Illinois mechanism could be made workable,[319] it was reactive and required property owners to potentially continually deal with new augmented reality services; instead, a national-level geofencing registry, analogous to a do-not-call list, was proposed as the most desirable form of regulation to efficiently balance the interests of both providers of augmented reality services and real property owners.[320] An article in the Vanderbilt Journal of Entertainment and Technology Law, however, analyses a monolithic do-not-locate registry as an insufficiently flexible tool, either permitting unwanted augmentations or foreclosing useful applications of augmented reality.[321] Instead, it argues that an 'open range' model, where augmentations are permitted by default but property owners may restrict them on a case-by-case basis (and with noncompliance treated as a form of trespass), will produce the socially-best outcome.[322]

Notable researchers

[edit]
  • Ronald Azuma is a scientist and author of works on AR.
  • Jeri Ellsworth headed a research effort for Valve on augmented reality (AR), later taking that research to her own start-up CastAR. The company, founded in 2013, eventually shuttered. Later, she created another start-up based on the same technology called Tilt Five; another AR start-up formed by her with the purpose of creating a device for digital board games.[323]
  • Steve Mann formulated an earlier concept of mediated reality in the 1970s and 1980s, using cameras, processors, and display systems to modify visual reality to help people see better (dynamic range management), building computerized welding helmets, as well as "augmediated reality" vision systems for use in everyday life. He is also an adviser to Meta.[324]
  • Dieter Schmalstieg and Daniel Wagner developed a marker tracking systems for mobile phones and PDAs in 2009.[325]
  • Ivan Sutherland invented the first VR head-mounted display at Harvard University.

In media

[edit]

The futuristic short film Sight[326] features contact lens-like augmented reality devices.[327][328]

See also

[edit]

References

[edit]
  1. ^ Cipresso, Pietro; Giglioli, Irene Alice Chicchi; Raya, iz; Riva, Giuseppe (7 December 2011). "The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature". Frontiers in Psychology. 9: 2086. doi:10.3389/fpsyg.2018.02086. PMC 6232426. PMID 30459681.
  2. ^ Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong (March 2013). "Current status, opportunities and challenges of augmented reality in education...". Computers & Education. 62: 41–49. doi:10.1016/j.compedu.2012.10.024. S2CID 15218665.
  3. ^ a b c d Rosenberg, Louis B. (1992). "The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance in Remote Environments". Archived from the original on 10 July 2019.
  4. ^ Milgram, Paul; Takemura, Haruo; Utsumi, Akira; Kishino, Fumio (21 December 1995). "Augmented reality: a class of displays on the reality-virtuality continuum". Telemanipulator and Telepresence Technologies. 2351. SPIE: 282–292. Bibcode:1995SPIE.2351..282M. doi:10.1117/12.197321.
  5. ^ Steuer,"Defining virtual reality: Dimensions Determining Telepresence". Archived from the original on 17 July 2022. Retrieved 27 November 2018., Department of Communication, Stanford University. 15 October 1993.
  6. ^ Introducing Virtual Environments Archived 21 April 2016 at the Wayback Machine National Center for Supercomputing Applications, University of Illinois.
  7. ^ Rosenberg, L.B. (1993). "Virtual fixtures: Perceptual tools for telerobotic manipulation". Proceedings of IEEE virtual reality Annual International Symposium. pp. 76–82. doi:10.1109/VRAIS.1993.380795. ISBN 0-7803-1363-1. S2CID 9856738.
  8. ^ a b Dupzyk, Kevin (6 September 2016). "I Saw the Future Through Microsoft's Hololens". Popular Mechanics.
  9. ^ Arai, Kohei, ed. (2022), "Augmented Reality: Reflections at Thirty Years", Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1, Lecture Notes in Networks and Systems, vol. 358, Cham: Springer International Publishing, pp. 1–11, doi:10.1007/978-3-030-89906-6_1, ISBN 978-3-030-89905-9, S2CID 239881216
  10. ^ Moro, Christian; Birt, James; Stromberga, Zane; Phelps, Charlotte; Clark, Justin; Glasziou, Paul; Scott, Anna Mae (2021). "Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis". Anatomical Sciences Education. 14 (3): 368–376. doi:10.1002/ase.2049. ISSN 1935-9772. PMID 33378557. S2CID 229929326.
  11. ^ "How to Transform Your Classroom with Augmented Reality - EdSurge News". 2 November 2015.
  12. ^ Crabben, Jan van der (16 October 2018). "Why We Need More Tech in History Education". ancient.eu. Archived from the original on 23 October 2018. Retrieved 23 October 2018.
  13. ^ Dargan, Shaveta; Bansal, Shally; Mittal, Ajay; Kumar, Krishan (2023). "Augmented Reality: A Comprehensive Review". Archives of Computational Methods in Engineering. 30 (2): 1057–1080. doi:10.1007/s11831-022-09831-7. Retrieved 27 February 2024.
  14. ^ Hegde, Naveen (19 March 2023). "What is Augmented Reality". Codegres. Retrieved 19 March 2023.
  15. ^ Chen, Brian (25 August 2009). "If You're Not Seeing Data, You're Not Seeing". Wired. Retrieved 18 June 2019.
  16. ^ Maxwell, Kerry. "Augmented Reality". macmillandictionary.com. Retrieved 18 June 2019.
  17. ^ "Augmented Reality (AR)". augmentedrealityon.com. Archived from the original on 5 April 2012. Retrieved 18 June 2019.
  18. ^ a b c d Azuma, Ronald (August 1997). "A Survey of Augmented Reality" (PDF). Presence: Teleoperators and Virtual Environments. 6 (4). MIT Press: 355–385. doi:10.1162/pres.1997.6.4.355. S2CID 469744. Retrieved 2 June 2021.
  19. ^ "Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97" (PDF).
  20. ^ Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp 99–128, 1992.
  21. ^ Mann, Steve; Feiner, Steve; Harner, Soren; Ali, Mir Adnan; Janzen, Ryan; Hansen, Jayse; Baldassi, Stefano (15 January 2015). "Wearable Computing, 3D Aug* Reality, Photographic/Videographic Gesture Sensing, and Veillance". Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction - TEI '14. ACM. pp. 497–500. doi:10.1145/2677199.2683590. ISBN 9781450333054. S2CID 12247969.
  22. ^ Carmigniani, Julie; Furht, Borko; Anisetti, Marco; Ceravolo, Paolo; Damiani, Ernesto; Ivkovic, Misa (1 January 2011). "Augmented reality technologies, systems and applications". Multimedia Tools and Applications. 51 (1): 341–377. doi:10.1007/s11042-010-0660-6. ISSN 1573-7721. S2CID 4325516.
  23. ^ Ma, Minhua; C. Jain, Lakhmi; Anderson, Paul (2014). Virtual, Augmented Reality and Serious Games for Healthcare 1. Springer Publishing. p. 120. ISBN 978-3-642-54816-1.
  24. ^ Marvin, Rob (16 August 2016). "Augment Is Bringing the AR Revolution to Business". PC Mag. Retrieved 23 February 2021.
  25. ^ Stamp, Jimmy (30 August 2019). "Retail is getting reimagined with augmented reality". The Architect's Newspaper. Archived from the original on 15 November 2019.
  26. ^ Mahmood 2019-04-12T11:30:27Z, Ajmal (12 April 2019). "The future is virtual - why AR and VR will live in the cloud". TechRadar. Retrieved 12 December 2019.{{cite web}}: CS1 maint: numeric names: authors list (link)
  27. ^ Aubrey, Dave. "Mural Artists Use Augmented Reality To Highlight Effects Of Climate Change". VRFocus. Retrieved 12 December 2019.
  28. ^ Johnson, Joel. "The Master Key": L. Frank Baum envisions augmented reality glasses in 1901 Mote & Beam 10 September 2012.
  29. ^ Sutherland, Ivan E. (1968). "A head-mounted three dimensional display". Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS '68 (Fall, part I). p. 757. doi:10.1145/1476589.1476686. S2CID 4561103.
  30. ^ a b c Lintern, Gavan (1980). "Transfer of landing skill after training with supplementary visual cues". Human Factors. 22 (1): 81–88. doi:10.1177/001872088002200109. PMID 7364448. S2CID 113087380.
  31. ^ Mann, Steve (2 November 2012). "Eye Am a Camera: Surveillance and Sousveillance in the Glassage". Time. Retrieved 14 October 2013.
  32. ^ "Absolute Display Window Mouse/Mice". Archived from the original on 6 November 2019. Retrieved 19 October 2020. (context & abstract only) IBM Technical Disclosure Bulletin 1 March 1987
  33. ^ "Absolute Display Window Mouse/Mice". Archived from the original on 19 October 2020. Retrieved 19 October 2020. (image of anonymous printed article) IBM Technical Disclosure Bulletin 1 March 1987
  34. ^ George, Douglas B.; Morris, L. Robert (1989). "A computer-driven astronomical telescope guidance and control system with superimposed star field and celestial coordinate graphics display". Journal of the Royal Astronomical Society of Canada. 83: 32. Bibcode:1989JRASC..83...32G.
  35. ^ Lee, Kangdon (7 February 2012). "Augmented Reality in Education and Training". TechTrends. 56 (2): 13–21. doi:10.1007/s11528-012-0559-3. S2CID 40826055.
  36. ^ Louis B. Rosenberg. "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments." Technical Report AL-TR-0089, USAF Armstrong Laboratory (AFRL), Wright-Patterson AFB OH, 1992.
  37. ^ Eric R. Fossum (1993), "Active Pixel Sensors: Are CCD's Dinosaurs?" Proc. SPIE Vol. 1900, p. 2–14, Charge-Coupled Devices and Solid State Optical Sensors III, Morley M. Blouke; Ed.
  38. ^ Schmalstieg, Dieter; Hollerer, Tobias (2016). Augmented Reality: Principles and Practice. Addison-Wesley Professional. pp. 209–10. ISBN 978-0-13-315320-0.
  39. ^ a b Abernathy, M., Houchard, J., Puccetti, M., and Lambert, J,"Debris Correlation Using the Rockwell WorldView System", Proceedings of 1993 Space Surveillance Workshop 30 March to 1 April 1993, pages 189-195
  40. ^ Wellner, Pierre; Mackay, Wendy; Gold, Rich (1 July 1993). "Back to the real world". Communications of the ACM. 36 (7): 24–27. doi:10.1145/159544.159555. S2CID 21169183.
  41. ^ Barrilleaux, Jon. Experiences and Observations in Applying Augmented Reality to Live Training.
  42. ^ "US Patent for Projection of images of computer models in three dimensional space Patent (Patent # 5,687,305 issued November 11, 1997) - Justia Patents Search". patents.justia.com. Retrieved 17 October 2021.
  43. ^ a b Ramesh Raskar, Greg Welch, Henry Fuchs Spatially Augmented Reality, First International Workshop on Augmented Reality, Sept 1998.
  44. ^ a b Delgado, F., Abernathy, M., White J., and Lowrey, B. Real-Time 3-D Flight Guidance with Terrain for the X-38, SPIE Enhanced and Synthetic Vision 1999, Orlando Florida, April 1999, Proceedings of the SPIE Vol. 3691, pages 149–156
  45. ^ a b Delgado, F., Altman, S., Abernathy, M., White, J. Virtual Cockpit Window for the X-38, SPIE Enhanced and Synthetic Vision 2000, Orlando Florida, Proceedings of the SPIE Vol. 4023, pages 63–70
  46. ^ "Information Technology". www.nrl.navy.mil.
  47. ^ AviationNow.com Staff, "X-38 Test Features Use of Hybrid Synthetic Vision" AviationNow.com, 11 December 2001
  48. ^ Behringer, R.; Tam, C.; McGee, J.; Sundareswaran, S.; Vassiliou, M. (2000). "A wearable augmented reality testbed for navigation and control, built solely with commercial-off-the-shelf (COTS) hardware". Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000). pp. 12–19. doi:10.1109/ISAR.2000.880918. ISBN 0-7695-0846-4. S2CID 18892611.
  49. ^ Behringer, R.; Tam, C.; McGee, J.; Sundareswaran, S.; Vassiliou, M. (2000). "Two wearable testbeds for augmented reality: ItWARNS and WIMMIS". Digest of Papers. Fourth International Symposium on Wearable Computers. pp. 189–190. doi:10.1109/ISWC.2000.888495. ISBN 0-7695-0795-6. S2CID 13459308.
  50. ^ a b Outdoor AR. TV One News, 8 March 2004.
  51. ^ 7732694, "United States Patent: 7732694 - Portable music player with synchronized transmissive visual overlays", published 9 Aug 2006, issued 8 June 2010 
  52. ^ Slawski, Bill (4 September 2011). "Google Picks Up Hardware and Media Patents from Outland Research". SEO by the Sea ⚓.
  53. ^ Wikitude AR Travel Guide. YouTube.com. Retrieved 9 June 2012.
  54. ^ Cameron, Chris. Flash-based AR Gets High-Quality Markerless Upgrade, ReadWriteWeb 9 July 2010.
  55. ^ Microsoft Channel, YouTube [1], 23 January 2015.
  56. ^ Bell, Karissa (15 September 2015). "How to get the most out of the new Snapchat update". Mashable.
  57. ^ Bond, Sarah (17 July 2016). "After the Success of Pokémon Go, How Will Augmented Reality Impact Archaeological Sites?". Retrieved 17 July 2016.
  58. ^ Haselton, Todd (8 August 2018). "After almost a decade and billions in outside investment, Magic Leap's first product is finally on sale for $2,295. Here's what it's like". CNBC. Retrieved 2 June 2024.
  59. ^ "Leap Motion's 'Project North Star' could help make cheap AR headsets a reality". Mashable. 9 April 2018. Retrieved 26 March 2024.
  60. ^ "Leap Motion designed a $100 augmented reality headset with super-powerful hand tracking". The Verge. 9 April 2018. Retrieved 26 March 2024.
  61. ^ "Project North Star is Now Open Source". Leap Motion. 6 June 2018. Retrieved 26 March 2024.
  62. ^ "Leap Motion Open-sources Project North Star, An AR Headset Prototype With Impressive Specs". Road to VR. 6 June 2018. Retrieved 26 March 2024.
  63. ^ Official Blog, Microsoft [2], 24 February 2019.
  64. ^ "Magic Leap 2 is the best AR headset yet, but will an enterprise focus save the company?". Engadget. 11 November 2022. Retrieved 26 March 2024.
  65. ^ Vanian, Jonathan (27 September 2024). "Hands-on with Meta's Orion AR glasses prototype and the possible future of computing". CNBC. Retrieved 28 September 2024.
  66. ^ Metz, Rachael (2 August 2012). "Augmented Reality Is Finally Getting Real". technologyreview.com. Retrieved 18 June 2019.
  67. ^ Marino, Emanuele; Bruno, Fabio; Barbieri, Loris; Lagudi, Antonio (2022). "Benchmarking Built-In Tracking Systems for Indoor AR Applications on Popular Mobile Devices". Sensors. 22 (14): 5382. Bibcode:2022Senso..22.5382M. doi:10.3390/s22145382. PMC 9320911. PMID 35891058.
  68. ^ "Fleet Week: Office of Naval Research Technology". eweek.com. 28 May 2012. Retrieved 18 June 2019.
  69. ^ Rolland, Jannick; Baillott, Yohan; Goon, Alexei.A Survey of Tracking Technology for Virtual Environments, Center for Research and Education in Optics and Lasers, University of Central Florida.
  70. ^ Klepper, Sebastian. "Augmented Reality - Display Systems" (PDF). campar.in.tum.de. Archived from the original (PDF) on 28 January 2013. Retrieved 18 June 2019.
  71. ^ Komura, Shinichi (19 July 2024). "Optics of AR/VR using liquid crystals". Molecular Crystals and Liquid Crystals: 1–26. doi:10.1080/15421406.2024.2379694. ISSN 1542-1406.
  72. ^ Rolland, Jannick P.; Biocca, Frank; Hamza-Lup, Felix; Ha, Yanggang; Martins, Ricardo (October 2005). "Development of Head-Mounted Projection Displays for Distributed, Collaborative, Augmented Reality Applications". Presence: Teleoperators and Virtual Environments. 14 (5): 528–549. arXiv:1902.07769. doi:10.1162/105474605774918741. S2CID 5328957.
  73. ^ "Gestigon Gesture Tracking – TechCrunch Disrupt". TechCrunch. Retrieved 11 October 2016.
  74. ^ Matney, Lucas (29 August 2016). "uSens shows off new tracking sensors that aim to deliver richer experiences for mobile VR". TechCrunch. Retrieved 29 August 2016.
  75. ^ "Images Of The Vuzix STAR 1200 Augmented Reality Glasses". TechCrunch. 5 June 2011. Retrieved 26 March 2024.
  76. ^ "Vuzix Blade AR glasses are the next-gen Google Glass we've all been waiting for". 9 January 2018. Retrieved 26 March 2024.
  77. ^ "Hands On: Vuzix's No-Nonsense AR Smart Glasses". Retrieved 26 March 2024.
  78. ^ Grifatini, Kristina. Augmented Reality Goggles, Technology Review 10 November 2010.
  79. ^ Arthur, Charles. UK company's 'augmented reality' glasses could be better than Google's, The Guardian, 10 September 2012.
  80. ^ Gannes, Liz. "Google Unveils Project Glass: Wearable Augmented-Reality Glasses". allthingsd.com. Retrieved 4 April 2012., All Things D.
  81. ^ Benedetti, Winda. Xbox leak reveals Kinect 2, augmented reality glasses NBC News. Retrieved 23 August 2012.
  82. ^ a b "GlassEyes": The Theory of EyeTap Digital Eye Glass, supplemental material for IEEE Technology and Society, Volume Vol. 31, Number 3, 2012, pp. 10–14.
  83. ^ "Intelligent Image Processing", John Wiley and Sons, 2001, ISBN 0-471-40637-6, 384 p.
  84. ^ "Augmented Reality". merriam-webster.com. Archived from the original on 13 September 2015. Retrieved 8 October 2015. an enhanced version of reality created by the use of technology to overlay digital information on an image of something being viewed through a device (such as a smartphone camera) also : the technology used to create augmented reality
  85. ^ "Augmented Reality". oxforddictionaries.com. Archived from the original on 25 November 2013. Retrieved 8 October 2015. A technology that superimposes a computer-generated image on a user's view of the real world, thus providing a composite view.
  86. ^ "What is Augmented Reality (AR): Augmented Reality Defined, iPhone Augmented Reality Apps and Games and More". Digital Trends. 3 November 2009. Retrieved 8 October 2015.
  87. ^ "Full Page Reload". IEEE Spectrum: Technology, Engineering, and Science News. 10 April 2013. Retrieved 6 May 2020.
  88. ^ "Contact lens for the display of information such as text, graphics, or pictures".
  89. ^ Greenemeier, Larry. Computerized Contact Lenses Could Enable In-Eye Augmented Reality. Scientific American, 23 November 2011.
  90. ^ Yoneda, Yuka. Solar Powered Augmented Contact Lenses Cover Your Eye with 100s of LEDs. inhabitat, 17 March 2010.
  91. ^ Rosen, Kenneth (8 December 2012). "Contact Lenses Can Display Your Text Messages". Mashable.com. Retrieved 13 December 2012.
  92. ^ O'Neil, Lauren. "LCD contact lenses could display text messages in your eye". CBC News. Archived from the original on 11 December 2012. Retrieved 12 December 2012.
  93. ^ Anthony, Sebastian. US military developing multi-focus augmented reality contact lenses. ExtremeTech, 13 April 2012.
  94. ^ Bernstein, Joseph. 2012 Invention Awards: Augmented-Reality Contact Lenses Popular Science, 5 June 2012.
  95. ^ Robertson, Adi (10 January 2013). "Innovega combines glasses and contact lenses for an unusual take on augmented reality". The Verge. Retrieved 6 May 2020.
  96. ^ "Samsung Just Patented Smart Contact Lenses With a Built-in Camera". sciencealert.com. 7 April 2016. Retrieved 18 June 2019.
  97. ^ "Full Page Reload". IEEE Spectrum: Technology, Engineering, and Science News. 16 January 2020. Retrieved 6 May 2020.
  98. ^ "Mojo Vision's AR contact lenses are very cool, but many questions remain". TechCrunch. 16 January 2020. Retrieved 6 May 2020.
  99. ^ "Mojo Vision is developing AR contact lenses". TechCrunch. 16 January 2020. Retrieved 6 May 2020.
  100. ^ a b Viirre, E.; Pryor, H.; Nagata, S.; Furness, T. A. (1998). "The virtual retinal display: a new technology for virtual reality and augmented vision in medicine". Studies in Health Technology and Informatics. 50 (Medicine Meets virtual reality): 252–257. doi:10.3233/978-1-60750-894-6-252. ISSN 0926-9630. PMID 10180549.
  101. ^ Tidwell, Michael; Johnson, Richard S.; Melville, David; Furness, Thomas A.The Virtual Retinal Display – A Retinal Scanning Imaging System Archived 13 December 2010 at the Wayback Machine, Human Interface Technology Laboratory, University of Washington.
  102. ^ Marker vs Markerless AR Archived 28 January 2013 at the Wayback Machine, Dartmouth College Library.
  103. ^ Feiner, Steve (3 March 2011). "Augmented reality: a long way off?". AR Week. Pocket-lint. Retrieved 3 March 2011.
  104. ^ Knight, Will. Augmented reality brings maps to life 19 July 2005.
  105. ^ Sung, Dan. Augmented reality in action – maintenance and repair. Pocket-lint, 1 March 2011.
  106. ^ Marshall, Gary.Beyond the mouse: how input is evolving, Touch, voice and gesture recognition and augmented realityTechRadar.computing\PC Plus 23 August 2009.
  107. ^ Simonite, Tom. Augmented Reality Meets Gesture Recognition, Technology Review, 15 September 2011.
  108. ^ Chaves, Thiago; Figueiredo, Lucas; Da Gama, Alana; de Araujo, Christiano; Teichrieb, Veronica. Human Body Motion and Gestures Recognition Based on Checkpoints. SVR '12 Proceedings of the 2012 14th Symposium on Virtual and Augmented Reality pp. 271–278.
  109. ^ Barrie, Peter; Komninos, Andreas; Mandrychenko, Oleksii.A Pervasive Gesture-Driven Augmented Reality Prototype using Wireless Sensor Body Area Networks.
  110. ^ Bosnor, Kevin (19 February 2001). "How Augmented Reality Works". howstuffworks.
  111. ^ Meisner, Jeffrey; Donnelly, Walter P.; Roosen, Richard (6 April 1999). "Augmented reality technology".
  112. ^ Krevelen, Poelman, D.W.F, Ronald (2010). A Survey of Augmented Reality Technologies, Applications and Limitations. International Journal of virtual reality. pp. 3, 6.{{cite book}}: CS1 maint: multiple names: authors list (link)
  113. ^ Jung, Timothy; Claudia Tom Dieck, M. (4 September 2017). Augmented reality and virtual reality : empowering human, place and business. Jung, Timothy,, Dieck, M. Claudia tom. Cham, Switzerland. ISBN 9783319640273. OCLC 1008871983.{{cite book}}: CS1 maint: location missing publisher (link)
  114. ^ Braud, T. "Future Networking Challenges: The Case of Mobile Augmented Reality" (PDF). cse.ust.hk. Archived from the original (PDF) on 16 May 2018. Retrieved 20 June 2019.
  115. ^ a b Azuma, Ronald; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. Recent Advances in Augmented Reality Computers & Graphics, November 2001.
  116. ^ Maida, James; Bowen, Charles; Montpool, Andrew; Pace, John. Dynamic registration correction in augmented-reality systems Archived 18 May 2013 at the Wayback Machine, Space Life Sciences, NASA.
  117. ^ State, Andrei; Hirota, Gentaro; Chen, David T; Garrett, William; Livingston, Mark. Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking, Department of Computer Science, University of North Carolina at Chapel Hill.
  118. ^ Bajura, Michael; Neumann, Ulrich. Dynamic Registration Correction in Augmented-Reality Systems Archived 13 July 2012, University of North Carolina, University of Southern California.
  119. ^ "What are augmented reality markers ?". anymotion.com. Retrieved 18 June 2019.
  120. ^ "Markerless Augmented Reality is here". Marxent | Top Augmented Reality Apps Developer. 9 May 2014. Retrieved 23 January 2018.
  121. ^ "ARML 2.0 SWG". Open Geospatial Consortium website. Open Geospatial Consortium. Archived from the original on 12 November 2013. Retrieved 12 November 2013.
  122. ^ "Top 5 AR SDKs". Augmented Reality News. Archived from the original on 13 December 2013. Retrieved 15 November 2013.
  123. ^ "Top 10 AR SDKs". Augmented World Expo. Archived from the original on 23 November 2013. Retrieved 15 November 2013.
  124. ^ a b c d Wilson, Tyler (30 January 2018). ""The Principles of Good UX for Augmented Reality – UX Collective." UX Collective". Retrieved 19 June 2019.
  125. ^ a b "Best Practices for Mobile AR Design- Google". blog.google. 13 December 2017.
  126. ^ "Human Computer Interaction with Augmented Reality" (PDF). eislab.fim.uni-passau.de. Archived from the original (PDF) on 25 May 2018.
  127. ^ "Basic Patterns of Mobile Navigation". theblog.adobe.com. 9 May 2017. Archived from the original on 13 April 2018. Retrieved 12 April 2018.
  128. ^ "Principles of Mobile App Design: Engage Users and Drive Conversions". thinkwithgoogle.com. Archived from the original on 13 April 2018.
  129. ^ "Inside Out: Interaction Design for Augmented Reality-UXmatters". uxmatters.com.
  130. ^ Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan (2017). "The effectiveness of virtual and augmented reality in health sciences and medical anatomy". Anatomical Sciences Education. 10 (6): 549–559. doi:10.1002/ase.1696. ISSN 1935-9780. PMID 28419750. S2CID 25961448.
  131. ^ "Don't be blind on wearable cameras insists AR genius". SlashGear. 20 July 2012. Retrieved 21 October 2018.
  132. ^ Stuart Eve (2012). "Augmenting Phenomenology: Using Augmented Reality to Aid Archaeological Phenomenology in the Landscape" (PDF). Journal of Archaeological Method and Theory. 19 (4): 582–600. doi:10.1007/s10816-012-9142-7. S2CID 4988300.
  133. ^ Dähne, Patrick; Karigiannis, John N. (2002). Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System. ISBN 9780769517810. Retrieved 6 January 2010.
  134. ^ LBI-ArchPro (5 September 2011). "School of Gladiators discovered at Roman Carnuntum, Austria". Retrieved 29 December 2014.
  135. ^ Papagiannakis, George; Schertenleib, Sébastien; O'Kennedy, Brian; Arevalo-Poizat, Marlene; Magnenat-Thalmann, Nadia; Stoddart, Andrew; Thalmann, Daniel (1 February 2005). "Mixing virtual and real scenes in the site of ancient Pompeii". Computer Animation and Virtual Worlds. 16 (1): 11–24. CiteSeerX 10.1.1.64.8781. doi:10.1002/cav.53. ISSN 1546-427X. S2CID 5341917.
  136. ^ Benko, H.; Ishak, E.W.; Feiner, S. (2004). "Collaborative Mixed Reality Visualization of an Archaeological Excavation". Third IEEE and ACM International Symposium on Mixed and Augmented Reality. pp. 132–140. doi:10.1109/ISMAR.2004.23. ISBN 0-7695-2191-6. S2CID 10122485.
  137. ^ Divecha, Devina.Augmented Reality (AR) used in architecture and design Archived 14 February 2013 at the Wayback Machine. designMENA 8 September 2011.
  138. ^ Architectural dreams in augmented reality. University News, University of Western Australia. 5 March 2012.
  139. ^ Churcher, Jason. "Internal accuracy vs external accuracy". Retrieved 7 May 2013.
  140. ^ "Augment for Architecture & Construction". Archived from the original on 8 November 2015. Retrieved 12 October 2015.
  141. ^ "App gives a view of city as it used to be". Stuff. 10 December 2011. Retrieved 20 May 2018.
  142. ^ Lee, Gun (2012). "CityViewAR outdoor AR visualization". Proceedings of the 13th International Conference of the NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction - CHINZ '12. ACM. p. 97. doi:10.1145/2379256.2379281. hdl:10092/8693. ISBN 978-1-4503-1474-9. S2CID 34199215.
  143. ^ Groundbreaking Augmented Reality-Based Reading Curriculum Launches, PRweb, 23 October 2011.
  144. ^ Stewart-Smith, Hanna. Education with Augmented Reality: AR textbooks released in Japan, ZDnet, 4 April 2012.
  145. ^ Augmented reality in education smarter learning.
  146. ^ Shumaker, Randall; Lackey, Stephanie (20 July 2015). Virtual, Augmented and Mixed Reality: 7th International Conference, VAMR 2015, Held as Part of HCI International 2015, Los Angeles, CA, USA, 2–7 August 2015, Proceedings. Springer. ISBN 9783319210674.
  147. ^ Wu, Hsin-Kai; Lee, Silvia Wen-Yu; Chang, Hsin-Yi; Liang, Jyh-Chong (March 2013). "Current status, opportunities and challenges of augmented reality in education". Computers & Education. 62: 41–49. doi:10.1016/j.compedu.2012.10.024. S2CID 15218665.
  148. ^ Lubrecht, Anna. Augmented Reality for Education Archived 5 September 2012 at the Wayback Machine The Digital Union, The Ohio State University 24 April 2012.
  149. ^ "Augmented reality, an evolution of the application of mobile devices" (PDF). Archived from the original (PDF) on 17 April 2015. Retrieved 19 June 2014.
  150. ^ Maier, Patrick; Tönnis, Marcus; Klinker, Gudron. Augmented Reality for teaching spatial relations Archived 28 January 2013 at the Wayback Machine, Conference of the International Journal of Arts & Sciences (Toronto 2009).
  151. ^ Plunkett, Kyle N. (12 November 2019). "A Simple and Practical Method for Incorporating Augmented Reality into the Classroom and Laboratory". Journal of Chemical Education. 96 (11): 2628–2631. Bibcode:2019JChEd..96.2628P. doi:10.1021/acs.jchemed.9b00607.
  152. ^ "Anatomy 4D". Qualcomm. Archived from the original on 11 March 2016. Retrieved 2 July 2015.
  153. ^ Moro, Christian; Štromberga, Zane; Raikos, Athanasios; Stirling, Allan (November 2017). "The effectiveness of virtual and augmented reality in health sciences and medical anatomy: VR and AR in Health Sciences and Medical Anatomy". Anatomical Sciences Education. 10 (6): 549–559. doi:10.1002/ase.1696. PMID 28419750. S2CID 25961448.
  154. ^ Birt, James; Stromberga, Zane; Cowling, Michael; Moro, Christian (31 January 2018). "Mobile Mixed Reality for Experiential Learning and Simulation in Medical and Health Sciences Education". Information. 9 (2): 31. doi:10.3390/info9020031. ISSN 2078-2489.
  155. ^ Catal, Cagatay; Akbulut, Akhan; Tunali, Berkay; Ulug, Erol; Ozturk, Eren (1 September 2020). "Evaluation of augmented reality technology for the design of an evacuation training game". Virtual Reality. 24 (3): 359–368. doi:10.1007/s10055-019-00410-z. ISSN 1434-9957.
  156. ^ Gong, Peizhen; Lu, Ying; Lovreglio, Ruggiero; Lv, Xiaofeng; Chi, Zexun (1 October 2024). "Applications and effectiveness of augmented reality in safety training: A systematic literature review and meta-analysis". Safety Science. 178: 106624. doi:10.1016/j.ssci.2024.106624. ISSN 0925-7535.
  157. ^ Paes, Daniel; Feng, Zhenan; King, Maddy; Khorrami Shad, Hesam; Sasikumar, Prasanth; Pujoni, Diego; Lovreglio, Ruggiero (June 2024). "Optical see-through augmented reality fire safety training for building occupants". Automation in Construction. 162: 105371. doi:10.1016/j.autcon.2024.105371.
  158. ^ Lovreglio, Ruggiero; Kinateder, Max (October 2020). "Augmented reality for pedestrian evacuation research: Promises and limitations". Safety Science. 128: 104750. doi:10.1016/j.ssci.2020.104750.
  159. ^ Mantoro, Teddy; Alamsyah, Zaenal; Ayu, Media Anugerah (October 2021). "Pathfinding for Disaster Emergency Route Using Sparse A* and Dijkstra Algorithm with Augmented Reality". 2021 IEEE 7th International Conference on Computing, Engineering and Design (ICCED). pp. 1–6. doi:10.1109/ICCED53389.2021.9664869. ISBN 978-1-6654-3996-1.
  160. ^ Lovreglio, R.; Paes, D.; Feng, Z.; Zhao, X. (2024), Huang, Xinyan; Tam, Wai Cheong (eds.), "Digital Technologies for Fire Evacuations", Intelligent Building Fire Safety and Smart Firefighting, Cham: Springer Nature Switzerland, pp. 439–454, doi:10.1007/978-3-031-48161-1_18, ISBN 978-3-031-48160-4, retrieved 15 March 2024
  161. ^ a b Mourtzis, Dimitris; Zogopoulos, Vasilios; Xanthi, Fotini (11 June 2019). "Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling". The International Journal of Advanced Manufacturing Technology. 105 (9): 3899–3910. doi:10.1007/s00170-019-03941-6. ISSN 0268-3768. S2CID 189904235.
  162. ^ Boccaccio, A.; Cascella, G. L.; Fiorentino, M.; Gattullo, M.; Manghisi, V. M.; Monno, G.; Uva, A. E. (2019), Cavas-Martínez, Francisco; Eynard, Benoit; Fernández Cañavate, Francisco J.; Fernández-Pacheco, Daniel G. (eds.), "Exploiting Augmented Reality to Display Technical Information on Industry 4.0 P&ID", Advances on Mechanics, Design Engineering and Manufacturing II, Lecture Notes in Mechanical Engineering, Springer International Publishing, pp. 282–291, doi:10.1007/978-3-030-12346-8_28, ISBN 978-3-030-12345-1, S2CID 150159603
  163. ^ a b Mourtzis, Dimitris; Zogopoulos, Vasilios; Katagis, Ioannis; Lagios, Panagiotis (2018). "Augmented Reality based Visualization of CAM Instructions towards Industry 4.0 paradigm: a CNC Bending Machine case study". Procedia CIRP. 70: 368–373. doi:10.1016/j.procir.2018.02.045.
  164. ^ Marino, Emanuele; Barbieri, Loris; Colacino, Biagio; Fleri, Anna Kum; Bruno, Fabio (2021). "An Augmented Reality inspection tool to support workers in Industry 4.0 environments". Computers in Industry. 127. doi:10.1016/j.compind.2021.103412. S2CID 232272256.
  165. ^ Michalos, George; Kousi, Niki; Karagiannis, Panagiotis; Gkournelos, Christos; Dimoulas, Konstantinos; Koukas, Spyridon; Mparis, Konstantinos; Papavasileiou, Apostolis; Makris, Sotiris (November 2018). "Seamless human robot collaborative assembly – An automotive case study". Mechatronics. 55: 194–211. doi:10.1016/j.mechatronics.2018.08.006. ISSN 0957-4158. S2CID 115979090.
  166. ^ Katts, Rima. Elizabeth Arden brings new fragrance to life with augmented reality Mobile Marketer, 19 September 2012.
  167. ^ Meyer, David. Telefónica bets on augmented reality with Aurasma tie-in gigaom, 17 September 2012.
  168. ^ Mardle, Pamela.Video becomes reality for Stuprint.com Archived 12 March 2013 at the Wayback Machine. PrintWeek, 3 October 2012.
  169. ^ Giraldo, Karina.Why mobile marketing is important for brands? Archived 2 April 2015 at the Wayback Machine. SolinixAR, Enero 2015.
  170. ^ "Augmented reality could be advertising world's best bet". The Financial Express. 18 April 2015. Archived from the original on 21 May 2015.
  171. ^ Humphries, Mathew.[3] Archived 26 June 2012 at the Wayback Machine.Geek.com 19 September 2011.
  172. ^ Netburn, Deborah.Ikea introduces augmented reality app for 2013 catalog Archived 2 December 2012 at the Wayback Machine. Los Angeles Times, 23 July 2012.
  173. ^ van Krevelen, D.W.F.; Poelman, R. (November 2015). "A Survey of Augmented Reality Technologies, Applications and Limitations". International Journal of Virtual Reality. 9 (2): 1–20. doi:10.20870/IJVR.2010.9.2.2767.
  174. ^ Alexander, Michael.Arbua Shoco Owl Silver Coin with Augmented Reality, Coin Update 20 July 2012.
  175. ^ Royal Mint produces revolutionary commemorative coin for Aruba Archived 4 September 2015 at the Wayback Machine, Today 7 August 2012.
  176. ^ "This small iOS 12 feature is the birth of a whole industry". Jonny Evans. 19 September 2018. Retrieved 19 September 2018.
  177. ^ "Shopify is bringing Apple's latest AR tech to their platform". Lucas Matney. 17 September 2018. Retrieved 3 December 2018.
  178. ^ "History re-made: New AR classroom application lets pupils see how York looked over 1,900 years ago". QA Education. 4 September 2018. Retrieved 4 September 2018.
  179. ^ "Sheffield's Twinkl claims AR first with new game". Prolific North. 19 September 2018. Retrieved 19 September 2018.
  180. ^ "Technology from Twinkl brings never seen before objects to the classroom". The Educator UK. 21 September 2018. Retrieved 21 December 2018.
  181. ^ Pavlik, John V., and Shawn McIntosh. "Augmented Reality." Converging Media: a New Introduction to Mass Communication, 5th ed., Oxford University Press, 2017, pp. 184–185.
  182. ^ a b Dacko, Scott G. (November 2017). "Enabling smart retail settings via mobile augmented reality shopping apps" (PDF). Technological Forecasting and Social Change. 124: 243–256. doi:10.1016/j.techfore.2016.09.032.
  183. ^ a b "How Neiman Marcus is turning technology innovation into a 'core value'". Retail Dive. Retrieved 23 September 2018.
  184. ^ a b c d e Arthur, Rachel. "Augmented Reality Is Set To Transform Fashion And Retail". Forbes. Retrieved 23 September 2018.
  185. ^ "Augmented Reality Apps for Interior Visualization". archvisualizations.com. 30 January 2024. Retrieved 9 April 2024.
  186. ^ Pardes, Arielle (20 September 2017). "IKEA's new app flaunts what you'll love most about AR". Wired. Retrieved 20 September 2017.
  187. ^ "IKEA Highlights 2017". Archived from the original on 8 October 2018. Retrieved 8 October 2018.
  188. ^ "Performance". www.inter.ikea.com. Archived from the original on 26 June 2018.
  189. ^ "How Shopify is setting the future of AR shopping and what it means for sellers". 29 June 2021. Retrieved 29 June 2021.
  190. ^ Indriani, Masitoh; Liah Basuki Anggraeni (30 June 2022). "What Augmented Reality Would Face Today? The Legal Challenges to the Protection of Intellectual Property in Virtual Space". Media Iuris. 5 (2): 305–330. doi:10.20473/mi.v5i2.29339. ISSN 2621-5225. S2CID 250464007.
  191. ^ "AR詩 | にかにかブログ! (おぶんがく&包丁&ちぽちぽ革命)". にかにかブログ! (おぶんがく&包丁&ちぽちぽ革命) (in Japanese). Retrieved 20 May 2018.
  192. ^ "10.000 Moving Cities – Same but Different, AR (Augmented Reality) Art Installation, 2018". Marc Lee. Retrieved 24 December 2018.
  193. ^ Duguet, Anne-Marie (2003). Jeffrey Shaw, Future Cinema. The Cinematic Imaginary after Film. ZKM Karlsruhe and MIT Press, Cambridge, Massachusetts. pp. 376–381. ISBN 9780262692861.
  194. ^ Duguet, Anne-Marie; Klotz, Heinrich; Weibel, Peter (1997). Jeffrey Shaw: A User's Manual. From Expanded Cinema to Virtual Reality. ZKM Cantz. pp. 9–20.
  195. ^ Freeman, John Craig. "ManifestAR: An Augmented Reality Manifesto." Leonardo Electronic Almanac, Vol. 19, No. 1, 2013.
  196. ^ Paul, Christiane. "Digital Art" (Third edition). Thames & Hudson, 2015.
  197. ^ tom Dieck, M. Claudia; Jung, Timothy; Han, Dai-In (July 2016). "Mapping requirements for the wearable smart glasses augmented reality museum application". Journal of Hospitality and Tourism Technology. 7 (3): 230–253. doi:10.1108/JHTT-09-2015-0036. ISSN 1757-9880.
  198. ^ Kipper, Greg; Rampolla, Joseph (31 December 2012). Augmented Reality: An Emerging Technologies Guide to AR. Elsevier. ISBN 9781597497343.
  199. ^ "Augmented Reality Is Transforming Museums". WIRED. Retrieved 30 September 2018.
  200. ^ Vankin, Deborah (28 February 2019). "With a free phone app, Nancy Baker Cahill cracks the glass ceiling in male-dominated land art". Los Angeles Times. Retrieved 26 August 2020.
  201. ^ "In the Vast Beauty of the Coachella Valley, Desert X Artists Emphasize the Perils of Climate Change". artnet News. 12 February 2019. Retrieved 10 April 2019.
  202. ^ Webley, Kayla (11 November 2010). "The 50 Best Inventions of 2010 - EyeWriter". Time. Archived from the original on 14 November 2010. Retrieved 26 March 2024.
  203. ^ "Olafur Eliasson creates augmented-reality cabinet of curiosities". 14 May 2020. Retrieved 17 May 2020.
  204. ^ "The Houses are Blind but the Trees Can See". March 2022. Retrieved 7 February 2023.
  205. ^ "Augmented Reality (AR) vs. virtual reality (VR): What's the Difference?". PCMAG. Retrieved 6 November 2020.
  206. ^ Sandee LaMotte (13 December 2017). "The very real health dangers of virtual reality". CNN. Retrieved 6 November 2020.
  207. ^ Thier, Dave. "'Jurassic World Alive' Makes Two Big Improvements Over 'Pokémon GO'". Forbes. Retrieved 6 November 2020.
  208. ^ "Research Human Computer Interaction (HCI), Virtual and Augmented Reality, Wearable Technologies". cs.nycu.edu.tw. Retrieved 28 March 2021.
  209. ^ Bajarin, Tim (31 January 2017). "This Technology Could Replace the Keyboard and Mouse". Time. Retrieved 19 June 2019.
  210. ^ "LightUp - An award-winning toy that teaches kids about circuits and coding". LightUp. Archived from the original on 29 August 2018. Retrieved 29 August 2018.
  211. ^ "Terminal Eleven: SkyView – Explore the Universe". www.terminaleleven.com. Retrieved 15 February 2016.
  212. ^ "AR Circuits – Augmented Reality Electronics Kit". arcircuits.com. Retrieved 15 February 2016.
  213. ^ "SketchAR - start drawing easily using augmented reality". sketchar.tech. Retrieved 20 May 2018.
  214. ^ "Augmented Reality—Emerging Technology for Emergency Management", Emergency Management 24 September 2009.
  215. ^ "What Does the Future Hold for Emergency Management?", Emergency Management Magazine, 8 November 2013
  216. ^ Cooper, Joseph (15 November 2007). Supporting Flight Control for UAV-Assisted Wilderness Search and Rescue Through Human Centered Interface Design (Master's thesis). Brigham Young University.
  217. ^ Shu, Jiayu; Kosta, Sokol; Zheng, Rui; Hui, Pan (2018). "Talk2Me: A Framework for Device-to-Device Augmented Reality Social Network". 2018 IEEE International Conference on Pervasive Computing and Communications (Per Com). pp. 1–10. doi:10.1109/PERCOM.2018.8444578. ISBN 978-1-5386-3224-6. S2CID 44017349.
  218. ^ "Effects of Augmented Reality on Social Interactions". Electronics Diary. 27 May 2019.
  219. ^ Hawkins, Mathew. Augmented Reality Used To Enhance Both Pool And Air Hockey Game Set Watch15 October 2011.
  220. ^ One Week Only – Augmented Reality Project Archived 6 November 2013 at the Wayback Machine Combat-HELO Dev Blog 31 July 2012.
  221. ^ "Best VR, Augmented Reality apps & games on Android". Archived from the original on 15 February 2017. Retrieved 14 February 2017.
  222. ^ "Ogmento First AR Gaming Startup to Win VC Funding". 26 May 2010.
  223. ^ Swatman, Rachel (10 August 2016). "Pokémon Go catches five new world records". Guinness World Records. Retrieved 28 August 2016.
  224. ^ "'Star Wars' augmented reality game that lets you be a Jedi launched". CNBC. 31 August 2017.
  225. ^ Noelle, S. (2002). "Stereo augmentation of simulation results on a projection wall by combining two basic ARVIKA systems". Proceedings. International Symposium on Mixed and Augmented Reality. pp. 271–322. CiteSeerX 10.1.1.121.1268. doi:10.1109/ISMAR.2002.1115108. ISBN 0-7695-1781-1. S2CID 24876142.
  226. ^ Verlinden, Jouke; Horvath, Imre. "Augmented Prototyping as Design Means in Industrial Design Engineering". Delft University of Technology. Archived from the original on 16 June 2013. Retrieved 7 October 2012.
  227. ^ Pang, Y.; Nee, Andrew Y. C.; Youcef-Toumi, Kamal; Ong, S. K.; Yuan, M. L. (January 2005). "Assembly Design and Evaluation in an Augmented Reality Environment". hdl:1721.1/7441.
  228. ^ Miyake RK, et al. (2006). "Vein imaging: a new method of near infrared imaging, where a processed image is projected onto the skin for the enhancement of vein treatment". Dermatol Surg. 32 (8): 1031–8. doi:10.1111/j.1524-4725.2006.32226.x. PMID 16918565. S2CID 8872471.
  229. ^ "Reality_Only_Better". The Economist. 8 December 2007.
  230. ^ Mountney, Peter; Giannarou, Stamatia; Elson, Daniel; Yang, Guang-Zhong (2009). "Optical Biopsy Mapping for Minimally Invasive Cancer Screening". Medical Image Computing and Computer-Assisted Intervention – MICCAI 2009. Lecture Notes in Computer Science. Vol. 5761. pp. 483–490. doi:10.1007/978-3-642-04268-3_60. ISBN 978-3-642-04267-6. PMID 20426023.
  231. ^ Scopis Augmented Reality: Path guidance to craniopharyngioma on YouTube
  232. ^ Loy Rodas, Nicolas; Padoy, Nicolas (2014). "3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose". Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014. Lecture Notes in Computer Science. Vol. 8673. pp. 415–422. doi:10.1007/978-3-319-10404-1_52. ISBN 978-3-319-10403-4. PMID 25333145. S2CID 819543.
  233. ^ 3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose on YouTube
  234. ^ "UNC Ultrasound/Medical Augmented Reality Research". Archived from the original on 12 February 2010. Retrieved 6 January 2010.
  235. ^ Mountney, Peter; Fallert, Johannes; Nicolau, Stephane; Soler, Luc; Mewes, Philip W. (2014). "An Augmented Reality Framework for Soft Tissue Surgery". Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014. Lecture Notes in Computer Science. Vol. 8673. pp. 423–431. doi:10.1007/978-3-319-10404-1_53. ISBN 978-3-319-10403-4. PMID 25333146.
  236. ^ Botella, Cristina; Bretón-López, Juani; Quero, Soledad; Baños, Rosa; García-Palacios, Azucena (September 2010). "Treating Cockroach Phobia With Augmented Reality". Behavior Therapy. 41 (3): 401–413. doi:10.1016/j.beth.2009.07.002. PMID 20569788. S2CID 29889630.
  237. ^ Zimmer, Anja; Wang, Nan; Ibach, Merle K.; Fehlmann, Bernhard; Schicktanz, Nathalie S.; Bentz, Dorothée; Michael, Tanja; Papassotiropoulos, Andreas; de Quervain, Dominique J. F. (1 August 2021). "Effectiveness of a smartphone-based, augmented reality exposure app to reduce fear of spiders in real-life: A randomized controlled trial". Journal of Anxiety Disorders. 82: 102442. doi:10.1016/j.janxdis.2021.102442. ISSN 0887-6185. PMID 34246153. S2CID 235791626.
  238. ^ "Augmented Reality Revolutionizing Medicine". Health Tech Event. 6 June 2014. Archived from the original on 12 October 2014. Retrieved 9 October 2014.
  239. ^ Thomas, Daniel J. (December 2016). "Augmented reality in surgery: The Computer-Aided Medicine revolution". International Journal of Surgery. 36 (Pt A): 25. doi:10.1016/j.ijsu.2016.10.003. ISSN 1743-9159. PMID 27741424.
  240. ^ Cui, Nan; Kharel, Pradosh; Gruev, Viktor (8 February 2017). "Augmented reality with Microsoft Holo Lens holograms for near-infrared fluorescence based image guided surgery". In Pogue, Brian W; Gioux, Sylvain (eds.). Augmented reality with Microsoft HoloLens holograms for near-infrared fluorescence based image guided surgery. Molecular-Guided Surgery: Molecules, Devices, and Applications III. Vol. 10049. International Society for Optics and Photonics. pp. 100490I. doi:10.1117/12.2251625. S2CID 125528534.
  241. ^ Moro, C; Birt, J; Stromberga, Z; Phelps, C; Clark, J; Glasziou, P; Scott, AM (May 2021). "Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis". Anatomical Sciences Education. 14 (3): 368–376. doi:10.1002/ase.2049. PMID 33378557. S2CID 229929326.
  242. ^ Barsom, E. Z.; Graafland, M.; Schijven, M. P. (1 October 2016). "Systematic review on the effectiveness of augmented reality applications in medical training". Surgical Endoscopy. 30 (10): 4174–4183. doi:10.1007/s00464-016-4800-6. ISSN 0930-2794. PMC 5009168. PMID 26905573.
  243. ^ Magee, D.; Zhu, Y.; Ratnalingam, R.; Gardner, P.; Kessel, D. (1 October 2007). "An augmented reality simulator for ultrasound guided needle placement training" (PDF). Medical & Biological Engineering & Computing. 45 (10): 957–967. doi:10.1007/s11517-007-0231-9. ISSN 1741-0444. PMID 17653784. S2CID 14943048.
  244. ^ Javaid, Mohd; Haleem, Abid (1 June 2020). "Virtual reality applications toward medical field". Clinical Epidemiology and Global Health. 8 (2): 600–605. doi:10.1016/j.cegh.2019.12.010. ISSN 2213-3984.
  245. ^ Akçayır, Murat; Akçayır, Gökçe (February 2017). "Advantages and challenges associated with augmented reality for education: A systematic review of the literature". Educational Research Review. 20: 1–11. doi:10.1016/j.edurev.2016.11.002. S2CID 151764812.
  246. ^ Tagaytayan, Raniel; Kelemen, Arpad; Sik-Lanyi, Cecilia (2018). "Augmented reality in neurosurgery". Archives of Medical Science. 14 (3): 572–578. doi:10.5114/aoms.2016.58690. ISSN 1734-1922. PMC 5949895. PMID 29765445.
  247. ^ Davis, Nicola (7 January 2015). "Project Anywhere: digital route to an out-of-body experience". The Guardian. Retrieved 21 September 2016.
  248. ^ "Project Anywhere: an out-of-body experience of a new kind". Euronews. 25 February 2015. Retrieved 21 September 2016.
  249. ^ Project Anywhere at studioany.com
  250. ^ Lintern, Gavan; Roscoe, Stanley N.; Sivier, Jonathan E. (June 1990). "Display Principles, Control Dynamics, and Environmental Factors in Pilot Training and Transfer". Human Factors. 32 (3): 299–317. doi:10.1177/001872089003200304. S2CID 110528421.
  251. ^ Calhoun, G. L., Draper, M. H., Abernathy, M. F., Delgado, F., and Patzek, M. "Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness," 2005 Proceedings of SPIE Enhanced and Synthetic Vision, Vol. 5802, pp. 219–230.
  252. ^ Cameron, Chris. Military-Grade Augmented Reality Could Redefine Modern Warfare ReadWriteWeb 11 June 2010.
  253. ^ a b Slyusar, Vadym (19 July 2019). "Augmented reality in the interests of ESMRM and munitions safety".
  254. ^ GM's Enhanced Vision System. Techcrunch.com (17 March 2010). Retrieved 9 June 2012.
  255. ^ Couts, Andrew. New augmented reality system shows 3D GPS navigation through your windshield Digital Trends,27 October 2011.
  256. ^ Griggs, Brandon. Augmented-reality' windshields and the future of driving CNN Tech, 13 January 2012.
  257. ^ "WayRay's AR in-car HUD convinced me HUDs can be better". TechCrunch. Retrieved 3 October 2018.
  258. ^ Walz, Eric (22 May 2017). "WayRay Creates Holographic Navigation: Alibaba Invests $18 Million". FutureCar. Retrieved 17 October 2018.
  259. ^ Cheney-Peters, Scott (12 April 2012). "CIMSEC: Google's AR Goggles". Retrieved 20 April 2012.
  260. ^ Stafford, Aaron; Piekarski, Wayne; Thomas, Bruce H. "Hand of God". Archived from the original on 7 December 2009. Retrieved 18 December 2009.
  261. ^ Benford, Steve; Greenhalgh, Chris; Reynard, Gail; Brown, Chris; Koleva, Boriana (1 September 1998). "Understanding and constructing shared spaces with mixed-reality boundaries". ACM Transactions on Computer-Human Interaction. 5 (3): 185–223. doi:10.1145/292834.292836. S2CID 672378.
  262. ^ Office of Tomorrow Media Interaction Lab.
  263. ^ The big idea:Augmented Reality. Ngm.nationalgeographic.com (15 May 2012). Retrieved 9 June 2012.
  264. ^ Henderson, Steve; Feiner, Steven. "Augmented Reality for Maintenance and Repair (ARMAR)". Archived from the original on 6 March 2010. Retrieved 6 January 2010.
  265. ^ Sandgren, Jeffrey. The Augmented Eye of the Beholder Archived 21 June 2013 at the Wayback Machine, BrandTech News 8 January 2011.
  266. ^ Cameron, Chris. Augmented Reality for Marketers and Developers, ReadWriteWeb.
  267. ^ Dillow, Clay BMW Augmented Reality Glasses Help Average Joes Make Repairs, Popular Science September 2009.
  268. ^ King, Rachael. Augmented Reality Goes Mobile, Bloomberg Business Week Technology 3 November 2009.
  269. ^ a b Abraham, Magid; Annunziata, Marco (13 March 2017). "Augmented Reality Is Already Improving Worker Performance". Harvard Business Review. Retrieved 13 January 2019.
  270. ^ Archived at Ghostarchive and the Wayback Machine: Arti AR highlights at SRX -- the first sports augmented reality live from a moving car!, 14 July 2021, retrieved 14 July 2021
  271. ^ Marlow, Chris. Hey, hockey puck! NHL PrePlay adds a second-screen experience to live games, digitalmediawire 27 April 2012.
  272. ^ Pair, J.; Wilson, J.; Chastine, J.; Gandy, M. (2002). "The Duran Duran project: The augmented reality toolkit in live performance". The First IEEE International Workshop Agumented Reality Toolkit. p. 2. doi:10.1109/ART.2002.1107010. ISBN 0-7803-7680-3. S2CID 55820154.
  273. ^ Broughall, Nick. Sydney Band Uses Augmented Reality For Video Clip. Gizmodo, 19 October 2009.
  274. ^ Pendlebury, Ty. Augmented reality in Aussie film clip. c|net 19 October 2009.
  275. ^ Saenz, Aaron Augmented Reality Does Time Travel Tourism SingularityHUB 19 November 2009.
  276. ^ Sung, Dan Augmented reality in action – travel and tourism Pocket-lint 2 March 2011.
  277. ^ Dawson, Jim Augmented Reality Reveals History to Tourists Life Science 16 August 2009.
  278. ^ Bartie, Phil J.; MacKaness, William A. (2006). "Development of a Speech-Based Augmented Reality System to Support Exploration of Cityscape". Transactions in GIS. 10 (1): 63–86. Bibcode:2006TrGIS..10...63B. doi:10.1111/j.1467-9671.2006.00244.x. S2CID 13325561.
  279. ^ Benderson, Benjamin B. Audio Augmented Reality: A Prototype Automated Tour Guide Archived 1 July 2002 at the Wayback Machine Bell Communications Research, ACM Human Computer in Computing Systems Conference, pp. 210–211.
  280. ^ Jain, Puneet and Manweiler, Justin and Roy Choudhury, Romit. OverLay: Practical Mobile Augmented Reality ACM MobiSys, May 2015.
  281. ^ Tsotsis, Alexia. Word Lens Translates Words Inside of Images. Yes Really. TechCrunch (16 December 2010).
  282. ^ N.B. Word Lens: This changes everything The Economist: Gulliver blog 18 December 2010.
  283. ^ Borghino, Dario Augmented reality glasses perform real-time language translation. gizmag, 29 July 2012.
  284. ^ "Music Production in the Era of Augmented Reality". Medium. 14 October 2016. Retrieved 5 January 2017.
  285. ^ "Augmented Reality music making with Oak on Kickstarter – gearnews.com". gearnews.com. 3 November 2016. Retrieved 5 January 2017.
  286. ^ Clouth, Robert (1 January 2013). "Mobile Augmented Reality as a Control Mode for Real-time Music Systems". Retrieved 5 January 2017.
  287. ^ Farbiz, Farzam; Tang, Ka Yin; Wang, Kejian; Ahmad, Waqas; Manders, Corey; Jyh Herng, Chong; Kee Tan, Yeow (2007). "A multimodal augmented reality DJ music system". 2007 6th International Conference on Information, Communications & Signal Processing. pp. 1–5. doi:10.1109/ICICS.2007.4449564. ISBN 978-1-4244-0982-2. S2CID 17807179.
  288. ^ "HoloLens concept lets you control your smart home via augmented reality". Digital Trends. 26 July 2016. Retrieved 5 January 2017.
  289. ^ "Hololens: Entwickler zeigt räumliches Interface für Elektrogeräte" (in German). MIXED. 22 July 2016. Retrieved 5 January 2017.
  290. ^ "Control Your IoT Smart Devices Using Microsoft HoloLen (video) – Geeky Gadgets". Geeky Gadgets. 27 July 2016. Retrieved 5 January 2017.
  291. ^ "Experimental app brings smart home controls into augmented reality with HoloLens". Windows Central. 22 July 2016. Retrieved 5 January 2017.
  292. ^ Berthaut, Florent; Jones, Alex (2016). "ControllAR: Appropriation of Visual Feedback on Control Surfaces". Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces (PDF). pp. 271–277. doi:10.1145/2992154.2992170. ISBN 9781450342483. S2CID 7180627.
  293. ^ "Rouages: Revealing the Mechanisms of Digital Musical Instruments to the Audience". May 2013. pp. 6 pages.
  294. ^ "Reflets: Combining and Revealing Spaces for Musical Performances". May 2015.
  295. ^ Wagner, Kurt. "Snapchat's New Augmented Reality Feature Brings Your Cartoon Bitmoji into the Real World." Recode, Recode, 14 Sept. 2017, www.recode.net/2017/9/14/16305890/snapchat-bitmoji-ar-Facebook.
  296. ^ Miller, Chance. "Snapchat's Latest Augmented Reality Feature Lets You Paint the Sky with New Filters." 9to5Mac, 9to5Mac, 25 Sept. 2017, 9to5mac.com/2017/09/25/how-to-use-snapchat-sky-filters/.
  297. ^ Faccio, Mara; McConnell, John J. (2017). "Death by Pokémon GO". doi:10.2139/ssrn.3073723. SSRN 3073723.
  298. ^ Peddie, J., 2017, Agumented Reality, Springer[page needed]
  299. ^ Roesner, Franziska; Kohno, Tadayoshi; Denning, Tamara; Calo, Ryan; Newell, Bryce Clayton (2014). "Augmented reality". Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp '14 Adjunct. pp. 1283–1288. doi:10.1145/2638728.2641709. ISBN 978-1-4503-3047-3. S2CID 15190154.
  300. ^ "The Code of Ethics on Human Augmentation - Augmented Reality : Where We Will All Live -". m.ebrary.net. Retrieved 18 November 2019.
  301. ^ Damiani, Jesse (18 July 2016). "The Future of Tech Just Changed at VRTO--Here's Why That Matters to You". HuffPost. Retrieved 18 November 2019.
  302. ^ "VRTO Spearheads Code of Ethics on Human Augmentation". VRFocus. Archived from the original on 11 August 2020. Retrieved 18 November 2019.
  303. ^ "The Code of Ethics on Human Augmentation". www.eyetap.org. Archived from the original on 28 February 2021. Retrieved 18 November 2019.
  304. ^ McClure 2017, p. 364-366.
  305. ^ McEvoy, Fiona J (4 June 2018). "What Are Your Augmented Reality Property Rights?". Slate. Retrieved 31 May 2022.
  306. ^ Mallick 2020, p. 1068-1072.
  307. ^ McClure 2017, p. 341-343.
  308. ^ McClure 2017, p. 347-351.
  309. ^ Conroy 2017, p. 20.
  310. ^ a b McClure 2017, p. 351-353.
  311. ^ Conroy 2017, p. 21-22.
  312. ^ Conroy 2017, p. 24-26.
  313. ^ Conroy 2017, p. 27-29.
  314. ^ Conroy 2017, p. 29-34.
  315. ^ McClure 2017, p. 354-355.
  316. ^ "Judge halts Wisconsin county rule for apps like Pokemon Go". Associated Press. 21 July 2017.
  317. ^ McClure 2017, p. 356-357.
  318. ^ McClure 2017, p. 355.
  319. ^ McClure 2017, p. 357.
  320. ^ McClure 2017, p. 357-359.
  321. ^ Mallick 2020, p. 1079-1080.
  322. ^ Mallick 2020, p. 1080-1084.
  323. ^ Markoff, John (24 October 2019). "Always Building, From the Garage to Her Company". The New York Times. ISSN 0362-4331. Retrieved 12 December 2019.
  324. ^ Mann, S. (1997). "Wearable computing: a first step toward personal imaging". Computer. 30 (2): 25–32. doi:10.1109/2.566147. S2CID 28001657.
  325. ^ Wagner, Daniel (29 September 2009). First Steps Towards Handheld Augmented Reality. ACM. ISBN 9780769520346. Retrieved 29 September 2009.
  326. ^ Robot Genius (24 July 2012). "Sight". vimeo.com. Retrieved 18 June 2019.
  327. ^ Kosner, Anthony Wing (29 July 2012). "Sight: An 8-Minute Augmented Reality Journey That Makes Google Glass Look Tame". Forbes. Retrieved 3 August 2015.
  328. ^ O'Dell, J. (27 July 2012). "Beautiful short film shows a frightening future filled with Google Glass-like devices". Retrieved 3 August 2015.

Sources

[edit]
[edit]

Media related to Augmented reality at Wikimedia Commons