Augmented reality: Difference between revisions
imported>MrOllie m Reverted 1 edit by 176.62.211.220 (talk) to last revision by Buuzbashi |
imported>Morn14150 m minor spelling mistake of Apple Vision Pro |
||
| Line 7: | Line 7: | ||
[[File:Suteki Hololens applikation .jpg|thumb|A man using an augmented reality headset to view a life-size virtual model of a building]] | [[File:Suteki Hololens applikation .jpg|thumb|A man using an augmented reality headset to view a life-size virtual model of a building]] | ||
[[File:Navit Reality View next to reality.jpg|thumb|An augmented reality mapping application]] | [[File:Navit Reality View next to reality.jpg|thumb|An augmented reality mapping application]] | ||
'''Augmented reality''' ('''AR'''), also known as '''mixed reality''' ('''MR'''), is a technology that overlays real-time [[3D computer graphics|3D-rendered computer graphics]] onto a portion of the real world through a display, such as a handheld device or [[head-mounted display]]. This experience is seamlessly interwoven with the physical world such that it is perceived as an [[immersion (virtual reality)|immersive]] aspect of the real environment.<ref name="B. Rosenberg 1992" /> In this way, augmented reality alters one's ongoing perception of a real-world environment, compared to [[virtual reality]], which aims to completely replace the user's real-world environment with a simulated one.<ref>Steuer,{{Cite web |url=https://filtermaker.fr/en/augmented-reality/ |title=Defining virtual reality: Dimensions Determining Telepresence |access-date=27 November 2018 |archive-url=https://web.archive.org/web/20220717120913/https://filtermaker.fr/en/augmented-reality/ |archive-date=17 July 2022 | [[File:Headset computer.webp|thumb|upright|Headset computer concept]] | ||
'''Augmented reality''' ('''AR'''), also known as '''mixed reality''' ('''MR'''), is a technology that overlays real-time [[3D computer graphics|3D-rendered computer graphics]] onto a portion of the real world through a display, such as a handheld device or [[head-mounted display]]. This experience is seamlessly interwoven with the physical world such that it is perceived as an [[immersion (virtual reality)|immersive]] aspect of the real environment.<ref name="B. Rosenberg 1992" /> In this way, augmented reality alters one's ongoing perception of a real-world environment, compared to [[virtual reality]], which aims to completely replace the user's real-world environment with a simulated one.<ref>Steuer,{{Cite web |url=https://filtermaker.fr/en/augmented-reality/ |title=Defining virtual reality: Dimensions Determining Telepresence |access-date=27 November 2018 |archive-url=https://web.archive.org/web/20220717120913/https://filtermaker.fr/en/augmented-reality/ |archive-date=17 July 2022 }}, Department of Communication, Stanford University. 15 October 1993.</ref><ref>[http://archive.ncsa.illinois.edu/Cyberia/VETopLevels/VR.Overview.html Introducing Virtual Environments] {{Webarchive|url=https://web.archive.org/web/20160421000159/http://archive.ncsa.illinois.edu/Cyberia/VETopLevels/VR.Overview.html |date=21 April 2016 }} National Center for Supercomputing Applications, University of Illinois.</ref> Augmented reality is typically [[visual]], but can span multiple sensory [[Modality (human–computer interaction)|modalities]], including [[Hearing|auditory]], [[haptic perception|haptic]], and [[Somatosensory system|somatosensory]].<ref>{{cite journal | last1=Cipresso | first1=Pietro | last2=Giglioli | first2=Irene Alice Chicchi | last3=Raya | first3=iz | last4=Riva | first4=Giuseppe | title=The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature | journal=Frontiers in Psychology | volume=9 | date=2011-12-07 | pmid=30459681 | doi=10.3389/fpsyg.2018.02086 | article-number=2086| pmc=6232426 | doi-access=free }}</ref> | |||
The primary value of augmented reality is the manner in which components of a digital world blend into a person's perception of the real world, through the integration of immersive sensations, which are perceived as real in the user's environment. The earliest functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the [[Virtual Fixtures]] system developed at the U.S. Air Force's [[Armstrong Laboratory]] in 1992.<ref name="B. Rosenberg 1992"/><ref>{{cite book |doi=10.1109/VRAIS.1993.380795 |chapter=Virtual fixtures: Perceptual tools for telerobotic manipulation |title=Proceedings of IEEE virtual reality Annual International Symposium |pages=76–82 |year=1993 |last1=Rosenberg |first1=L.B. |s2cid=9856738 |isbn=0-7803-1363-1 }}</ref><ref name="Dupzyk 2016">{{Cite news|url=http://www.popularmechanics.com/technology/a22384/hololens-ar-breakthrough-awards/|title=I Saw the Future Through Microsoft's Hololens|last=Dupzyk|first=Kevin|work=Popular Mechanics|date = 6 September 2016}}</ref> Commercial augmented reality experiences were first introduced in entertainment and gaming businesses.<ref>{{Citation|last=|first=|title=Augmented Reality: Reflections at Thirty Years|url=https://link.springer.com/10.1007/978-3-030-89906-6_1|work=Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1|series=Lecture Notes in Networks and Systems|year=2022|volume=358|pages=1–11|editor-last=Arai|editor-first=Kohei|place=Cham|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-030-89906-6_1|isbn=978-3-030-89905-9|s2cid=239881216 | The primary value of augmented reality is the manner in which components of a digital world blend into a person's perception of the real world, through the integration of immersive sensations, which are perceived as real in the user's environment. The earliest functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the [[Virtual Fixtures]] system developed at the U.S. Air Force's [[Armstrong Laboratory]] in 1992.<ref name="B. Rosenberg 1992"/><ref>{{cite book |doi=10.1109/VRAIS.1993.380795 |chapter=Virtual fixtures: Perceptual tools for telerobotic manipulation |title=Proceedings of IEEE virtual reality Annual International Symposium |pages=76–82 |year=1993 |last1=Rosenberg |first1=L.B. |s2cid=9856738 |isbn=0-7803-1363-1 }}</ref><ref name="Dupzyk 2016">{{Cite news|url=http://www.popularmechanics.com/technology/a22384/hololens-ar-breakthrough-awards/|title=I Saw the Future Through Microsoft's Hololens|last=Dupzyk|first=Kevin|work=Popular Mechanics|date = 6 September 2016}}</ref> Commercial augmented reality experiences were first introduced in entertainment and gaming businesses.<ref>{{Citation|last=|first=|title=Augmented Reality: Reflections at Thirty Years|url=https://link.springer.com/10.1007/978-3-030-89906-6_1|work=Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1|series=Lecture Notes in Networks and Systems|year=2022|volume=358|pages=1–11|editor-last=Arai|editor-first=Kohei|place=Cham|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-030-89906-6_1|isbn=978-3-030-89905-9|s2cid=239881216|url-access=subscription}}</ref> Subsequently, augmented reality applications have spanned industries such as education, communications, medicine, and entertainment. | ||
Augmented reality can be used to enhance natural environments or situations and offers perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding [[computer vision]], incorporating AR cameras into smartphone applications, and [[object recognition]]) the information about the surrounding real world of the user becomes [[interactive]] and digitally manipulated.<ref>{{Cite journal |url=https://doi.org/10.1007/s11831-022-09831-7/ |title=Augmented Reality: A Comprehensive Review|last1=Dargan|first1=Shaveta|last2=Bansal|first2=Shally|last3=Mittal|first3=Ajay|last4=Kumar|first4=Krishan|date=2023 | journal=Archives of Computational Methods in Engineering |volume=30 |issue=2 |pages=1057–1080 |doi=10.1007/s11831-022-09831-7 |access-date=27 February 2024|url-access=subscription}}</ref> Information about the environment and its objects is overlaid on the real world. This information can be virtual or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.<ref>{{Cite web|url=http://wearcam.org/PhenomenalAugmentedReality.pdf|title=Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97}}</ref><ref>Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp 99–128, 1992.</ref><ref>{{Cite book|last1=Mann|first1=Steve|last2=Feiner|first2=Steve|last3=Harner|first3=Soren|last4=Ali|first4=Mir Adnan|last5=Janzen|first5=Ryan|last6=Hansen|first6=Jayse|last7=Baldassi|first7=Stefano|s2cid=12247969|date=15 January 2015|publisher=ACM|pages=497–500|doi=10.1145/2677199.2683590|isbn= | Augmented reality can be used to enhance natural environments or situations and offers perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding [[computer vision]], incorporating AR cameras into smartphone applications, and [[object recognition]]) the information about the surrounding real world of the user becomes [[interactive]] and digitally manipulated.<ref>{{Cite journal |url=https://doi.org/10.1007/s11831-022-09831-7/ |title=Augmented Reality: A Comprehensive Review|last1=Dargan|first1=Shaveta|last2=Bansal|first2=Shally|last3=Mittal|first3=Ajay|last4=Kumar|first4=Krishan|date=2023 | journal=Archives of Computational Methods in Engineering |volume=30 |issue=2 |pages=1057–1080 |doi=10.1007/s11831-022-09831-7 |access-date=27 February 2024|url-access=subscription}}</ref> Information about the environment and its objects is overlaid on the real world. This information can be virtual or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.<ref>{{Cite web|url=http://wearcam.org/PhenomenalAugmentedReality.pdf|title=Phenomenal Augmented Reality, IEEE Consumer Electronics, Volume 4, No. 4, October 2015, cover+pp92-97}}</ref><ref>Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp 99–128, 1992.</ref><ref>{{Cite book|last1=Mann|first1=Steve|last2=Feiner|first2=Steve|last3=Harner|first3=Soren|last4=Ali|first4=Mir Adnan|last5=Janzen|first5=Ryan|last6=Hansen|first6=Jayse|last7=Baldassi|first7=Stefano|s2cid=12247969|date=15 January 2015|publisher=ACM|pages=497–500|doi=10.1145/2677199.2683590|isbn=978-1-4503-3305-4|chapter=Wearable Computing, 3D Aug* Reality, Photographic/Videographic Gesture Sensing, and Veillance|title=Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction - TEI '14}}</ref> Augmented reality also has a lot of potential in the gathering and sharing of tacit knowledge. Immersive perceptual information is sometimes combined with supplemental information like scores over a live video feed of a sporting event. This combines the benefits of both augmented reality technology and [[heads up display]] technology (HUD). | ||
Augmented reality [[Application framework|frameworks]] include [[ARKit]] and [[ARCore]]. Commercial augmented reality headsets include the [[Magic Leap]] 1 and [[HoloLens]]. A number of companies have promoted the concept of [[smartglasses]] that have augmented reality capability. | Augmented reality [[Application framework|frameworks]] include [[ARKit]] and [[ARCore]]. Commercial augmented reality headsets include the [[Magic Leap]] 1 and [[HoloLens]]. A number of companies have promoted the concept of [[smartglasses]] that have augmented reality capability. | ||
Augmented reality can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.<ref>{{cite journal |last1=Wu |first1=Hsin-Kai |last2=Lee |first2=Silvia Wen-Yu |last3=Chang |first3=Hsin-Yi |last4=Liang |first4=Jyh-Chong |title=Current status, opportunities and challenges of augmented reality in education... |journal=Computers & Education |date=March 2013 |volume=62 |pages=41–49 |doi=10.1016/j.compedu.2012.10.024 |s2cid=15218665 }}</ref> The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).<ref name="B. Rosenberg 1992">{{cite web |last1=Rosenberg |first1=Louis B. |title=The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance in Remote Environments. |work=DTIC |date=1992 |url=https://apps.dtic.mil/docs/citations/ADA292450 |archive-url=https://web.archive.org/web/20190710211431/https://apps.dtic.mil/docs/citations/ADA292450 |url-status=live |archive-date=10 July 2019 }}</ref> As such, it is one of the key technologies in the [[Reality–virtuality continuum|reality-virtuality continuum]].<ref>{{Cite journal |last1=Milgram |first1=Paul |last2=Takemura |first2=Haruo |last3=Utsumi |first3=Akira |last4=Kishino |first4=Fumio |date=1995-12-21 |title=Augmented reality: a class of displays on the reality-virtuality continuum |url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2351/0000/Augmented-reality--a-class-of-displays-on-the-reality/10.1117/12.197321.full |journal=Telemanipulator and Telepresence Technologies |publisher=SPIE |volume=2351 |pages=282–292 |doi=10.1117/12.197321|bibcode=1995SPIE.2351..282M |url-access=subscription }}</ref> Augmented reality refers to experiences that are artificial and that add to the already existing reality.<ref>{{Cite magazine |url=https://www.wired.com/2009/08/augmented-reality/ |title=If You're Not Seeing Data, You're Not Seeing |last=Chen |first=Brian |date=25 August 2009 |magazine=Wired |access-date=18 June 2019}}</ref><ref>{{Cite web |url=http://www.augmentedrealityon.com/ |title=Augmented Reality (AR) |website=augmentedrealityon.com |archive-url=https://web.archive.org/web/20120405071414/http://www.augmentedrealityon.com/ |archive-date=5 April 2012 | Augmented reality can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.<ref>{{cite journal |last1=Wu |first1=Hsin-Kai |last2=Lee |first2=Silvia Wen-Yu |last3=Chang |first3=Hsin-Yi |last4=Liang |first4=Jyh-Chong |title=Current status, opportunities and challenges of augmented reality in education... |journal=Computers & Education |date=March 2013 |volume=62 |pages=41–49 |doi=10.1016/j.compedu.2012.10.024 |s2cid=15218665 }}</ref> The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).<ref name="B. Rosenberg 1992">{{cite web |last1=Rosenberg |first1=Louis B. |title=The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance in Remote Environments. |work=DTIC |date=1992 |url=https://apps.dtic.mil/docs/citations/ADA292450 |archive-url=https://web.archive.org/web/20190710211431/https://apps.dtic.mil/docs/citations/ADA292450 |url-status=live |archive-date=10 July 2019 }}</ref> As such, it is one of the key technologies in the [[Reality–virtuality continuum|reality-virtuality continuum]].<ref>{{Cite journal |last1=Milgram |first1=Paul |last2=Takemura |first2=Haruo |last3=Utsumi |first3=Akira |last4=Kishino |first4=Fumio |date=1995-12-21 |title=Augmented reality: a class of displays on the reality-virtuality continuum |url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2351/0000/Augmented-reality--a-class-of-displays-on-the-reality/10.1117/12.197321.full |journal=Telemanipulator and Telepresence Technologies |publisher=SPIE |volume=2351 |pages=282–292 |doi=10.1117/12.197321|bibcode=1995SPIE.2351..282M |url-access=subscription }}</ref> Augmented reality refers to experiences that are artificial and that add to the already existing reality.<ref>{{Cite magazine |url=https://www.wired.com/2009/08/augmented-reality/ |title=If You're Not Seeing Data, You're Not Seeing |last=Chen |first=Brian |date=25 August 2009 |magazine=Wired |access-date=18 June 2019}}</ref><ref>{{Cite web |url=http://www.augmentedrealityon.com/ |title=Augmented Reality (AR) |website=augmentedrealityon.com |archive-url=https://web.archive.org/web/20120405071414/http://www.augmentedrealityon.com/ |archive-date=5 April 2012 |access-date=18 June 2019}}</ref><ref name="Azuma_survey">{{cite journal |last=Azuma |first=Ronald |author-link=Ronald Azuma |date=August 1997 |title=A Survey of Augmented Reality |url=http://www.cs.unc.edu/~azuma/ARpresence.pdf |access-date=2 June 2021 |journal=Presence: Teleoperators and Virtual Environments |publisher=MIT Press |volume=6 |issue=4 |pages=355–385 |doi=10.1162/pres.1997.6.4.355|s2cid=469744 }}</ref> | ||
{{toclimit|3}} | {{toclimit|3}} | ||
| Line 21: | Line 23: | ||
Augmented reality (AR) is largely synonymous with mixed reality (MR). There is also overlap in terminology with [[extended reality]] and [[computer-mediated reality]]. However, In the 2020s, the differences between AR and MR began to be emphasized.<ref>Rokhsaritalemi, S., Sadeghi-Niaraki, A., & Choi, S. M. (2020). A review on mixed reality: Current trends, challenges and prospects. ''Applied Sciences'', ''10''(2), 636.</ref><ref>Buhalis, D., & Karatay, N. (2022). Mixed reality (MR) for generation Z in cultural heritage tourism towards metaverse. In ''Information and communication technologies in tourism 2022: Proceedings of the ENTER 2022 eTourism conference, January 11–14, 2022'' (pp. 16-27). Springer International Publishing.</ref> | Augmented reality (AR) is largely synonymous with mixed reality (MR). There is also overlap in terminology with [[extended reality]] and [[computer-mediated reality]]. However, In the 2020s, the differences between AR and MR began to be emphasized.<ref>Rokhsaritalemi, S., Sadeghi-Niaraki, A., & Choi, S. M. (2020). A review on mixed reality: Current trends, challenges and prospects. ''Applied Sciences'', ''10''(2), 636.</ref><ref>Buhalis, D., & Karatay, N. (2022). Mixed reality (MR) for generation Z in cultural heritage tourism towards metaverse. In ''Information and communication technologies in tourism 2022: Proceedings of the ENTER 2022 eTourism conference, January 11–14, 2022'' (pp. 16-27). Springer International Publishing.</ref> | ||
[[File:Extended reality types.svg|thumb|Types of extended reality | [[File:Extended reality types.svg|thumb|Types of extended reality]] | ||
Mixed reality (MR) is an advanced technology that extends beyond augmented reality (AR) by seamlessly integrating the physical and virtual worlds.<ref>{{Cite web |date=2024-03-05 |title=Augmented Reality vs Mixed Reality: Decoding the Key Differences |url=https://www.onirix.com/ar-vs-mr/ |access-date=2025-06-28 |language=en-GB}}</ref> In MR, users are not only able to view digital content within their real environment but can also interact with it as if it were a tangible part of the physical world.<ref>{{Cite web |title=Augmented reality vs. virtual reality vs. mixed reality {{!}} TechTarget |url=https://www.techtarget.com/searcherp/feature/AR-vs-VR-vs-MR-Differences-similarities-and-manufacturing-uses |access-date=2025-06-28 |website=Search ERP |language=en}}</ref> This is made possible through devices such as [[Meta Quest 3S]] and [[Apple Vision | Mixed reality (MR) is an advanced technology that extends beyond augmented reality (AR) by seamlessly integrating the physical and virtual worlds.<ref>{{Cite web |date=2024-03-05 |title=Augmented Reality vs Mixed Reality: Decoding the Key Differences |url=https://www.onirix.com/ar-vs-mr/ |access-date=2025-06-28 |language=en-GB}}</ref> In MR, users are not only able to view digital content within their real environment but can also interact with it as if it were a tangible part of the physical world.<ref>{{Cite web |title=Augmented reality vs. virtual reality vs. mixed reality {{!}} TechTarget |url=https://www.techtarget.com/searcherp/feature/AR-vs-VR-vs-MR-Differences-similarities-and-manufacturing-uses |access-date=2025-06-28 |website=Search ERP |language=en}}</ref> This is made possible through devices such as [[Meta Quest 3S]] and [[Apple Vision Pro]], which utilize multiple cameras and sensors to enable real-time interaction between virtual and physical elements.<ref>{{Cite web |title=Meta Quest 3S: New Mixed Reality Headset - Shop Now |url=https://www.meta.com/quest/quest-3s/ |archive-url=https://web.archive.org/web/20250624165740/https://www.meta.com/quest/quest-3s/ |archive-date=24 June 2025 |access-date=2025-06-28 |website=www.meta.com |language=en |url-status=live }}</ref> Mixed reality that incorporates [[Haptic technology|haptics]] has sometimes been referred to as visuo-haptic mixed reality.<ref>{{cite journal |last1=Cosco |first1=F. |last2=Garre |first2=C. |last3=Bruno |first3=F. |last4=Muzzupappa |first4=M. |last5=Otaduy |first5=M. A. |date=January 2013 |title=Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration |journal=IEEE Transactions on Visualization and Computer Graphics |volume=19 |issue=1 |pages=159–172 |doi=10.1109/TVCG.2012.107 |pmid=22508901 |bibcode=2013ITVCG..19..159C }}</ref><ref>{{Cite journal |last1=Aygün |first1=Mehmet Murat |last2=Öğüt |first2=Yusuf Çağrı |last3=Baysal |first3=Hulusi |last4=Taşcıoğlu |first4=Yiğit |date=January 2020 |title=Visuo-Haptic Mixed Reality Simulation Using Unbound Handheld Tools |journal=Applied Sciences |language=en |volume=10 |issue=15 |page=5344 |doi=10.3390/app10155344 |issn=2076-3417 |doi-access=free}}</ref> | ||
In [[virtual reality]] (VR), the users' perception is completely computer-generated, whereas with augmented reality (AR), it is partially generated and partially from the real world.<ref>{{Cite journal|last1=Carmigniani|first1=Julie|last2=Furht|first2=Borko|last3=Anisetti|first3=Marco|last4=Ceravolo|first4=Paolo|last5=Damiani|first5=Ernesto|last6=Ivkovic|first6=Misa|s2cid=4325516|date=1 January 2011|title=Augmented reality technologies, systems and applications|journal=Multimedia Tools and Applications|language=en|volume=51|issue=1|pages=341–377|doi=10.1007/s11042-010-0660-6|issn=1573-7721}}</ref><ref>{{Cite book|title=Virtual, Augmented Reality and Serious Games for Healthcare 1|last1=Ma|first1=Minhua|last2=C. Jain|first2=Lakhmi|last3=Anderson|first3=Paul|publisher=[[Springer Publishing]]|year=2014|isbn=978-3-642-54816-1|page=120}}</ref> For example, in architecture, VR can be used to create a walk-through simulation of the inside of a new building; and AR can be used to show a building's structures and systems super-imposed on a real-life view. Another example is through the use of utility applications. Some AR applications, such as [[Augment (app)|Augment]], enable users to apply digital objects into real environments, allowing businesses to use augmented reality devices as a way to preview their products in the real world.<ref>{{Cite web|url=https://www.pcmag.com/news/augment-is-bringing-the-ar-revolution-to-business|title=Augment Is Bringing the AR Revolution to Business|last1=Marvin|first1=Rob|date=16 August 2016|website=PC Mag|language=en|access-date=2021-02-23}}</ref> Similarly, it can also be used to demo what products may look like in an environment for customers, as demonstrated by companies such as [[Mountain Equipment Co-op]] or [[Lowe's]] who use augmented reality to allow customers to preview what their products might look like at home.<ref>{{Cite web|url=https://archpaper.com/2019/08/retail-is-getting-reimagined-with-augmented-reality/|title=Retail is getting reimagined with augmented reality|last=Stamp|first=Jimmy|date=30 August 2019|website=The Architect's Newspaper|url-status=live|archive-url=https://web.archive.org/web/20191115233539/https://archpaper.com/2019/08/retail-is-getting-reimagined-with-augmented-reality/|archive-date=15 November 2019}}</ref> | |||
Augmented reality (AR) differs from [[virtual reality]] (VR) in the sense that in AR, the surrounding environment is 'real' and AR is just adding virtual objects to the real environment. On the other hand, in VR, the surrounding environment is completely virtual and computer generated. A demonstration of how AR layers objects onto the real world can be seen with augmented reality games. [[WallaMe]] is an augmented reality game application that allows users to hide messages in real environments, utilizing geolocation technology in order to enable users to hide messages wherever they may wish in the world.<ref>{{cite web|url=https://www.techradar.com/news/the-future-is-virtual-why-ar-and-vr-will-live-in-the-cloud|title=The future is virtual - why AR and VR will live in the cloud|last=Mahmood |first=Ajmal|website=TechRadar|date=12 April 2019 |access-date=2019-12-12}}</ref> | |||
In | In a physics context, the term "interreality system" refers to a virtual reality system coupled with its real-world counterpart.<ref>J. van Kokswijk, [http://www.kokswijk.nl/hum@n.pdf ''Hum@n, Telecoms & Internet as Interface to Interreality''] {{Webarchive|url=https://web.archive.org/web/20070926235344/http://www.kokswijk.nl/hum@n.pdf|date=26 September 2007}} (Bergboek, The Netherlands, 2003).</ref> A 2007 paper describes an interreality system comprising a real physical pendulum coupled to a pendulum that only exists in virtual reality.<ref>{{cite journal |last1=Gintautas |first1=Vadas |last2=Hübler |first2=Alfred W. |date=25 May 2007 |title=Experimental evidence for mixed reality states in an interreality system |journal=Physical Review E |volume=75 |issue=5 |article-number=057201 |arxiv=physics/0611293 |bibcode=2007PhRvE..75e7201G |doi=10.1103/PhysRevE.75.057201 |pmid=17677199}}</ref> This system has two stable states of motion: a "dual reality" state in which the motion of the two pendula are uncorrelated, and a "mixed reality" state in which the pendula exhibit stable phase-locked motion, which is highly correlated. The use of the terms "mixed reality" and "interreality" is clearly defined in the context of physics and may be slightly different in other fields, however, it is generally seen as, "bridging the physical and virtual world".<ref>Repetto, C. and Riva, G., 2020. From Virtual Reality To Interreality In The Treatment Of Anxiety Disorders. [online] Jneuropsychiatry.org. Available at: https://www.jneuropsychiatry.org/peer-review/from-virtual-reality-to-interreality-in-the-treatment-of-anxiety-disorders-neuropsychiatry.pdf [Accessed 30 October 2020].</ref> | ||
Recent improvements in AR and VR headsets have made the display quality, field of view, and motion tracking more accurate, which makes augmented experiences more immersive. Improvements in sensor calibration, lightweight optics, and wireless connectivity have also made it easier for users to move around and be comfortable.<ref>{{Cite web |last=Kolhe |first=Hemant |date=2025-11-07 |title=Ar And Vr Headsets Market is Estimated to Reach a Valuation of USD 215.2 Billion By 2035 |url=https://medium.com/@hemantkolhe.mrfr/ar-and-vr-headsets-market-is-estimated-to-reach-a-valuation-of-usd-215-2-billion-by-2035-ea4913c3bbc0 |access-date=2025-11-07 |website=Medium |language=en}}</ref> | |||
According to a market analysis, the global market for AR and VR headsets was valued $10.3 billion in 2024 and will be worth more than [https://medium.com/@hemantkolhe.mrfr/ar-and-vr-headsets-market-is-estimated-to-reach-a-valuation-of-usd-215-2-billion-by-2035-ea4913c3bbc0 $105 billion] by 2035, with a CAGR of more than 25%. More and more people are using these devices in gaming, healthcare, education, and industrial training because the cost of hardware is going down and the number of content ecosystems is expanding.<ref>{{Cite web |last=Kolhe |first=Hemant |date=2025-11-07 |title=Ar And Vr Headsets Market is Estimated to Reach a Valuation of USD 215.2 Billion By 2035 |url=https://medium.com/@hemantkolhe.mrfr/ar-and-vr-headsets-market-is-estimated-to-reach-a-valuation-of-usd-215-2-billion-by-2035-ea4913c3bbc0 |access-date=2025-11-07 |website=Medium |language=en}}</ref> | |||
==History== | ==History== | ||
| Line 34: | Line 40: | ||
* 1901: Author [[L. Frank Baum]], in his science-fiction novel ''[[The Master Key (Baum novel)|The Master Key]]'', first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.<ref>Johnson, Joel. [https://web.archive.org/web/20130522153011/http://moteandbeam.net/the-master-key-l-frank-baum-envisions-ar-glasses-in-1901 "The Master Key": L. Frank Baum envisions augmented reality glasses in 1901] ''Mote & Beam'' 10 September 2012.</ref> | * 1901: Author [[L. Frank Baum]], in his science-fiction novel ''[[The Master Key (Baum novel)|The Master Key]]'', first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.<ref>Johnson, Joel. [https://web.archive.org/web/20130522153011/http://moteandbeam.net/the-master-key-l-frank-baum-envisions-ar-glasses-in-1901 "The Master Key": L. Frank Baum envisions augmented reality glasses in 1901] ''Mote & Beam'' 10 September 2012.</ref> | ||
* [[Head-up display|Heads-up displays]] (HUDs), a precursor technology to augmented reality, were first developed for pilots in the 1950s, projecting simple flight data into their line of sight, thereby enabling them to keep their "heads up" and not look down at the instruments. It is a transparent display. | * [[Head-up display|Heads-up displays]] (HUDs), a precursor technology to augmented reality, were first developed for pilots in the 1950s, projecting simple flight data into their line of sight, thereby enabling them to keep their "heads up" and not look down at the instruments. It is a transparent display. | ||
* 1968: [[Ivan Sutherland]] creates the first [[head-mounted display]] that has graphics rendered by a computer.<ref>{{cite book |doi=10.1145/1476589.1476686 |chapter=A head-mounted three dimensional display |title=Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS '68 (Fall, part I) | | * 1968: [[Ivan Sutherland]] creates the first [[head-mounted display]] that has graphics rendered by a computer.<ref>{{cite book |doi=10.1145/1476589.1476686 |chapter=A head-mounted three dimensional display |title=Proceedings of the December 9-11, 1968, fall joint computer conference, part I on - AFIPS '68 (Fall, part I) |page=757 |year=1968 |last1=Sutherland |first1=Ivan E. |s2cid=4561103 }}</ref> | ||
* 1975: [[Myron Krueger]] creates [[Videoplace]] to allow users to interact with virtual objects. | * 1975: [[Myron Krueger]] creates [[Videoplace]] to allow users to interact with virtual objects. | ||
* 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a [[Head-up display|heads up display]] for teaching real-world flight skills.<ref name="Lintern-1980"/> | * 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a [[Head-up display|heads up display]] for teaching real-world flight skills.<ref name="Lintern-1980"/> | ||
* 1980: [[Steve Mann (inventor)|Steve Mann]] creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.<ref>{{cite news|last=Mann |first=Steve |url=https://techland.time.com/2012/11/02/eye-am-a-camera-surveillance-and-sousveillance-in-the-glassage/ |title=Eye Am a Camera: Surveillance and Sousveillance in the Glassage |publisher=[[Time (magazine)|Time]] |date=2 November 2012 |access-date=14 October 2013}}</ref> | * 1980: [[Steve Mann (inventor)|Steve Mann]] creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.<ref>{{cite news|last=Mann |first=Steve |url=https://techland.time.com/2012/11/02/eye-am-a-camera-surveillance-and-sousveillance-in-the-glassage/ |title=Eye Am a Camera: Surveillance and Sousveillance in the Glassage |publisher=[[Time (magazine)|Time]] |date=2 November 2012 |access-date=14 October 2013}}</ref> | ||
* 1986: Within IBM, Ron Feigenblatt describes the most widely experienced form of AR today (viz. "magic window," e.g. [[smartphone]]-based [[Pokémon Go]]), use of a small, "smart" flat panel display positioned and oriented by hand.<ref>{{cite web|url=https://priorart.ip.com/IPCOM/000040923 |title=Absolute Display Window Mouse/Mice |access-date=19 October 2020 |url-status=live |archive-url=https://web.archive.org/web/20191106031325/https://priorart.ip.com/IPCOM/000040923 |archive-date=6 November 2019 | * 1986: Within IBM, Ron Feigenblatt describes the most widely experienced form of AR today (viz. "magic window," e.g. [[smartphone]]-based [[Pokémon Go]]), use of a small, "smart" flat panel display positioned and oriented by hand.<ref>{{cite web|url=https://priorart.ip.com/IPCOM/000040923 |title=Absolute Display Window Mouse/Mice |access-date=19 October 2020 |url-status=live |archive-url=https://web.archive.org/web/20191106031325/https://priorart.ip.com/IPCOM/000040923 |archive-date=6 November 2019 }} (context & abstract only) ''[[IBM Technical Disclosure Bulletin]]'' 1 March 1987</ref><ref> | ||
{{cite web|url=https://priorart.ip.com/IPCOM/000040923 |title=Absolute Display Window Mouse/Mice |access-date=19 October 2020 |url-status=live |archive-url=https://web.archive.org/web/20201019143932/https://priorart.ip.com/first-page/IPCOM000040923D |archive-date=19 October 2020 | {{cite web|url=https://priorart.ip.com/IPCOM/000040923 |title=Absolute Display Window Mouse/Mice |access-date=19 October 2020 |url-status=live |archive-url=https://web.archive.org/web/20201019143932/https://priorart.ip.com/first-page/IPCOM000040923D |archive-date=19 October 2020 }} (image of anonymous printed article) ''[[IBM Technical Disclosure Bulletin]]'' 1 March 1987</ref> | ||
* 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "[[Head-up display|heads-up display]]" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.<ref>{{cite journal |title=A computer-driven astronomical telescope guidance and control system with superimposed star field and celestial coordinate graphics display |journal=Journal of the Royal Astronomical Society of Canada |volume=83 | | * 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "[[Head-up display|heads-up display]]" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.<ref>{{cite journal |title=A computer-driven astronomical telescope guidance and control system with superimposed star field and celestial coordinate graphics display |journal=Journal of the Royal Astronomical Society of Canada |volume=83 |page=32 |bibcode=1989JRASC..83...32G |last1=George |first1=Douglas B. |last2=Morris |first2=L. Robert |year=1989 }}</ref> | ||
* 1990: The term ''augmented reality'' is attributed to Thomas P. Caudell, a former [[Boeing]] researcher.<ref>{{cite journal |last1=Lee |first1=Kangdon |s2cid=40826055 |title=Augmented Reality in Education and Training |journal=TechTrends |date=7 February 2012 |volume=56 |issue=2 |pages=13–21 |doi=10.1007/s11528-012-0559-3 }}</ref> | * 1990: The term ''augmented reality'' is attributed to Thomas P. Caudell, a former [[Boeing]] researcher.<ref>{{cite journal |last1=Lee |first1=Kangdon |s2cid=40826055 |title=Augmented Reality in Education and Training |journal=TechTrends |date=7 February 2012 |volume=56 |issue=2 |pages=13–21 |doi=10.1007/s11528-012-0559-3 }}</ref> | ||
* 1992: [[Louis B. Rosenberg|Louis Rosenberg]] developed one of the first functioning AR systems, called [[Virtual fixture|Virtual Fixtures]], at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.<ref>Louis B. Rosenberg. "The Use of [[Virtual fixture|Virtual Fixtures]] As Perceptual Overlays to Enhance Operator Performance in Remote Environments." Technical Report AL-TR-0089, USAF Armstrong Laboratory (AFRL), Wright-Patterson AFB OH, 1992.</ref> | * 1992: [[Louis B. Rosenberg|Louis Rosenberg]] developed one of the first functioning AR systems, called [[Virtual fixture|Virtual Fixtures]], at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.<ref>Louis B. Rosenberg. "The Use of [[Virtual fixture|Virtual Fixtures]] As Perceptual Overlays to Enhance Operator Performance in Remote Environments." Technical Report AL-TR-0089, USAF Armstrong Laboratory (AFRL), Wright-Patterson AFB OH, 1992.</ref> | ||
| Line 49: | Line 55: | ||
* 1995: S. Ravela et al. at University of Massachusetts introduce a vision-based system using monocular cameras to track objects (engine blocks) across views for augmented reality.<ref>{{Cite journal|url=https://scholarworks.umass.edu/entities/publication/84c55891-d457-47f6-878b-abe58212ab57|title=Tracking Object Motion Across Aspect Changes for Augmented Reality|first=S.|last=Ravela|date=16 March 1996|via=scholarworks.umass.edu}}</ref><ref>{{Cite book|chapter-url=https://ieeexplore.ieee.org/document/525793|chapter=Adaptive tracking and model registration across distinct aspects|first1=S.|last1=Ravela|first2=B.|last2=Draper|first3=J.|last3=Lim|first4=R.|last4=Weiss|title=Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots |date=16 August 1995|volume=1|pages=174–180 vol.1|via=IEEE Xplore|doi=10.1109/IROS.1995.525793|isbn=0-8186-7108-4 |url=https://scholarworks.umass.edu/cs_faculty_pubs/219 }}</ref> | * 1995: S. Ravela et al. at University of Massachusetts introduce a vision-based system using monocular cameras to track objects (engine blocks) across views for augmented reality.<ref>{{Cite journal|url=https://scholarworks.umass.edu/entities/publication/84c55891-d457-47f6-878b-abe58212ab57|title=Tracking Object Motion Across Aspect Changes for Augmented Reality|first=S.|last=Ravela|date=16 March 1996|via=scholarworks.umass.edu}}</ref><ref>{{Cite book|chapter-url=https://ieeexplore.ieee.org/document/525793|chapter=Adaptive tracking and model registration across distinct aspects|first1=S.|last1=Ravela|first2=B.|last2=Draper|first3=J.|last3=Lim|first4=R.|last4=Weiss|title=Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots |date=16 August 1995|volume=1|pages=174–180 vol.1|via=IEEE Xplore|doi=10.1109/IROS.1995.525793|isbn=0-8186-7108-4 |url=https://scholarworks.umass.edu/cs_faculty_pubs/219 }}</ref> | ||
*1996: General Electric develops system for projecting information from 3D CAD models onto real-world instances of those models.<ref>{{Cite web|title=US Patent for Projection of images of computer models in three dimensional space Patent (Patent # 5,687,305 issued November 11, 1997) - Justia Patents Search|url=https://patents.justia.com/patent/5687305|access-date=2021-10-17|website=patents.justia.com}}</ref> | *1996: General Electric develops system for projecting information from 3D CAD models onto real-world instances of those models.<ref>{{Cite web|title=US Patent for Projection of images of computer models in three dimensional space Patent (Patent # 5,687,305 issued November 11, 1997) - Justia Patents Search|url=https://patents.justia.com/patent/5687305|access-date=2021-10-17|website=patents.justia.com}}</ref> | ||
* 1998: Spatial augmented reality introduced at [[University of North Carolina]] at Chapel Hill by [[Ramesh Raskar]], Greg Welch, [[Henry Fuchs]].<ref name="raskarSAR" /> | * 1998: Spatial augmented reality introduced at [[University of North Carolina]] at Chapel Hill by [[Ramesh Raskar]], Greg Welch, [[Henry Fuchs]].<ref name="raskarSAR">Ramesh Raskar, Greg Welch, Henry Fuchs [https://web.archive.org/web/19981205111134/http://www.cs.unc.edu/~raskar/Office/ Spatially Augmented Reality], First International Workshop on Augmented Reality, Sept 1998.</ref> | ||
* 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.<ref name="DELG99" /><ref name = "DELG00" /> | * 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.<ref name="DELG99" /><ref name = "DELG00" /> | ||
* 1999: The [[United States Naval Research Laboratory|US Naval Research Laboratory]] engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.<ref>{{Cite web|url=https://www.nrl.navy.mil/itd/imda/research/5581/augmented-reality/|title=Information Technology|website=www.nrl.navy.mil}}</ref> | * 1999: The [[United States Naval Research Laboratory|US Naval Research Laboratory]] engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.<ref>{{Cite web|url=https://www.nrl.navy.mil/itd/imda/research/5581/augmented-reality/|title=Information Technology|website=www.nrl.navy.mil}}</ref> | ||
* 1999: NASA X-38 flown using LandForm software video map overlays at [[Dryden Flight Research Center]].<ref>AviationNow.com Staff, "X-38 Test Features Use of Hybrid Synthetic Vision" AviationNow.com, 11 December 2001</ref> | * 1999: NASA X-38 flown using LandForm software video map overlays at [[Dryden Flight Research Center]].<ref>AviationNow.com Staff, "X-38 Test Features Use of Hybrid Synthetic Vision" AviationNow.com, 11 December 2001</ref> | ||
* 2000: [[Rockwell International]] Science Center demonstrates tetherless wearable augmented reality systems receiving analog video and 3D audio over radio-frequency wireless channels. The systems incorporate outdoor navigation capabilities, with digital horizon silhouettes from a terrain database overlain in real time on the live outdoor scene, allowing visualization of terrain made invisible by clouds and fog.<ref>{{cite book |doi=10.1109/ISAR.2000.880918 |chapter=A wearable augmented reality testbed for navigation and control, built solely with commercial-off-the-shelf (COTS) hardware |title=Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) |pages=12–19 |year=2000 |last1=Behringer |first1=R. |last2=Tam |first2=C. |last3=McGee |first3=J. |last4=Sundareswaran |first4=S. |last5=Vassiliou |first5=M. |s2cid=18892611 |isbn=0-7695-0846-4 }}</ref><ref>{{cite book |doi=10.1109/ISWC.2000.888495 |chapter=Two wearable testbeds for augmented reality: ItWARNS and WIMMIS |title=Digest of Papers. Fourth International Symposium on Wearable Computers |pages=189–190 |year=2000 |last1=Behringer |first1=R. |last2=Tam |first2=C. |last3=McGee |first3=J. |last4=Sundareswaran |first4=S. |last5=Vassiliou |first5=M. |s2cid=13459308 |isbn=0-7695-0795-6 }}</ref> | * 2000: [[Rockwell International]] Science Center demonstrates tetherless wearable augmented reality systems receiving analog video and 3D audio over radio-frequency wireless channels. The systems incorporate outdoor navigation capabilities, with digital horizon silhouettes from a terrain database overlain in real time on the live outdoor scene, allowing visualization of terrain made invisible by clouds and fog.<ref>{{cite book |doi=10.1109/ISAR.2000.880918 |chapter=A wearable augmented reality testbed for navigation and control, built solely with commercial-off-the-shelf (COTS) hardware |title=Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000) |pages=12–19 |year=2000 |last1=Behringer |first1=R. |last2=Tam |first2=C. |last3=McGee |first3=J. |last4=Sundareswaran |first4=S. |last5=Vassiliou |first5=M. |s2cid=18892611 |isbn=0-7695-0846-4 }}</ref><ref>{{cite book |doi=10.1109/ISWC.2000.888495 |chapter=Two wearable testbeds for augmented reality: ItWARNS and WIMMIS |title=Digest of Papers. Fourth International Symposium on Wearable Computers |pages=189–190 |year=2000 |last1=Behringer |first1=R. |last2=Tam |first2=C. |last3=McGee |first3=J. |last4=Sundareswaran |first4=S. |last5=Vassiliou |first5=M. |s2cid=13459308 |isbn=0-7695-0795-6 }}</ref> | ||
*2004: An outdoor helmet-mounted AR system was demonstrated by [[Trimble Navigation]] and the Human Interface Technology Laboratory (HIT lab).<ref name="Outdoor AR" /> | *2004: An outdoor helmet-mounted AR system was demonstrated by [[Trimble Navigation]] and the Human Interface Technology Laboratory (HIT lab).<ref name="Outdoor AR">[https://www.youtube.com/watch?v=jL3C-OVQKWU Outdoor AR]. ''TV One News'', 8 March 2004.</ref> | ||
*2006: Outland Research develops AR media player that overlays virtual content onto a users view of the real world synchronously with playing music, thereby providing an immersive AR entertainment experience.<ref>{{Cite patent|country=|number=7732694|title=United States Patent: 7732694 - Portable music player with synchronized transmissive visual overlays|status=|pubdate=9 Aug 2006|gdate=8 June 2010|invent1=|inventor1-first=|url=http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-adv.htm&r=1&f=G&l=50&d=PALL&S1=07732694&OS=PN/07732694&RS=PN/07732694}}</ref><ref>{{Cite web|last=Slawski|first=Bill|date=2011-09-04|title=Google Picks Up Hardware and Media Patents from Outland Research|url=https://www.seobythesea.com/2011/09/google-picks-up-hardware-and-media-patents-from-outland-research/|website=SEO by the Sea ⚓|language=en-US}}</ref> | *2006: Outland Research develops AR media player that overlays virtual content onto a users view of the real world synchronously with playing music, thereby providing an immersive AR entertainment experience.<ref>{{Cite patent|country=|number=7732694|title=United States Patent: 7732694 - Portable music player with synchronized transmissive visual overlays|status=|pubdate=9 Aug 2006|gdate=8 June 2010|invent1=|inventor1-first=|url=http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-adv.htm&r=1&f=G&l=50&d=PALL&S1=07732694&OS=PN/07732694&RS=PN/07732694}} {{Webarchive|url=https://web.archive.org/web/20190427164734/http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&f=G&l=50&d=PALL&S1=07732694&OS=PN/07732694&RS=PN/07732694 |date=27 April 2019 }}</ref><ref>{{Cite web|last=Slawski|first=Bill|date=2011-09-04|title=Google Picks Up Hardware and Media Patents from Outland Research|url=https://www.seobythesea.com/2011/09/google-picks-up-hardware-and-media-patents-from-outland-research/|website=SEO by the Sea ⚓|language=en-US}}</ref> | ||
* 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the [[HTC Dream|G1 Android phone]].<ref>[https://www.youtube.com/watch?v=8EA8xlicmT8 Wikitude AR Travel Guide]. YouTube.com. Retrieved 9 June 2012.</ref> | * 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the [[HTC Dream|G1 Android phone]].<ref>[https://www.youtube.com/watch?v=8EA8xlicmT8 Wikitude AR Travel Guide]. YouTube.com. Retrieved 9 June 2012.</ref> | ||
* 2009: ARToolkit was ported to [[Adobe Flash]] (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.<ref>Cameron, Chris. [http://www.readwriteweb.com/archives/flash-based_ar_gets_high-quality_markerless_upgrade.php Flash-based AR Gets High-Quality Markerless Upgrade], ''ReadWriteWeb'' 9 July 2010.</ref> | * 2009: ARToolkit was ported to [[Adobe Flash]] (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.<ref>Cameron, Chris. [http://www.readwriteweb.com/archives/flash-based_ar_gets_high-quality_markerless_upgrade.php Flash-based AR Gets High-Quality Markerless Upgrade], ''ReadWriteWeb'' 9 July 2010.</ref> | ||
| Line 65: | Line 71: | ||
* 2019: [[Microsoft]] announced [[HoloLens 2]] with significant improvements in terms of field of view and ergonomics.<ref>Official Blog, Microsoft [https://blogs.microsoft.com/blog/2019/02/24/microsoft-at-mwc-barcelona-introducing-microsoft-hololens-2/], 24 February 2019.</ref> | * 2019: [[Microsoft]] announced [[HoloLens 2]] with significant improvements in terms of field of view and ergonomics.<ref>Official Blog, Microsoft [https://blogs.microsoft.com/blog/2019/02/24/microsoft-at-mwc-barcelona-introducing-microsoft-hololens-2/], 24 February 2019.</ref> | ||
* 2022: Magic Leap launched the Magic Leap 2 headset.<ref>{{cite web |title=Magic Leap 2 is the best AR headset yet, but will an enterprise focus save the company? |url=https://www.engadget.com/magic-leap-2-ar-headset-tech-dive-143046676.html |website=Engadget |date=11 November 2022 |access-date=26 March 2024}}</ref> | * 2022: Magic Leap launched the Magic Leap 2 headset.<ref>{{cite web |title=Magic Leap 2 is the best AR headset yet, but will an enterprise focus save the company? |url=https://www.engadget.com/magic-leap-2-ar-headset-tech-dive-143046676.html |website=Engadget |date=11 November 2022 |access-date=26 March 2024}}</ref> | ||
* 2023: [[Meta Quest 3]], a [[mixed reality]] [[Virtual reality headset|VR headset]]<ref>{{Cite web |title=Meta Quest 3: Mixed Reality VR Headset - Shop Now |url=https://www.meta.com/quest/quest-3/ |archive-url= | * 2023: [[Meta Quest 3]], a [[mixed reality]] [[Virtual reality headset|VR headset]]<ref>{{Cite web |title=Meta Quest 3: Mixed Reality VR Headset - Shop Now |url=https://www.meta.com/quest/quest-3/ |archive-url=https://web.archive.org/web/20250627103033/https://www.meta.com/quest/quest-3/ |archive-date=27 June 2025 |access-date=2025-06-28 |website=www.meta.com |language=en |url-status=live }}</ref> was developed by [[Reality Labs]], a division of [[Meta Platforms]]. In the same year, [[Apple Vision Pro]] was released. | ||
* 2024: [[Meta Platforms]] revealed the Orion AR glasses prototype.<ref>{{Cite web |last=Vanian |first=Jonathan |date=2024-09-27 |title=Hands-on with Meta's Orion AR glasses prototype and the possible future of computing |url=https://www.cnbc.com/2024/09/27/hands-on-with-metas-orion-augmented-reality-smart-glasses-prototype.html |access-date=2024-09-28 |website=CNBC |language=en}}</ref> | * 2024: [[Meta Platforms]] revealed the Orion AR glasses prototype.<ref>{{Cite web |last=Vanian |first=Jonathan |date=2024-09-27 |title=Hands-on with Meta's Orion AR glasses prototype and the possible future of computing |url=https://www.cnbc.com/2024/09/27/hands-on-with-metas-orion-augmented-reality-smart-glasses-prototype.html |access-date=2024-09-28 |website=CNBC |language=en}}</ref> | ||
* 2025: [[Meta Platforms]] released their Meta Ray-Ban Display glasses, featuring a small AR HUD on the right eye.<ref>{{Cite web |last=Hayden |first=Scott |date=2025-09-17 |title=Meta Unveils Ray-Ban Smart Glasses with Display, Launching for $800 This Month |url=https://www.roadtovr.com/meta-ray-ban-smart-glasses-display-price-release-date-specs/ |access-date= |website=Road to VR |language=en-US}}</ref> | |||
== | ==Hardware and displays== | ||
AR visuals appear on handheld devices (video passthrough) and head-mounted displays (optical see-through or video passthrough). Systems pair a display with sensors (e.g., cameras and IMUs) to register virtual content to the environment; research also explores near-eye optics, projection-based AR, and experimental concepts such as contact-lens or retinal-scanned displays.<ref>{{cite journal |last1=Itoh |first1=Y. |last2=Langlotz |first2=T. |last3=Sutton |first3=J. |last4=Plopski |first4=A. |year=2021 |title=Towards Indistinguishable Augmented Reality: A Survey on Optical See-through Head-mounted Displays |url=https://dl.acm.org/doi/10.1145/3453157 |journal=ACM Computing Surveys |volume=54 |issue=6 |doi=10.1145/3453157}}</ref><ref name="azuma1">{{cite journal |last=Azuma |first=R. |year=1997 |title=A Survey of Augmented Reality |url=https://www.cs.unc.edu/~azuma/ARpresence.pdf |journal=Presence |volume=6 |issue=4 |pages=355–385 |doi=10.1162/pres.1997.6.4.355}}</ref>[[File:Hud_on_the_cat.jpg|thumb|Photograph of the head-up display of a F/A-18C]] | |||
=== Head-mounted displays === | |||
{{main|Head-mounted display}} | |||
AR HMDs place virtual imagery in the user's view using optical see-through or video passthrough and track head motion for stable registration.<ref name="itoh1">{{cite journal |last=Itoh |first=Y. |year=2021 |title=Towards Indistinguishable Augmented Reality: A Survey on Optical See-through Head-mounted Displays |journal=ACM Computing Surveys |doi=10.1145/3453157}}</ref> | |||
===Handheld=== | |||
Phone and tablet AR uses the rear camera (video passthrough) plus on-device SLAM/VIO for tracking.<ref name="core1">{{cite web |date=2024-10-31 |title=ARCore—Overview |url=https://developers.google.com/ar/develop |website=Google Developers}}</ref><ref>{{cite web |title=ARKit overview |url=https://developer.apple.com/documentation/arkit |website=Apple Developer Documentation}}</ref> | |||
=== Head-up display === | === Head-up display === | ||
{{Main|Head-up display}} | |||
HUDs project information into the forward view; AR variants align graphics to the outside scene (e.g., lane guidance, hazards).<ref>{{cite journal |last=Zhou |first=C. |year=2024 |title=Automotive Augmented Reality Head-Up Displays |journal=Micromachines |volume=15 |issue=4 |page=442 |doi=10.3390/mi15040442 |pmid=38675254 |pmc=11052328 |doi-access=free }}</ref> | |||
[[File:CAVE_Crayoland.jpg|thumb|A user standing in the middle of a cave automatic virtual environment]] | |||
=== Cave automatic virtual environment === | |||
{{main|Cave automatic virtual environment}} | |||
Room-scale projection systems surround users with imagery for co-located, multi-user AR/VR.<ref>{{cite book |last=Cruz-Neira |first=C. |title=Proceedings of the 20th annual conference on Computer graphics and interactive techniques |chapter=Surround-screen projection-based virtual reality: The design and implementation of the CAVE |pages=135–142 |year=1993 |chapter-url=https://users.cs.utah.edu/~thompson/vissim-seminar/on-line/CruzNeiraSig93.pdf |doi=10.1145/166117.166134 |isbn=0-89791-601-8 }}</ref> | |||
===Contact lenses=== | ===Contact lenses=== | ||
{{main|Bionic contact lens}} | |||
Prototypes explore embedding display/antenna elements into lenses for glanceable AR; most work remains experimental.<ref>{{cite magazine |last=Parviz |first=B. |date=2009-09-01 |title=Augmented Reality in a Contact Lens |url=https://spectrum.ieee.org/augmented-reality-in-a-contact-lens |magazine=IEEE Spectrum}}</ref><ref>{{cite journal |last=Kazanskiy |first=N. |year=2023 |title=Smart Contact Lenses—A Step towards Non-Invasive Monitoring and Treatment |journal=Pharmaceutics |volume=15 |issue=11 |page=2620 |doi=10.3390/pharmaceutics15112620 |pmid=37887126 |pmc=10605521 |doi-access=free }}</ref> | |||
===Virtual retinal display=== | ===Virtual retinal display=== | ||
{{main|Virtual retinal display}} | |||
VRD concepts scan imagery directly onto the retina for high-contrast viewing.<ref>{{cite web |title=Virtual Retinal Display (VRD) |url=https://www.hitl.washington.edu/projects/vrd.html |website=HIT Lab, University of Washington}}</ref> | |||
===Projection mapping=== | ===Projection mapping=== | ||
{{main|Projection mapping}} | |||
Projectors overlay graphics onto real objects/environments without head-worn displays (spatial AR).<ref>{{cite report |url=https://www.cs.unc.edu/~welch/media/pdf/IWAR_SAR.pdf |title=Spatially Augmented Reality |last1=Raskar |first1=R. |last2=Welch |first2=G. |year=1998 |institution=UNC Chapel Hill}}</ref> | |||
[[File:MicrosoftHoloLensBloomGesture.JPG|thumb|alt= Photograph of a man wearing an augmented reality headset| A man wearing an augmented reality headset]] | |||
===AR glasses=== | |||
{{main|Smartglasses}} | |||
=== | |||
Glasses-style near-eye displays aim for lighter, hands-free AR; approaches vary in optics, tracking, and power.<ref name="itoh1"/> | |||
== | == Tracking and registration == | ||
{{further|Simultaneous localization and mapping|Visual odometry|Fiducial marker|Image registration}} | |||
AR systems estimate device pose and scene geometry so virtual graphics stay aligned with the real world. Common approaches include visual–inertial odometry and SLAM for markerless tracking, and fiducial markers when known patterns are available; image registration and depth cues (e.g., occlusion, shadows) maintain realism.<ref name="azuma1"/><ref>{{cite journal |last=Kazerouni |first=I.A. |year=2022 |title=A survey of state-of-the-art on visual SLAM |url=https://www.sciencedirect.com/science/article/pii/S0957417422010156 |journal=Expert Systems with Applications |volume=205 |article-number=117734 |doi=10.1016/j.eswa.2022.117734}}</ref><ref name="syed1">{{cite journal |last=Syed |first=T.A. |year=2022 |title=In-Depth Review of Augmented Reality: Tracking Technologies, Development Tools, AR Displays, Collaborative AR, and Security Concerns |journal=Applied Sciences |volume=12 |issue=24 |article-number=12722 |doi=10.3390/app122412722 |pmid=36616745 |pmc=9824627 |doi-access=free }}</ref> | |||
=== Software and standards === | |||
{{further|ARKit|ARCore|Augmented Reality Markup Language}} | |||
AR runtimes provide sensing, tracking, and rendering pipelines; mobile platforms expose SDKs with camera access and spatial tracking. Interchange/geospatial formats such as ARML standardize anchors and content.<ref>{{cite web |date=2015-02-24 |title=ARML 2.0—OGC Standard |url=https://www.ogc.org/publications/standard/arml/ |website=Open Geospatial Consortium}}</ref><ref>{{cite web |title=ARKit overview |url=https://developer.apple.com/documentation/arkit |access-date=26 September 2025 |website=Apple Developer Documentation}}</ref><ref name="core1"/> | |||
=== Interaction and input === | |||
{{further|Human–computer interaction|Gesture recognition}} | |||
Input commonly combines head/gaze with touch, controllers, voice, or hand tracking; audio and haptics can reduce visual load. Human-factors studies report performance benefits but also workload and safety trade-offs depending on task and context.<ref>{{cite journal |last=Yang |first=Z. |year=2019 |title=Influences of Augmented Reality Assistance on Cognitive Load and Performance in Manual Assembly |journal=Frontiers in Psychology |volume=10 |page=1703 |doi=10.3389/fpsyg.2019.01703 |pmid=31396134 |pmc=6668604 |doi-access=free }}</ref><ref name="syed1"/> | |||
== | === Design considerations === | ||
Key usability factors include stable registration, legible contrast under varied lighting, and low motion-to-photon latency. Visual design often uses depth cues (occlusion, shadows) to support spatial judgment; safety-critical uses emphasize glanceable prompts and minimal interaction.<ref>{{cite journal |last=Itoh |first=Y. |year=2021 |title=A Survey on Optical See-Through Head-Mounted Displays for Augmented Reality |url=https://dl.acm.org/doi/fullHtml/10.1145/3453157 |journal=ACM Computing Surveys |volume=54 |issue=6 |doi=10.1145/3453157}}</ref><ref>{{cite journal |last=Warburton |first=M. |year=2023 |title=Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems |url=https://eprints.whiterose.ac.uk/id/eprint/193470/3/s13428-022-01983-5.pdf |journal=Behavior Research Methods |volume=55 |issue=7 |pages=3658–3678 |doi=10.3758/s13428-022-01983-5 |pmid=36217006 |pmc=10616216 }}</ref><ref name="azuma1"/> | |||
== Applications == | |||
{{further|Industrial augmented reality}} | |||
Augmented reality has been explored for many uses, including gaming, medicine, and entertainment. It has also been explored for education and business.<ref>{{Cite journal |last1=Moro |first1=Christian |last2=Štromberga |first2=Zane |last3=Raikos |first3=Athanasios |last4=Stirling |first4=Allan |date=2017 |title=The effectiveness of virtual and augmented reality in health sciences and medical anatomy |journal=Anatomical Sciences Education |volume=10 |issue=6 |pages=549–559 |doi=10.1002/ase.1696 |issn=1935-9780 |pmid=28419750 |s2cid=25961448}}</ref> Some of the earliest cited examples include augmented reality used to support surgery by providing virtual overlays to guide medical practitioners, to AR content for astronomy and welding.<ref name="Dupzyk 2016" /><ref>{{Cite news |date=20 July 2012 |title=Don't be blind on wearable cameras insists AR genius |url=https://www.slashgear.com/dont-be-blind-on-wearable-cameras-insists-ar-genius-20239514/ |access-date=21 October 2018 |work=SlashGear |language=en-US}}</ref> Example application areas described below include archaeology, architecture, commerce and education. | |||
Augmented reality has been explored for many uses, including gaming, medicine, and entertainment. It has also been explored for education and business.<ref>{{Cite journal|last1=Moro|first1=Christian|last2=Štromberga|first2=Zane|last3=Raikos|first3=Athanasios|last4=Stirling|first4=Allan|date=2017|title=The effectiveness of virtual and augmented reality in health sciences and medical anatomy | |||
===Education and training=== | ===Education and training=== | ||
Overlays models and step-by-step guidance in real settings (e.g., anatomy, maintenance); systematic reviews report learning benefits alongside design and implementation caveats that vary by context and task.<ref>{{cite journal |last=Li |first=G. |year=2025 |title=Augmented Reality in Higher Education: A Systematic Review and Meta-Analysis (2000–2023) |journal=Education Sciences |volume=15 |issue=6 |page=678 |doi=10.3390/educsci15060678 |doi-access=free }}</ref><ref>{{cite journal |last=Gabbard |first=J.L. |year=2024 |title=A Systematic Review of VR/AR in Higher Education: Benefits, Challenges, and Trends |url=https://www.tandfonline.com/doi/full/10.1080/14703297.2024.2382854 |journal=The International Journal for Academic Development |volume= |issue= |doi=10.1080/14703297.2024.2382854}}</ref><ref>{{cite journal |last=Park |first=S. |year=2024 |title=Effects of immersive technology-based education for undergraduate nursing students: a systematic review |journal=BMC Nursing |volume=23 |article-number= e57566|doi=10.2196/57566 |doi-access=free |pmid=38978483 |pmc=11306947 }}</ref> | |||
=== Medicine === | |||
Guidance overlays and image fusion support planning and intraoperative visualization across several specialties; reviews note accuracy/registration constraints and workflow integration issues.<ref>{{cite journal |last=Doornbos |first=M.C.J. |year=2024 |title=Augmented Reality Implementation in Minimally Invasive Surgery of Deformable Organs: A Systematic Review |journal=Journal of Personalized Medicine |volume=14 |issue=7 |pages=646–658 |doi=10.1177/15533506241290412 |pmid=39370802 |pmc=11475712 |doi-access=free }}</ref><ref>{{cite journal |last=Malhotra |first=S. |year=2023 |title=Augmented Reality in Surgical Navigation: A Review of Current State and Future Directions |journal=Applied Sciences |volume=13 |issue=3 |page=1629 |doi=10.3390/app13031629 |doi-access=free }}</ref><ref>{{cite journal |last1=Nadeem-Tariq |first1=Ahmed |last2=Kazemeini |first2=Sarah |last3=Kaur |first3=Pratiksha |last4=Dang |first4=Grace |last5=Davis |first5=Trevor |last6=Sraa |first6=Kiratpreet |last7=Zitser |first7=Philip |last8=Fang |first8=Christopher |title=Augmented Reality in Spine Surgery: A Narrative Review of Clinical Accuracy, Workflow Efficiency, and Barriers to Adoption |journal=Cureus |date=2025 |volume=17 |issue=6 |article-number=e86803 |doi=10.7759/cureus.86803 |doi-access=free |pmid=40718258 |pmc=12296264 }}</ref> | |||
=== | === Industry === | ||
{{ | Hands-free work instructions, inspection, and remote assistance tied to assets; evidence highlights productivity gains alongside limits around tracking robustness, ergonomics, and change management.<ref>{{cite journal |last=Morales Méndez |first=G. |year=2024 |title=Industry 4.0 Assistance and Training with Augmented Reality: A Systematic Review and Bibliometric Analysis |journal=Electronics |volume=13 |issue=6 |page=1147 |doi=10.3390/electronics13061147 |doi-access=free }}</ref><ref>{{cite journal |last=Wu |first=L. |year=2025 |title=Augmented Remote Assistance for Quality Inspection |url=https://www.tandfonline.com/doi/full/10.1080/10447318.2025.2534059 |journal=International Journal of Human–Computer Interaction |volume= |issue= |pages= |doi=10.1080/10447318.2025.2534059}}</ref><ref>{{cite journal |last=Souza |first=B.J. |year=2025 |title=Towards an Integration of Augmented Reality in Industrial Assembly Tasks: A Literature Review |url=https://www.sciencedirect.com/science/article/pii/S1877050925001760 |journal=Procedia Computer Science |volume= |pages= |doi=10.1016/j.procs.2025.01.168 }}</ref>[[File:Desjardins AR Augmented Reality Game, March 2013.png|thumb|upright|alt= An image from an AR mobile game | An AR mobile game using a trigger image as [[fiducial marker]]]] | ||
=== Entertainment and games === | |||
Location-based and camera-based play place virtual objects in real spaces; recent surveys cover design patterns, effectiveness, and safety/attention trade-offs.<ref>{{cite journal|title=Mobile Augmented Reality: A Systematic Review of Current Trends and Challenges|journal=Proceedings of CHI 2025|year=2025|publisher=ACM|doi=10.1145/3723498.3723807|url=https://dl.acm.org/doi/10.1145/3723498.3723807}}</ref><ref>{{cite journal |last=Wickramasinghe |first=Y.S. |year=2025 |title=Representing remote locations with location-based augmented reality games |url=https://www.sciencedirect.com/science/article/pii/S1875952125000126 |journal=Entertainment Computing |volume= |pages= |doi=10.1016/j.entcom.2025.100932 }}</ref><ref>{{cite journal |last=Guo |first=J. |year=2024 |title=How Do Location-Based AR Games Enhance Value Co-Creation in Tourism? |journal=Applied Sciences |volume=14 |issue=15 |page=6812 |doi=10.3390/app14156812 |doi-access=free }}</ref> | |||
=== | === Navigation and maps === | ||
Augmented reality navigation overlays route guidance or hazard cues onto the real scene, typically via smartphone "live view" or in-vehicle heads-up displays. Research finds AR can improve wayfinding and driver situation awareness, but human-factors trade-offs (distraction, cognitive load, occlusion) matter for safety-critical use.<ref>{{cite journal |last=Gabbard |first=J.L. |year=2024 |title=Augmented Reality Navigation: A Survey |url=https://www.tandfonline.com/doi/full/10.1080/10447318.2024.2431757 |journal=International Journal of Human–Computer Interaction |volume=40 |issue=12 |pages= 10190–10206|doi=10.1080/10447318.2024.2431757}}</ref><ref>{{cite journal |last=Zhou |first=C. |year=2024 |title=Automotive Augmented Reality Head-Up Displays |journal=Sensors |volume=24 |issue=3 |page=442 |doi=10.3390/s24031024 |pmid=38675254 |pmc=11052328 |doi-access=free }}</ref><ref>{{cite journal |last=Cheng |first=Y. |year=2023 |title=Does the AR-HUD system affect driving behaviour? An eye-movement study |url=https://www.sciencedirect.com/science/article/pii/S2590198223000143 |journal=Journal of Transport & Health |volume=30 |pages= |doi=10.1016/j.jth.2023.101611}}</ref><ref>{{cite journal |last=Valizadeh |first=M. |year=2024 |title=Indoor AR pedestrian navigation for emergency evacuation |url=https://www.sciencedirect.com/science/article/pii/S2405844024088832 |journal=Heliyon |volume=10 |issue= |pages= |doi=10.1016/j.heliyon.2024.eXXXXX |doi-broken-date=26 September 2025 |doi-access=free }}</ref> | |||
''See also'': [[Head-up display]], [[Automotive navigation system]], [[Wayfinding]] | |||
===Architecture, engineering, and construction=== | |||
In the AEC sector, AR is used for design visualization, on-site verification against BIM models, clash detection, and guided assembly/inspection. Systematic reviews report benefits for communication and error reduction, while noting limits around tracking robustness and workflow integration.<ref>{{cite journal |last1=Rankohi |first1=S. |last2=Waugh |first2=L. |year=2013 |title=Review and analysis of augmented reality literature for construction industry |journal=Visualization in Engineering |volume=1 |issue=9 |article-number=9 |doi=10.1186/2213-7459-1-9 |doi-access=free }}</ref><ref>{{cite journal |last1=Chi |first1=H.-L. |last2=Kang |first2=S.-C. |last3=Wang |first3=X. |year=2013 |title=Research trends and opportunities of AR applications in AEC |journal=Automation in Construction |volume=33 |pages=116–122 |doi=10.1016/j.autcon.2012.12.017}}</ref><ref>{{cite journal |last=Nassereddine |first=H. |year=2022 |title=Augmented Reality in the Construction Industry: Use Cases, Benefits, and Barriers |journal=Frontiers in Built Environment |volume=8 |article-number=730094 |doi=10.3389/fbuil.2022.730094 |doi-access=free }}</ref> | |||
===Archaeology=== | ===Archaeology=== | ||
AR has been used to aid [[Archaeology|archaeological]] research. By augmenting archaeological features onto the modern landscape, AR allows archaeologists to formulate possible site configurations from extant structures.<ref>{{cite journal |title=Augmenting Phenomenology: Using Augmented Reality to Aid Archaeological Phenomenology in the Landscape |author=Stuart Eve |doi=10.1007/s10816-012-9142-7 | volume=19 |issue=4 |journal=Journal of Archaeological Method and Theory |pages=582–600|url=http://discovery.ucl.ac.uk/1352447/1/Eve_2012_Augmented_Phenomenology.pdf |year=2012 |s2cid=4988300 }}</ref> Computer generated models of ruins, buildings, landscapes or even ancient people have been recycled into early archaeological AR applications.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=854948 |title=Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System |author1=Dähne, Patrick |author2=Karigiannis, John N. |access-date=6 January 2010|isbn= | AR has been used to aid [[Archaeology|archaeological]] research. By augmenting archaeological features onto the modern landscape, AR allows archaeologists to formulate possible site configurations from extant structures.<ref>{{cite journal |title=Augmenting Phenomenology: Using Augmented Reality to Aid Archaeological Phenomenology in the Landscape |author=Stuart Eve |doi=10.1007/s10816-012-9142-7 | volume=19 |issue=4 |journal=Journal of Archaeological Method and Theory |pages=582–600|url=http://discovery.ucl.ac.uk/1352447/1/Eve_2012_Augmented_Phenomenology.pdf |year=2012 |s2cid=4988300 }}</ref> Computer generated models of ruins, buildings, landscapes or even ancient people have been recycled into early archaeological AR applications.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=854948 |title=Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System |author1=Dähne, Patrick |author2=Karigiannis, John N. |access-date=6 January 2010|isbn=978-0-7695-1781-0 |year=2002 |publisher=IEEE Computer Society Press }}</ref><ref>{{cite web |url=http://archpro.lbg.ac.at/press-release/school-gladiators-discovered-roman-carnuntum-austria |title=School of Gladiators discovered at Roman Carnuntum, Austria |author=LBI-ArchPro |access-date=29 December 2014|date=5 September 2011}}</ref><ref>{{Cite journal|title = Mixing virtual and real scenes in the site of ancient Pompeii|journal = Computer Animation and Virtual Worlds|date = 1 February 2005|issn = 1546-427X|pages = 11–24|volume = 16|issue = 1|doi = 10.1002/cav.53|first1 = George|last1 = Papagiannakis|first2 = Sébastien|last2 = Schertenleib|first3 = Brian|last3 = O'Kennedy|first4 = Marlene|last4 = Arevalo-Poizat|first5 = Nadia|last5 = Magnenat-Thalmann|first6 = Andrew|last6 = Stoddart|first7 = Daniel|last7 = Thalmann|citeseerx = 10.1.1.64.8781|s2cid = 5341917}}</ref> For example, implementing a system like VITA (Visual Interaction Tool for Archaeology) will allow users to imagine and investigate instant excavation results without leaving their home. Each user can collaborate by mutually "navigating, searching, and viewing data". Hrvoje Benko, a researcher in the computer science department at [[Columbia University]], points out that these particular systems and others like them can provide "3D panoramic images and 3D models of the site itself at different excavation stages" all the while organizing much of the data in a collaborative way that is easy to use. Collaborative AR systems supply [[multimodal interaction]]s that combine the real world with virtual images of both environments.<ref>{{Cite book |doi = 10.1109/ISMAR.2004.23|chapter = Collaborative Mixed Reality Visualization of an Archaeological Excavation|title = Third IEEE and ACM International Symposium on Mixed and Augmented Reality|pages = 132–140|year = 2004|last1 = Benko|first1 = H.|last2 = Ishak|first2 = E.W.|last3 = Feiner|first3 = S.|s2cid = 10122485|isbn = 0-7695-2191-6}}</ref> | ||
===Commerce=== | ===Commerce=== | ||
{{main|Commercial augmented reality}} | {{main|Commercial augmented reality}} | ||
AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect different types of media.<ref>Katts, Rima. [http://www.mobilemarketer.com/cms/news/software-technology/13810.html Elizabeth Arden brings new fragrance to life with augmented reality] ''Mobile Marketer'', 19 September 2012.</ref><ref>Meyer, David. [https://web.archive.org/web/20120918191136/http://gigaom.com/europe/telefonica-bets-on-augmented-reality-with-aurasma-tie-in/ Telefónica bets on augmented reality with Aurasma tie-in] ''gigaom'', 17 September 2012.</ref><ref>Mardle, Pamela.[http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ Video becomes reality for Stuprint.com] {{webarchive |url=https://web.archive.org/web/20130312171811/http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ |date=12 March 2013 }}. ''[[PrintWeek]]'', 3 October 2012.</ref><ref>Giraldo, Karina.[http://www.solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ Why mobile marketing is important for brands?] {{webarchive|url=https://web.archive.org/web/20150402135323/http://solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ |date=2 April 2015 }}. ''SolinixAR'', Enero 2015.</ref><ref>{{cite news|title=Augmented reality could be advertising world's best bet|url=http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|agency=The Financial Express|date=18 April 2015 | AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect different types of media.<ref>Katts, Rima. [http://www.mobilemarketer.com/cms/news/software-technology/13810.html Elizabeth Arden brings new fragrance to life with augmented reality] ''Mobile Marketer'', 19 September 2012.</ref><ref>Meyer, David. [https://web.archive.org/web/20120918191136/http://gigaom.com/europe/telefonica-bets-on-augmented-reality-with-aurasma-tie-in/ Telefónica bets on augmented reality with Aurasma tie-in] ''gigaom'', 17 September 2012.</ref><ref>Mardle, Pamela.[http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ Video becomes reality for Stuprint.com] {{webarchive |url=https://web.archive.org/web/20130312171811/http://www.printweek.com/news/1153133/Video-becomes-reality-Stuprintcom/ |date=12 March 2013 }}. ''[[PrintWeek]]'', 3 October 2012.</ref><ref>Giraldo, Karina.[http://www.solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ Why mobile marketing is important for brands?] {{webarchive|url=https://web.archive.org/web/20150402135323/http://solinix.co/blog/marketing-movil-su-importancia-para-las-marcas/ |date=2 April 2015 }}. ''SolinixAR'', Enero 2015.</ref><ref>{{cite news|title=Augmented reality could be advertising world's best bet|url=http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|agency=The Financial Express|date=18 April 2015|archive-url=https://web.archive.org/web/20150521061314/http://www.financialexpress.com/article/industry/companies/augmented-reality-could-be-advertising-worlds-best-bet/64855/|archive-date=21 May 2015}}</ref> | ||
AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.<ref>Humphries, Mathew.[http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/] {{Webarchive|url=https://web.archive.org/web/20120626192637/http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/|date=26 June 2012}}.''Geek.com'' 19 September 2011.</ref> AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.<ref>Netburn, Deborah.[https://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story Ikea introduces augmented reality app for 2013 catalog] {{Webarchive|url=https://web.archive.org/web/20121202070158/http://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story |date=2 December 2012 }}. ''[[Los Angeles Times]]'', 23 July 2012.</ref> | AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.<ref>Humphries, Mathew.[http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/] {{Webarchive|url=https://web.archive.org/web/20120626192637/http://www.geek.com/articles/gadgets/lego-demos-augmented-reality-boxes-with-gesture-recognition-20110919/|date=26 June 2012}}.''Geek.com'' 19 September 2011.</ref> AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.<ref>Netburn, Deborah.[https://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story Ikea introduces augmented reality app for 2013 catalog] {{Webarchive|url=https://web.archive.org/web/20121202070158/http://www.latimes.com/business/technology/la-ikeas-augmented-reality-app-20120723,0,1261315.story |date=2 December 2012 }}. ''[[Los Angeles Times]]'', 23 July 2012.</ref> | ||
| Line 271: | Line 174: | ||
In 2018, [[Shopify]], the Canadian e-commerce company, announced AR Quick Look integration. Their merchants will be able to upload 3D models of their products and their users will be able to tap on the models inside the Safari browser on their iOS devices to view them in their real-world environments.<ref>{{cite web | url = https://techcrunch.com/2018/09/17/shopify-is-bringing-apples-latest-ar-tech-to-their-platform/ | title = Shopify is bringing Apple's latest AR tech to their platform | date = 17 September 2018 | publisher = Lucas Matney | access-date = 3 December 2018}}</ref> | In 2018, [[Shopify]], the Canadian e-commerce company, announced AR Quick Look integration. Their merchants will be able to upload 3D models of their products and their users will be able to tap on the models inside the Safari browser on their iOS devices to view them in their real-world environments.<ref>{{cite web | url = https://techcrunch.com/2018/09/17/shopify-is-bringing-apples-latest-ar-tech-to-their-platform/ | title = Shopify is bringing Apple's latest AR tech to their platform | date = 17 September 2018 | publisher = Lucas Matney | access-date = 3 December 2018}}</ref> | ||
In 2018, [[Twinkl]] released a free AR classroom application. Pupils can see how [[York]] looked over 1,900 years ago.<ref>{{cite journal | url = https://www.qaeducation.co.uk/article/ar-classroom-york | title = History re-made: New AR classroom application lets pupils see how York looked over 1,900 years ago | journal = QA Education| access-date = 4 September 2018| date = 4 September 2018 | last1 = Magazine | first1 = QA Education }}</ref> Twinkl launched the first ever multi-player AR game, ''Little Red''<ref>{{cite journal | url = https://www.prolificnorth.co.uk/news/digital/2018/09/sheffields-twinkl-claims-ar-first-new-game| title = Sheffield's Twinkl claims AR first with new game | journal = Prolific North| access-date = 19 September 2018| date = 19 September 2018}}</ref> and has over 100 free AR educational models.<ref>{{cite journal | url = http://www.the-educator.org/technology-from-twinkl-brings-never-seen-before-objects-to-the-classroom/ | title = Technology from Twinkl brings never seen before objects to the classroom | journal = The Educator UK| access-date = 21 December 2018| date = 21 September 2018}}</ref> | In 2018, [[Twinkl]] released a free AR classroom application. Pupils can see how [[York]] looked over 1,900 years ago.<ref>{{cite journal | url = https://www.qaeducation.co.uk/article/ar-classroom-york | title = History re-made: New AR classroom application lets pupils see how York looked over 1,900 years ago | journal = QA Education | access-date = 4 September 2018 | date = 4 September 2018 | last1 = Magazine | first1 = QA Education | archive-date = 25 December 2018 | archive-url = https://web.archive.org/web/20181225175230/https://www.qaeducation.co.uk/article/ar-classroom-york }}</ref> Twinkl launched the first ever multi-player AR game, ''Little Red''<ref>{{cite journal | url = https://www.prolificnorth.co.uk/news/digital/2018/09/sheffields-twinkl-claims-ar-first-new-game| title = Sheffield's Twinkl claims AR first with new game | journal = Prolific North| access-date = 19 September 2018| date = 19 September 2018}}</ref> and has over 100 free AR educational models.<ref>{{cite journal | url = http://www.the-educator.org/technology-from-twinkl-brings-never-seen-before-objects-to-the-classroom/ | title = Technology from Twinkl brings never seen before objects to the classroom | journal = The Educator UK| access-date = 21 December 2018| date = 21 September 2018}}</ref> | ||
Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which are overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer [[full-body scanning]]. These booths render a 3D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.<ref>Pavlik, John V., and Shawn McIntosh. "Augmented Reality." ''Converging Media: a New Introduction to Mass Communication'', 5th ed., [[Oxford University Press]], 2017, pp. 184–185.</ref> For example, [[J. C. Penney|JC Penney]] and [[Bloomingdale's]] use "[[virtual dressing room]]s" that allow customers to see themselves in clothes without trying them on.<ref name="Dacko-2017">{{cite journal |last1=Dacko |first1=Scott G. |title=Enabling smart retail settings via mobile augmented reality shopping apps |journal=Technological Forecasting and Social Change |date=November 2017 |volume=124 |pages=243–256 |doi=10.1016/j.techfore.2016.09.032 |url=http://wrap.warwick.ac.uk/81922/5/WRAP-enabling-smart-retail-Dacko-2017.pdf }}</ref> Another store that uses AR to market clothing to its customers is [[Neiman Marcus]].<ref name="Retail Dive">{{Cite news|url=https://www.retaildive.com/news/how-neiman-marcus-is-turning-technology-innovation-into-a-core-value/436590/|title=How Neiman Marcus is turning technology innovation into a 'core value'|work=Retail Dive|access-date=23 September 2018|language=en-US}}</ref> Neiman Marcus offers consumers the ability to see their outfits in a 360-degree view with their "memory mirror".<ref name="Retail Dive" /> Makeup stores like [[L'Oréal|L'Oreal]], [[Sephora]], [[Charlotte Tilbury]], and [[Rimmel]] also have apps that utilize AR.<ref name="Arthur" /> These apps allow consumers to see how the makeup will look on them.<ref name="Arthur" /> According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail".<ref name="Arthur" /> | Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which are overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer [[full-body scanning]]. These booths render a 3D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.<ref>Pavlik, John V., and Shawn McIntosh. "Augmented Reality." ''Converging Media: a New Introduction to Mass Communication'', 5th ed., [[Oxford University Press]], 2017, pp. 184–185.</ref> For example, [[J. C. Penney|JC Penney]] and [[Bloomingdale's]] use "[[virtual dressing room]]s" that allow customers to see themselves in clothes without trying them on.<ref name="Dacko-2017">{{cite journal |last1=Dacko |first1=Scott G. |title=Enabling smart retail settings via mobile augmented reality shopping apps |journal=Technological Forecasting and Social Change |date=November 2017 |volume=124 |pages=243–256 |doi=10.1016/j.techfore.2016.09.032 |url=http://wrap.warwick.ac.uk/81922/5/WRAP-enabling-smart-retail-Dacko-2017.pdf }}</ref> Another store that uses AR to market clothing to its customers is [[Neiman Marcus]].<ref name="Retail Dive">{{Cite news|url=https://www.retaildive.com/news/how-neiman-marcus-is-turning-technology-innovation-into-a-core-value/436590/|title=How Neiman Marcus is turning technology innovation into a 'core value'|work=Retail Dive|access-date=23 September 2018|language=en-US}}</ref> Neiman Marcus offers consumers the ability to see their outfits in a 360-degree view with their "memory mirror".<ref name="Retail Dive" /> Makeup stores like [[L'Oréal|L'Oreal]], [[Sephora]], [[Charlotte Tilbury]], and [[Rimmel]] also have apps that utilize AR.<ref name="Arthur" /> These apps allow consumers to see how the makeup will look on them.<ref name="Arthur" /> According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail".<ref name="Arthur" /> | ||
AR technology is also used by furniture retailers such as [[IKEA]], [[Houzz]], and [[Wayfair]].<ref name="Arthur">{{Cite news|url=https://www.forbes.com/sites/rachelarthur/2017/10/31/augmented-reality-is-set-to-transform-fashion-and-retail/#364c701b3151|title=Augmented Reality Is Set To Transform Fashion And Retail|last=Arthur|first=Rachel|work=Forbes|access-date=23 September 2018|language=en}}</ref><ref name="Dacko-2017" /> These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.<ref name="Arthur" /><ref>{{cite web |url=https://archvisualizations.com/augmented-reality-apps-for-interior-visualization/ |title=Augmented Reality Apps for Interior Visualization |access-date=2024-04-09 |website=archvisualizations.com|date=30 January 2024 }}</ref> | AR technology is also used by furniture retailers such as [[IKEA]], [[Houzz]], and [[Wayfair]].<ref name="Arthur">{{Cite news|url=https://www.forbes.com/sites/rachelarthur/2017/10/31/augmented-reality-is-set-to-transform-fashion-and-retail/#364c701b3151|title=Augmented Reality Is Set To Transform Fashion And Retail|last=Arthur|first=Rachel|work=Forbes|access-date=23 September 2018|language=en}}</ref><ref name="Dacko-2017" /> These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.<ref name="Arthur" /><ref>{{cite web |url=https://archvisualizations.com/augmented-reality-apps-for-interior-visualization/ |title=Augmented Reality Apps for Interior Visualization |access-date=2024-04-09 |website=archvisualizations.com|date=30 January 2024 }}</ref> | ||
In 2017, [[Ikea]] announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.<ref>{{cite magazine | url = https://www.wired.com/story/ikea-place-ar-kit-augmented-reality/ | title = IKEA's new app flaunts what you'll love most about AR| magazine = [[Wired (magazine)|Wired]] | access-date = 20 September 2017| date = 20 September 2017| last1 = Pardes| first1 = Arielle}}</ref> The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore.<ref>{{Cite web|url=https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html|title=IKEA Highlights 2017|access-date=8 October 2018|archive-date=8 October 2018|archive-url=https://web.archive.org/web/20181008214446/https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html | In 2017, [[Ikea]] announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.<ref>{{cite magazine | url = https://www.wired.com/story/ikea-place-ar-kit-augmented-reality/ | title = IKEA's new app flaunts what you'll love most about AR| magazine = [[Wired (magazine)|Wired]] | access-date = 20 September 2017| date = 20 September 2017| last1 = Pardes| first1 = Arielle}}</ref> The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore.<ref>{{Cite web|url=https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html|title=IKEA Highlights 2017|access-date=8 October 2018|archive-date=8 October 2018|archive-url=https://web.archive.org/web/20181008214446/https://www.ikea.com/ms/en_CH/this-is-ikea/ikea-highlights/2017/ikea-place-app/index.html}}</ref><ref>{{Cite web|url=https://www.inter.ikea.com/en/performance|archive-url=https://web.archive.org/web/20180626015939/https://highlights.ikea.com/2017/facts-and-figures/|title=Performance|archive-date=26 June 2018|website=www.inter.ikea.com}}</ref> Shopify's acquisition of Primer, an AR [[Application software|app]] aims to push small and medium-sized sellers towards interactive AR shopping with easy to use AR integration and user experience for both merchants and consumers. AR helps the retail industry reduce operating costs. Merchants upload product information to the AR system, and consumers can use mobile terminals to search and generate 3D maps.<ref>{{Cite journal |last1=Indriani |first1=Masitoh |last2=Liah Basuki Anggraeni |date=2022-06-30 |title=What Augmented Reality Would Face Today? The Legal Challenges to the Protection of Intellectual Property in Virtual Space |journal=Media Iuris |volume=5 |issue=2 |pages=305–330 |doi=10.20473/mi.v5i2.29339 |s2cid=250464007 |issn=2621-5225|doi-access=free }}</ref> | ||
=== Literature === | === Literature === | ||
| Line 301: | Line 204: | ||
===Healthcare planning, practice and education=== | ===Healthcare planning, practice and education=== | ||
One of the first applications of augmented reality was in healthcare, particularly to support the planning, practice, and training of surgical procedures. As far back as 1992, enhancing human performance during surgery was a formally stated objective when building the first augmented reality systems at U.S. Air Force laboratories.<ref name="B. Rosenberg 1992"/> AR provides surgeons with patient monitoring data in the style of a fighter pilot's heads-up display, and allows patient imaging records, including functional videos, to be accessed and overlaid. Examples include a virtual [[X-ray]] view based on prior [[tomography]] or on real-time images from [[ultrasound]] and [[confocal microscopy]] probes,<ref>{{cite book |doi=10.1007/978-3-642-04268-3_60 |pmid=20426023 |chapter=Optical Biopsy Mapping for Minimally Invasive Cancer Screening |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2009 |volume=5761 |issue=Pt 1 |pages=483–490 |series=Lecture Notes in Computer Science |year=2009 |last1=Mountney |first1=Peter |last2=Giannarou |first2=Stamatia |last3=Elson |first3=Daniel |last4=Yang |first4=Guang-Zhong |isbn=978-3-642-04267-6 }}</ref> visualizing the position of a tumor in the video of an [[endoscope]],<ref>{{youTube|4emmCcBb4s|Scopis Augmented Reality: Path guidance to craniopharyngioma}}</ref> or radiation exposure risks from X-ray imaging devices.<ref>{{cite book |doi=10.1007/978-3-319-10404-1_52 |pmid=25333145 |chapter=3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014 |volume=8673 |issue=Pt 1 |pages=415–422 |series=Lecture Notes in Computer Science |year=2014 |last1=Loy Rodas |first1=Nicolas |last2=Padoy |first2=Nicolas |isbn=978-3-319-10403-4 |s2cid=819543 }}</ref><ref>{{youTube|pINE2gaOVOY|3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose}}</ref> AR can enhance viewing a [[fetus]] inside a mother's [[womb]].<ref>{{cite web |url=http://www.cs.unc.edu/Research/us/ |title=UNC Ultrasound/Medical Augmented Reality Research |access-date=6 January 2010 |archive-url=https://web.archive.org/web/20100212231230/http://www.cs.unc.edu/Research/us/ |archive-date=12 February 2010 |url-status=live}}</ref> Siemens, Karl Storz and IRCAD have developed a system for [[Laparoscopy|laparoscopic]] liver surgery that uses AR to view sub-surface tumors and vessels.<ref>{{cite book |doi=10.1007/978-3-319-10404-1_53 |pmid=25333146 |chapter=An Augmented Reality Framework for Soft Tissue Surgery |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014 |volume=8673 |issue=Pt 1 |pages=423–431 |series=Lecture Notes in Computer Science |year=2014 |last1=Mountney |first1=Peter |last2=Fallert |first2=Johannes |last3=Nicolau |first3=Stephane |last4=Soler |first4=Luc |last5=Mewes |first5=Philip W. |isbn=978-3-319-10403-4 }}</ref> | One of the first applications of augmented reality was in healthcare, particularly to support the planning, practice, and training of surgical procedures. As far back as 1992, enhancing human performance during surgery was a formally stated objective when building the first augmented reality systems at U.S. Air Force laboratories.<ref name="B. Rosenberg 1992"/> AR provides surgeons with patient monitoring data in the style of a fighter pilot's heads-up display, and allows patient imaging records, including functional videos, to be accessed and overlaid. Examples include a virtual [[X-ray]] view based on prior [[tomography]] or on real-time images from [[ultrasound]] and [[confocal microscopy]] probes,<ref>{{cite book |doi=10.1007/978-3-642-04268-3_60 |pmid=20426023 |chapter=Optical Biopsy Mapping for Minimally Invasive Cancer Screening |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2009 |volume=5761 |issue=Pt 1 |pages=483–490 |series=Lecture Notes in Computer Science |year=2009 |last1=Mountney |first1=Peter |last2=Giannarou |first2=Stamatia |last3=Elson |first3=Daniel |last4=Yang |first4=Guang-Zhong |isbn=978-3-642-04267-6 }}</ref> visualizing the position of a tumor in the video of an [[endoscope]],<ref>{{youTube|4emmCcBb4s|Scopis Augmented Reality: Path guidance to craniopharyngioma}}</ref> or radiation exposure risks from X-ray imaging devices.<ref>{{cite book |doi=10.1007/978-3-319-10404-1_52 |pmid=25333145 |chapter=3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014 |volume=8673 |issue=Pt 1 |pages=415–422 |series=Lecture Notes in Computer Science |year=2014 |last1=Loy Rodas |first1=Nicolas |last2=Padoy |first2=Nicolas |isbn=978-3-319-10403-4 |s2cid=819543 }}</ref><ref>{{youTube|pINE2gaOVOY|3D Global Estimation and Augmented Reality Visualization of Intra-operative X-ray Dose}}</ref> AR can enhance viewing a [[fetus]] inside a mother's [[womb]].<ref>{{cite web |url=http://www.cs.unc.edu/Research/us/ |title=UNC Ultrasound/Medical Augmented Reality Research |access-date=6 January 2010 |archive-url=https://web.archive.org/web/20100212231230/http://www.cs.unc.edu/Research/us/ |archive-date=12 February 2010 |url-status=live}}</ref> Siemens, Karl Storz and IRCAD have developed a system for [[Laparoscopy|laparoscopic]] liver surgery that uses AR to view sub-surface tumors and vessels.<ref>{{cite book |doi=10.1007/978-3-319-10404-1_53 |pmid=25333146 |chapter=An Augmented Reality Framework for Soft Tissue Surgery |title=Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014 |volume=8673 |issue=Pt 1 |pages=423–431 |series=Lecture Notes in Computer Science |year=2014 |last1=Mountney |first1=Peter |last2=Fallert |first2=Johannes |last3=Nicolau |first3=Stephane |last4=Soler |first4=Luc |last5=Mewes |first5=Philip W. |isbn=978-3-319-10403-4 }}</ref> | ||
AR has been used for cockroach phobia treatment<ref>{{cite journal |last1=Botella |first1=Cristina |last2=Bretón-López |first2=Juani |last3=Quero |first3=Soledad |last4=Baños |first4=Rosa |last5=García-Palacios |first5=Azucena |title=Treating Cockroach Phobia With Augmented Reality |journal=Behavior Therapy |date=September 2010 |volume=41 |issue=3 |pages=401–413 |doi=10.1016/j.beth.2009.07.002 |pmid=20569788 |s2cid=29889630 }}</ref> and to reduce the fear of spiders.<ref>{{Cite journal|last1=Zimmer|first1=Anja|last2=Wang|first2=Nan|last3=Ibach|first3=Merle K.|last4=Fehlmann|first4=Bernhard|last5=Schicktanz|first5=Nathalie S.|last6=Bentz|first6=Dorothée|last7=Michael|first7=Tanja|last8=Papassotiropoulos|first8=Andreas|last9=de Quervain|first9=Dominique J. F.|date=2021-08-01|title=Effectiveness of a smartphone-based, augmented reality exposure app to reduce fear of spiders in real-life: A randomized controlled trial|journal=Journal of Anxiety Disorders|language=en|volume=82| | AR has been used for cockroach phobia treatment<ref>{{cite journal |last1=Botella |first1=Cristina |last2=Bretón-López |first2=Juani |last3=Quero |first3=Soledad |last4=Baños |first4=Rosa |last5=García-Palacios |first5=Azucena |title=Treating Cockroach Phobia With Augmented Reality |journal=Behavior Therapy |date=September 2010 |volume=41 |issue=3 |pages=401–413 |doi=10.1016/j.beth.2009.07.002 |pmid=20569788 |s2cid=29889630 }}</ref> and to reduce the fear of spiders.<ref>{{Cite journal|last1=Zimmer|first1=Anja|last2=Wang|first2=Nan|last3=Ibach|first3=Merle K.|last4=Fehlmann|first4=Bernhard|last5=Schicktanz|first5=Nathalie S.|last6=Bentz|first6=Dorothée|last7=Michael|first7=Tanja|last8=Papassotiropoulos|first8=Andreas|last9=de Quervain|first9=Dominique J. F.|date=2021-08-01|title=Effectiveness of a smartphone-based, augmented reality exposure app to reduce fear of spiders in real-life: A randomized controlled trial|journal=Journal of Anxiety Disorders|language=en|volume=82|article-number=102442|doi=10.1016/j.janxdis.2021.102442|pmid=34246153|s2cid=235791626|issn=0887-6185|doi-access=free}}</ref> Patients wearing augmented reality glasses can be reminded to take medications.<ref>{{cite web | url = http://www.healthtechevent.com/technology/augmented-reality-revolutionizing-medicine-healthcare/ | title = Augmented Reality Revolutionizing Medicine | publisher = Health Tech Event | access-date = 9 October 2014 | date = 6 June 2014 | archive-date = 12 October 2014 | archive-url = https://web.archive.org/web/20141012184851/http://www.healthtechevent.com/technology/augmented-reality-revolutionizing-medicine-healthcare/ }}</ref> Augmented reality can be very helpful in the medical field.<ref>{{Cite journal|last=Thomas|first=Daniel J.|date=December 2016|title=Augmented reality in surgery: The Computer-Aided Medicine revolution|journal=International Journal of Surgery |volume=36|issue=Pt A|page=25|doi=10.1016/j.ijsu.2016.10.003|issn=1743-9159|pmid=27741424|doi-access=free}}</ref> It could be used to provide crucial information to a doctor or surgeon without having them take their eyes off the patient. | ||
On 30 April 2015, Microsoft announced the [[Microsoft HoloLens]], their first attempt at augmented reality. The HoloLens is capable of displaying images for image-guided surgery.<ref>{{Cite book|last1=Cui|first1=Nan|last2=Kharel|first2=Pradosh|last3=Gruev|first3=Viktor|s2cid=125528534|date=8 February 2017|title=Augmented reality with Microsoft HoloLens holograms for near-infrared fluorescence based image guided surgery|publisher=International Society for Optics and Photonics|volume=10049|pages=100490I|doi=10.1117/12.2251625|series=Molecular-Guided Surgery: Molecules, Devices, and Applications III|chapter=Augmented reality with Microsoft Holo ''Lens'' holograms for near-infrared fluorescence based image guided surgery|editor1-last=Pogue|editor1-first=Brian W|editor2-last=Gioux|editor2-first=Sylvain}}</ref> As augmented reality advances, it finds increasing applications in healthcare. Augmented reality and similar computer based-utilities are being used to train medical professionals.<ref>{{cite journal |last1=Moro |first1=C |last2=Birt |first2=J |last3=Stromberga |first3=Z |last4=Phelps |first4=C |last5=Clark |first5=J |last6=Glasziou |first6=P |last7=Scott |first7=AM |title=Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis. |journal=Anatomical Sciences Education |date=May 2021 |volume=14 |issue=3 |pages=368–376 |doi=10.1002/ase.2049 |pmid=33378557|s2cid=229929326 |url=https://research.bond.edu.au/en/publications/63e5a776-f3fd-48f2-b0ba-f47ca4ca96e2 }}</ref><ref>{{Cite journal|last1=Barsom|first1=E. Z.|last2=Graafland|first2=M.|last3=Schijven|first3=M. P.|date=1 October 2016|title=Systematic review on the effectiveness of augmented reality applications in medical training|journal=Surgical Endoscopy|language=en|volume=30|issue=10|pages=4174–4183|doi=10.1007/s00464-016-4800-6|pmid=26905573|issn=0930-2794|pmc=5009168}}</ref> In healthcare, AR can be used to provide guidance during diagnostic and therapeutic interventions e.g. during surgery. Magee et al.,<ref>{{Cite journal|last1=Magee|first1=D.|last2=Zhu|first2=Y.|last3=Ratnalingam|first3=R.|last4=Gardner|first4=P.|last5=Kessel|first5=D.|date=1 October 2007|title=An augmented reality simulator for ultrasound guided needle placement training|journal=Medical & Biological Engineering & Computing|language=en|volume=45|issue=10|pages=957–967|doi=10.1007/s11517-007-0231-9|pmid=17653784|s2cid=14943048|issn=1741-0444|url=http://eprints.whiterose.ac.uk/75786/8/Combine.pdf}}</ref> for instance, describe the use of augmented reality for medical training in simulating ultrasound-guided needle placement. Recently, augmented reality began seeing adoption in [[neurosurgery]], a field that requires heavy amounts of imaging before procedures.<ref>{{Cite journal|last1=Tagaytayan|first1=Raniel|last2=Kelemen|first2=Arpad|last3=Sik-Lanyi|first3=Cecilia|title=Augmented reality in neurosurgery|journal=Archives of Medical Science |volume=14|issue=3|pages=572–578|doi=10.5114/aoms.2016.58690|issn=1734-1922|pmc=5949895|pmid=29765445|year=2018}}</ref> | On 30 April 2015, Microsoft announced the [[Microsoft HoloLens]], their first attempt at augmented reality. The HoloLens is capable of displaying images for image-guided surgery.<ref>{{Cite book|last1=Cui|first1=Nan|last2=Kharel|first2=Pradosh|last3=Gruev|first3=Viktor|s2cid=125528534|date=8 February 2017|title=Augmented reality with Microsoft HoloLens holograms for near-infrared fluorescence based image guided surgery|publisher=International Society for Optics and Photonics|volume=10049|pages=100490I|doi=10.1117/12.2251625|series=Molecular-Guided Surgery: Molecules, Devices, and Applications III|chapter=Augmented reality with Microsoft Holo ''Lens'' holograms for near-infrared fluorescence based image guided surgery|editor1-last=Pogue|editor1-first=Brian W|editor2-last=Gioux|editor2-first=Sylvain}}</ref> As augmented reality advances, it finds increasing applications in healthcare. Augmented reality and similar computer based-utilities are being used to train medical professionals.<ref>{{cite journal |last1=Moro |first1=C |last2=Birt |first2=J |last3=Stromberga |first3=Z |last4=Phelps |first4=C |last5=Clark |first5=J |last6=Glasziou |first6=P |last7=Scott |first7=AM |title=Virtual and Augmented Reality Enhancements to Medical and Science Student Physiology and Anatomy Test Performance: A Systematic Review and Meta-Analysis. |journal=Anatomical Sciences Education |date=May 2021 |volume=14 |issue=3 |pages=368–376 |doi=10.1002/ase.2049 |pmid=33378557|s2cid=229929326 |url=https://research.bond.edu.au/en/publications/63e5a776-f3fd-48f2-b0ba-f47ca4ca96e2 }}</ref><ref>{{Cite journal|last1=Barsom|first1=E. Z.|last2=Graafland|first2=M.|last3=Schijven|first3=M. P.|date=1 October 2016|title=Systematic review on the effectiveness of augmented reality applications in medical training|journal=Surgical Endoscopy|language=en|volume=30|issue=10|pages=4174–4183|doi=10.1007/s00464-016-4800-6|pmid=26905573|issn=0930-2794|pmc=5009168}}</ref> In healthcare, AR can be used to provide guidance during diagnostic and therapeutic interventions e.g. during surgery. Magee et al.,<ref>{{Cite journal|last1=Magee|first1=D.|last2=Zhu|first2=Y.|last3=Ratnalingam|first3=R.|last4=Gardner|first4=P.|last5=Kessel|first5=D.|date=1 October 2007|title=An augmented reality simulator for ultrasound guided needle placement training|journal=Medical & Biological Engineering & Computing|language=en|volume=45|issue=10|pages=957–967|doi=10.1007/s11517-007-0231-9|pmid=17653784|s2cid=14943048|issn=1741-0444|url=http://eprints.whiterose.ac.uk/75786/8/Combine.pdf}}</ref> for instance, describe the use of augmented reality for medical training in simulating ultrasound-guided needle placement. Recently, augmented reality began seeing adoption in [[neurosurgery]], a field that requires heavy amounts of imaging before procedures.<ref>{{Cite journal|last1=Tagaytayan|first1=Raniel|last2=Kelemen|first2=Arpad|last3=Sik-Lanyi|first3=Cecilia|title=Augmented reality in neurosurgery|journal=Archives of Medical Science |volume=14|issue=3|pages=572–578|doi=10.5114/aoms.2016.58690|issn=1734-1922|pmc=5949895|pmid=29765445|year=2018}}</ref> | ||
| Line 315: | Line 218: | ||
===Military=== | ===Military=== | ||
[[File:ARC4 AR System.jpg|thumb|alt= Photograph of an Augmented Reality System for Soldier ARC4. |Augmented reality system for soldier ARC4 (U.S. Army 2017)]] | [[File:ARC4 AR System.jpg|thumb|alt= Photograph of an Augmented Reality System for Soldier ARC4. |Augmented reality system for soldier ARC4 (U.S. Army 2017)]] | ||
The first fully immersive system was the [[Virtual fixture|Virtual Fixtures]] platform, which was developed in 1992 by Louis Rosenberg at the [[Armstrong Labs|Armstrong Laboratories]] of the [[United States Air Force]].<ref name="ros92">Rosenberg, Louis B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.</ref> It enabled human users to control [[Robot|robots]] in real-world environments that included real physical objects and 3D virtual overlays ("fixtures") that were added enhance human performance of manipulation tasks. Published studies showed that by introducing virtual objects into the real world, significant performance increases could be achieved by human operators.<ref name="ros92" /><ref>{{cite journal |last1=Rosenberg |first1=Louis B. |date=21 December 1993 |title=Virtual fixtures as tools to enhance operator performance in telepresence environments |journal=Telemanipulator Technology and Space Telerobotics |volume=2057 |pages=10–21 |bibcode=1993SPIE.2057...10R |doi=10.1117/12.164901}}</ref><ref>{{cite journal |last1=Hughes |first1=C.E. |last2=Stapleton |first2=C.B. |last3=Hughes |first3=D.E. |last4=Smith |first4=E.M. |date=November 2005 |title=Mixed reality in education, entertainment, and training |journal=IEEE Computer Graphics and Applications |volume=25 |issue=6 |pages=24–30 |doi=10.1109/MCG.2005.139 |pmid=16315474}}</ref> | The first fully immersive system was the [[Virtual fixture|Virtual Fixtures]] platform, which was developed in 1992 by Louis Rosenberg at the [[Armstrong Labs|Armstrong Laboratories]] of the [[United States Air Force]].<ref name="ros92">Rosenberg, Louis B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.</ref> It enabled human users to control [[Robot|robots]] in real-world environments that included real physical objects and 3D virtual overlays ("fixtures") that were added enhance human performance of manipulation tasks. Published studies showed that by introducing virtual objects into the real world, significant performance increases could be achieved by human operators.<ref name="ros92" /><ref>{{cite journal |last1=Rosenberg |first1=Louis B. |date=21 December 1993 |title=Virtual fixtures as tools to enhance operator performance in telepresence environments |journal=Telemanipulator Technology and Space Telerobotics |volume=2057 |pages=10–21 |bibcode=1993SPIE.2057...10R |doi=10.1117/12.164901}}</ref><ref>{{cite journal |last1=Hughes |first1=C.E. |last2=Stapleton |first2=C.B. |last3=Hughes |first3=D.E. |last4=Smith |first4=E.M. |date=November 2005 |title=Mixed reality in education, entertainment, and training |journal=IEEE Computer Graphics and Applications |volume=25 |issue=6 |pages=24–30 |doi=10.1109/MCG.2005.139 |pmid=16315474 |bibcode=2005ICGA...25f..24H }}</ref> | ||
An interesting early application of AR occurred when [[Rockwell International]] created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris.<ref name="ABER93">Abernathy, M., Houchard, J., Puccetti, M., and Lambert, J,"Debris Correlation Using the Rockwell WorldView System", Proceedings of 1993 Space Surveillance Workshop 30 March to 1 April 1993, pages 189–195</ref> | An interesting early application of AR occurred when [[Rockwell International]] created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris.<ref name="ABER93">Abernathy, M., Houchard, J., Puccetti, M., and Lambert, J,"Debris Correlation Using the Rockwell WorldView System", Proceedings of 1993 Space Surveillance Workshop 30 March to 1 April 1993, pages 189–195</ref> | ||
| Line 321: | Line 224: | ||
Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows it to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness. | Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows it to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness. | ||
Combat reality can be simulated and represented using complex, layered data and visual aides, most of which are [[Head-mounted display|head-mounted displays]] (HMD), which encompass any display technology that can be worn on the user's head.<ref>Pandher, Gurmeet Singh (2 March 2016). "Microsoft HoloLens Preorders: Price, Specs Of The Augmented Reality Headset". The Bitbag. Archived from the original on 4 March 2016. Retrieved 1 April 2016.</ref> Military training solutions are often built on [[commercial off-the-shelf]] (COTS) technologies, such as [[Improbable (company)|Improbable's]] synthetic environment platform, Virtual Battlespace 3 and VirTra, with the latter two platforms used by the [[United States Army]]. {{As of|2018}}, VirTra is being used by both civilian and military law enforcement to train personnel in a variety of scenarios, including active shooter, domestic violence, and military traffic stops.<ref>{{Cite news |author=VirTra Inc. |title=VirTra's Police Training Simulators Chosen by Three of the Largest U.S. Law Enforcement Departments |url=https://globenewswire.com/news-release/2018/06/25/1528863/0/en/VirTra-s-Police-Training-Simulators-Chosen-by-Three-of-the-Largest-U-S-Law-Enforcement-Departments.html |access-date=22 August 2018 |work=GlobeNewswire News Room |language=en-US}}</ref><ref>{{Cite web |date=14 August 2017 |title=How do police use VR? Very well {{!}} Police Foundation |url=https://www.policefoundation.org/virtual-reality-technology-changes-the-game-for-law-enforcement-training/ | Combat reality can be simulated and represented using complex, layered data and visual aides, most of which are [[Head-mounted display|head-mounted displays]] (HMD), which encompass any display technology that can be worn on the user's head.<ref>Pandher, Gurmeet Singh (2 March 2016). "Microsoft HoloLens Preorders: Price, Specs Of The Augmented Reality Headset". The Bitbag. Archived from the original on 4 March 2016. Retrieved 1 April 2016.</ref> Military training solutions are often built on [[commercial off-the-shelf]] (COTS) technologies, such as [[Improbable (company)|Improbable's]] synthetic environment platform, Virtual Battlespace 3 and VirTra, with the latter two platforms used by the [[United States Army]]. {{As of|2018}}, VirTra is being used by both civilian and military law enforcement to train personnel in a variety of scenarios, including active shooter, domestic violence, and military traffic stops.<ref>{{Cite news |author=VirTra Inc. |title=VirTra's Police Training Simulators Chosen by Three of the Largest U.S. Law Enforcement Departments |url=https://globenewswire.com/news-release/2018/06/25/1528863/0/en/VirTra-s-Police-Training-Simulators-Chosen-by-Three-of-the-Largest-U-S-Law-Enforcement-Departments.html |access-date=22 August 2018 |work=GlobeNewswire News Room |language=en-US}}</ref><ref>{{Cite web |date=14 August 2017 |title=How do police use VR? Very well {{!}} Police Foundation |url=https://www.policefoundation.org/virtual-reality-technology-changes-the-game-for-law-enforcement-training/ |archive-url=https://web.archive.org/web/20200222113548/https://www.policefoundation.org/virtual-reality-technology-changes-the-game-for-law-enforcement-training/ |archive-date=22 February 2020 |access-date=22 August 2018 |website=www.policefoundation.org |language=en-US}}</ref> | ||
In 2017, the U.S. Army was developing the Synthetic Training Environment (STE), a collection of technologies for training purposes that was expected to include mixed reality. {{As of|2018}}, STE was still in development without a projected completion date. Some recorded goals of STE included enhancing realism and increasing simulation training capabilities and STE availability to other systems.<ref>{{cite thesis |last1=Eagen |first1=Andrew S |title=Expanding Simulations as a Means of Tactical Training with Multinational Partners |date=2017 |id={{DTIC|AD1038670}}}}{{pn|date=April 2025}}</ref> | In 2017, the U.S. Army was developing the Synthetic Training Environment (STE), a collection of technologies for training purposes that was expected to include mixed reality. {{As of|2018}}, STE was still in development without a projected completion date. Some recorded goals of STE included enhancing realism and increasing simulation training capabilities and STE availability to other systems.<ref>{{cite thesis |last1=Eagen |first1=Andrew S |title=Expanding Simulations as a Means of Tactical Training with Multinational Partners |date=2017 |id={{DTIC|AD1038670}}}}{{pn|date=April 2025}}</ref> | ||
It was claimed that mixed-reality environments like STE could reduce training costs,<ref>{{cite journal |last1=Bukhari |first1=Hatim |last2=Andreatta |first2=Pamela |last3=Goldiez |first3=Brian |last4=Rabelo |first4=Luis |date=January 2017 |title=A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care |journal=Inquiry |volume=54 |doi=10.1177/0046958016687176 |pmc=5798742 |pmid=28133988}}</ref><ref>{{cite journal |last1=Smith |first1=Roger |date=February 2010 |title=The Long History of Gaming in Military Training |journal=Simulation & Gaming |volume=41 |issue=1 |pages=6–19 |doi=10.1177/1046878109334330}}</ref> such as reducing the amount of [[ammunition]] expended during training.<ref>Shufelt, Jr., J.W. (2006) A Vision for Future Virtual Training. In Virtual Media for Military Applications (pp. KN2-1 – KN2-12). Meeting Proceedings RTO-MP-HFM-136, Keynote 2. Neuilly-sur-Seine, France: RTO. Available from:[http://www.rto.nato.int/abstracts.asp Mixed Reality (MR)]{{Webarchive|url=https://web.archive.org/web/20070613170605/http://www.rto.nato.int/Abstracts.asp|date=13 June 2007}}</ref> In 2018, it was reported that STE would include representation of any part of the world's terrain for training purposes.<ref>{{Cite web |title=STAND-TO! |url=https://www.army.mil/standto/2018-03-26 |access-date=22 August 2018 |website=www.army.mil |language=en}}</ref> STE would offer a variety of training opportunities for squad brigade and combat teams, including [[Stryker]], armory, and infantry teams.<ref>{{Cite web |title=Augmented reality may revolutionize Army training |url=https://www.arl.army.mil/www/default.cfm?article=3042 | It was claimed that mixed-reality environments like STE could reduce training costs,<ref>{{cite journal |last1=Bukhari |first1=Hatim |last2=Andreatta |first2=Pamela |last3=Goldiez |first3=Brian |last4=Rabelo |first4=Luis |date=January 2017 |title=A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care |journal=Inquiry |volume=54 |article-number=0046958016687176 |doi=10.1177/0046958016687176 |pmc=5798742 |pmid=28133988}}</ref><ref>{{cite journal |last1=Smith |first1=Roger |date=February 2010 |title=The Long History of Gaming in Military Training |journal=Simulation & Gaming |volume=41 |issue=1 |pages=6–19 |doi=10.1177/1046878109334330}}</ref> such as reducing the amount of [[ammunition]] expended during training.<ref>Shufelt, Jr., J.W. (2006) A Vision for Future Virtual Training. In Virtual Media for Military Applications (pp. KN2-1 – KN2-12). Meeting Proceedings RTO-MP-HFM-136, Keynote 2. Neuilly-sur-Seine, France: RTO. Available from:[http://www.rto.nato.int/abstracts.asp Mixed Reality (MR)]{{Webarchive|url=https://web.archive.org/web/20070613170605/http://www.rto.nato.int/Abstracts.asp|date=13 June 2007}}</ref> In 2018, it was reported that STE would include representation of any part of the world's terrain for training purposes.<ref>{{Cite web |title=STAND-TO! |url=https://www.army.mil/standto/2018-03-26 |access-date=22 August 2018 |website=www.army.mil |language=en}}</ref> STE would offer a variety of training opportunities for squad brigade and combat teams, including [[Stryker]], armory, and infantry teams.<ref>{{Cite web |title=Augmented reality may revolutionize Army training |url=https://www.arl.army.mil/www/default.cfm?article=3042 |archive-url=https://web.archive.org/web/20170810182853/http://www.arl.army.mil/www/default.cfm?article=3042 |archive-date=10 August 2017 |access-date=22 August 2018 |website=www.arl.army.mil |language=en}}</ref> | ||
Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology.<ref>Calhoun, G. L., Draper, M. H., Abernathy, M. F., Delgado, F., and Patzek, M. "Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness," 2005 Proceedings of SPIE Enhanced and Synthetic Vision, Vol. 5802, pp. 219–230.</ref> This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems. | Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology.<ref>Calhoun, G. L., Draper, M. H., Abernathy, M. F., Delgado, F., and Patzek, M. "Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness," 2005 Proceedings of SPIE Enhanced and Synthetic Vision, Vol. 5802, pp. 219–230.</ref> This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems. | ||
| Line 342: | Line 245: | ||
===Workplace=== | ===Workplace=== | ||
In a research project, AR was used to facilitate collaboration among distributed team members via conferences with local and virtual participants. AR tasks included brainstorming and discussion meetings utilizing common visualization via touch screen tables, interactive digital whiteboards, shared design spaces and distributed control rooms.<ref>{{cite web |url=http://www.hog3d.net/ |title=Hand of God |author1=Stafford, Aaron |author2=Piekarski, Wayne |author3=Thomas, Bruce H. |access-date=18 December 2009 |archive-url=https://web.archive.org/web/20091207022651/http://www.hog3d.net/ |archive-date=7 December 2009 | In a research project, AR was used to facilitate collaboration among distributed team members via conferences with local and virtual participants. AR tasks included brainstorming and discussion meetings utilizing common visualization via touch screen tables, interactive digital whiteboards, shared design spaces and distributed control rooms.<ref>{{cite web |url=http://www.hog3d.net/ |title=Hand of God |author1=Stafford, Aaron |author2=Piekarski, Wayne |author3=Thomas, Bruce H. |access-date=18 December 2009 |archive-url=https://web.archive.org/web/20091207022651/http://www.hog3d.net/ |archive-date=7 December 2009 }}</ref><ref>{{cite journal |last1=Benford |first1=Steve |last2=Greenhalgh |first2=Chris |last3=Reynard |first3=Gail |last4=Brown |first4=Chris |last5=Koleva |first5=Boriana |s2cid=672378 |title=Understanding and constructing shared spaces with mixed-reality boundaries |journal=ACM Transactions on Computer-Human Interaction |date=1 September 1998 |volume=5 |issue=3 |pages=185–223 |doi=10.1145/292834.292836 |url=https://nottingham-repository.worktribe.com/output/23509780 }}</ref><ref>[http://mi-lab.org/projects/office-of-tomorrow/ Office of Tomorrow] {{Webarchive|url=https://web.archive.org/web/20100916015955/http://mi-lab.org/projects/office-of-tomorrow/ |date=16 September 2010 }} ''Media Interaction Lab''.</ref> | ||
In industrial environments, augmented reality is proving to have a substantial impact with use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system.<ref>[https://web.archive.org/web/20110511082745/http://ngm.nationalgeographic.com/big-idea/14/augmented-reality-pg1 The big idea:Augmented Reality]. Ngm.nationalgeographic.com (15 May 2012). Retrieved 9 June 2012.</ref><ref>{{cite web |url=http://graphics.cs.columbia.edu/projects/armar/ |title=Augmented Reality for Maintenance and Repair (ARMAR) |author1=Henderson, Steve |author2=Feiner, Steven |access-date=6 January 2010 |archive-date=6 March 2010 |archive-url=https://web.archive.org/web/20100306202422/http://graphics.cs.columbia.edu/projects/armar/ | In industrial environments, augmented reality is proving to have a substantial impact with use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system.<ref>[https://web.archive.org/web/20110511082745/http://ngm.nationalgeographic.com/big-idea/14/augmented-reality-pg1 The big idea:Augmented Reality]. Ngm.nationalgeographic.com (15 May 2012). Retrieved 9 June 2012.</ref><ref>{{cite web |url=http://graphics.cs.columbia.edu/projects/armar/ |title=Augmented Reality for Maintenance and Repair (ARMAR) |author1=Henderson, Steve |author2=Feiner, Steven |access-date=6 January 2010 |archive-date=6 March 2010 |archive-url=https://web.archive.org/web/20100306202422/http://graphics.cs.columbia.edu/projects/armar/ }}</ref> Assembly lines benefited from the usage of AR. In addition to Boeing, BMW and Volkswagen were known for incorporating this technology into assembly lines for monitoring process improvements.<ref>Sandgren, Jeffrey. [http://brandtechnews.net/tag/augmented-reality/ The Augmented Eye of the Beholder] {{Webarchive|url=https://web.archive.org/web/20130621054848/http://brandtechnews.net/tag/augmented-reality/ |date=21 June 2013 }}, ''BrandTech News'' 8 January 2011.</ref><ref>Cameron, Chris. [http://www.slideshare.net/readwriteweb/augmented-reality-for-marketers-and-developers-analysis-of-the-leaders-the-challenges-and-the-future Augmented Reality for Marketers and Developers], ''ReadWriteWeb''.</ref><ref>Dillow, Clay [http://www.popsci.com/scitech/article/2009-09/bmw-developing-augmented-reality-help-mechanics BMW Augmented Reality Glasses Help Average Joes Make Repairs], ''Popular Science'' September 2009.</ref> Big machines are difficult to maintain because of their multiple layers or structures. AR permits people to look through the machine as if with an x-ray, pointing them to the problem right away.<ref>King, Rachael. [https://web.archive.org/web/20120704074014/http://www.businessweek.com/stories/2009-11-03/augmented-reality-goes-mobilebusinessweek-business-news-stock-market-and-financial-advice Augmented Reality Goes Mobile], ''Bloomberg Business Week Technology'' 3 November 2009.</ref> | ||
As AR technology has progressed, the impact of AR in enterprise has grown. In the ''Harvard Business Review'', Magid Abraham and Marco Annunziata discussed how AR devices are now being used to "boost workers' productivity on an array of tasks the first time they're used, even without prior training".<ref name="Abraham-2017">{{Cite journal|url=https://hbr.org/2017/03/augmented-reality-is-already-improving-worker-performance|title=Augmented Reality Is Already Improving Worker Performance|last1=Abraham|first1=Magid|last2=Annunziata|first2=Marco|date=13 March 2017|journal=[[Harvard Business Review]]|access-date=13 January 2019}}</ref> They contend that "these technologies increase productivity by making workers more skilled and efficient, and thus have the potential to yield both more economic growth and better jobs".<ref name="Abraham-2017" /> | As AR technology has progressed, the impact of AR in enterprise has grown. In the ''Harvard Business Review'', Magid Abraham and Marco Annunziata discussed how AR devices are now being used to "boost workers' productivity on an array of tasks the first time they're used, even without prior training".<ref name="Abraham-2017">{{Cite journal|url=https://hbr.org/2017/03/augmented-reality-is-already-improving-worker-performance|title=Augmented Reality Is Already Improving Worker Performance|last1=Abraham|first1=Magid|last2=Annunziata|first2=Marco|date=13 March 2017|journal=[[Harvard Business Review]]|access-date=13 January 2019}}</ref> They contend that "these technologies increase productivity by making workers more skilled and efficient, and thus have the potential to yield both more economic growth and better jobs".<ref name="Abraham-2017" /> | ||
| Line 363: | Line 266: | ||
AR has become common in sports telecasting. Sports and entertainment venues are provided with see-through and overlay augmentation through tracked camera feeds for enhanced viewing by the audience. Examples include the yellow "[[first down]]" line seen in television broadcasts of [[American football]] games showing the line the offensive team must cross to receive a first down. AR is also used in association with football and other sporting events to show commercial advertisements overlaid onto the view of the playing area. Sections of [[rugby football|rugby]] fields and [[cricket]] pitches also display sponsored images. Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance. Other examples include hockey puck tracking and annotations of racing car performance<ref>Archived at [https://ghostarchive.org/varchive/youtube/20211211/1jQUkqqnZIc Ghostarchive]{{cbignore}} and the [https://web.archive.org/web/20210714184600/https://www.youtube.com/watch?v=1jQUkqqnZIc Wayback Machine]{{cbignore}}: {{Citation|title=Arti AR highlights at SRX -- the first sports augmented reality live from a moving car!| date=14 July 2021 |url=https://www.youtube.com/watch?v=1jQUkqqnZIc|language=en|access-date=2021-07-14}}{{cbignore}}</ref> and snooker ball trajectories.<ref name="recentadvances">[[Azuma, Ronald]]; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. [http://www.cc.gatech.edu/~blair/papers/ARsurveyCGA.pdf Recent Advances in Augmented Reality] ''Computers & Graphics'', November 2001.</ref><ref>Marlow, Chris. [http://www.dmwmedia.com/news/2012/04/27/hey-hockey-puck-nhl-preplay-adds-a-second-screen-experience-to-live-games Hey, hockey puck! NHL PrePlay adds a second-screen experience to live games], ''digitalmediawire'' 27 April 2012.</ref> | AR has become common in sports telecasting. Sports and entertainment venues are provided with see-through and overlay augmentation through tracked camera feeds for enhanced viewing by the audience. Examples include the yellow "[[first down]]" line seen in television broadcasts of [[American football]] games showing the line the offensive team must cross to receive a first down. AR is also used in association with football and other sporting events to show commercial advertisements overlaid onto the view of the playing area. Sections of [[rugby football|rugby]] fields and [[cricket]] pitches also display sponsored images. Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance. Other examples include hockey puck tracking and annotations of racing car performance<ref>Archived at [https://ghostarchive.org/varchive/youtube/20211211/1jQUkqqnZIc Ghostarchive]{{cbignore}} and the [https://web.archive.org/web/20210714184600/https://www.youtube.com/watch?v=1jQUkqqnZIc Wayback Machine]{{cbignore}}: {{Citation|title=Arti AR highlights at SRX -- the first sports augmented reality live from a moving car!| date=14 July 2021 |url=https://www.youtube.com/watch?v=1jQUkqqnZIc|language=en|access-date=2021-07-14}}{{cbignore}}</ref> and snooker ball trajectories.<ref name="recentadvances">[[Azuma, Ronald]]; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. [http://www.cc.gatech.edu/~blair/papers/ARsurveyCGA.pdf Recent Advances in Augmented Reality] ''Computers & Graphics'', November 2001.</ref><ref>Marlow, Chris. [http://www.dmwmedia.com/news/2012/04/27/hey-hockey-puck-nhl-preplay-adds-a-second-screen-experience-to-live-games Hey, hockey puck! NHL PrePlay adds a second-screen experience to live games], ''digitalmediawire'' 27 April 2012.</ref> | ||
AR has been used to enhance concert and theater performances. For example, artists allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.<ref>{{cite book |doi=10.1109/ART.2002.1107010 |chapter=The Duran Duran project: The augmented reality toolkit in live performance |title=The First IEEE International Workshop Agumented Reality Toolkit | | AR has been used to enhance concert and theater performances. For example, artists allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.<ref>{{cite book |doi=10.1109/ART.2002.1107010 |chapter=The Duran Duran project: The augmented reality toolkit in live performance |title=The First IEEE International Workshop Agumented Reality Toolkit |page=2 |year=2002 |last1=Pair |first1=J. |last2=Wilson |first2=J. |last3=Chastine |first3=J. |last4=Gandy |first4=M. |s2cid=55820154 |isbn=0-7803-7680-3 }}</ref><ref>Broughall, Nick. [http://www.gizmodo.com.au/2009/10/sydney-band-uses-augmented-reality-for-video-clip/ Sydney Band Uses Augmented Reality For Video Clip.] ''Gizmodo'', 19 October 2009.</ref><ref>Pendlebury, Ty. [http://www.cnet.com.au/augmented-reality-in-aussie-film-clip-339299097.htm Augmented reality in Aussie film clip]. ''CNet'' 19 October 2009.</ref> | ||
===Tourism and sightseeing=== | ===Tourism and sightseeing=== | ||
| Line 378: | Line 281: | ||
=== Human-in-the-loop operation of robots === | === Human-in-the-loop operation of robots === | ||
Recent advances in mixed-reality technologies have renewed interest in alternative modes of communication for human-robot interaction.<ref>{{cite book |last1=Chakraborti |first1=Tathagata |title=2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) |last2=Sreedharan |first2=Sarath |last3=Kulkarni |first3=Anagha |last4=Kambhampati |first4=Subbarao |date=2018 |isbn=978-1-5386-8094-0 |pages=4476–4482 |chapter=Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots in a Mixed-Reality Workspace |doi=10.1109/IROS.2018.8593830}}</ref> Human operators wearing augmented reality headsets such as [[Microsoft HoloLens|HoloLens]] can interact with (control and monitor) e.g. robots and lifting machines<ref name="Tu 9480">{{cite journal |last1=Tu |first1=Xinyi |last2=Autiosalo |first2=Juuso |last3=Jadid |first3=Adnane |last4=Tammi |first4=Kari |last5=Klinker |first5=Gudrun |date=12 October 2021 |title=A Mixed Reality Interface for a Digital Twin Based Crane |journal=Applied Sciences |volume=11 |issue=20 | | Recent advances in mixed-reality technologies have renewed interest in alternative modes of communication for human-robot interaction.<ref>{{cite book |last1=Chakraborti |first1=Tathagata |title=2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) |last2=Sreedharan |first2=Sarath |last3=Kulkarni |first3=Anagha |last4=Kambhampati |first4=Subbarao |date=2018 |isbn=978-1-5386-8094-0 |pages=4476–4482 |chapter=Projection-Aware Task Planning and Execution for Human-in-the-Loop Operation of Robots in a Mixed-Reality Workspace |doi=10.1109/IROS.2018.8593830}}</ref> Human operators wearing augmented reality headsets such as [[Microsoft HoloLens|HoloLens]] can interact with (control and monitor) e.g. robots and lifting machines<ref name="Tu 9480">{{cite journal |last1=Tu |first1=Xinyi |last2=Autiosalo |first2=Juuso |last3=Jadid |first3=Adnane |last4=Tammi |first4=Kari |last5=Klinker |first5=Gudrun |date=12 October 2021 |title=A Mixed Reality Interface for a Digital Twin Based Crane |journal=Applied Sciences |volume=11 |issue=20 |page=9480 |doi=10.3390/app11209480 |doi-access=free}}</ref> on site in a digital factory setup. This use case typically requires real-time data communication between a mixed reality interface with the machine / process / system, which could be enabled by incorporating [[Digital twin|digital twin technology.]]<ref name="Tu 9480" /> | ||
==Apps== | ==Apps== | ||
| Line 405: | Line 308: | ||
* [[Jeri Ellsworth]] headed a research effort for [[Valve Corporation|Valve]] on augmented reality (AR), later taking that research to her own start-up [[CastAR]]. The company, founded in 2013, eventually shuttered. Later, she created another start-up based on the same technology called Tilt Five; another AR start-up formed by her with the purpose of creating a device for digital [[board game]]s.<ref>{{Cite news|url=https://www.nytimes.com/2019/10/24/technology/jeri-ellsworth-augmented-reality.html|title=Always Building, From the Garage to Her Company|last=Markoff|first=John|date=2019-10-24|work=The New York Times|access-date=2019-12-12|language=en-US|issn=0362-4331}}</ref> | * [[Jeri Ellsworth]] headed a research effort for [[Valve Corporation|Valve]] on augmented reality (AR), later taking that research to her own start-up [[CastAR]]. The company, founded in 2013, eventually shuttered. Later, she created another start-up based on the same technology called Tilt Five; another AR start-up formed by her with the purpose of creating a device for digital [[board game]]s.<ref>{{Cite news|url=https://www.nytimes.com/2019/10/24/technology/jeri-ellsworth-augmented-reality.html|title=Always Building, From the Garage to Her Company|last=Markoff|first=John|date=2019-10-24|work=The New York Times|access-date=2019-12-12|language=en-US|issn=0362-4331}}</ref> | ||
* [[Steve Mann (inventor)|Steve Mann]] formulated an earlier concept of [[mediated reality]] in the 1970s and 1980s, using cameras, processors, and display systems to modify visual reality to help people see better (dynamic range management), building computerized welding helmets, as well as "augmediated reality" vision systems for use in everyday life. He is also an adviser to [[Meta (augmented reality company)|Meta]].<ref>{{cite journal |last1=Mann |first1=S. |title=Wearable computing: a first step toward personal imaging |journal=Computer |date=1997 |volume=30 |issue=2 |pages=25–32 |doi=10.1109/2.566147 |s2cid=28001657 }}</ref> | * [[Steve Mann (inventor)|Steve Mann]] formulated an earlier concept of [[mediated reality]] in the 1970s and 1980s, using cameras, processors, and display systems to modify visual reality to help people see better (dynamic range management), building computerized welding helmets, as well as "augmediated reality" vision systems for use in everyday life. He is also an adviser to [[Meta (augmented reality company)|Meta]].<ref>{{cite journal |last1=Mann |first1=S. |title=Wearable computing: a first step toward personal imaging |journal=Computer |date=1997 |volume=30 |issue=2 |pages=25–32 |doi=10.1109/2.566147 |s2cid=28001657 }}</ref> | ||
* [[Dieter Schmalstieg]] and Daniel Wagner developed a marker tracking systems for mobile phones and PDAs in 2009.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=946910 |title=First Steps Towards Handheld Augmented Reality |author=Wagner, Daniel |date=29 September 2009 |publisher=ACM |access-date=29 September 2009|isbn= | * [[Dieter Schmalstieg]] and Daniel Wagner developed a marker tracking systems for mobile phones and PDAs in 2009.<ref>{{cite book |url=http://portal.acm.org/citation.cfm?id=946910 |title=First Steps Towards Handheld Augmented Reality |author=Wagner, Daniel |date=29 September 2009 |publisher=ACM |access-date=29 September 2009|isbn=978-0-7695-2034-6 }}</ref> | ||
* [[Ivan Sutherland]] invented the [[The Sword of Damocles (virtual reality)|first VR head-mounted display]] at [[Harvard University]]. | * [[Ivan Sutherland]] invented the [[The Sword of Damocles (virtual reality)|first VR head-mounted display]] at [[Harvard University]]. | ||
Latest revision as of 06:14, 20 November 2025
Template:Short description Template:Multiple issues Template:Use dmy dates
Augmented reality (AR), also known as mixed reality (MR), is a technology that overlays real-time 3D-rendered computer graphics onto a portion of the real world through a display, such as a handheld device or head-mounted display. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment.[1] In this way, augmented reality alters one's ongoing perception of a real-world environment, compared to virtual reality, which aims to completely replace the user's real-world environment with a simulated one.[2][3] Augmented reality is typically visual, but can span multiple sensory modalities, including auditory, haptic, and somatosensory.[4]
The primary value of augmented reality is the manner in which components of a digital world blend into a person's perception of the real world, through the integration of immersive sensations, which are perceived as real in the user's environment. The earliest functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Laboratory in 1992.[1][5][6] Commercial augmented reality experiences were first introduced in entertainment and gaming businesses.[7] Subsequently, augmented reality applications have spanned industries such as education, communications, medicine, and entertainment.
Augmented reality can be used to enhance natural environments or situations and offers perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding computer vision, incorporating AR cameras into smartphone applications, and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulated.[8] Information about the environment and its objects is overlaid on the real world. This information can be virtual or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.[9][10][11] Augmented reality also has a lot of potential in the gathering and sharing of tacit knowledge. Immersive perceptual information is sometimes combined with supplemental information like scores over a live video feed of a sporting event. This combines the benefits of both augmented reality technology and heads up display technology (HUD).
Augmented reality frameworks include ARKit and ARCore. Commercial augmented reality headsets include the Magic Leap 1 and HoloLens. A number of companies have promoted the concept of smartglasses that have augmented reality capability.
Augmented reality can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.[12] The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).[1] As such, it is one of the key technologies in the reality-virtuality continuum.[13] Augmented reality refers to experiences that are artificial and that add to the already existing reality.[14][15][16]
Comparison with mixed reality/virtual reality
Augmented reality (AR) is largely synonymous with mixed reality (MR). There is also overlap in terminology with extended reality and computer-mediated reality. However, In the 2020s, the differences between AR and MR began to be emphasized.[17][18]
Mixed reality (MR) is an advanced technology that extends beyond augmented reality (AR) by seamlessly integrating the physical and virtual worlds.[19] In MR, users are not only able to view digital content within their real environment but can also interact with it as if it were a tangible part of the physical world.[20] This is made possible through devices such as Meta Quest 3S and Apple Vision Pro, which utilize multiple cameras and sensors to enable real-time interaction between virtual and physical elements.[21] Mixed reality that incorporates haptics has sometimes been referred to as visuo-haptic mixed reality.[22][23]
In virtual reality (VR), the users' perception is completely computer-generated, whereas with augmented reality (AR), it is partially generated and partially from the real world.[24][25] For example, in architecture, VR can be used to create a walk-through simulation of the inside of a new building; and AR can be used to show a building's structures and systems super-imposed on a real-life view. Another example is through the use of utility applications. Some AR applications, such as Augment, enable users to apply digital objects into real environments, allowing businesses to use augmented reality devices as a way to preview their products in the real world.[26] Similarly, it can also be used to demo what products may look like in an environment for customers, as demonstrated by companies such as Mountain Equipment Co-op or Lowe's who use augmented reality to allow customers to preview what their products might look like at home.[27]
Augmented reality (AR) differs from virtual reality (VR) in the sense that in AR, the surrounding environment is 'real' and AR is just adding virtual objects to the real environment. On the other hand, in VR, the surrounding environment is completely virtual and computer generated. A demonstration of how AR layers objects onto the real world can be seen with augmented reality games. WallaMe is an augmented reality game application that allows users to hide messages in real environments, utilizing geolocation technology in order to enable users to hide messages wherever they may wish in the world.[28]
In a physics context, the term "interreality system" refers to a virtual reality system coupled with its real-world counterpart.[29] A 2007 paper describes an interreality system comprising a real physical pendulum coupled to a pendulum that only exists in virtual reality.[30] This system has two stable states of motion: a "dual reality" state in which the motion of the two pendula are uncorrelated, and a "mixed reality" state in which the pendula exhibit stable phase-locked motion, which is highly correlated. The use of the terms "mixed reality" and "interreality" is clearly defined in the context of physics and may be slightly different in other fields, however, it is generally seen as, "bridging the physical and virtual world".[31]
Recent improvements in AR and VR headsets have made the display quality, field of view, and motion tracking more accurate, which makes augmented experiences more immersive. Improvements in sensor calibration, lightweight optics, and wireless connectivity have also made it easier for users to move around and be comfortable.[32]
According to a market analysis, the global market for AR and VR headsets was valued $10.3 billion in 2024 and will be worth more than $105 billion by 2035, with a CAGR of more than 25%. More and more people are using these devices in gaming, healthcare, education, and industrial training because the cost of hardware is going down and the number of content ecosystems is expanding.[33]
History
- 1901: Author L. Frank Baum, in his science-fiction novel The Master Key, first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.[34]
- Heads-up displays (HUDs), a precursor technology to augmented reality, were first developed for pilots in the 1950s, projecting simple flight data into their line of sight, thereby enabling them to keep their "heads up" and not look down at the instruments. It is a transparent display.
- 1968: Ivan Sutherland creates the first head-mounted display that has graphics rendered by a computer.[35]
- 1975: Myron Krueger creates Videoplace to allow users to interact with virtual objects.
- 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a heads up display for teaching real-world flight skills.[36]
- 1980: Steve Mann creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.[37]
- 1986: Within IBM, Ron Feigenblatt describes the most widely experienced form of AR today (viz. "magic window," e.g. smartphone-based Pokémon Go), use of a small, "smart" flat panel display positioned and oriented by hand.[38][39]
- 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "heads-up display" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.[40]
- 1990: The term augmented reality is attributed to Thomas P. Caudell, a former Boeing researcher.[41]
- 1992: Louis Rosenberg developed one of the first functioning AR systems, called Virtual Fixtures, at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.[42]
- 1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present an early paper on an AR system prototype, KARMA, at the Graphics Interface conference.
- 1993: Mike Abernathy, et al., report the first use of augmented reality in identifying space debris using Rockwell WorldView by overlaying satellite geographic trajectories on live telescope video.[43]
- 1993: A widely cited version of the paper above is published in Communications of the ACM – Special issue on computer augmented environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold.[44]
- 1993: Loral WDL, with sponsorship from STRICOM, performed the first demonstration combining live AR-equipped vehicles and manned simulators. Unpublished paper, J. Barrilleaux, "Experiences and Observations in Applying Augmented Reality to Live Training", 1999.[45]
- 1995: S. Ravela et al. at University of Massachusetts introduce a vision-based system using monocular cameras to track objects (engine blocks) across views for augmented reality.[46][47]
- 1996: General Electric develops system for projecting information from 3D CAD models onto real-world instances of those models.[48]
- 1998: Spatial augmented reality introduced at University of North Carolina at Chapel Hill by Ramesh Raskar, Greg Welch, Henry Fuchs.[49]
- 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.[50][51]
- 1999: The US Naval Research Laboratory engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.[52]
- 1999: NASA X-38 flown using LandForm software video map overlays at Dryden Flight Research Center.[53]
- 2000: Rockwell International Science Center demonstrates tetherless wearable augmented reality systems receiving analog video and 3D audio over radio-frequency wireless channels. The systems incorporate outdoor navigation capabilities, with digital horizon silhouettes from a terrain database overlain in real time on the live outdoor scene, allowing visualization of terrain made invisible by clouds and fog.[54][55]
- 2004: An outdoor helmet-mounted AR system was demonstrated by Trimble Navigation and the Human Interface Technology Laboratory (HIT lab).[56]
- 2006: Outland Research develops AR media player that overlays virtual content onto a users view of the real world synchronously with playing music, thereby providing an immersive AR entertainment experience.[57][58]
- 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the G1 Android phone.[59]
- 2009: ARToolkit was ported to Adobe Flash (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.[60]
- 2012: Launch of Lyteshot, an interactive AR gaming platform that utilizes smart glasses for game data
- 2013: Niantic releases "Ingress", an augmented reality mobile game for iOS and Android operating systems (and a predecessor of Pokémon Go).
- 2015: Microsoft announced the HoloLens augmented reality headset, which uses various sensors and a processing unit to display virtual imagery over the real world.[61]
- 2016: Niantic released Pokémon Go for iOS and Android in July 2016. The game quickly became one of the most popular smartphone applications and in turn spikes the popularity of augmented reality games.[62]
- 2018: Magic Leap launched the Magic Leap One augmented reality headset.[63] Leap Motion announced the Project North Star augmented reality headset, and later released it under an open source license.[64][65][66][67]
- 2019: Microsoft announced HoloLens 2 with significant improvements in terms of field of view and ergonomics.[68]
- 2022: Magic Leap launched the Magic Leap 2 headset.[69]
- 2023: Meta Quest 3, a mixed reality VR headset[70] was developed by Reality Labs, a division of Meta Platforms. In the same year, Apple Vision Pro was released.
- 2024: Meta Platforms revealed the Orion AR glasses prototype.[71]
- 2025: Meta Platforms released their Meta Ray-Ban Display glasses, featuring a small AR HUD on the right eye.[72]
Hardware and displays
AR visuals appear on handheld devices (video passthrough) and head-mounted displays (optical see-through or video passthrough). Systems pair a display with sensors (e.g., cameras and IMUs) to register virtual content to the environment; research also explores near-eye optics, projection-based AR, and experimental concepts such as contact-lens or retinal-scanned displays.[73][74]
Head-mounted displays
Script error: No such module "Labelled list hatnote". AR HMDs place virtual imagery in the user's view using optical see-through or video passthrough and track head motion for stable registration.[75]
Handheld
Phone and tablet AR uses the rear camera (video passthrough) plus on-device SLAM/VIO for tracking.[76][77]
Head-up display
Script error: No such module "Labelled list hatnote".
HUDs project information into the forward view; AR variants align graphics to the outside scene (e.g., lane guidance, hazards).[78]
Cave automatic virtual environment
Script error: No such module "Labelled list hatnote".
Room-scale projection systems surround users with imagery for co-located, multi-user AR/VR.[79]
Contact lenses
Script error: No such module "Labelled list hatnote".
Prototypes explore embedding display/antenna elements into lenses for glanceable AR; most work remains experimental.[80][81]
Virtual retinal display
Script error: No such module "Labelled list hatnote".
VRD concepts scan imagery directly onto the retina for high-contrast viewing.[82]
Projection mapping
Script error: No such module "Labelled list hatnote".
Projectors overlay graphics onto real objects/environments without head-worn displays (spatial AR).[83]
AR glasses
Script error: No such module "Labelled list hatnote".
Glasses-style near-eye displays aim for lighter, hands-free AR; approaches vary in optics, tracking, and power.[75]
Tracking and registration
Script error: No such module "labelled list hatnote".
AR systems estimate device pose and scene geometry so virtual graphics stay aligned with the real world. Common approaches include visual–inertial odometry and SLAM for markerless tracking, and fiducial markers when known patterns are available; image registration and depth cues (e.g., occlusion, shadows) maintain realism.[74][84][85]
Software and standards
Script error: No such module "labelled list hatnote". AR runtimes provide sensing, tracking, and rendering pipelines; mobile platforms expose SDKs with camera access and spatial tracking. Interchange/geospatial formats such as ARML standardize anchors and content.[86][87][76]
Interaction and input
Script error: No such module "labelled list hatnote". Input commonly combines head/gaze with touch, controllers, voice, or hand tracking; audio and haptics can reduce visual load. Human-factors studies report performance benefits but also workload and safety trade-offs depending on task and context.[88][85]
Design considerations
Key usability factors include stable registration, legible contrast under varied lighting, and low motion-to-photon latency. Visual design often uses depth cues (occlusion, shadows) to support spatial judgment; safety-critical uses emphasize glanceable prompts and minimal interaction.[89][90][74]
Applications
Script error: No such module "labelled list hatnote".
Augmented reality has been explored for many uses, including gaming, medicine, and entertainment. It has also been explored for education and business.[91] Some of the earliest cited examples include augmented reality used to support surgery by providing virtual overlays to guide medical practitioners, to AR content for astronomy and welding.[6][92] Example application areas described below include archaeology, architecture, commerce and education.
Education and training
Overlays models and step-by-step guidance in real settings (e.g., anatomy, maintenance); systematic reviews report learning benefits alongside design and implementation caveats that vary by context and task.[93][94][95]
Medicine
Guidance overlays and image fusion support planning and intraoperative visualization across several specialties; reviews note accuracy/registration constraints and workflow integration issues.[96][97][98]
Industry
Hands-free work instructions, inspection, and remote assistance tied to assets; evidence highlights productivity gains alongside limits around tracking robustness, ergonomics, and change management.[99][100][101]
Entertainment and games
Location-based and camera-based play place virtual objects in real spaces; recent surveys cover design patterns, effectiveness, and safety/attention trade-offs.[102][103][104]
Augmented reality navigation overlays route guidance or hazard cues onto the real scene, typically via smartphone "live view" or in-vehicle heads-up displays. Research finds AR can improve wayfinding and driver situation awareness, but human-factors trade-offs (distraction, cognitive load, occlusion) matter for safety-critical use.[105][106][107][108]
See also: Head-up display, Automotive navigation system, Wayfinding
Architecture, engineering, and construction
In the AEC sector, AR is used for design visualization, on-site verification against BIM models, clash detection, and guided assembly/inspection. Systematic reviews report benefits for communication and error reduction, while noting limits around tracking robustness and workflow integration.[109][110][111]
Archaeology
AR has been used to aid archaeological research. By augmenting archaeological features onto the modern landscape, AR allows archaeologists to formulate possible site configurations from extant structures.[112] Computer generated models of ruins, buildings, landscapes or even ancient people have been recycled into early archaeological AR applications.[113][114][115] For example, implementing a system like VITA (Visual Interaction Tool for Archaeology) will allow users to imagine and investigate instant excavation results without leaving their home. Each user can collaborate by mutually "navigating, searching, and viewing data". Hrvoje Benko, a researcher in the computer science department at Columbia University, points out that these particular systems and others like them can provide "3D panoramic images and 3D models of the site itself at different excavation stages" all the while organizing much of the data in a collaborative way that is easy to use. Collaborative AR systems supply multimodal interactions that combine the real world with virtual images of both environments.[116]
Commerce
Script error: No such module "Labelled list hatnote".
AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect different types of media.[117][118][119][120][121]
AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.[122] AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.[123]
In 2018, Apple announced Universal Scene Description (USDZ) AR file support for iPhones and iPads with iOS 12. Apple has created an AR QuickLook Gallery that allows people to experience augmented reality through their own Apple device.[124]
In 2018, Shopify, the Canadian e-commerce company, announced AR Quick Look integration. Their merchants will be able to upload 3D models of their products and their users will be able to tap on the models inside the Safari browser on their iOS devices to view them in their real-world environments.[125]
In 2018, Twinkl released a free AR classroom application. Pupils can see how York looked over 1,900 years ago.[126] Twinkl launched the first ever multi-player AR game, Little Red[127] and has over 100 free AR educational models.[128]
Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which are overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer full-body scanning. These booths render a 3D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.[129] For example, JC Penney and Bloomingdale's use "virtual dressing rooms" that allow customers to see themselves in clothes without trying them on.[130] Another store that uses AR to market clothing to its customers is Neiman Marcus.[131] Neiman Marcus offers consumers the ability to see their outfits in a 360-degree view with their "memory mirror".[131] Makeup stores like L'Oreal, Sephora, Charlotte Tilbury, and Rimmel also have apps that utilize AR.[132] These apps allow consumers to see how the makeup will look on them.[132] According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail".[132]
AR technology is also used by furniture retailers such as IKEA, Houzz, and Wayfair.[132][130] These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.[132][133] In 2017, Ikea announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.[134] The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore.[135][136] Shopify's acquisition of Primer, an AR app aims to push small and medium-sized sellers towards interactive AR shopping with easy to use AR integration and user experience for both merchants and consumers. AR helps the retail industry reduce operating costs. Merchants upload product information to the AR system, and consumers can use mobile terminals to search and generate 3D maps.[137]
Literature
The first description of AR as it is known today was in Virtual Light, the 1994 novel by William Gibson.
Fitness
AR hardware and software for use in fitness includes smart glasses made for biking and running, with performance analytics and map navigation projected onto the user's field of vision,[138] and boxing, martial arts, and tennis, where users remain aware of their physical environment for safety.[139] Fitness-related games and software include Pokémon Go and Jurassic World Alive.[140]
Emergency management/search and rescue
Augmented reality systems are used in public safety situations, from super storms to suspects at large.
As early as 2009, two articles from Emergency Management discussed AR technology for emergency management. The first was "Augmented Reality—Emerging Technology for Emergency Management", by Gerald Baron.[141] According to Adam Crow,: "Technologies like augmented reality (ex: Google Glass) and the growing expectation of the public will continue to force professional emergency managers to radically shift when, where, and how technology is deployed before, during, and after disasters."[142]
Another early example was a search aircraft looking for a lost hiker in rugged mountain terrain. Augmented reality systems provided aerial camera operators with a geographic awareness of forest road names and locations blended with the camera video. The camera operator was better able to search for the hiker knowing the geographic context of the camera image. Once located, the operator could more efficiently direct rescuers to the hiker's location because the geographic position and reference landmarks were clearly labeled.[143]
Social interaction
AR can be used to facilitate social interaction, however, use of an AR headset can inhibit the quality of an interaction between two people if one isn't wearing one if the headset becomes a distraction.[144]
Augmented reality also gives users the ability to practice different forms of social interactions with other people in a safe, risk-free environment. Hannes Kauffman, Associate Professor for virtual reality at TU Vienna, says: "In collaborative augmented reality multiple users may access a shared space populated by virtual objects, while remaining grounded in the real world. This technique is particularly powerful for educational purposes when users are collocated and can use natural means of communication (speech, gestures, etc.), but can also be mixed successfully with immersive VR or remote collaboration."Template:Quote without source Hannes cites education as a potential use of this technology.
Healthcare planning, practice and education
One of the first applications of augmented reality was in healthcare, particularly to support the planning, practice, and training of surgical procedures. As far back as 1992, enhancing human performance during surgery was a formally stated objective when building the first augmented reality systems at U.S. Air Force laboratories.[1] AR provides surgeons with patient monitoring data in the style of a fighter pilot's heads-up display, and allows patient imaging records, including functional videos, to be accessed and overlaid. Examples include a virtual X-ray view based on prior tomography or on real-time images from ultrasound and confocal microscopy probes,[145] visualizing the position of a tumor in the video of an endoscope,[146] or radiation exposure risks from X-ray imaging devices.[147][148] AR can enhance viewing a fetus inside a mother's womb.[149] Siemens, Karl Storz and IRCAD have developed a system for laparoscopic liver surgery that uses AR to view sub-surface tumors and vessels.[150] AR has been used for cockroach phobia treatment[151] and to reduce the fear of spiders.[152] Patients wearing augmented reality glasses can be reminded to take medications.[153] Augmented reality can be very helpful in the medical field.[154] It could be used to provide crucial information to a doctor or surgeon without having them take their eyes off the patient.
On 30 April 2015, Microsoft announced the Microsoft HoloLens, their first attempt at augmented reality. The HoloLens is capable of displaying images for image-guided surgery.[155] As augmented reality advances, it finds increasing applications in healthcare. Augmented reality and similar computer based-utilities are being used to train medical professionals.[156][157] In healthcare, AR can be used to provide guidance during diagnostic and therapeutic interventions e.g. during surgery. Magee et al.,[158] for instance, describe the use of augmented reality for medical training in simulating ultrasound-guided needle placement. Recently, augmented reality began seeing adoption in neurosurgery, a field that requires heavy amounts of imaging before procedures.[159]
Smartglasses can be incorporated into the operating room to aide in surgical procedures; possibly displaying patient data conveniently while overlaying precise visual guides for the surgeon.[160][161] Augmented reality headsets like the Microsoft HoloLens have been theorized to allow for efficient sharing of information between doctors, in addition to providing a platform for enhanced training.[162][161] This can, in some situations (i.e. patient infected with contagious disease), improve doctor safety and reduce PPE use.[163] While mixed reality has lots of potential for enhancing healthcare, it does have some drawbacks too.[161] The technology may never fully integrate into scenarios when a patient is present, as there are ethical concerns surrounding the doctor not being able to see the patient.[161] Mixed reality is also useful for healthcare education. For example, according to a 2022 report from the World Economic Forum, 85% of first-year medical students at Case Western Reserve University reported that mixed reality for teaching anatomy was "equivalent" or "better" than the in-person class.[164]
Spatial immersion and interaction
Augmented reality applications, running on handheld devices utilized as virtual reality headsets, can also digitize human presence in space and provide a computer generated model of them, in a virtual space where they can interact and perform various actions. Such capabilities are demonstrated by Project Anywhere, developed by a postgraduate student at ETH Zurich, which was dubbed as an "out-of-body experience".[165][166][167]
Flight training
Building on decades of perceptual-motor research in experimental psychology, researchers at the Aviation Research Laboratory of the University of Illinois at Urbana–Champaign used augmented reality in the form of a flight path in the sky to teach flight students how to land an airplane using a flight simulator. An adaptive augmented schedule in which students were shown the augmentation only when they departed from the flight path proved to be a more effective training intervention than a constant schedule.[36][168] Flight students taught to land in the simulator with the adaptive augmentation learned to land a light aircraft more quickly than students with the same amount of landing training in the simulator but with constant augmentation or without any augmentation.[36]
Military
The first fully immersive system was the Virtual Fixtures platform, which was developed in 1992 by Louis Rosenberg at the Armstrong Laboratories of the United States Air Force.[169] It enabled human users to control robots in real-world environments that included real physical objects and 3D virtual overlays ("fixtures") that were added enhance human performance of manipulation tasks. Published studies showed that by introducing virtual objects into the real world, significant performance increases could be achieved by human operators.[169][170][171]
An interesting early application of AR occurred when Rockwell International created video map overlays of satellite and orbital debris tracks to aid in space observations at Air Force Maui Optical System. In their 1993 paper "Debris Correlation Using the Rockwell WorldView System" the authors describe the use of map overlays applied to video from space surveillance telescopes. The map overlays indicated the trajectories of various objects in geographic coordinates. This allowed telescope operators to identify satellites, and also to identify and catalog potentially dangerous space debris.[43]
Starting in 2003 the US Army integrated the SmartCam3D augmented reality system into the Shadow Unmanned Aerial System to aid sensor operators using telescopic cameras to locate people or points of interest. The system combined fixed geographic information including street names, points of interest, airports, and railroads with live video from the camera system. The system offered a "picture in picture" mode that allows it to show a synthetic view of the area surrounding the camera's field of view. This helps solve a problem in which the field of view is so narrow that it excludes important context, as if "looking through a soda straw". The system displays real-time friend/foe/neutral location markers blended with live video, providing the operator with improved situational awareness.
Combat reality can be simulated and represented using complex, layered data and visual aides, most of which are head-mounted displays (HMD), which encompass any display technology that can be worn on the user's head.[172] Military training solutions are often built on commercial off-the-shelf (COTS) technologies, such as Improbable's synthetic environment platform, Virtual Battlespace 3 and VirTra, with the latter two platforms used by the United States Army. Template:As of, VirTra is being used by both civilian and military law enforcement to train personnel in a variety of scenarios, including active shooter, domestic violence, and military traffic stops.[173][174]
In 2017, the U.S. Army was developing the Synthetic Training Environment (STE), a collection of technologies for training purposes that was expected to include mixed reality. Template:As of, STE was still in development without a projected completion date. Some recorded goals of STE included enhancing realism and increasing simulation training capabilities and STE availability to other systems.[175]
It was claimed that mixed-reality environments like STE could reduce training costs,[176][177] such as reducing the amount of ammunition expended during training.[178] In 2018, it was reported that STE would include representation of any part of the world's terrain for training purposes.[179] STE would offer a variety of training opportunities for squad brigade and combat teams, including Stryker, armory, and infantry teams.[180]
Researchers at USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using this technology.[181] This ability to maintain geographic awareness quantitatively enhances mission efficiency. The system is in use on the US Army RQ-7 Shadow and the MQ-1C Gray Eagle Unmanned Aerial Systems.
In combat, AR can serve as a networked communication system that renders useful battlefield data onto a soldier's goggles in real time. From the soldier's viewpoint, people and various objects can be marked with special indicators to warn of potential dangers. Virtual maps and 360° view camera imaging can also be rendered to aid a soldier's navigation and battlefield perspective, and this can be transmitted to military leaders at a remote command center.[182] The combination of 360° view cameras visualization and AR can be used on board combat vehicles and tanks as circular review system.
AR can be an effective tool for virtually mapping out the 3D topologies of munition storages in the terrain, with the choice of the munitions combination in stacks and distances between them with a visualization of risk areas.[183]Script error: No such module "Unsubst". The scope of AR applications also includes visualization of data from embedded munitions monitoring sensors.[183]
Script error: No such module "Labelled list hatnote".
The NASA X-38 was flown using a hybrid synthetic vision system that overlaid map data on video to provide enhanced navigation for the spacecraft during flight tests from 1998 to 2002. It used the LandForm software which was useful for times of limited visibility, including an instance when the video camera window frosted over leaving astronauts to rely on the map overlays.[50] The LandForm software was also test flown at the Army Yuma Proving Ground in 1999. In the photo at right one can see the map markers indicating runways, air traffic control tower, taxiways, and hangars overlaid on the video.[51]
AR can augment the effectiveness of navigation devices. Information can be displayed on an automobile's windshield indicating destination directions and meter, weather, terrain, road conditions and traffic information as well as alerts to potential hazards in their path.[184][185][186] Since 2012, a Swiss-based company WayRay has been developing holographic AR navigation systems that use holographic optical elements for projecting all route-related information including directions, important notifications, and points of interest right into the drivers' line of sight and far ahead of the vehicle.[187][188] Aboard maritime vessels, AR can allow bridge watch-standers to continuously monitor important information such as a ship's heading and speed while moving throughout the bridge or performing other tasks.[189]
Workplace
In a research project, AR was used to facilitate collaboration among distributed team members via conferences with local and virtual participants. AR tasks included brainstorming and discussion meetings utilizing common visualization via touch screen tables, interactive digital whiteboards, shared design spaces and distributed control rooms.[190][191][192]
In industrial environments, augmented reality is proving to have a substantial impact with use cases emerging across all aspect of the product lifecycle, starting from product design and new product introduction (NPI) to manufacturing to service and maintenance, to material handling and distribution. For example, labels were displayed on parts of a system to clarify operating instructions for a mechanic performing maintenance on a system.[193][194] Assembly lines benefited from the usage of AR. In addition to Boeing, BMW and Volkswagen were known for incorporating this technology into assembly lines for monitoring process improvements.[195][196][197] Big machines are difficult to maintain because of their multiple layers or structures. AR permits people to look through the machine as if with an x-ray, pointing them to the problem right away.[198]
As AR technology has progressed, the impact of AR in enterprise has grown. In the Harvard Business Review, Magid Abraham and Marco Annunziata discussed how AR devices are now being used to "boost workers' productivity on an array of tasks the first time they're used, even without prior training".[199] They contend that "these technologies increase productivity by making workers more skilled and efficient, and thus have the potential to yield both more economic growth and better jobs".[199]
Machine maintenance can also be executed with the help of mixed reality. Larger companies with multiple manufacturing locations and a lot of machinery can use mixed reality to educate and instruct their employees. The machines need regular checkups and have to be adjusted every now and then. These adjustments are mostly done by humans, so employees need to be informed about needed adjustments. By using mixed reality, employees from multiple locations can wear headsets and receive live instructions about the changes. Instructors can operate the representation that every employee sees, and can glide through the production area, zooming in to technical details and explaining every change needed. Employees completing a five-minute training session with such a mixed-reality program have been shown to attain the same learning results as reading a 50-page training manual.[200] An extension to this environment is the incorporation of live data from operating machinery into the virtual collaborative space and then associated with three dimensional virtual models of the equipment. This enables training and execution of maintenance, operational and safety work processes, which would otherwise be difficult in a live setting, while making use of expertise, no matter their physical location.[201]
Product content management
Product content management before the advent of augmented reality consisted largely of brochures and little customer-product engagement outside of this 2-dimensional realm.[202] With augmented reality technology improvements, new forms of interactive product content management has emerged. Most notably, 3-dimensional digital renderings of normally 2-dimensional products have increased reachability and effectiveness of consumer-product interaction.[203]
Augmented reality allows sellers to show the customers how a certain commodity will suit their demands. A seller may demonstrate how a certain product will fit into the homes of the buyer. The buyer with the assistance of the VR can virtually pick the item, spin around and place to their desired points. This improves the buyer's confidence of making a purchase and reduces the number of returns.[204] Architectural firms can allow customers to virtually visit their desired homes.
Functional mockup
Augmented reality can be used to build mockups that combine physical and digital elements. With the use of simultaneous localization and mapping (SLAM), mockups can interact with the physical world to gain control of more realistic sensory experiences[205] like object permanence, which would normally be infeasible or extremely difficult to track and analyze without the use of both digital and physical aides.[206]
Broadcast and live events
Weather visualizations were the first application of augmented reality in television. It has now become common in weather casting to display full motion video of images captured in real-time from multiple cameras and other imaging devices. Coupled with 3D graphics symbols and mapped to a common virtual geospatial model, these animated visualizations constitute the first true application of AR to TV.
AR has become common in sports telecasting. Sports and entertainment venues are provided with see-through and overlay augmentation through tracked camera feeds for enhanced viewing by the audience. Examples include the yellow "first down" line seen in television broadcasts of American football games showing the line the offensive team must cross to receive a first down. AR is also used in association with football and other sporting events to show commercial advertisements overlaid onto the view of the playing area. Sections of rugby fields and cricket pitches also display sponsored images. Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance. Other examples include hockey puck tracking and annotations of racing car performance[207] and snooker ball trajectories.[208][209]
AR has been used to enhance concert and theater performances. For example, artists allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.[210][211][212]
Tourism and sightseeing
Travelers may use AR to access real-time informational displays regarding a location, its features, and comments or content provided by previous visitors. Advanced AR applications include simulations of historical events, places, and objects rendered into the landscape.[213][214][215]
AR applications linked to geographic locations present location information by audio, announcing features of interest at a particular site as they become visible to the user.[216][217][218]
Translation
AR applications such as Word Lens can interpret the foreign text on signs and menus and, in a user's augmented view, re-display the text in the user's language. Spoken words of a foreign language can be translated and displayed in a user's view as printed subtitles.[219][220][221]
Music
It has been suggested that augmented reality may be used in new methods of music production, mixing, control and visualization.[222][223][224][225]
Human-in-the-loop operation of robots
Recent advances in mixed-reality technologies have renewed interest in alternative modes of communication for human-robot interaction.[226] Human operators wearing augmented reality headsets such as HoloLens can interact with (control and monitor) e.g. robots and lifting machines[227] on site in a digital factory setup. This use case typically requires real-time data communication between a mixed reality interface with the machine / process / system, which could be enabled by incorporating digital twin technology.[227]
Apps
Snapchat users have access to augmented reality features. In September 2017, Snapchat announced a feature called "Sky Filters" that will be available on its app. This new feature makes use of augmented reality to alter the look of a picture taken of the sky, much like how users can apply the app's filters to other pictures. Users can choose from sky filters such as starry night, stormy clouds, beautiful sunsets, and rainbow.[228]
Google launched an augmented reality feature for Google Maps on Pixel phones that identifies users' location and places signs and arrows on the device screen to show a user navigation directions.[229]
Concerns
Reality modifications
In a paper titled "Death by Pokémon GO", researchers at Purdue University's Krannert School of Management claim the game caused "a disproportionate increase in vehicular crashes and associated vehicular damage, personal injuries, and fatalities in the vicinity of locations, called PokéStops, where users can play the game while driving."[230] Using data from one municipality, the paper extrapolates what that might mean nationwide and concluded "the increase in crashes attributable to the introduction of Pokémon GO is 145,632 with an associated increase in the number of injuries of 29,370 and an associated increase in the number of fatalities of 256 over the period of 6 July 2016, through 30 November 2016." The authors extrapolated the cost of those crashes and fatalities at between $2bn and $7.3 billion for the same period. Furthermore, more than one in three surveyed advanced Internet users would like to edit out disturbing elements around them, such as garbage or graffiti.[231] They would like to even modify their surroundings by erasing street signs, billboard ads, and uninteresting shopping windows. Consumers want to use augmented reality glasses to change their surroundings into something that reflects their own personal opinions. Around two in five want to change the way their surroundings look and even how people appear to them. Script error: No such module "Unsubst".
Privacy concerns
Augmented reality devices that use cameras for 3D tracking or video passthrough depend on the ability of the device to record and analyze the environment in real time. Because of this, there are potential legal concerns over privacy.
In late 2024, Meta's collaboration with Ray-Ban on smart glasses faced heightened scrutiny due to significant privacy concerns. A notable incident involved two Harvard students who developed a program named I-XRAY, which utilized the glasses' camera in conjunction with facial recognition software to identify individuals in real-time.[232]
According to recent studies, users are especially concerned that augmented reality smart glasses might compromise the privacy of others, potentially causing peers to become uncomfortable or less open during interactions.[233]
While the First Amendment to the United States Constitution allows for such recording in the name of public interest, the constant recording of an AR device makes it difficult to do so without also recording outside of the public domain. Legal complications would be found in areas where a right to a certain amount of privacy is expected or where copyrighted media are displayed.
In terms of individual privacy, there exists the ease of access to information that one should not readily possess about a given person. This is accomplished through facial recognition technology. Assuming that AR automatically passes information about persons that the user sees, there could be anything seen from social media, criminal record, and marital status.[234]
Notable researchers
- Ronald Azuma is a scientist and author of works on AR.
- Jeri Ellsworth headed a research effort for Valve on augmented reality (AR), later taking that research to her own start-up CastAR. The company, founded in 2013, eventually shuttered. Later, she created another start-up based on the same technology called Tilt Five; another AR start-up formed by her with the purpose of creating a device for digital board games.[235]
- Steve Mann formulated an earlier concept of mediated reality in the 1970s and 1980s, using cameras, processors, and display systems to modify visual reality to help people see better (dynamic range management), building computerized welding helmets, as well as "augmediated reality" vision systems for use in everyday life. He is also an adviser to Meta.[236]
- Dieter Schmalstieg and Daniel Wagner developed a marker tracking systems for mobile phones and PDAs in 2009.[237]
- Ivan Sutherland invented the first VR head-mounted display at Harvard University.
See also
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
References
External links
Template:Commons category-inline Template:Extended reality Template:Authority control
- ↑ a b c d Script error: No such module "citation/CS1".
- ↑ Steuer,Script error: No such module "citation/CS1"., Department of Communication, Stanford University. 15 October 1993.
- ↑ Introducing Virtual Environments Template:Webarchive National Center for Supercomputing Applications, University of Illinois.
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Time-frequency perspectives, with applications, in Advances in Machine Vision, Strategies and Applications, World Scientific Series in Computer Science: Volume 32, C Archibald and Emil Petriu, Cover + pp 99–128, 1992.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Template:Cite magazine
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Rokhsaritalemi, S., Sadeghi-Niaraki, A., & Choi, S. M. (2020). A review on mixed reality: Current trends, challenges and prospects. Applied Sciences, 10(2), 636.
- ↑ Buhalis, D., & Karatay, N. (2022). Mixed reality (MR) for generation Z in cultural heritage tourism towards metaverse. In Information and communication technologies in tourism 2022: Proceedings of the ENTER 2022 eTourism conference, January 11–14, 2022 (pp. 16-27). Springer International Publishing.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ J. van Kokswijk, Hum@n, Telecoms & Internet as Interface to Interreality Template:Webarchive (Bergboek, The Netherlands, 2003).
- ↑ Script error: No such module "Citation/CS1".
- ↑ Repetto, C. and Riva, G., 2020. From Virtual Reality To Interreality In The Treatment Of Anxiety Disorders. [online] Jneuropsychiatry.org. Available at: https://www.jneuropsychiatry.org/peer-review/from-virtual-reality-to-interreality-in-the-treatment-of-anxiety-disorders-neuropsychiatry.pdf [Accessed 30 October 2020].
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Johnson, Joel. "The Master Key": L. Frank Baum envisions augmented reality glasses in 1901 Mote & Beam 10 September 2012.
- ↑ Script error: No such module "citation/CS1".
- ↑ a b c Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1". (context & abstract only) IBM Technical Disclosure Bulletin 1 March 1987
- ↑ Script error: No such module "citation/CS1". (image of anonymous printed article) IBM Technical Disclosure Bulletin 1 March 1987
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Louis B. Rosenberg. "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments." Technical Report AL-TR-0089, USAF Armstrong Laboratory (AFRL), Wright-Patterson AFB OH, 1992.
- ↑ a b Abernathy, M., Houchard, J., Puccetti, M., and Lambert, J,"Debris Correlation Using the Rockwell WorldView System", Proceedings of 1993 Space Surveillance Workshop 30 March to 1 April 1993, pages 189–195
- ↑ Script error: No such module "Citation/CS1".
- ↑ Barrilleaux, Jon. Experiences and Observations in Applying Augmented Reality to Live Training.
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Ramesh Raskar, Greg Welch, Henry Fuchs Spatially Augmented Reality, First International Workshop on Augmented Reality, Sept 1998.
- ↑ a b Delgado, F., Abernathy, M., White J., and Lowrey, B. Real-Time 3-D Flight Guidance with Terrain for the X-38, SPIE Enhanced and Synthetic Vision 1999, Orlando Florida, April 1999, Proceedings of the SPIE Vol. 3691, pages 149–156
- ↑ a b Delgado, F., Altman, S., Abernathy, M., White, J. Virtual Cockpit Window for the X-38, SPIE Enhanced and Synthetic Vision 2000, Orlando Florida, Proceedings of the SPIE Vol. 4023, pages 63–70
- ↑ Script error: No such module "citation/CS1".
- ↑ AviationNow.com Staff, "X-38 Test Features Use of Hybrid Synthetic Vision" AviationNow.com, 11 December 2001
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Outdoor AR. TV One News, 8 March 2004.
- ↑ Template:Cite patent Template:Webarchive
- ↑ Script error: No such module "citation/CS1".
- ↑ Wikitude AR Travel Guide. YouTube.com. Retrieved 9 June 2012.
- ↑ Cameron, Chris. Flash-based AR Gets High-Quality Markerless Upgrade, ReadWriteWeb 9 July 2010.
- ↑ Microsoft Channel, YouTube [1], 23 January 2015.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Official Blog, Microsoft [2], 24 February 2019.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b c Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Cite magazine
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Cite report
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Katts, Rima. Elizabeth Arden brings new fragrance to life with augmented reality Mobile Marketer, 19 September 2012.
- ↑ Meyer, David. Telefónica bets on augmented reality with Aurasma tie-in gigaom, 17 September 2012.
- ↑ Mardle, Pamela.Video becomes reality for Stuprint.com Template:Webarchive. PrintWeek, 3 October 2012.
- ↑ Giraldo, Karina.Why mobile marketing is important for brands? Template:Webarchive. SolinixAR, Enero 2015.
- ↑ Script error: No such module "citation/CS1".
- ↑ Humphries, Mathew.[3] Template:Webarchive.Geek.com 19 September 2011.
- ↑ Netburn, Deborah.Ikea introduces augmented reality app for 2013 catalog Template:Webarchive. Los Angeles Times, 23 July 2012.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Pavlik, John V., and Shawn McIntosh. "Augmented Reality." Converging Media: a New Introduction to Mass Communication, 5th ed., Oxford University Press, 2017, pp. 184–185.
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ a b c d e Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Cite magazine
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ "Augmented Reality—Emerging Technology for Emergency Management", Emergency Management 24 September 2009.
- ↑ "What Does the Future Hold for Emergency Management?", Emergency Management Magazine, 8 November 2013
- ↑ Template:Cite thesis
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Trim Template:Replace on YouTubeScript error: No such module "Check for unknown parameters".
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Trim Template:Replace on YouTubeScript error: No such module "Check for unknown parameters".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b c d Script error: No such module "citation/CS1".
- ↑ M. Pell, Envisioning Holograms Design Breakthrough Experiences for Mixed Reality, 1st ed. 2017. Berkeley, CA: Apress, 2017.Template:Pn
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Project Anywhere at studioany.com
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Rosenberg, Louis B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Pandher, Gurmeet Singh (2 March 2016). "Microsoft HoloLens Preorders: Price, Specs Of The Augmented Reality Headset". The Bitbag. Archived from the original on 4 March 2016. Retrieved 1 April 2016.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Cite thesisTemplate:Pn
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Shufelt, Jr., J.W. (2006) A Vision for Future Virtual Training. In Virtual Media for Military Applications (pp. KN2-1 – KN2-12). Meeting Proceedings RTO-MP-HFM-136, Keynote 2. Neuilly-sur-Seine, France: RTO. Available from:Mixed Reality (MR)Template:Webarchive
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Calhoun, G. L., Draper, M. H., Abernathy, M. F., Delgado, F., and Patzek, M. "Synthetic Vision System for Improving Unmanned Aerial Vehicle Operator Situation Awareness," 2005 Proceedings of SPIE Enhanced and Synthetic Vision, Vol. 5802, pp. 219–230.
- ↑ Cameron, Chris. Military-Grade Augmented Reality Could Redefine Modern Warfare ReadWriteWeb 11 June 2010.
- ↑ a b Script error: No such module "citation/CS1".
- ↑ GM's Enhanced Vision System. Techcrunch.com (17 March 2010). Retrieved 9 June 2012.
- ↑ Couts, Andrew. New augmented reality system shows 3D GPS navigation through your windshield Digital Trends,27 October 2011.
- ↑ Griggs, Brandon. Augmented-reality' windshields and the future of driving CNN Tech, 13 January 2012.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Office of Tomorrow Template:Webarchive Media Interaction Lab.
- ↑ The big idea:Augmented Reality. Ngm.nationalgeographic.com (15 May 2012). Retrieved 9 June 2012.
- ↑ Script error: No such module "citation/CS1".
- ↑ Sandgren, Jeffrey. The Augmented Eye of the Beholder Template:Webarchive, BrandTech News 8 January 2011.
- ↑ Cameron, Chris. Augmented Reality for Marketers and Developers, ReadWriteWeb.
- ↑ Dillow, Clay BMW Augmented Reality Glasses Help Average Joes Make Repairs, Popular Science September 2009.
- ↑ King, Rachael. Augmented Reality Goes Mobile, Bloomberg Business Week Technology 3 November 2009.
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Bingham and Conner "The New Social Learning" Chapter 6 - Immersive Environments Refine Learning
- ↑ Script error: No such module "citation/CS1".
- ↑ Melroseqatar.com. 2020. MELROSE Solutions W.L.L. [online] Available at: http://www.melroseqatar.com/reality-technologies.html [Accessed 25 October 2020].
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Archived at GhostarchiveTemplate:Cbignore and the Wayback MachineTemplate:Cbignore: Script error: No such module "citation/CS1".Template:Cbignore
- ↑ Azuma, Ronald; Balliot, Yohan; Behringer, Reinhold; Feiner, Steven; Julier, Simon; MacIntyre, Blair. Recent Advances in Augmented Reality Computers & Graphics, November 2001.
- ↑ Marlow, Chris. Hey, hockey puck! NHL PrePlay adds a second-screen experience to live games, digitalmediawire 27 April 2012.
- ↑ Script error: No such module "citation/CS1".
- ↑ Broughall, Nick. Sydney Band Uses Augmented Reality For Video Clip. Gizmodo, 19 October 2009.
- ↑ Pendlebury, Ty. Augmented reality in Aussie film clip. CNet 19 October 2009.
- ↑ Saenz, Aaron Augmented Reality Does Time Travel Tourism SingularityHUB 19 November 2009.
- ↑ Sung, Dan Augmented reality in action – travel and tourism Pocket-lint 2 March 2011.
- ↑ Dawson, Jim Augmented Reality Reveals History to Tourists Life Science 16 August 2009.
- ↑ Script error: No such module "Citation/CS1".
- ↑ Benderson, Benjamin B. Audio Augmented Reality: A Prototype Automated Tour Guide Template:Webarchive Bell Communications Research, ACM Human Computer in Computing Systems Conference, pp. 210–211.
- ↑ Jain, Puneet and Manweiler, Justin and Roy Choudhury, Romit. OverLay: Practical Mobile Augmented Reality ACM MobiSys, May 2015.
- ↑ Tsotsis, Alexia. Word Lens Translates Words Inside of Images. Yes Really. TechCrunch (16 December 2010).
- ↑ N.B. Word Lens: This changes everything The Economist: Gulliver blog 18 December 2010.
- ↑ Borghino, Dario Augmented reality glasses perform real-time language translation. gizmag, 29 July 2012.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Miller, Chance. "Snapchat's Latest Augmented Reality Feature Lets You Paint the Sky with New Filters." 9to5Mac, 9to5Mac, 25 Sept. 2017, 9to5mac.com/2017/09/25/how-to-use-snapchat-sky-filters/.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Peddie, J., 2017, Agumented Reality, SpringerScript error: No such module "Unsubst".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".