ARCore: Difference between revisions
Appearance
Content deleted Content added
Maksdroider (talk | contribs) mNo edit summary |
Citation bot (talk | contribs) Altered publisher. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Software development kits | #UCB_Category 45/47 |
||
(2 intermediate revisions by 2 users not shown) | |||
Line 13: | Line 13: | ||
| website = {{URL|https://developers.google.com/ar/}} |
| website = {{URL|https://developers.google.com/ar/}} |
||
}} |
}} |
||
'''ARCore''', also known as '''Google Play Services for AR''', is a [[software development kit]] developed by [[Google]] that allows for [[augmented reality]] (AR) applications to be built. ARCore has been integrated into a multitude of devices.<ref>{{cite web|title=ARCore supported devices|url=https://developers.google.com/ar/discover/supported-devices|publisher=Google |
'''ARCore''', also known as '''Google Play Services for AR''', is a [[software development kit]] developed by [[Google]] that allows for [[augmented reality]] (AR) applications to be built. ARCore has been integrated into a multitude of devices.<ref>{{cite web|title=ARCore supported devices|url=https://developers.google.com/ar/discover/supported-devices|publisher=Google Inc.|access-date=23 February 2020}}</ref> |
||
== Key technologies == |
== Key technologies == |
||
Line 21: | Line 21: | ||
=== Six degrees of freedom === |
=== Six degrees of freedom === |
||
* Allows the phone to understand and [[positional tracking|track its position]] relative to the world. |
* Allows the phone to understand and [[positional tracking|track its position]] relative to the world. |
||
* A [[pose tracking|motion tracking]] process known as [[simultaneous localization and mapping]] (SLAM) utilizes feature points - which are visually distinct objects within camera view - to provide focal points for the phone to determine proper positioning (pose) of the device.<ref>{{cite web |title=Fundamental Concepts |url=https://developers.google.com/ar/develop/fundamentals |website=ARCore |publisher=Google |
* A [[pose tracking|motion tracking]] process known as [[simultaneous localization and mapping]] (SLAM) utilizes feature points - which are visually distinct objects within camera view - to provide focal points for the phone to determine proper positioning (pose) of the device.<ref>{{cite web |title=Fundamental Concepts |url=https://developers.google.com/ar/develop/fundamentals |website=ARCore |publisher=Google Inc. |access-date=22 February 2024}}</ref> |
||
=== Environmental understanding === |
=== Environmental understanding === |
||
Line 30: | Line 30: | ||
=== Light estimation === |
=== Light estimation === |
||
* Lighting Estimation API allows the phone to estimate the environment's current lighting conditions and display images accurately in relation to real-world lighting. |
* Lighting Estimation API allows the phone to estimate the environment's current lighting conditions and display images accurately in relation to real-world lighting. |
||
** Lighting cues such as shadows and highlights are used to more immersively display virtual objects.<ref name="Google LLC">{{cite web |title=Get the Lighting Right |url=https://developers.google.com/ar/develop/lighting-estimation |website=ARCore |access-date=22 February 2024 |publisher=Google |
** Lighting cues such as shadows and highlights are used to more immersively display virtual objects.<ref name="Google LLC">{{cite web |title=Get the Lighting Right |url=https://developers.google.com/ar/develop/lighting-estimation |website=ARCore |access-date=22 February 2024 |publisher=Google Inc.}}</ref> |
||
=== Depth analysis === |
=== Depth analysis === |
||
* Utilizes the phone's camera to create [[depth map|depth maps]], which enable the device to more accurately determine the amount of space between surfaces based on what is captured.<ref>{{cite web |title=Fundamental Concepts |url=https://developers.google.com/ar/develop/fundamentals |access-date=22 February 2024 |website=ARCore |publisher=Google |
* Utilizes the phone's camera to create [[depth map|depth maps]], which enable the device to more accurately determine the amount of space between surfaces based on what is captured.<ref>{{cite web |title=Fundamental Concepts |url=https://developers.google.com/ar/develop/fundamentals |access-date=22 February 2024 |website=ARCore |publisher=Google Inc.}}</ref> |
||
** In order to properly assess the real world, depth maps are created to measure the amount of space between objects or surfaces. |
** In order to properly assess the real world, depth maps are created to measure the amount of space between objects or surfaces. |
||
** A depth-from-motion algorithm takes the motion data from the user's camera and utilizes it to create a more detailed depth map.<ref>{{cite web |title=Depth Adds Realism |url=https://developers.google.com/ar/develop/depth |website=ARCore |publisher=Google |
** A depth-from-motion algorithm takes the motion data from the user's camera and utilizes it to create a more detailed depth map.<ref>{{cite web |title=Depth Adds Realism |url=https://developers.google.com/ar/develop/depth |website=ARCore |publisher=Google Inc. |access-date=22 February 2024}}</ref> |
||
=== Geospatial capabilities === |
=== Geospatial capabilities === |
||
* This function's API uses [[GPS]] and allows creators to give users unique experiences based on their real-world location.<ref>{{cite web |title=Build global-scale, immersive, location-based AR experiences with the ARCore Geospatial API |url=https://developers.google.com/ar/develop/geospatial |website=ARCore |publisher=Google |
* This function's API uses [[GPS]] and allows creators to give users unique experiences based on their real-world location.<ref>{{cite web |title=Build global-scale, immersive, location-based AR experiences with the ARCore Geospatial API |url=https://developers.google.com/ar/develop/geospatial |website=ARCore |publisher=Google Inc. |access-date=22 February 2024}}</ref> |
||
** Google's visual positioning system (VPS) is utilized for this process. |
** Google's visual positioning system (VPS) is utilized for this process. |
||
* Matches the user's visual data with that of [[Google Maps]] to determine precise location. |
* Matches the user's visual data with that of [[Google Maps]] to determine precise location. |
||
==See also== |
|||
* {{Annotated link|ARKit}} |
|||
* {{Annotated link|OpenXR}} |
|||
==References== |
==References== |
||
Line 56: | Line 60: | ||
[[Category:3D imaging]] |
[[Category:3D imaging]] |
||
[[Category:Augmented reality]] |
[[Category:Augmented reality]] |
||
{{Google-stub}} |
Latest revision as of 15:29, 29 November 2024
Developer(s) | |
---|---|
Initial release | February 23, 2018 |
Stable release | 1.45.2420502[1]
/ August 14, 2024 |
Operating system | Android |
Platform | Android 7.0 and later |
Website | developers |
ARCore, also known as Google Play Services for AR, is a software development kit developed by Google that allows for augmented reality (AR) applications to be built. ARCore has been integrated into a multitude of devices.[2]
Key technologies
[edit]ARCore uses a few key technologies to integrate virtual content with the real world as seen through the camera of a smartphone or tablet.[3] Each of these technologies can be utilized by developers to create a high-quality, immersive AR experience.
Six degrees of freedom
[edit]- Allows the phone to understand and track its position relative to the world.
- A motion tracking process known as simultaneous localization and mapping (SLAM) utilizes feature points - which are visually distinct objects within camera view - to provide focal points for the phone to determine proper positioning (pose) of the device.[4]
Environmental understanding
[edit]- Allows the phone to detect the size and location of flat surfaces - both vertical and horizontal - with feature points.
- Geometric plane can be calculated based on detected feature points.
- A scene semantics API is used to gather real-time semantic data about the user's surroundings and uses that data to identify objects and features in view.
Light estimation
[edit]- Lighting Estimation API allows the phone to estimate the environment's current lighting conditions and display images accurately in relation to real-world lighting.
- Lighting cues such as shadows and highlights are used to more immersively display virtual objects.[5]
Depth analysis
[edit]- Utilizes the phone's camera to create depth maps, which enable the device to more accurately determine the amount of space between surfaces based on what is captured.[6]
- In order to properly assess the real world, depth maps are created to measure the amount of space between objects or surfaces.
- A depth-from-motion algorithm takes the motion data from the user's camera and utilizes it to create a more detailed depth map.[7]
Geospatial capabilities
[edit]- This function's API uses GPS and allows creators to give users unique experiences based on their real-world location.[8]
- Google's visual positioning system (VPS) is utilized for this process.
- Matches the user's visual data with that of Google Maps to determine precise location.
See also
[edit]- ARKit – Augmented reality API for Apple platforms
- OpenXR – Standard for access to virtual reality and augmented reality platforms and devices
References
[edit]- ^ "Google Play Services for AR APKs". APKMirror. Retrieved 11 April 2024.
- ^ "ARCore supported devices". Google Inc. Retrieved 23 February 2020.
- ^ Amadeo, Ron (29 August 2017). "Google's ARCore brings augmented reality to millions of Android devices". Ars Technica. Condé Nast. Retrieved 6 November 2017.
- ^ "Fundamental Concepts". ARCore. Google Inc. Retrieved 22 February 2024.
- ^ "Get the Lighting Right". ARCore. Google Inc. Retrieved 22 February 2024.
- ^ "Fundamental Concepts". ARCore. Google Inc. Retrieved 22 February 2024.
- ^ "Depth Adds Realism". ARCore. Google Inc. Retrieved 22 February 2024.
- ^ "Build global-scale, immersive, location-based AR experiences with the ARCore Geospatial API". ARCore. Google Inc. Retrieved 22 February 2024.