What is Geopose?

This is the first of three parts. Part two is here.

There is no definition for 'geopose' on Wikipedia.

This post is an attempt to create a simple definition for the word.

In Bubiko Foodtour's Unusual Guide to Augmented Reality, 'geopose' is defined as:  A word that, now, “kind of means” a shared frame of reference signifying the unique positions and orientations of digital OR physical objects in real OR virtual spaces

The Open Augmented Reality Cloud is focused on, among other things, creating a definition upon which everyone agrees upon and can use. See also: anchor, Open Augmented Reality Cloud

The following is from a Github Post started by blairmckintyre.

(I just discovered his blog: https://blairmacintyre.me/ Wow!)

Mr. McKintyre wrote:

What will the relationship between WebXR and geospatial data be?

It seems that WebXR cannot entirely ignore geospatial positioning, as geospatial content will be a major use case for mobile AR (at least eventually).

The web already has a geolocation API, but it is not sufficient for these purposes: it gives position but not orientation, is of very poor quality and not synchronized with the WebXR frame data. The deviceorientation API cannot be relied on for orientation: it is of very poor quality, was never standardizes (and is potentially going to be removed from existing browsers) and is also not synchronized with the WebXR frame data.

ARKit offers the option to have it's local coordinate system be aligned with geospatial orientation (e.g., Y up, Z south, X east). This provides a possible direction for how geospatial might be handled: have the WebXR API expose a property that says if the coordinate frame can be aligned with geospatial EUS coordinates, and provide a way for the developer to request this. Crude/simple geospatial positional alignment (between the user and the local coordinates) is easier, if you are guaranteed to have the local device coordinates aligned with EUS: each time a geospatial value is received from the local coordinates, an estimate of the geospatial location of the origin of the local coordinate frame can be updated (based on error values, etc). It won't be any better than the error of the geolocation API, but can be stable (because the local coordinates are used for locating and rendering content, not the very-slowly-changing geolocation values).

Hi I am the founder of the https://open-arcloud.org/ . The TL DR AR-cloud description is a 1:1 digital map/twin of the physical world stored in the cloud that enables a shared programmable space attached directly to our physical suroundings that enables multiuser AR and persistent and universally consistent placement of virtual assets in the real worldOne of the things we hope can help bring this to reality is a standard definition of geographical position and orientation that can be understood across platforms and applications. What we call "GeoPose". Each AR-goggle, AR-smartphone or AR-content could have a GeoPose at any given moment.Obtaining GeoPose of an XR-device could be achieved by matching sensor data from the device with the 1:1 map in the cloud through something like a "GeoPose" cloud service. Once the device has its GeoPose it can display geospatial data and assets that are anchored to a GeoPose.A bit more about that here:
https://open-arcloud.org/standardsplease join our slack channel and chime in on how you think a GeoPose should be defined.
https://join.slack.com/t/open-arcloud/shared_invite/enQtMzE4MTc0MTY2NjYwLWIyN2E4YmYxOTA4MWNkZmI5OGQ4Mjg2MGYzNTc4OTRkN2RjZGUxOTc4YjJhOTQ0Nzc3OWMxYTA3ZDMxNGEzMGEMy hope of course is that WebXR will support using device GeoPose and do the proper transforms for assets that are ancored to GeoPoses or assets that are described by geospatial coordinates.

...............................................................

A bit of a bumbling attempt at describing geopose. The source materials are from a Powerpoint presentation on AV +AR, held at the Collider in Detroit in July 2019. The presentation is here.

The following is from a free PDF download entitled:

TITLE: Geopose Standards Working Group Charter [OGC 19-028]

Author Name (s): Jan-Erik Vinje, Christine Perey, Scott Simmons

Email: jan-erik.vinje@norkart.no

DATE: 2019-05-17

CATEGORY: SWG Charter Template

All physical world objects inherently have a geographically-anchored pose. Unfortunately, there is not a standard for universally expressing the pose in a manner which can be interpreted and used by modern computing platforms. The main purpose of this SWG will be to develop and propose a standard for geographically-anchored pose (geopose) with 6 degrees of freedom referenced to one or more standardized Coordinate Reference Systems (CRSs).

Definition of geopose

A real object in space can have three components of translation – up and down (z), left and right (x) and forward and backward (y) and three components of rotation – Pitch, Roll and Yaw. Hence the real object has six degrees of freedom.

The combination of position and orientation with 6 degrees of freedom of objects in computer graphics and robotics are usually referred to as the object’s “pose.” Pose can be expressed as being in relation to other objects and/or to the user. When a pose is defined relative to a geographical frame of reference or coordinate system, it will be called a geographically-anchored pose, or geopose for short.

Uses for geopose

An object with geopose may be any real physical object. This includes object such as an AR display device (proxy for a user’s eyes), vehicle, robot, or a park bench. It may also be a digital object like a BIM model, a computer game asset, the origin and orientation of the local coordinate system of an AR device, or a point-cloud dataset.

When the geopose of both real and virtual objects include the current position and orientation of the objects in a way that is universally understood, the interactions between the objects and an object and its location can be put to many uses. It is also important to note that many objects move with respect to a common frame of reference (and one another). Their positions and orientations can vary over time.

The ability to specify the geopose of any object enables us to represent any object in a universally agreed upon way for real world 3D spatial computing systems, such as those under development for autonomous vehicles or those used by augmented reality (AR) or 3D map solutions. In addition, the pose of any object can be encoded consistently in a digital representation of the physical world or any part therein (i.e., digital twin).

The proposed standard will provide an interoperable way to seamlessly express, record, and share the geopose of objects in an entirely consistent manner across different applications, users, devices, services, and platforms which adopt the standard or are able to translate/exchange the geopose into another CRS.

One example of the benefit of a universally consistent geopose is in a traffic situation. The same real-time geopose of a vehicle could be shared and displayed in different systems including:

Open Geospatial Consortium

Geopose SWG Charter Page 2

OGC 19-028r1

- a traffic visualization on a screen in another car,

- shown directly at its physical location in the AR glasses of a pedestrian that is around the corner for the car, or

- in the real time world model used by a delivery robot to help it navigate the world autonomously.

........................................................................................................................

argon.js documentation

This site contains documentation for web developers using Design Tools to add AR content to a web application.

Argon-aframe — Lesson 7: Geolocation

Key features of augmented reality include 1. the ability of associating data objects with places in the world and 2. displaying those objects at those places.

The <ar-geopose> primitive is Argon-aframe’s way of locating objects in physical space (of the planet). The <ar-geopose> primitive creates an entity with a referenceframe component. This component defines the position and/or rotation of the entity by an LLA (longitude, latitude, altitude), so you can locate your object anywhere on the earth relatively accurately by using this tag.

You can find the longitude and latitude through Google Maps or through a variety of mobile applications. Some apps will also give your the altitude of a location.
.........................................................................................................................................

The Open Augmented Reality Cloud glossary does not have a definition for 'geopose' as of September 19, 2019.

  1. Geo Spatial Pose: [ please define ]
  2. Pose: [please define ]
  3. Position: [please define]

It does, however, have the following post:

Reference Implementations for Conversion Between Geopose and Cartesian Coordinate Systems

There will be a need throughout many parts of the technological ecosystem to convert object poses between a geospatial coordinate-system and local ones (typically the Cartesian x,y,z in metric as used in most AR SDKs). This repository is the first step to create reference libraries in different programming languages, available for everyone to use for free.

Contributions are welcome.

  • © 2019 G
Geopose: take 2. Comments greatly appreciated, as I will redo the visuals used in the presentation and improve my speaking skills. Hopefully , anyway.

One Response to What is Geopose?

  1. Pingback: What is Geopose pt 2. - blackstepsblacksteps

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.