The two previous posts on this blog are the result of trying to create a definition of geopose. I still have not come up with a twenty-word-or-less definition. However, the following is very helpful towards achieving that goal.
What you are about to read is a reply sent to me by Jan-Erik Vinje, the Managing Director of the Open Augmented Reality Cloud. Jan-Erik and I will both be giving presentations at the 2nd Open Augmented Reality Cloud Symposium in Munich on October 16th.
Download the First State of the AR Cloud report as well. You can find it here.
Jan-Erik Vinje's thoughts on the differences between GPS and Geopose:
GPS is a specific technology. And it is mostly used for obtaining more or less accurate geospatial positions related to a geospatial coordinate reference system. Currently an ellipsoide that approximates the earth is used to as the reference for altitude and longitude. The ellipsoid is a bit crude but it is normally less than 100 meters incorrect as a representation of where the main ocean surface is or would have been.
GeoPose is intended to relate to the same type of geospatial reference but adds geospatial orientation to the geospatial position. All real objects on our planet could in principle be said to have both a position and orientation. GPS provides position. If you have a stream of GPS positions you could derive a direction vector that is almost an orientation but not quite. Imagine walking down the street with a smartphone and imagine collecting a GPS location ( lat, lng, alt) every second. You can create a path through space telling you the direction you have moved with phone. What the GPS locations can not give you is the orientation you have held your device. Did you point it towards the ground, to the sky or towards a Wall across the street, and importantly how was your device rotated around that direction?
For that you would need another system or set of systems to provide you with your orientation. AR Cloud visual positioning systems can provide both the position and the orientation of your device. This is what we call a pose. If that position and orientation is in a geospatial frame of reference equivalent to the one used for GPS one can call the pose a geopose.
(SB:Hmmm that is great, but the last three lines will require a bit of serious thought...I am not 100% clear I understand. When I do, I hope to make educational drawings, or better, 3D models-in AR.)
Jan-Erik Vinje, the Managing Director of the Open Augmented Reality Cloud, responded to my previous post about a definition of geopose. His reply is reproduced with his permission. Thanks to Jan-Erik and all of the team involved in this project. The following exemplifies all of the diligent hard work going on to create standards for AR.
Thanks 🙂 If you like to update the geopose page - we have made some headway with our partner Open Geospatial Consortium (A standards development organization in the geospatial sector)
Open AR Cloud put together a draft for a Standards Working Group for geopose at the Open Geospatial Consortium. It has been out on public hearing for a while and is currently being voted over by members.
If the vote succeds an official Standards Working Group will be created in Open Geospatial Consortium (OGC) to develop the geopose standard.
If we are successful in creating such a standard we will in effect have created the equivalent of the URL for real world spatial computing. Allowing the geopose of both real and virtual objects to be universally captured, stored, shared and understood.
I liken it to URL because it can be seen as links between the physical world with the digital world.
URL links information to more information. Geopose links the world to digital information and digital information to the world
It can even be used to create a "persistant portal" between a physical space and a virtual space. Where people from across the world who are experiencing a virtual space can go through such a portal to experience a real space (by streaming realtime reality capture data to all those in the virtual reality space). At the same time people in the real world can walk into the virtual reality space or see virtual objects, scenes and avatars of virtual reality users projected into their physical space using AR.
Stephen Black: There is so much information here, tat need to review it carefully. It will be a wonderful challenge to come up with a definition of 'geopose' that is 20 words or less!
What will the relationship between WebXR and geospatial data be?
It seems that WebXR cannot entirely ignore geospatial positioning, as geospatial content will be a major use case for mobile AR (at least eventually).
The web already has a geolocation API, but it is not sufficient for these purposes: it gives position but not orientation, is of very poor quality and not synchronized with the WebXR frame data. The deviceorientation API cannot be relied on for orientation: it is of very poor quality, was never standardizes (and is potentially going to be removed from existing browsers) and is also not synchronized with the WebXR frame data.
ARKit offers the option to have it's local coordinate system be aligned with geospatial orientation (e.g., Y up, Z south, X east). This provides a possible direction for how geospatial might be handled: have the WebXR API expose a property that says if the coordinate frame can be aligned with geospatial EUS coordinates, and provide a way for the developer to request this. Crude/simple geospatial positional alignment (between the user and the local coordinates) is easier, if you are guaranteed to have the local device coordinates aligned with EUS: each time a geospatial value is received from the local coordinates, an estimate of the geospatial location of the origin of the local coordinate frame can be updated (based on error values, etc). It won't be any better than the error of the geolocation API, but can be stable (because the local coordinates are used for locating and rendering content, not the very-slowly-changing geolocation values).
Hi I am the founder of the https://open-arcloud.org/ . The TL DR AR-cloud description is a 1:1 digital map/twin of the physical world stored in the cloud that enables a shared programmable space attached directly to our physical suroundings that enables multiuser AR and persistent and universally consistent placement of virtual assets in the real worldOne of the things we hope can help bring this to reality is a standard definition of geographical position and orientation that can be understood across platforms and applications. What we call "GeoPose". Each AR-goggle, AR-smartphone or AR-content could have a GeoPose at any given moment.Obtaining GeoPose of an XR-device could be achieved by matching sensor data from the device with the 1:1 map in the cloud through something like a "GeoPose" cloud service. Once the device has its GeoPose it can display geospatial data and assets that are anchored to a GeoPose.A bit more about that here: https://open-arcloud.org/standardsplease join our slack channel and chime in on how you think a GeoPose should be defined. https://join.slack.com/t/open-arcloud/shared_invite/enQtMzE4MTc0MTY2NjYwLWIyN2E4YmYxOTA4MWNkZmI5OGQ4Mjg2MGYzNTc4OTRkN2RjZGUxOTc4YjJhOTQ0Nzc3OWMxYTA3ZDMxNGEzMGEMy hope of course is that WebXR will support using device GeoPose and do the proper transforms for assets that are ancored to GeoPoses or assets that are described by geospatial coordinates.
TITLE: Geopose Standards Working Group Charter [OGC 19-028]
Author Name (s): Jan-Erik Vinje, Christine Perey, Scott Simmons
CATEGORY: SWG Charter Template
All physical world objects inherently have a geographically-anchored pose. Unfortunately, there is not a standard for universally expressing the pose in a manner which can be interpreted and used by modern computing platforms. The main purpose of this SWG will be to develop and propose a standard for geographically-anchored pose (geopose) with 6 degrees of freedom referenced to one or more standardized Coordinate Reference Systems (CRSs).
Definition of geopose
A real object in space can have three components of translation – up and down (z), left and right (x) and forward and backward (y) and three components of rotation – Pitch, Roll and Yaw. Hence the real object has six degrees of freedom.
The combination of position and orientation with 6 degrees of freedom of objects in computer graphics and robotics are usually referred to as the object’s “pose.” Pose can be expressed as being in relation to other objects and/or to the user. When a pose is defined relative to a geographical frame of reference or coordinate system, it will be called a geographically-anchored pose, or geopose for short.
Uses for geopose
An object with geopose may be any real physical object. This includes object such as an AR display device (proxy for a user’s eyes), vehicle, robot, or a park bench. It may also be a digital object like a BIM model, a computer game asset, the origin and orientation of the local coordinate system of an AR device, or a point-cloud dataset.
When the geopose of both real and virtual objects include the current position and orientation of the objects in a way that is universally understood, the interactions between the objects and an object and its location can be put to many uses. It is also important to note that many objects move with respect to a common frame of reference (and one another). Their positions and orientations can vary over time.
The ability to specify the geopose of any object enables us to represent any object in a universally agreed upon way for real world 3D spatial computing systems, such as those under development for autonomous vehicles or those used by augmented reality (AR) or 3D map solutions. In addition, the pose of any object can be encoded consistently in a digital representation of the physical world or any part therein (i.e., digital twin).
The proposed standard will provide an interoperable way to seamlessly express, record, and share the geopose of objects in an entirely consistent manner across different applications, users, devices, services, and platforms which adopt the standard or are able to translate/exchange the geopose into another CRS.
One example of the benefit of a universally consistent geopose is in a traffic situation. The same real-time geopose of a vehicle could be shared and displayed in different systems including:
Open Geospatial Consortium
Geopose SWG CharterPage 2
- a traffic visualization on a screen in another car,
- shown directly at its physical location in the AR glasses of a pedestrian that is around the corner for the car, or
- in the real time world model used by a delivery robot to help it navigate the world autonomously.
This site contains documentation for web developers using Design Tools to add AR content to a web application.
Argon-aframe — Lesson 7: Geolocation
Key features of augmented reality include 1. the ability of associating data objects with places in the world and 2. displaying those objects at those places.
The <ar-geopose> primitive is Argon-aframe’s way of locating objects in physical space (of the planet). The <ar-geopose> primitive creates an entity with a referenceframe component. This component defines the position and/or rotation of the entity by an LLA (longitude, latitude, altitude), so you can locate your object anywhere on the earth relatively accurately by using this tag.
You can find the longitude and latitude through Google Maps or through a variety of mobile applications. Some apps will also give your the altitude of a location. .........................................................................................................................................
Reference Implementations for Conversion Between Geopose and Cartesian Coordinate Systems
There will be a need throughout many parts of the technological ecosystem to convert object poses between a geospatial coordinate-system and local ones (typically the Cartesian x,y,z in metric as used in most AR SDKs). This repository is the first step to create reference libraries in different programming languages, available for everyone to use for free.
Would you like to support THE GREATEST AR TOUR IN HISTORY? Do you want to learn the latest news from AWE, Austin, Japan, Detroit, Turin, Munich, Shenzhen and Japan!?
Stephen Black and Bubiko Foodtour are about to go global: learning, educating and networking at some of the biggest AR events in the world. We just need a little financial fuel...we are movin’ and groovin, but our app isn’t out yet! Startup blues!
The following video is a rehearsal. The starting point is a presentation I did in Detroit in July 2019. That presentation was about Autonomous Vehicles and Augmented Reality. Eventually I will have about eight videos based on that Detroit presentation. You can find the Powerpoint for that here.
It will be obvious that I am not the smoothest person to ever be on a stage. I am a photographer, a cinematographer, an artist and a writer. I am used to having time to think about what I am expressing. As I will soon be making presentations in Europe, I am trying to get through the learning curve of being onstage, asap.
In this presentation, I often refer to the Open Augmented Reality Cloud. If you are serious about AR, you must learn about their work, and download the free State of the Augmented Reality Cloud Report which is available on their website. (I am honored and humbled to say that I will be speaking about Bubiko at the OARC Symposium in Munich on October 16!)
The ARShow produced this informative podcast with Jan-Erik Vinje, Christine Perey, Jason Fox and Colin Steinman, the key cotributors to the OARC
From the start, Bubiko Foodtour was meant to be a cross between Hello Kitty and Anthony Bourdain, with a dash of Charlie Chaplin.
Bubiko is a little chef from northeastern Thailand, just outside of Udon Thani. There, growing up, she learned how to make Issan-style food. However, she is also familiar with Lanna cuisine and other regional cooking styles of Thailand. Her favorite dessert is mango sticky rice. (Visit Bubiko's Instagram account to see the many locations where she sampled mango sticky rice.)
Bubiko has researched the food cultures of Thailand, Malaysia, Singapore, Bali and Shenzhen, China. She has also spent time researching in Ventiane, Yangoon, Jakarta, Bandung and Hong Kong. Japan and the US as well. Again, you are invited to look at her Instagram, or search "Bubiko" on this blog.
As AR technology continues to improve, Bubiko will make educational AR projects for people of all ages about spices, vitamins, geography, cooking techniques and other concepts related to food.
Bubiko also really likes butterfly pea flowers, and one day will have an AR lesson about them.
Butterfly pea flower ( Clitoria ternatea ) is found all over Southeast Asia. It used to make many types of foods and is also great in drinks; by itself or mixed with other ingredients, like macha. It is also used in soap, shampoo and other health care products.
Here is a great recipe for "Blue Surf Cake", on the Unconventional Baker Blog. Additionally, the blog has a number of great resource on where to buy butterfly pea flowers, as well as other natural colorings. Recommended!
A recipe for Butterfly Pea Flower Crepe cake can be found here, on the Mario'z Eats blog. Also informative, and the instructions are clear and well illustrated.
Bubiko is a little chef who has been having some exciting adventures in AR, Augmented Reality. Bubiko was born at Novaby and made her public debut at Tech in the Tenderloin.
Stephen Black is a writer, a visual artist and marching along the path of Spoken Word.
This post is exploratory.
Stephen Black and Bubiko Foodtour are now looking for opportunities to make presentations, or do collaborations with musicians, artists, writers or AR practitioners in these cities.
(BLAM is also a reference to SLAM: Simultaneous Localization And Mapping, a process associated with AR.)
Stephen Black has given presentations at Sasin School of Business, MIT Media Lab, Hong Kong PolyU, TechCrunch Shenzhen and the Collider at Alitimetrik (Detroit). This post is about his projects and this post is about Bubiko's.
What we hope to do, are open to:
Find AR collaborators. Do you have an AR software that could use a cute little chef? (Bubiko has been in Facebook Spark and Facebook Camera projects, as well as an AR game demo made by Dominique Wu from Hummingbirdsday Studios. Two projects using Artive are here and here.
2. Consultations regarding AR, or projects needing a producer.
I am now working on a book about AR, roads and transportation. The following is from a section about AR and the Tour de France.
In Spring of 2019, I was excited to discover that the world of cycling had a growing number of AR apps and products. Excited because, previous to these discoveries, I could only find three successful examples of AR.
The first was Pokemon Go but, as huge as it was, it created the misleading impression that AR was only for games. Next: 19 Crimes, an Australian wine company that achieved massive success largely due to their series of AR "living wine labels”. The third example was Ikea, whose AR app allowed buyers to "insert" digital models of furniture into their real homes, so as to visualise the ideal purchase.
Also, there were many exciting AR projects in medicine and industry, but these uses required expensive viewing devices, like the Hololens.
So, when I discovered cycling goggles with AR functionality? Great! An AR app that allowed users to customize a bike and then order it? Yes! An AR bike repair manual? Yes, again! These simple uses were practical, but with hints of openness, adventure and excitement. Vitality!
And then I thought I had a good idea. To break up all of the technical ideas, and to inject some excitement into the book, I decided to write about futuristic uses of AR in the Tour de France. I came up with ideas like “fatigue hawks”, “blood hawks” and “wind hawks", terms that would be applied to specialized data scientists who coached cycling teams.
Although the NTT website doesn’t use exciting names like “blood hawks”, they do use data in exciting ways, especially in the areas of data collection/processing, AR, 3D mapping and AI. NTT is making media history. Pioneers, they are leaving the safe continent of broadcast television to venture into the uncharted islands of billions of mobile devices.
And no, NTT is not sponsoring this (nor is anyone else). Having said that I would be happy to write about NTT's London headquarters, or their facility dedicated to the Tour de France, in Mulhouse, in eastern France.
The recent release of citywide 3D mapping technologies allows AR objects to be placed permanently and accurately. Two companies to watch : Scape and 6D.Ai.
With an established background in visual arts, music, the performing arts and AR, I have been waiting for the opportunity to create very large scale AR artworks. Years ago I developed plans for an AR sound artwork for Singapore's Tiong Bahru, an estate composed of historic Art Deco buildings. In 2018, in Shenzhen, a citywide light show rekindled my interest in large scale AR projects.
This year, while preparing for a presentation in Detroit, another idea: an Augmented Reality project to be viewed, and experienced, from inside Autonomous Vehicles; bROADWAY
Although the ideas of bROADWAY could work in any city, Detroit is associated with the automobile industry, including AV. Detroit is the home of legends like Motown, Aretha Franklin, Stevie Wonder, the Jacksons, Detroit techno (Juan Atkins, Kevin Saunderson and Derrick May +), Bob Seger, Jack White, Alice Cooper and Iggy Pop. These points, plus distinguished architecture and public spaces, mean there could be no better city for the world’s first AR+AV citywide experience!
AVs as performers!
If anyone in Detroit knows how to make this project happen, please get in touch.
(And, as AVs are becoming more common worldwide, the idea could happen anywhere.)
This blog post is a sample of the guide, which is available on Amazon.
This is a response to an urgent need to share
Influences: a. “make it fast, not
c. ash from chaos
d. Chungking Express
e. The Zone System by Ansel
Adams and Minor White
No person or company has paid me to have their
name or product included in this document.
I am now notifying the companies and individuals
mentioned. If your work is here, and I have not yet contacted you, I apologize.
If you prefer that I do not share your work, let me know and I will remove it immediately.
No one has received a promotional copy. If you
have bought this, and we meet, I will buy you a beverage. Or two. If you bought
this and it seems unlikely that we will meet, I will send you my other ebooks
or find a way to make sure this purchase is something you are very happy with.
Hint, hint, Bubiko and I are working on an app. (Many times in the past, I have given out free ebooks; only to be
surprised that people did not even open them; even my bestseller.)
If you use photos or text from this document,
please credit accordingly:
Stephen Black from Bubiko’s Unusual Guide
Stephen Black from Bubiko’s Unusual Guide to AR
Have a nice day.
9/3/2019 4th edition
Anchor The three points used to describe the
location, in the real world, where an AR object has been placed. The more one
understands about 3D geometry, the more one is prepared for AR. See also billboarding, geopost.
Augmented Driving Will Feel Like, a SXSW
presentation by Theo Calvin.
Avatar A digital creation used to
represent a person, possibly resembling a human, possibly not.
‘augment’ means to make something greater; to give it more power. The platform
called ‘augmented reality’ is a network that adds digital information to real
objects. Machines are needed to do this, and
to see the results. Thus, augmented reality is the real world and the
digital information added to it, as well as the machines enabling us to
experience both at the same time.See R3
information could be in the form of a 3D model of a real object made by a
computer, like the furniture in AR apps made by Ikea, Wayfair and other
companies. Pokemon Go and Snapchat are
other examples of AR. However, the digital information could also be live or
pre-recorded video, music, podcasts,medical imagery, industrial blueprints,
text information or many other types of content.
phone, tablet, HMD (head mounted device) or eyewear is needed to see/hear/feel the digital content.
change everything, even more than radio, television, computers and mobile
Autonomous Vehicles Vehicles capable of sensing their
environment, making decisions and navigating without human input. AVs require the safest, most efficient AR data networks possible.
Billboarding The term used to describe a common procedure when positioning models
in AR. Billboard means that the front view of the 3D model faces the viewer.
Most AR apps allow the user to rotate the model so that another side of the
model is presented to the viewer.
Bubiko in the starting position; ie
Rabbit The AR masterpiece of 2019.
Occlusion, light estimation, voice commands and
Patched Reality and 6d allow you to learn three
years worth of ARness, and see the future:
Foodtour AR’s first superstar, a character created by
Stephen Black and Sayuri Okayama. Bubiko is one of the results of a two year
food/AR research trip in Southeast Asia. Bubiko is a trailblazer who shares her
AR experiences with the general public as well as with AR practitioners. Bubiko
often forms partnerships, such as one with Green Bean Boy, a character made by
Dominique Wu at Hummingbirdsday Studios http://www.hummingbirdsday.com/.
Other collaborations are planned with the Dundercats, by Six Cat Studios https://www.sixcatstudios.com/journal/2018/4/24/the-dundercats,
and creations by David Severn http://david-severn.com/ . The 3D version of
Bubiko was created by Novaby. https://www.novaby.com/
Charlie Fink's Metaverse - An AR Enabled Guide to AR & VR
Cloud see R3
vision Computers use lenses, radar and many kinds of
sensors to learn about the world. These different ways of “seeing” are often
combined with Artificial Intelligence(AI). The end result is that computers
recognize objects as well as the many kinds of information connected to them.
Convergence author and
Forbes columnist Charlie Fink tells the story of Augmented Reality (AR), a new
technology that's already seeping into every smartphone and every workplace.
AR's merger with new 5G and AI technologies will unleash a wave of innovation
that will enable wearable, invisible, latency-free and ubiquitous computing.
The book uses a kind of mobile AR called "marker AR" to allow readers
to use their smartphone to bring pages to life, demonstrating with art and
entertainment how the world, and every person, place, and thing, will be
painted with data. https://www.amazon.com/Convergence-World-Will-Painted-Data/dp/0578460556/
sensing Recording scenes in 3D dimensions. See volumetric video
Events on Hi-Techs4Humans: Workshops, Seminars, Lectures, etc. (Facebook)
computing There are advantages to processing data as
close to the user as possible, especially in regards to the Internet of Things.
This means, to a large extent, a decentralized system. This
localized/decentralized approach is called edge computing.