It was unexpected, suddenly having a brief discussion about cancer treatments with Professor Daniel Zajfman, the President of the Weizmann Institute of Science. Unplanned as well; Professor Zajfman was being interviewed right in front of me. What I happened to overhear was a phrase that went something like this: “Historically, cancer was perceived as one disease. Eventually, we understood that there are many kinds of cancer. Now, we realize that cancer, just like every living being, seems to be unique.”
At first, the presence of Professor Zajfman at VIEW was surprising. The VIEW website is a striking collection of the latest animated movies, CG creations and award-winning Hollywood technical talents. Why would a physicist whose career focuses on atomic and molecular physics be making a presentation..He seeks to understand the astrophysical conditions found in star-forming regions, as well as working to solve the riddle of star formation. VIEW also focus on exploring the increasingly fluid boundary between real and digital worlds. Professor Zajfman listened attentively, gave me his card, and that was that. Unfortunately, I had no chance to attend his talk which was entitled What is Why and Why is it Important?
A woman warrior riding upon a flying dragon the size of a 747.
A rat who cooks in a three star restaurant in Paris.
A mouse whose best friends are a dog and a duck.
The tortured soul of a giant robot.
A captured clownfish.
Animation is the art form in which ideas like these become the foundations of global economic powerhouses.
Besides toys, the animation industry sells food products, books, health products, clothing, banking services and more. At VIEW 2019, I was very fortunate to listen to, and interact with, some of the most successful men and women in contemporary animation.
It is difficult to say who impressed me the most, but Thomas Schelesny would be in the top of the list. His presentation about the dragons created for Game of Thrones was extremely impressive.
However, after his talk he spoke to a few people, including students. He discussed the idea of being an artist. Egos were discussed. Mr. Schelesny did not mention his Emmy. What he did speak about was the professionalism needed to keep things moving. The dragons of GoT were an extremely expensive undertaking that required seamless interaction between a number of companies all over the world. These companies, for the most part, rarely work with other companies. The production schedule was dangerously short.
Mr. Schelesny exemplified the traits that are usually found in the best soldiers fighting the worst wars. He was a leader who stayed focused and thought of all members on his team.
Those watching Game of Thrones saw a beautifully choreographed and realistic battle of flying dragons. Those aware of what was happening behind the scenes , with Thomas and the Image Engine Design, saw a very different battle; one equally thrilling but real.
I met Jim Simons, the force behind Tranzient, at the November 2019 edition of the London Augmenting Reality Meetup. He did a live demo of the Tranzient VR/music making app. The app not only performed beautifully, Jim composed some great little beats; on the spot of course.
The Tranzient app allows musicians to collaborate in VR, in real time. Tranzient is impressive on many levels. I wondered if Jim would be consider adding Bubiko to his "band".
Jim was open to the idea!
What you see above is less than an hour's work. Now that we know Bubiko is functional within Tranzient, we can literally play around.
Tranzient is highly recommended https://www.aliveintech.com/ . I just met Jim, but he is genuinely enthusiastic about making music in VR. If you are a musician and working in VR, you should definitely move on Tranzient. Jim is a bit mysterious about his musical past, but I am sure he was involved with some great projects. He sure makes Bubiko look good!
This version on Bubiko was made by Novaby, art directed by Stephen Black and Sayuri Okayama.
Tony, Antony Vitillo, is the AR/VR consultant who runs Skarred Ghost, one of the top blogs for VR and AR. I met Tony online in 2016, and we have shared our problems and joys since then. I was invited to speak in Munich and Paris, and hoped that somehow I could finally meet Tony in person, in Europe.
However, Tony was going to be in China until he returned to Italy, to give a presentation at VIEW, in Torino, Italy. I had never heard of VIEW. When I saw it, I immediately wanted to go. But my trip was on a budget far below “low budget”. Perhaps I could volunteer, like I would do at AWE, in Munich. Tony said he would ask the organizers, and that was how we left it.
I arrived in Turin on the morning of Sunday, October 21. I’d taken an overnight bus from Munich; the city felt both friendly and alien. Two officers in a polizia car asked me if I knew where I was going.
Though the officers described the bus and train options very well, I didn’t pay too much attention. Bubiko and I were going to walk.
And, walk we did.
The address turned out to be the administrative office for VIEW, not the venue. I was relieved when someone answered the intercom; and happy when Ricardo, a handsome young man walked out. He attentively listened to my volunteering idea and seemed genuinely interested in my Bubiko demo. Like the polizie, Ricardo gave me directions on how to the venue, a place called OGR, which seemed to be near where the bus from Munich had dropped me off. I thanked Ricardo, slung my bag over my shoulder, and began rolling my little suitcase down the road.
A blonde young woman called out to me! She was the driver for OGR, and she was going to the venue. As she drove me to OGR, we talked. She lived outside of Turin, and told me how beautiful the golds and reds of Autumn were beneath the snow-capped peaks of the Alps.
We arrived at OGR, once a train station, now an art center.
I didn’t know what would happen inside, but I pretended that I did.
This post provides notes on the above Powerpoint. The actual presentation elaborated on the points much more than I do here. Please feel free to ask questions. Also, as always I recommend joining the Open Augmented Reality Cloud. They are working to create open, interoperable standards to encourage the growth, use and understanding of AR.
And yes, another resource is Bubiko Foodtour's Unusual Guide to Augmented Reality, which is reviewed here.
The following presentation uses references from the State of the AR Cloud Report published May 28, 2019 by the Open AR Cloud. All rights reserved. http://stateofthearcloud.com/ With the exception of Novaby, the 3D model making company, I have no relationships with any companies mentioned in this presentation. This research has been self-funded.
The above link exemplifies a lot of the ideas about the future of AR in general, as well as about the potential applications of AR and AV, Autonomous Vehicles.The entire article is good, but the embedded video is excellent.
I cannot recommend the work of the OARC highly enough. Visit their site for the download of the first OARC Symposium Report.
8. www.blacksteps.tv At least three posts about 'geopose'
Yes, on this blog, I presently have a few posts about 'geopose'.
9. From correspondence with Jan-Erik Vinje, Managing Director of Open Augmented Reality Cloud:
○ If we are successful in creating such a standard we will in effect have created the equivalent of the URL for real world spatial computing. Allowing the geopose of both real and virtual objects to be universally captured, stored, shared and understood.
This comment by Jan-Erik, and those that follow, are self-explanatory. However, one week ago, at the second OARC Symposium, the definition of 'geopost' was greatly discussed. In fact, the goal of the OARC for the next year is to fcus upon creating a final definition of 'geopost', and to work towards a working model.
10. ○ I liken it to URL because it can be seen as links between the physical world with the digital world. ○ URL links information to more information. Geopose links the world to digital information and digital information to the world.
11. ● It can even be used to create a "persistent portal" between a physical space and a virtual space. Where people from across the world who are experiencing a virtual space can go through such a portal to experience a real space (by streaming realtime reality capture data to all those in the virtual reality space).
12. ● At the same time people in the real world can walk into the virtual reality space or see virtual objects, scenes and avatars of virtual reality users projected into their physical space using AR.
14. To create a GEOPOSE 1.Physical space owner 2. AR data/ SLAM spatial location and mapping 3. AR Cloud space 4. End user(s)
This slide features a photo of the Fox Theatre in Detroit. The front of the theatre is copyrighted. Anyway planning to use the front of the Fox would need to obtain permission to do so.
This situation would be true for many buildings, copyrighted or not.
15. How to find the location? (the need for interoperability): Google maps Mapillary Military maps LADOT ebike monitoring maps(eg) Satellite-based maps monitoring floods, snow, ice, fire, crowds etc. City/State/Federal maps Company owned routing maps Bus routes Ride sharing maps Superworld Emergency routes Bike lanes Construction Traffic signals
The companies and organizations listed above would all benefit from an open AR Cloud. A comparison is sometimes made, comparing the present state of the AR Cloud with the early nonstandardized gauges of railroad tracks. Once the railroad tracks became standardized, society benefited.
Once there are interoperable standards for the AR Cloud, groups like those listed above will benefit, as will education, medicine, science and, all of society/
I arrived in New Orleans on the morning of October 1, 2019. The very air-conditioned overnight bus ride from Austin resulted in a bit of a raspy throat, but overall, I was excited. At 9AM, I was to meet with Charles Carriere, the President and CEO of Scandy, a company working with cutting edge 3D ideas, like scanning and volumetric video.
Charles was friendly and informative. Our conversation was stimulating. At one point he pulled out his iPhone, waved it around me, and then pushed a button. Voila: my first portrait in 3D styleee!
My takeaway: AR will be defined by 3D objects. Yes, I knew this before, but now I have internalized that fact. Scandy's work is hugely important.
The two previous posts on this blog are the result of trying to create a definition of geopose. I still have not come up with a twenty-word-or-less definition. However, the following is very helpful towards achieving that goal.
What you are about to read is a reply sent to me by Jan-Erik Vinje, the Managing Director of the Open Augmented Reality Cloud. Jan-Erik and I will both be giving presentations at the 2nd Open Augmented Reality Cloud Symposium in Munich on October 16th.
Download the First State of the AR Cloud report as well. You can find it here.
Jan-Erik Vinje's thoughts on the differences between GPS and Geopose:
GPS is a specific technology. And it is mostly used for obtaining more or less accurate geospatial positions related to a geospatial coordinate reference system. Currently an ellipsoide that approximates the earth is used to as the reference for altitude and longitude. The ellipsoid is a bit crude but it is normally less than 100 meters incorrect as a representation of where the main ocean surface is or would have been.
GeoPose is intended to relate to the same type of geospatial reference but adds geospatial orientation to the geospatial position. All real objects on our planet could in principle be said to have both a position and orientation. GPS provides position. If you have a stream of GPS positions you could derive a direction vector that is almost an orientation but not quite. Imagine walking down the street with a smartphone and imagine collecting a GPS location ( lat, lng, alt) every second. You can create a path through space telling you the direction you have moved with phone. What the GPS locations can not give you is the orientation you have held your device. Did you point it towards the ground, to the sky or towards a Wall across the street, and importantly how was your device rotated around that direction?
For that you would need another system or set of systems to provide you with your orientation. AR Cloud visual positioning systems can provide both the position and the orientation of your device. This is what we call a pose. If that position and orientation is in a geospatial frame of reference equivalent to the one used for GPS one can call the pose a geopose.
(SB:Hmmm that is great, but the last three lines will require a bit of serious thought...I am not 100% clear I understand. When I do, I hope to make educational drawings, or better, 3D models-in AR.)
Jan-Erik Vinje, the Managing Director of the Open Augmented Reality Cloud, responded to my previous post about a definition of geopose. His reply is reproduced with his permission. Thanks to Jan-Erik and all of the team involved in this project. The following exemplifies all of the diligent hard work going on to create standards for AR.
Thanks 🙂 If you like to update the geopose page - we have made some headway with our partner Open Geospatial Consortium (A standards development organization in the geospatial sector)
Open AR Cloud put together a draft for a Standards Working Group for geopose at the Open Geospatial Consortium. It has been out on public hearing for a while and is currently being voted over by members.
If the vote succeds an official Standards Working Group will be created in Open Geospatial Consortium (OGC) to develop the geopose standard.
If we are successful in creating such a standard we will in effect have created the equivalent of the URL for real world spatial computing. Allowing the geopose of both real and virtual objects to be universally captured, stored, shared and understood.
I liken it to URL because it can be seen as links between the physical world with the digital world.
URL links information to more information. Geopose links the world to digital information and digital information to the world
It can even be used to create a "persistent portal" between a physical space and a virtual space. Where people from across the world who are experiencing a virtual space can go through such a portal to experience a real space (by streaming realtime reality capture data to all those in the virtual reality space). At the same time people in the real world can walk into the virtual reality space or see virtual objects, scenes and avatars of virtual reality users projected into their physical space using AR.
Stephen Black: There is so much information here, that I need to review it carefully. It will be a wonderful challenge to come up with a definition of 'geopose' that is 20 words or less!
What will the relationship between WebXR and geospatial data be?
It seems that WebXR cannot entirely ignore geospatial positioning, as geospatial content will be a major use case for mobile AR (at least eventually).
The web already has a geolocation API, but it is not sufficient for these purposes: it gives position but not orientation, is of very poor quality and not synchronized with the WebXR frame data. The deviceorientation API cannot be relied on for orientation: it is of very poor quality, was never standardizes (and is potentially going to be removed from existing browsers) and is also not synchronized with the WebXR frame data.
ARKit offers the option to have it's local coordinate system be aligned with geospatial orientation (e.g., Y up, Z south, X east). This provides a possible direction for how geospatial might be handled: have the WebXR API expose a property that says if the coordinate frame can be aligned with geospatial EUS coordinates, and provide a way for the developer to request this. Crude/simple geospatial positional alignment (between the user and the local coordinates) is easier, if you are guaranteed to have the local device coordinates aligned with EUS: each time a geospatial value is received from the local coordinates, an estimate of the geospatial location of the origin of the local coordinate frame can be updated (based on error values, etc). It won't be any better than the error of the geolocation API, but can be stable (because the local coordinates are used for locating and rendering content, not the very-slowly-changing geolocation values).
Hi I am the founder of the https://open-arcloud.org/ . The TL DR AR-cloud description is a 1:1 digital map/twin of the physical world stored in the cloud that enables a shared programmable space attached directly to our physical suroundings that enables multiuser AR and persistent and universally consistent placement of virtual assets in the real worldOne of the things we hope can help bring this to reality is a standard definition of geographical position and orientation that can be understood across platforms and applications. What we call "GeoPose". Each AR-goggle, AR-smartphone or AR-content could have a GeoPose at any given moment.Obtaining GeoPose of an XR-device could be achieved by matching sensor data from the device with the 1:1 map in the cloud through something like a "GeoPose" cloud service. Once the device has its GeoPose it can display geospatial data and assets that are anchored to a GeoPose.A bit more about that here: https://open-arcloud.org/standardsplease join our slack channel and chime in on how you think a GeoPose should be defined. https://join.slack.com/t/open-arcloud/shared_invite/enQtMzE4MTc0MTY2NjYwLWIyN2E4YmYxOTA4MWNkZmI5OGQ4Mjg2MGYzNTc4OTRkN2RjZGUxOTc4YjJhOTQ0Nzc3OWMxYTA3ZDMxNGEzMGEMy hope of course is that WebXR will support using device GeoPose and do the proper transforms for assets that are ancored to GeoPoses or assets that are described by geospatial coordinates.
TITLE: Geopose Standards Working Group Charter [OGC 19-028]
Author Name (s): Jan-Erik Vinje, Christine Perey, Scott Simmons
CATEGORY: SWG Charter Template
All physical world objects inherently have a geographically-anchored pose. Unfortunately, there is not a standard for universally expressing the pose in a manner which can be interpreted and used by modern computing platforms. The main purpose of this SWG will be to develop and propose a standard for geographically-anchored pose (geopose) with 6 degrees of freedom referenced to one or more standardized Coordinate Reference Systems (CRSs).
Definition of geopose
A real object in space can have three components of translation – up and down (z), left and right (x) and forward and backward (y) and three components of rotation – Pitch, Roll and Yaw. Hence the real object has six degrees of freedom.
The combination of position and orientation with 6 degrees of freedom of objects in computer graphics and robotics are usually referred to as the object’s “pose.” Pose can be expressed as being in relation to other objects and/or to the user. When a pose is defined relative to a geographical frame of reference or coordinate system, it will be called a geographically-anchored pose, or geopose for short.
Uses for geopose
An object with geopose may be any real physical object. This includes object such as an AR display device (proxy for a user’s eyes), vehicle, robot, or a park bench. It may also be a digital object like a BIM model, a computer game asset, the origin and orientation of the local coordinate system of an AR device, or a point-cloud dataset.
When the geopose of both real and virtual objects include the current position and orientation of the objects in a way that is universally understood, the interactions between the objects and an object and its location can be put to many uses. It is also important to note that many objects move with respect to a common frame of reference (and one another). Their positions and orientations can vary over time.
The ability to specify the geopose of any object enables us to represent any object in a universally agreed upon way for real world 3D spatial computing systems, such as those under development for autonomous vehicles or those used by augmented reality (AR) or 3D map solutions. In addition, the pose of any object can be encoded consistently in a digital representation of the physical world or any part therein (i.e., digital twin).
The proposed standard will provide an interoperable way to seamlessly express, record, and share the geopose of objects in an entirely consistent manner across different applications, users, devices, services, and platforms which adopt the standard or are able to translate/exchange the geopose into another CRS.
One example of the benefit of a universally consistent geopose is in a traffic situation. The same real-time geopose of a vehicle could be shared and displayed in different systems including:
Open Geospatial Consortium
Geopose SWG CharterPage 2
- a traffic visualization on a screen in another car,
- shown directly at its physical location in the AR glasses of a pedestrian that is around the corner for the car, or
- in the real time world model used by a delivery robot to help it navigate the world autonomously.
This site contains documentation for web developers using Design Tools to add AR content to a web application.
Argon-aframe — Lesson 7: Geolocation
Key features of augmented reality include 1. the ability of associating data objects with places in the world and 2. displaying those objects at those places.
The <ar-geopose> primitive is Argon-aframe’s way of locating objects in physical space (of the planet). The <ar-geopose> primitive creates an entity with a referenceframe component. This component defines the position and/or rotation of the entity by an LLA (longitude, latitude, altitude), so you can locate your object anywhere on the earth relatively accurately by using this tag.
You can find the longitude and latitude through Google Maps or through a variety of mobile applications. Some apps will also give your the altitude of a location. .........................................................................................................................................
Reference Implementations for Conversion Between Geopose and Cartesian Coordinate Systems
There will be a need throughout many parts of the technological ecosystem to convert object poses between a geospatial coordinate-system and local ones (typically the Cartesian x,y,z in metric as used in most AR SDKs). This repository is the first step to create reference libraries in different programming languages, available for everyone to use for free.
Would you like to support THE GREATEST AR TOUR IN HISTORY? Do you want to learn the latest news from AWE, Austin, Japan, Detroit, Turin, Munich, Shenzhen and Japan!?
Stephen Black and Bubiko Foodtour are about to go global: learning, educating and networking at some of the biggest AR events in the world. We just need a little financial fuel...we are movin’ and groovin, but our app isn’t out yet! Startup blues!
The following video is a rehearsal. The starting point is a presentation I did in Detroit in July 2019. That presentation was about Autonomous Vehicles and Augmented Reality. Eventually I will have about eight videos based on that Detroit presentation. You can find the Powerpoint for that here.
It will be obvious that I am not the smoothest person to ever be on a stage. I am a photographer, a cinematographer, an artist and a writer. I am used to having time to think about what I am expressing. As I will soon be making presentations in Europe, I am trying to get through the learning curve of being onstage, asap.
In this presentation, I often refer to the Open Augmented Reality Cloud. If you are serious about AR, you must learn about their work, and download the free State of the Augmented Reality Cloud Report which is available on their website. (I am honored and humbled to say that I will be speaking about Bubiko at the OARC Symposium in Munich on October 16!)
The ARShow produced this informative podcast with Jan-Erik Vinje, Christine Perey, Jason Fox and Colin Steinman, the key cotributors to the OARC