Although, in my presentations, I mention companies, no one is paying me to have their product or service mentioned. I am thrilled to say that Novaby is helping me with Bubiko, and that I have spent a little quality time with Tranzient and Scandy (listed below), but other than that, I am just sharing info gathered online and at trade shows/events.
Stephen Black is a consultant/creative director/ bestselling writer/director/producer (Fox, Cartoon Network, CNN, Fuji TV) who’s spoken about AR at MIT,Hong Kong PolyU, TechCrunch Shenzhen, PARIS (Paris Augmented Reality Influencers Show) and the Open Augmented Reality Cloud Symposium (Munich). Beach Road, his VR (360 video) movie was featured at VR festivals in Singapore, Las Vegas and Brisbane.
Also an established visual artist, he has exhibited at art spaces worldwide, including the Singapore Biennale (with Michael Lee), Image Forum Experimental Film and Video Festival (Tokyo), and numerous venues in the legendary art scene of the Rivington School/Lower East Side of New York City.
Stephen is available for creating AR/VR/traditional video content as well as consulting, creative direction and public speaking opportunities. His latest book, Bubiko Foodtour’s Ununusual Guide to Augmented Reality (reviewed here) is available on Amazon.
PRO TIPS/NEWS STORIES
Geopose. As URL is to the internet, geopose is to AR. Interested? This blog post is a must.
Geopose and GPS virtual sculpture (1st location: Paris) now in production by Vincent Trastour (Flamingo Studios/PARIS event) and Stephen Black.
What will the relationship between WebXR and geospatial data be?
It seems that WebXR cannot entirely ignore geospatial positioning, as geospatial content will be a major use case for mobile AR (at least eventually).
The web already has a geolocation API, but it is not sufficient for these purposes: it gives position but not orientation, is of very poor quality and not synchronized with the WebXR frame data. The deviceorientation API cannot be relied on for orientation: it is of very poor quality, was never standardizes (and is potentially going to be removed from existing browsers) and is also not synchronized with the WebXR frame data.
ARKit offers the option to have it's local coordinate system be aligned with geospatial orientation (e.g., Y up, Z south, X east). This provides a possible direction for how geospatial might be handled: have the WebXR API expose a property that says if the coordinate frame can be aligned with geospatial EUS coordinates, and provide a way for the developer to request this. Crude/simple geospatial positional alignment (between the user and the local coordinates) is easier, if you are guaranteed to have the local device coordinates aligned with EUS: each time a geospatial value is received from the local coordinates, an estimate of the geospatial location of the origin of the local coordinate frame can be updated (based on error values, etc). It won't be any better than the error of the geolocation API, but can be stable (because the local coordinates are used for locating and rendering content, not the very-slowly-changing geolocation values).
Hi I am the founder of the https://open-arcloud.org/ . The TL DR AR-cloud description is a 1:1 digital map/twin of the physical world stored in the cloud that enables a shared programmable space attached directly to our physical suroundings that enables multiuser AR and persistent and universally consistent placement of virtual assets in the real worldOne of the things we hope can help bring this to reality is a standard definition of geographical position and orientation that can be understood across platforms and applications. What we call "GeoPose". Each AR-goggle, AR-smartphone or AR-content could have a GeoPose at any given moment.Obtaining GeoPose of an XR-device could be achieved by matching sensor data from the device with the 1:1 map in the cloud through something like a "GeoPose" cloud service. Once the device has its GeoPose it can display geospatial data and assets that are anchored to a GeoPose.A bit more about that here: https://open-arcloud.org/standardsplease join our slack channel and chime in on how you think a GeoPose should be defined. https://join.slack.com/t/open-arcloud/shared_invite/enQtMzE4MTc0MTY2NjYwLWIyN2E4YmYxOTA4MWNkZmI5OGQ4Mjg2MGYzNTc4OTRkN2RjZGUxOTc4YjJhOTQ0Nzc3OWMxYTA3ZDMxNGEzMGEMy hope of course is that WebXR will support using device GeoPose and do the proper transforms for assets that are ancored to GeoPoses or assets that are described by geospatial coordinates.
TITLE: Geopose Standards Working Group Charter [OGC 19-028]
Author Name (s): Jan-Erik Vinje, Christine Perey, Scott Simmons
CATEGORY: SWG Charter Template
All physical world objects inherently have a geographically-anchored pose. Unfortunately, there is not a standard for universally expressing the pose in a manner which can be interpreted and used by modern computing platforms. The main purpose of this SWG will be to develop and propose a standard for geographically-anchored pose (geopose) with 6 degrees of freedom referenced to one or more standardized Coordinate Reference Systems (CRSs).
Definition of geopose
A real object in space can have three components of translation – up and down (z), left and right (x) and forward and backward (y) and three components of rotation – Pitch, Roll and Yaw. Hence the real object has six degrees of freedom.
The combination of position and orientation with 6 degrees of freedom of objects in computer graphics and robotics are usually referred to as the object’s “pose.” Pose can be expressed as being in relation to other objects and/or to the user. When a pose is defined relative to a geographical frame of reference or coordinate system, it will be called a geographically-anchored pose, or geopose for short.
Uses for geopose
An object with geopose may be any real physical object. This includes object such as an AR display device (proxy for a user’s eyes), vehicle, robot, or a park bench. It may also be a digital object like a BIM model, a computer game asset, the origin and orientation of the local coordinate system of an AR device, or a point-cloud dataset.
When the geopose of both real and virtual objects include the current position and orientation of the objects in a way that is universally understood, the interactions between the objects and an object and its location can be put to many uses. It is also important to note that many objects move with respect to a common frame of reference (and one another). Their positions and orientations can vary over time.
The ability to specify the geopose of any object enables us to represent any object in a universally agreed upon way for real world 3D spatial computing systems, such as those under development for autonomous vehicles or those used by augmented reality (AR) or 3D map solutions. In addition, the pose of any object can be encoded consistently in a digital representation of the physical world or any part therein (i.e., digital twin).
The proposed standard will provide an interoperable way to seamlessly express, record, and share the geopose of objects in an entirely consistent manner across different applications, users, devices, services, and platforms which adopt the standard or are able to translate/exchange the geopose into another CRS.
One example of the benefit of a universally consistent geopose is in a traffic situation. The same real-time geopose of a vehicle could be shared and displayed in different systems including:
Open Geospatial Consortium
Geopose SWG CharterPage 2
- a traffic visualization on a screen in another car,
- shown directly at its physical location in the AR glasses of a pedestrian that is around the corner for the car, or
- in the real time world model used by a delivery robot to help it navigate the world autonomously.
This site contains documentation for web developers using Design Tools to add AR content to a web application.
Argon-aframe — Lesson 7: Geolocation
Key features of augmented reality include 1. the ability of associating data objects with places in the world and 2. displaying those objects at those places.
The <ar-geopose> primitive is Argon-aframe’s way of locating objects in physical space (of the planet). The <ar-geopose> primitive creates an entity with a referenceframe component. This component defines the position and/or rotation of the entity by an LLA (longitude, latitude, altitude), so you can locate your object anywhere on the earth relatively accurately by using this tag.
You can find the longitude and latitude through Google Maps or through a variety of mobile applications. Some apps will also give your the altitude of a location. .........................................................................................................................................
Reference Implementations for Conversion Between Geopose and Cartesian Coordinate Systems
There will be a need throughout many parts of the technological ecosystem to convert object poses between a geospatial coordinate-system and local ones (typically the Cartesian x,y,z in metric as used in most AR SDKs). This repository is the first step to create reference libraries in different programming languages, available for everyone to use for free.
Would you like to support THE GREATEST AR TOUR IN HISTORY? Do you want to learn the latest news from AWE, Austin, Japan, Detroit, Turin, Munich, Shenzhen and Japan!?
Stephen Black and Bubiko Foodtour are about to go global: learning, educating and networking at some of the biggest AR events in the world. We just need a little financial fuel...we are movin’ and groovin, but our app isn’t out yet! Startup blues!
As I become more involved AR's potential impact upon traffic, I see roads, vehicles, pedestrians and bicycles differently. The following documentary photographs are study aids; barely related to my previous street photography projects, such as the Bus Stopping series.
The goal is to prepare for the possibilities of AR cinema.
This post documents a simple test. The Bubiko model used for the Tech in the Tenderloin event was used. Two locations: a garage and an open piece of land. The objective: to gain an understanding of what a “stage” can be in AR. AR is a new medium; to use the established techniques of theatre, television and movie is to fail to grasp the uniqueness of AR. Performance art and dance provide clues.
Notes: Spark used
Occlusion not a concern at this time
Ambient light a constant
Size and scaling of Bubiko purposely varied
Bubiko was created by Stephen Black and Sayuri Okayama
iPad used; no manual controls nor color correction
ARBICLE- A piece of bicycle equipment, with partial or total augmented reality functionality. A bicycle light with some AR functions is an arbicle. A device which acts as a "AR computer", exclusively managing AR functions for the bike, traffic and rider is also an arbicle.
The word "arbicle" was coined by Stephen Black in 2019, in response to a perceived need for AR-compatible bicycle gear.Stephen Black is a writer, a visual artist and an AR/VR producer.
The chart above is a hypothetical overview and no such network or devices now exist. LIDAR is used an example of a traffic monitoring system. In reality, LIDAR is heavy, delicate and extremely expensive. Unlike autonomous vehicles, bicycles must have an attentive human steering and braking. Thus,LIDAR would be unnecessary. However. sensors detecting objects within a range of 50 feet around the bike would be advantageous. An arbicle featuring a rear view camera with a feed into the visor or smartglasses is an obvious need. ARbicles specializing in emergencies would be another obvious need.
Also, if you are near San Francisco's Tenderloin District on June 28th, you are warmly invited to the world premier of Bubiko Foodtour!
Bubiko will be at the Novaby booth at the Tech in the Tenderloin Fair! her app will not be on dispaly, as we haven't made it yet, but Bubiko will be hosting an interactive event that is sure to be a crowdpleaser.
2002: After years of working in art, photography and network TV (Cartoon Network, Fuji TV, Fox, CNN+) became creative director at 3D gamemaking company, WalkerAsia
Walker/Asia was also doing something like Youtube, three years before Youtube.
CEO's unexpected demise results in non-action, despite great interest by Singapore Ministry of Education, Singapore Science Center, publishers in Hong Kong and universities in Japan.
SB, son of a book salesman, decides to write books and wait for VR/AR/spatial computing to become mainstream. Inspirations: his father, a book salesman and the release of the Kindle, Amazon's pioneering ebook reader.
2016, Oculus released. SB re-enters the world of spatial computing, begins discussing his VR ideas with receptive VCs.
Visa complications result in what will become the Bubiko Orwell Tour of Southeast Asia. SB pivots from VR to AR.
AR cinema becomes the focus, starring the following characters: the Dundercats (collaboration with Six Cat Studio), Secret Donut World (collaboration with David Severn) and Bubiko Foodtour.
AR presentations/workshops at Hong Kong PolyU, a presentation at Le Wagon and pitches to VCs at TechCrunch.
An invitation to participate in ARIA at MIT becomes the inspiration for the ARphabet Tour. Debuting at ARIA will be an ARKIT/Unity collaboration with Dominique Wu/Hummingbirdsday Studios, a game featuring Green Bean Boy and co-starring Bubiko Foodtour.
The ARphabet Tour will also be a platform for Stephen Black to do readings and writing workshops.
The ARphabet Tour also aims to educate people about mango sticky rice, Bubiko's favorite food.
Separate from this, briefly mention the unicorn concept which has a high degree of success regardless of the global economy or 5G take-up.