‘Apple Glass’ could use AR 3D mapping data sourced from an iPhone

The Apple Glass smartglasses or an Apple-produced VR or AR headset could take advantage of other hardware to determine where it is and movements three-dimensional space, by sharing data about the local environment.

One of the problems VR and AR headset producers have to contend with is the need to know where the head-mounted Is in an environment. This is especially important for augmented reality applications, as viewing systems that overlay a digital object over a real-world scene have to ensure that the positioning of the object in the user’s view is absolutely correct, to sell the illusion of its existence in the real world.

Headsets have multiple different ways to track their position, including accelerometers and cameras that point out into the environment to track nearby items.

However, in cases where there’s multiple people with head-mounted vision systems, or multiple iPhone users with ARKit apps as currently observable, each device typically handles their own co-ordinate-tracking system. Devices typically don’t share coordinates with each other during use, and generally work independently of each other.

In a patent granted by the US Patent and Trademark Office on Tuesday titled “Method and device for synchronizing augmented reality coordinate systems,” Apple suggests a way to get all devices to metaphorically sing from the same songbook and to work with each other, by synchronizing data.

The idea initially starts with one electronic device handing over multiple items in a “feature set” to a second device, such as an iPhone or a base station transmitting data to an AR headset. This feature set can include a reference location in 3D space, coordinates for each of the devices, and in some versions, a map of the 3D space and points of interest within it that could be used to place virtual