Apple’s Quietly Making an Augmented Reality Headset
Andrew Wheeler posted on September 05, 2018 |

Magic Leap finally released a headset with a developer kit, and the reviews are mostly expressing the same sentiment: the hype was so great that the headset can only be described as disappointing. Rony Abovitz recently said that Magic Leap was “arrogant” in his approach, but he didn’t really tell the truth that many journalists suspected, which was that Magic Leap basically lied to the public about the capabilities of its technology. Hence the disappointment. (Image courtesy of Magic Leap.)
Magic Leap finally released a headset with a developer kit, and the reviews are mostly expressing the same sentiment: the hype was so great that the headset can only be described as disappointing. Rony Abovitz recently said that Magic Leap was “arrogant” in his approach, but he didn’t really tell the truth that many journalists suspected, which was that Magic Leap basically lied to the public about the capabilities of its technology. Hence the disappointment. (Image courtesy of Magic Leap.)

Apple never discusses the products they are working on, and maybe it’s a safer play when it comes to infant technology with an uncertain future like augmented reality glasses. Apple is an extremely profitable company with a huge amount of cash at its disposal. As such, it can buy up smaller companies whose work accelerates their own research and development projects. During the two intervals under the direction of Steve Jobs, Apple was known as a company with a reputation for delivering majority hit products, with a few misses now and again. Jobs was a product fanatic, and Tim Cook comes from the logistic world of supply chain management, where meeting deadlines is the standing order and high priority of the day. All this may mean is that Tim Cook would be more likely to hit a deadline with a slightly less-perfect product than Jobs would.

Why People Think Apple is Creating an Augmented Reality Headset

The short answer is because of their acquisitions and Tim Cook’s public comments expressing interest in augmented reality. Plus, they released ARKit, and API for developing augmented reality apps for iPhone.

In 2013 Apple bought PrimeSense, the company that developed software for the Kinect 3D depth sensor. Now iPhone X users have Face ID as a result. Face ID would likely be a key component to a working augmented reality wearable from Apple. (Image courtesy of Microsoft.)
In 2013 Apple bought PrimeSense, the company that developed software for the Kinect 3D depth sensor. Now iPhone X users have Face ID as a result. Face ID would likely be a key component to a working augmented reality wearable from Apple. (Image courtesy of Microsoft.)
In 2015, Apple acquired Metaio for its augmented reality SDK, which is now the basis of ARKit, Apple’s augmented reality API for iOS 11 and the impending iOS 12. (Image courtesy of Apple.)
In 2015, Apple acquired Metaio for its augmented reality SDK, which is now the basis of ARKit, Apple’s augmented reality API for iOS 11 and the impending iOS 12. (Image courtesy of Apple.)

In 2017, Apple increased the number of companies it purchased with potentially viable augmented reality assets. In one year, it acquired Vrvana for its manufacturing augmented reality head-mounted display capabilities, SensoMortoric Instruments for its eye-tracking software and hardware, Regain for its computer vision technology and InVisage Technologies for its quantum dot-based image sensor manufacturing technologies.

And just recently, in 2018, Apple bought Akonia Holographics, acquiring its augmented reality display manufacturing capabilities. With Akonia, Apple now has access to its 200-plus patents that result in thin and transparent smart glass lenses with full color and wide field-of-view.

CEO of Akonia in 2016 with the company’s HoloMirror display lens, transformed from years of holographic storage research and development. (Image courtesy of Dean Takahashi.)
CEO of Akonia in 2016 with the company’s HoloMirror display lens, transformed from years of holographic storage research and development. (Image courtesy of Dean Takahashi.)

Apple is the world’s most valuable company with a market value that recently crossed over the trillion-dollar mark. They can buy startups and spend years in research and development, blowing through hundreds of millions to develop their next hit product. The iPhone, Apple’s flagship product, is over ten years old, and the Apple watch is not nearly the same type of cultural phenomenon as the iPhone.

But Can Apple’s Engineers Solve Motion-to-Photon Latency Issues that Plague Augmented Reality?

Motion-to-Photon latency refers to the amount of time needed for a user’s movement to be completely reflected on an augmented reality head-mounted display unit, like DAQRI Smart Glasses, or Microsoft HoloLens.

The way augmented reality headsets are currently designed involves some pretty fantastic engineering maneuvers to decrease overall motion-to-photon latency. To consider the problem, start with the fact that in augmented reality, digital content is overlaid by optical see-through (OST) systems onto the real world. Human perceptual systems of course do not have any latency whatsoever. This is why any overlaid virtual content in AR that does have latency is way more pronounced than latency in the fully occluded virtual environment of VR. If you tie this in with the fact that mobile AR headsets do not have the computing power of a tethered workstation or laptop, it’s simple to understand why latency is more of an engineering challenge for AR as opposed to VR. It’s a complicated challenge, but limitations caused by displays used by DAQRI and Microsoft are a key part of the problem.

Liquid crystalline on a silicon wafer (LCoS) displays are currently used in Microsoft HoloLens and DAQRI Smart Glasses (pictured above and are only capable of showing one color at a time. To show a full RGB (Red, Green Blue) frame, the LCoS display matrix receives data for only the red pixel parts prior to turning on the red LED which shows the red image. Then the red LED is turned off, and the same process is repeated with blue and green, depending on the RGB data attributes. In order to refresh the display, complicated sequencing and “late-warping” techniques are needed to prevent unwanted artefacts and color slippage known as the “rainbow effect.” (Image courtesy of DAQRI.)
Liquid crystalline on a silicon wafer (LCoS) displays are currently used in Microsoft HoloLens and DAQRI Smart Glasses (pictured above and are only capable of showing one color at a time. To show a full RGB (Red, Green Blue) frame, the LCoS display matrix receives data for only the red pixel parts prior to turning on the red LED which shows the red image. Then the red LED is turned off, and the same process is repeated with blue and green, depending on the RGB data attributes. In order to refresh the display, complicated sequencing and “late-warping” techniques are needed to prevent unwanted artefacts and color slippage known as the “rainbow effect.” (Image courtesy of DAQRI.)

It’s also important to remember that Apple’s augmented reality head-mounted display unit is likely going to be a mobile unit. And again, mobile units are hindered by a lack of processing power compared to systems that are tethered to powerful workstations and laptops. Engineering the correct balance between requirements like high throughput, low latency and low power are tricky enough, but the process of connecting a GPU to an LCoS display still requires proprietary display protocols. There are no standards, which would make component integration easier. Perhaps recent Apple acquisition Akonia has made strides in this area. But one can only speculate.

Without standard GPU-to-LCoS protocols, proprietary extensions to improve mobile AR support from the GPU to the LCoS display will be necessary to improve motion-to-photon latency.

But a closer look at Akonia reveals research on what could turn out to be an entirely new type of augmented reality display screen, one based on transforming holography’s ability to store information into a better augmented reality lens, one that may be capable of meeting Apple’s high standards for product design.

Apple’s Latest Acquisition Akonia Holographics: Holography as Storage transformed into HoloMirror Lenses for the Masses

The company Apple most recently acquired that sent AR enthusiasts into a tizzy of speculation was started with a mission of using light to record information in a far more efficient manner.

Akonia developed a fascinating technique of recording bits: it starts with a single laser split into two beams. One is a data-carrying signal beam, and the other is a reference beam. A spatial light modulator then encodes data into the signal beam, converting electrical signals of 0s and 1s “into an optical checkerboard pattern of over one million light and dark bits.” When the signal beam and the reference beam intersect in it’s recording material a hologram is formed. Data is stored.

Similar to the way that a deep learning GPU uses it’s innate parallel-processing ability versus the single-threaded processing of a CPU to process many little calculations of math matrices simultaneously versus one at a time, holographic storage accomplishes faster transfer rates than traditional storage by virtue of its laser recording more than one million bits of data in a millisecond instead of the way conventional storage records one bit at a time.

It turns out that holography as a data storage technology has been promising for quite some time, but finding a storage medium was the real engineering challenge to overcome. So, what does this have to do with augmented reality lenses?

First, Akonia’s “two-chemistry” photopolymer holographic storage material was created from decades of research and development. It yielded a highly photosensitive material, suitable for AR lenses, that scored high in optical flatness, stability and manufacturability. The company’s research and development has given it the ability to combine a large number of holograms in a single layer of polymer for more control over the quality and size of images that are projected towards a user’s eye. Akonia dubbed the transformed holographic storage to augmented reality lens its “HoloMirror” technology.

Second, if Akonia succeeded in getting its HoloMirror lens into a form that is easy to manufacture, inexpensive to sell (according to Apple’s standards), has a large field-of-view (perhaps 60-90 degrees) while incorporating the whole color spectrum with minimal motion-to-photon latency vis-a-vie a proprietary GPU-lens-connection that could serve as an industry standard…then Apple might indeed actually be on the verge of releasing a pair of thin iGlasses. Maybe in the next two years?

Now perhaps all someone at Apple has to do is create the augmented reality version of multi-touch keyboards that distinguished the iPhone greatly from Blackberry keyboards, and they’ll be ready to unleash what will likely be a new augmented reality product.

But Apple certainly doesn’t have to. They could scrap the whole thing if they felt like it. That’s what you can do when you’re worth a trillion dollars.





Others who read this also liked:

 

Recommended For You