Preview for Google’s ARCore for Android Released

Rebutting Apple’s ARKit: open system versus closed system.

The debate has raged for quite some time in the history of personal computing: open system versus closed system. Apple’s Steve Jobs was a proponent for a closed system, which gave the company greater control over its ecosystem. The engineer who created the Apple I and Apple II, Steve Wozniak, was not in favor of curtailing hobbyist’s options to add on hardware.

Bill Gates, founder of the first giant software company, Microsoft, was a proponent of open-system architecture, which allowed him to develop and standardize an operating system that could use off-the-shelf computer hardware and program software applications to run on what evolved into Windows.

The original Android OS was spearheaded by Andy Rubin, who worked for Apple as a manufacturing engineer in the 80s and Google after it acquired Android in 2005. When the Android OS hit the mobile device market, Jobs famously declared “thermonuclear warfare” on Android and Google.

Built with Unity and ARCore, Morph Face is one of many experiments developers have created. (Image courtesy of Google.)

Built with Unity and ARCore, Morph Face is one of many experiments developers have created. (Image courtesy of Google.)

Google’s Preview of ARCore

In the months leading up to the launch of the new iPhone 8, which marks the 10th anniversary of the first iPhone in 2007, Apple released ARKit for its legion of developers to make new augmented reality software for the new iPhone.

This week, Google released a preview for its ARCore SDK, emphasizing three major capabilities:

  1. Light estimation: ARCore digitizes and processes the light of a given environment and gives developers the possibility of matching the behavior of light on digital objects overlaid in said environment.
  2. Detecting Horizontal Surfaces: Many AR demonstrations show digital objects overlaid on a desktop, table, podium or floor. ARCore detects feature points and gives developers the ability to place digital 3D models on these detected horizontal planes.
  3. Motion tracking: ARCore leverages input from the smartphone’s camera to detect feature points in the room. Using a combination of observed feature points and Inertial Measurement Unit (IMU) sensor data, the spatial position and orientation of the phone is tracked and allows virtual models to remain in their positions relative to the user.

It seems like Google might be combining ARCore with ProjectTango, since it gave a preview of Tango’s Visual Positioning Service (VPS), an inside-out tracking system hooked into Google Maps, this past May.

There is no word yet on how long the “preview” will last, but it surely is a rebuttal to Apple’s ARKit and an attempted distraction from Apple’s big iPhone release coming Sept. 12. To see more ARCore based creations, click here.