13 February 2020
Augmented Reality (AR) is one of the next big things in the technology world which is the reason major tech giants competing with each to see who becomes the leader of the market. Google is one of the competitors and it has been working on AR through its own platform named ARCore. It was debuted early last year and now, Google has announced ARCore Depth API to further improve AR experiences.
Google says ARCore Depth API will allow developers to create depth maps using a single RGB camera instead of relying on multiple sensors or a dedicated depth sensor. The API is based on Google's depth-from-motion algorithms and creates depth maps by "taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel".
This will result in virtual objects feel more realistic as per the surrounding. It will eventually make AR experiences even more immersive as it also improves occlusion. Occlusion is the ability of a digital object to blend with real-world objects and appear correctly in their front or back. Here is a preview of how ARCore Depth API improves this situation significantly.
The API is further said to be helpful in a wide range of other situations including AR-based games. This is because it will help developers to get more accurate path planning, realistic physics, and surface interaction among other benefits. In other words, virtual objects will interact and behave with real-world objects in the same way as a real object would.
The best part of ARCore Depth API is that it works without requiring dedicated sensors or hardware but if there are particular hardware components, it will only get better. Google is now inviting collaborators to take the API for a spin and see what user case they can come up with.