Yesterday Google showcased some very interesting new features at the Google I/O Conference. The company has now launched its voice assistant feature on iPhone as well as is in the process of launching standalone Virtual reality headsets. The announcement that stood out for us was the introduction of the novel Google Lens feature.
This new feature showcases the integration of the smartphone’s camera power with image recognition and display of context related information.
The Google lens app uses image recognition to identify objects appearing in your camera lens in real-time. The app analyzes your surroundings and displays relevant on your screen.
You’ll see a restaurant’s rating, menus or instant reviews when pointing your phone at the storefront, you can aim it at a flower and it will identify the species, and you can even pull up a band’s music or videos by pointing Lens at a concert poster.
Given the new processing power of the next gen smartphones along with powerful camera lens, this could open up a whole new world of human computer interaction and make augmented reality front and center of our smartphone experience.
As Apple prepares to launch iPhone 8 and iOS 11, we wonder if it has had ample time to integrate one of its key acquisitions from yesteryears to provide similar experiences on the iPhone.
Apple acquired Metaio in May 2015. Much like Apple’s other small acquisitions, there wasn’t any fanfare as metaio was absorbed into Apple’s operations. metaio was a startup with some history. It was launched in 2003 as an offshoot of a Volksvagen project. At the time of acquisition, metaio had close to 1000 customers and 150,000 users. Most offerings were focused on B2B use cases.
The company was one of the first small yet successful players in the world of augmented reality.
The way the platform worked was very similar to what you find in Google Lens. Point your iOS camera onto a physical entity and the system would layer up additional context sensitive information pertaining to the object captured realtime in the lens.
A classic use case of its platform can be seen in this video where users point their iPad while visiting the Berlin Wall memorial and the app would augment historical footage onto the screen.
The company also provided a demo that showcased some of the use cases along the lines of what you see in Google lens. For example, point your camera at a retail store and it will show you the various ongoing promotions and discounts that the store is offering.
metaio provided an easy and seamless platform that allowed users to create augmented reality applications within minutes. With the metaio creator, you could develop and deploy apps on macOS as well as iOS. The company was also offering a cloud based system called Metaio CVS which allowed for cloud based image matching.
Apple acquisition of metaio in 2015 was definitely a very strategic play. When you combine what this small company had done with the large research and development that Apple has in-house, it is hard not to get excited thinking about the potential offerings.
Apple has been working with Computer vision, pattern matching and machine learning for quite some time now. Peter Meier, co-founder of metaio still works for Apple in the Engineering department along with Thomas Alt, the other co-founder who currently is a member of the strategic deals team at Apple.
Last year when iOS 10 was showcased at the WWDC, we were hoping to see some of the mixed reality capabilities. Well it didn’t quite happen. Much of the last year’s WWDC focused on the re-engineered iMessage capabilities along with the redesigned interface of iOS
We hope that this year’s WWDC has more to offer especially in this emerging area of image recognition. It cannot be just wishful thinking since Apple already has been working on this tech via acquisitions and in-house research.
Now that the Google Lens has been showcased, its high time for Apple to put up its best show at the June WWDC fun fest.
We will be waiting!