In its pursuit of keeping things simple yet cutting edge, Apple has been exploring ways to make a few camera and image features more innovative and easy to use.
In a patent granted today, 9,423,873, Apple has explored the possibility of rendering dynamic three-dimensional imagery on a two-dimensional user interface.
Portable electronic devices that are generally designed for use by a single person typically utilize two-dimensional imagery on the user interface screen.
This patent provides methods to render three-dimensional images on the user interface screens. In order to make these three-dimensional images more engaging, the patent explores techniques that would allowing an user to “look around” the image, thereby making these images more interesting.
So if you were looking at a 3D rendered image of a sculpture on your iPhone and you wanted to look at the details on the right side of the displayed sculpture; when you would tilt your head to look at the right side, the image would adjust based on the movement and show the details from the right side would be displayed.
The present disclosure in the patent capitalizes on this natural behavior and governs the presentation of images based on the relative position and orientation of the users head and/or eyes relative to the user interface screen when three-dimensional appearing images are being displayed.
According to the patent, Accelerometer(s) or similar type motion sensors can be incorporated into handheld device versions of the system to assist the tracking of relative head position as the device is moved about in comparison to the user’s head and eyes.
The techniques described in this patent point at an early attempt to showcase mixed reality applications.
What stands out in this disclosure is that it is not pointing at another example of showcasing 3D images on your iPhones via some 3D glasses. It is rather exploring ways to render a 3D image into a 2D surface and allow the user to interact with the image by the use of simple movements.
We have already seen emerging applications on iPhone/iOS that are being used to track user eye movement without the need of any special gear.
This new feature to provide the depth perception and other characteristics of a 3D object in 2D rendering and having the user to interact with the image can have bigger implications particularly if it can be applied to online e-retailing and other use cases.
Obsessed with tech since the early arrival of A/UX on Apple, Sudz (SK) is responsible for the editorial direction of AppleToolBox. He is based out of Los Angeles, CA.
Sudz specializes in covering all things macOS, having reviewed dozens of OS X and macOS developments over the years.
In a former life, Sudz worked helping Fortune 100 companies with their technology and business transformation aspirations.