Researchers at MIT’s Computer Science and artificial intelligence laboratory have developed a simple eye tracking application for devices that use Apple’s iOS operating system. Eye tracking using mobile devices has been widely researched but has always been an expensive proposition. This new research from Aditya Khosla and team is expected to change that.
“The field is kind of stuck in this chicken-and-egg loop,” says Aditya Khosla, an MIT graduate student in electrical engineering and computer science and co-first author on the paper. “Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera.”
By using the camera on an iPhone, the team was able to conduct the numerous eye tracking experiments and collect data. The team built their eye tracker using machine learning, a technique in which computers learn to perform tasks by looking for patterns in large sets of training examples.
With advances in machine learning and neural networks, Apple will be able to continue innovation on its iOS platform. Imagine the possibilities if the device is able to track your eye movement and positions and offer context based information.
In addition to making existing applications of eye-tracking technology more accessible, the system could enable new computer interfaces or help detect signs of incipient neurological disease or mental illness.
The team is expected to present their findings at the June 28th, Computer Vision and Pattern Recognition conference.