The Eye Control feature will allow Windows 10 users the ability to navigate the operating system just by glancing around on the screen. It provides the ability to control an on-screen mouse and keyboard as well as text-to-speech with a glance. The eye-gazing technology uses a camera on a laptop or PC monitor to track where a user is looking. You can launch apps, type text, and perform other common tasks within the OS.
Eye Control was originally developed as part of a Microsoft Hackathon back in 2014 to assist people with ALS to steer and maneuver a wheelchair by using eye movement. When a research team from Microsoft discovered the technology, they saw the potential for eye tracking. More recently, Microsoft CEO Satya Nadella returned to a one-week Hackathon to share inspiring projects and announced Windows 10 will provide built-in eye tracking support through Accessibility Options.
At the time of this writing, Eye Control is still in the beta phase, and if you are a Windows Insider, you can test it out (provided you have the Tobii Eye Tracker 4C or a similar supported device). It will obviously take some time for more devices with similar technology to reach the market, but when they do, it will make Windows 10 more accessible for a larger number of users. Gamers and developers are looking for unique ways to implement this new technology into their apps for people who need it. What’s your take on this latest announcement about Windows 10? Do you see the potential for other types of systems or applications this technology can be used with? Leave a comment below and let us know your thoughts. Comment Name * Email *
Δ Save my name and email and send me emails as new comments are made to this post.