Tuesday, October 5, 2010

PC Controlled By User's Eyes | Science and Technology

What if you could dramatically speed up your computing by moving your cursor exclusively with your eyes? A company called Tobii is transforming the way we interact with our screens. By using your eyes instead of your mouse, you can select what you’re looking at almost instantaneously. Not only does this speed up a tremendous number of computing tasks, but it has the potential to reduce repetitive stress injuries. But Does It Really Work? Seeing is believing. So I had to test it for myself. I have to admit I was a skeptic. Most gesture and touch controls I’ve tried in the past at the Consumer Electronics Show have been a little clunky. So I was thinking that something as sophisticated as gaze recognition wouldn’t work very well. Boy, was I wrong. After a one-time calibration that took all of 10 seconds, I started looking around the screen. I expected the cursor to go crazy as I scanned from side to side, but the cursor never moved. Instead, as Tobii CEO and Co-Founder Henrik Eskilsson explained to me, the eye-tracking only registers when you hit a function key on the keyboard that they had outfitted with a blue-sticker. As soon as I found a program I wanted to open, I looked at it on the screen and then hit the blue button. Boom - the application opened. No mouse, just eye-controlled. As I zipped around the computer, I very quickly figured it out: look, blue button. Find an icon, stare at it, hit the blue button. Hit the Windows key on the keyboard to go back to the Windows 8 Home screen of tiles, look for something new, hit the blue button. You get the idea. Navigating the operating system was pretty easy, so then I dug into a web page. Look at a link, hit the blue button, and the link opens. What surprised me was when I read a long article of text, my gaze didn’t move the page or the cursor at all until I was on the line of text lowest on the page. Just as I was about to reach for the mouse to scroll down, the web page automatically scrolled. “How’d it know to do that?” I asked. Henrik explained that the tracker knows you are reading from the motion of your eyes; as your gaze nears the lower edge of the page, it is set to automatically scroll. I used the calculator and added all by gaze: look at 7, hit the blue button; look at the + sign, hit the blue button, look at 8 hit the blue button, look at the = hit the blue button, and then I see 15 in the result field. It sounds laborious, but it’s much faster than mousing through the numbers. It actually felt like keyboard shortcuts where you don’t have to memorize the correct shortcut keys. You just look at what you want and keep hitting the same blue button. Broad Applications Tobii’s eye-tracking technology was initially designed as a research tool and as an assistive communication device for those with disabilities. Someone without the ability to speak, for example, could communicate by looking at sounds or words on a screen. Now the company is venturing into broad consumer applications. The first generation product that I tried, the Tobii Rex, works only with Windows 8 machines and costs about $1000 for a USB add-on. But as with most new technologies, costs are sure to come down quickly with mass adoption – and I see the potential. I tried a variety of computing tasks, reading e-mail, mapping, using a calculator, gaming – blowing up Asteroids without a mouse or keyboard – and was impressed by all. Plus, companies like Haier are licensing Tobii’s underlying technology and developing prototype eye-tracking TV controls. Good news, couch potatoes: soon, you won’t even have to move your hands to change the channel. Predictions I see a lot of demos, but this one is the real deal. I predict that eye-tracking technology will be baked into the computers we see rolled out at the next Consumer Electronics Show in 2014. PC CONTROLLED BY USER'S EYES New System Developed by Japanese Professor (March 19, 2007) photo An illustration of how the system works. Professor Arai Kohei of Saga University's Faculty of Science and Engineering has developed a system whereby PC users can input text simply by looking at an on-screen keyboard. When the user gazes at a character for one second, the system, which uses a miniature camera, detects their line of sight and inputs the appropriate character. The system is called Mitsumeru Dake in Japanese, which means "Just Look." Professor Arai expects that the system will be useful to people with disabilities and for a range of medical and social-welfare applications. At present many people with disabilities are unable to use personal computers unless they have expensive special equipment. Reading the User's Line of Sight In Professor Arai's system, a miniature camera attached to the computer notes the positions of three points for each eye: the inner corner of the eye, the inner extremity of the eyebrow, and the center of the pupil. On the basis of these six positions the system determines the direction in which the face is turned, and by following the line of sight it recognizes the exact location on the screen that the user is looking at. According to Professor Arai, even people wearing glasses can use the system. In the early stages of the system's development the positions of the outer corners of the eyes and of the eyebrows were used. However, it was found that in cases such as people with cerebral palsy, where the face is constantly in motion, the camera was unable to determine the position of these points when the face was turned sideways. But when the inner corners of the eyes and eyebrows were taken as the coordinates, development of the system advanced by leaps and bounds. Professor Arai says that at a distance of about 30 cm from the on-screen keyboard characters, with the characters 2.5 cm apart, his system provides a very accurate method of inputting text. Low Cost, High Performance Professor Arai was prompted to develop his system by the arrival five years ago of a student at the university who had cerebral palsy. The university revamped the toilet facilities and installed ramps throughout the campus, but the student's mother had to operate the computer. "I thought then that I've got to do something about this," says Professor Arai. Until that time similar input systems for disabled people involved attaching electrodes to the face in order to detect eyeball movement or wearing special goggles with infrared cameras in order to analyze the image on the retina. This meant the systems were expensive and cumbersome. Professor Arai makes the software available free of charge, so the only cost for his system is the ¥3,000 ($25 at ¥120 to the dollar) price of the camera. Bedridden people or those with impaired use of the hands can easily use the system to communicate their needs - to ask for a nurse, for example, or to indicate that they are thirsty. This system is another example of how Japanese researchers and companies are putting technology to use in their quest to improve the lives of elderly and disabled people. PC Controlled By User's Eyes | Science and Technology | Trends in Japan | Web Japan

No comments:

Post a Comment

Vote For Bo$$ Amero On ArtistSignal