With a wave of eyes: computer control systems will become more accurate with a glance
- Статьи
- Science and technology
- With a wave of eyes: computer control systems will become more accurate with a glance
For the first time, scientists have described the mechanisms of neural activity that allow a person to control a computer by intentionally holding their gaze. During the recording, the study participants played a specially designed EyeLines game. The data obtained will help improve gaze control technologies for people with limited mobility, as well as make virtual and augmented reality systems more convenient and believable. For more information, see the Izvestia article.
Computer control with a glance
Scientists from Moscow State Psychological and Pedagogical University (MGPPU), Lomonosov Moscow State University and the Higher School of Economics have investigated the mechanisms of intentional gaze retention used by humans to control computers. To do this, the authors recorded 306-channel magnetoencephalograms in 32 people — signals that describe in great detail the magnetic fields that occur during the operation of brain neurons. During the recording, the study participants played the EyeLines game, which was specially developed for such studies. In it, a person must move objects with the help of a glance and create combinations of them.

In parallel with the recording, the scientists tracked the players' eye movements. A specially developed technique made it possible to identify gaze delays that were intentionally used for control, and unintentional, spontaneous ones that occurred, for example, when a player was studying a position. By comparing data on brain activity and gaze movement, the researchers determined the differences between these types of delays.

It turned out that intentional ones are associated with two sequentially developing processes in the brain. At first, an inhibitory signal appears in the areas responsible for eye movements, apparently associated with stopping the automatic (unconscious) movement of the gaze. Then the temporal regions of the cortex are activated, which are responsible for paying attention to the position of an object in space. Interestingly, these processes begin even before the eyes are fully fixed — in the first 0.2–0.3 seconds after looking at an object. Spontaneous delays were not accompanied by such processes, the scientists said.
— We managed to get an idea of what processes in the human brain make it possible to arbitrarily hold one's gaze in order to use it to control electronic devices. Technologies for computer control by gaze were previously developed without relying on knowledge of the physiological mechanisms that make gaze retention possible," said Sergey Shishkin, a leading researcher at the Moscow State Pedagogical University's Scientific and Educational Center for Neurocognitive Research (MEG Center).
Neural interfaces of the future
The new data obtained using a unique multichannel magnetoencephalogram recording facility located at the MGPPU MEG Center will be useful for improving such interfaces.
— It is possible that they will be useful to researchers from other fields of science. A person is able to intentionally use their gaze when communicating with people — for example, we can point them at the objects we are talking about. This kind of gaze communication is very similar to how a person chooses objects in our game. Therefore, it is possible that our results and the developed methods for comparing intentional and spontaneous gaze delays can be used by those who study human interaction," said Sergey Shishkin.
Such technology can really help in the development of human—computer control systems, said Anton Averyanov, CEO of the ST IT Group of companies, TechNet NTI market expert.
— Currently, an eye position tracking system is used to control the gaze, which "observes" the position of the eye and highlights the corresponding area. However, such systems may not be accurate or reliable enough. The proposed technology will help improve virtual reality systems, which will become more intuitive," the specialist said.

Computer gaze control technologies have already reached a level where they are effectively used in VR/AR and in systems for people with reduced mobility. Modern gaze trackers are able to accurately record the direction and duration of fixation, recognizing user commands by holding their gaze on virtual elements. However, until recently, these interfaces practically did not take into account the neurophysiological aspects of conscious gaze control, said Yaroslav Seliverstov, a leading expert in the field of AI at the University of 2035.
"This discovery is an important step towards creating biologically based user interfaces. The prospects include the development of brain—computer interfaces without additional devices, based only on an intentional look. It also opens the way to reducing the number of false positives and to more natural interaction in VR/AR environments. In the future, it is possible to combine gaze data with direct brain signals to create hybrid neural interfaces. This is especially true for people with speech and motor impairments," he added.

The results of the study, supported by a grant from the Russian Science Foundation (RSF), are published in the journal Cortex.
Переведено сервисом «Яндекс Переводчик»