Abstracts

  • Tomasz M. Rutkowski, PhD, assistant professor & BCI-lab-group PI at University of Tsukuba (Japan)

    Augmented informative ERS/ERD of sensorimotor EEG rhythms in planning movements for the control of the brain-computer interface

    State—of—the—art stimulus—driven brain—neural computer interface (BNCI) paradigms rely mostly on visual, auditory or somatosensory (tactile or haptic) modalities. The above three sensory modality constitute also human spatial awareness in surround environments. Recently multimodal approaches have been proposed to offer alternative ways to deliver sensory stimulation inputs which could be crucial for patients suffering from neurodegenerative diseases or for healthy users in need for alternative communication channels. Already several techniques have been developed to connect the BNCI to a traditional visual, auditory or tactile interfaces or to utilize those interfaces as stimulation sources. The talk will present recent developments and discuss pros and cons of the traditional as well as the newly developed airborne ultrasonic tactile display (AUTD) approaches. On the other hand a more traditional vibrotactile stimulation brings also a possibility to create a bone—conduction—auditory sensory effect in case of the head area transducers application. This concept, creates a new possibility for the users with hearing problems to enjoy BNCI communication advantages. It brings a very interesting possibility to deliver multimodal stimuli (somatosensory and auditory combined) for locked—in users with a very fast information transfer rates. The talk will review the recent hot BNCI developments allowing for direct brain—machine interaction requiring only the intentional thought control. Future possible applications of neurotechnology—based devices will be also presented. The new BNCI paradigms call for creation also of the novel data driven signal processing and machine learning methods which will be summarized and discussed at the end of the talk.

  • Alexander Kaplan, PhD DrSc, Head of Brain-Machine Interfaces and Applied Neuroengineering lab, Center of Development Biotechnology, Institute of Biology and Biomedicine, Lobachevsky State University of Nizhny Novgorod (Russia)

    New experiments with neurointerface: waiting for the results

    Several experimental paradigms with neurointerface will be given for consideration, which are now in operation and have unpredictable results with far-reaching consequences.

  • Sergey Shishkin, PhD, Head of Neuroergonomics and brain-machine interfaces lab, National Research Centre “Kurchatov Institute” (Russia)

    Gaze Touch and Gaze Talk: New vistas for the eye-brain-computer interfaces

    Eye movements are involved in many of our everyday activities and often precede our physical actions, including actions used to control a computer. For example, we typically fixate a virtual button or a link at computer screen with gaze before approaching them with a cursor and making a click. It is natural to consider the use of this gaze behavior as a good basis for establishing a human-machine interface based solely on gaze fixations and do not requiring manual actions. One may expect that such an interface would be useful not only for persons whose ability to use skeletal muscles is impaired but also for healthy people, because bypassing manual activity may make interaction with computers and robots fast and fluent. Indeed, such interfaces exist, but their performance is limited due to difficulties in differentiating gaze behavior related to machine control from usual gaze activities.

    Recent studies by our laboratory demonstrated that such differentiation can be done within the framework of the Eye-Brain-Computer Interfaces (EBCIs), i.e., the hybrid interfaces that jointly use the eye movement data and the brain signals such as EEG to determine the user’s intention. In our experiments, participants played a game using their gaze fixations only. Fixations used to make moves were detected by the BCI classifier using EEG segments with duration of several hundreds of milliseconds.

    Moreover, we suggested that gaze interaction technology, yet being developed mainly by engineers, can be further improved by applying relevant knowledge from psychology. In our Gaze Touch approach, vibrotactile feedback is used to reduce the burden on visual attention and to enable fast switching to next steps in sequential fixation based control. Comparing to standard protocol, time to proceed to the next control fixation was reduced by 60 ms.

    Tactile (or haptic) feedback is very common in our tool use practice, but can be irrelevant in the case of anthropomorphic and/or autonomous robots, because such robots may be perceived more as partners than tools. For this instance, we proposed the Gaze Talk approach based on psychological notion of “joint attention”. In this protocol, tactile feedback is absent, but natural mechanisms of intensive gaze based communication are used instead.

    Although both Gaze Touch and Gaze Talk protocols have not been implemented into the EBCI framework yet, they seem to possess a number of useful features that can be exploited within this framework. In the presentation, we will discuss the use of these features and other prospects of further development of EBCI protocols, as a possible way to revolutionize our interaction with computers and robots.

  • Vasily Pyatin, Doctor of Medical Sciences, рrofessor, Department of Neurointerface and Applied Neurophysiology, Samara State Medical University (Russia)

    Augmented informative ERS/ERD of sensorimotor EEG rhythms in planning movements for the control of the brain-computer interface

  • Victor Kazantsev, PhD DrSc, Head of Neurotechnologies Department, Institute of Biology and Biomedicine, Vice-Rector for Research and Innovation, Lobachevsky State University of Nizhny Novgorod (Russia)