Brain-computer interface (BCI)-based robotic telepresence provides an opportunity for people with disabilities to control robots remotely without any actual physical movement. However, traditional BCI systems usually require the user to select the navigation direction from visual stimuli in a fixed background, which makes it difficult to control the robot in a dynamic environment during the locomotion. In this paper, a novel SSVEP-based BCI stimuli system is proposed for robotic telepresence. The novel system utilized the live video streamed from the robot onboard camera as the input. By altering and flickering the detected objects in the scene with different frequencies predefined based on their relative positions on the screen, the robot can be navigated based on the user’s attention in a dynamic manner. In order to better differentiate multiple objects (more than the number of frequencies predefined), the task-related component analysis (TRCA) model was trained with a priori offline experimental data to select the front objects with priority. Experiments were conducted to validate the proposed system. Using the system, four human subjects are able to control a humanoid robot to navigate through multiple objects to reach the desired goal. The success rate reaches 87.5% in average.
Scientific Publications from Researchers Using iMotions
iMotion is used for some of the most interesting human behavior research studies done by top researchers around the world. Contact us to have your publication featured here.
All Publications