The Sasai and Kai teams at Araya R&D Department have successfully managed to remotely operate a robotic arm using an ultra-high-density EEG and AI.
This research used non-invasive ultra-high-density EEG to collect brainwave data when color names were spoken. This data was then used to train an AI, making it possible to identify words based on brainwaves. Moreover, the remote operation of the robotic arm was realized through imitation learning.
Photographs of the Experiment
This experiment was conducted with the Kai team as the next step of the Sasai team's "ChatGPT empowered Brain Gmail Interface" in collaboration with the Kai team.
Schematic diagram of the experiment
In this project, we will continue to develop a Brain-Machine Interface (BMI) that can achieve more by advancing this research. Our goal is to help a diverse range of people overcome societal barriers and expand their options for social participation and communication.
Further details about the research will be announced as progress is made.