Motion artifact-controlled micro-brain sensors between hair follicles for persistent augmented reality brain-computer interfaces

成果类型:
Article
署名作者:
Kim, Hodam; Kim, Ju Hyeon; Lee, Yoon Jae; Lee, Jimin; Han, Hyojeong; Yi, Hoon; Kim, Hyeonseok; Kim, Hojoong; Kang, Tae Woog; Chung, Suyeong; Ban, Seunghyeb; Lee, Byeongjun; Lee, Haran; Im, Chang-Hwan; Cho, Seong J.; Sohn, Jung Woo; Yu, Ki Jun; Kang, June; Yeo, Woon-Hong
署名单位:
University System of Georgia; Georgia Institute of Technology; University System of Georgia; Georgia Institute of Technology; Yonsei University; Inha University; University System of Georgia; Georgia Institute of Technology; Hanyang University; Kumoh National University Technology; Chungnam National University; Kumoh National University Technology; Yonsei University; University System of Georgia; Georgia Institute of Technology; Emory University; University System of Georgia; Georgia Institute of Technology
刊物名称:
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA
ISSN/ISSBN:
0027-12714
DOI:
10.1073/pnas.2419304122
发表日期:
2025-04-15
关键词:
poly(3 4-ethylenedioxythiophene) pedot dry electrode microneedles fabrication finger FORCE skin
摘要:
Modern brain-computer interfaces (BCI), utilizing electroencephalograms for bidirectional human-machine communication, face significant limitations from movement-vulnerable rigid sensors, inconsistent skin-electrode impedance, and bulky electronics, diminishing the system's continuous use and portability. Here, we introduce motion artifact-controlled micro-brain sensors between hair strands, enabling ultralow impedance density on skin contact for long-term usable, persistent BCI with augmented reality (AR). An array of low-profile microstructured electrodes with a highly conductive polymer is seamlessly inserted into the space between hair follicles, offering high-fidelity neural signal capture for up to 12 h while maintaining the lowest contact impedance density (0.03 k Omegacm-2) among reported articles. Implemented wireless BCI, detecting steady-state visually evoked potentials, offers 96.4% accuracy in signal classification with a train-free algorithm even during the subject's excessive motions, including standing, walking, and running. A demonstration captures this system's capability, showing AR-based video calling with hands-free controls using brain signals, transforming digital communication. Collectively, this research highlights the pivotal role of integrated sensors and flexible electronics technology in advancing BCI's applications for interactive digital environments.