Integrating Large Language Model, EEG, and Eye-Tracking for Word-Level Neural State Classification in Reading Comprehension

Yuhong Zhang, Qin Li, Sujal Nahata, Tasnia Jamal, Shih kuen Cheng, Gert Cauwenberghs, Tzyy Ping Jung

Research output: Contribution to journalArticlepeer-review

Abstract

With the recent proliferation of large language models (LLMs), such as Generative Pre-trained Transformers (GPT), there has been a significant shift in exploring human and machine comprehension of semantic language meaning. This shift calls for interdisciplinary research that bridges cognitive science and natural language processing (NLP). This pilot study aims to provide insights into individuals’ neural states during a semantic inference reading-comprehension task. We propose jointly analyzing LLMs, eye-gaze, and electroencephalographic (EEG) data to study how the brain processes words with varying degrees of relevance to a keyword during reading. We also use feature engineering to improve the fixation-related EEG data classification while participants read words with high versus low relevance to the keyword. The best validation accuracy in this word-level classification is over 60% across 12 subjects. Words highly relevant to the inference keyword received significantly more eye fixations per word: 1.0584 compared to 0.6576, including words with no fixations. This study represents the first attempt to classify brain states at a word level using LLM-generated labels. It provides valuable insights into human cognitive abilities and Artificial General Intelligence (AGI), and offers guidance for developing potential reading-assisted technologies.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Transactions on Neural Systems and Rehabilitation Engineering
DOIs
StateAccepted/In press - 2024

Keywords

  • Accuracy
  • Brain modeling
  • Brain-Computer Interface
  • Cognitive Computing
  • Computational Linguistics
  • EEG
  • Electroencephalography
  • Electronic mail
  • Eye-fixation
  • Gaze tracking
  • Human-Computer Interface
  • Large Language Model
  • Pattern Recognition
  • Reading Comprehension
  • Semantics
  • Task analysis

Fingerprint

Dive into the research topics of 'Integrating Large Language Model, EEG, and Eye-Tracking for Word-Level Neural State Classification in Reading Comprehension'. Together they form a unique fingerprint.

Cite this