Knowledge Vault 3/87 - G.TEC BCI & Neurotechnology Spring School 2024 - Day 9
Real-time assessment of concentration performance
Martin Walchshofer, g.tec medical engineering GmbH (RO)
<Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Llama 3:

graph LR classDef experiment fill:#f9d4d4, font-weight:bold, font-size:14px; classDef eeg fill:#d4f9d4, font-weight:bold, font-size:14px; classDef test fill:#d4d4f9, font-weight:bold, font-size:14px; classDef results fill:#f9f9d4, font-weight:bold, font-size:14px; classDef demo fill:#f9d4f9, font-weight:bold, font-size:14px; A[Martin Walchshofer] --> B[concentration
performance experiment. 1] A --> C[EEG device: Unicorn Hybrid Black,
8 electrodes, 250Hz. 1] C --> D[Electrodes: occipital area,
minimize artifacts. 2] A --> E[Experiment: 23 subjects,
D2 focus test. 3] E --> F[Baseline EEG vs
D2 test EEG. 3] E --> G[System validated by
repeating test. 4] G --> H[Classifier applied to Tetris,
video tasks. 4] E --> I[Questionnaires: perceived vs
measured focus/engagement. 5] A --> J[EEG bands reflect
mental states. 5] J --> K[Theta, alpha changes examined
during tasks. 6] K --> L[D2: theta increase,
alpha decrease vs rest. 7] K --> M[Tetris speed impacted
theta/alpha similarly. 7] K --> N[Videos: small band
power changes. 7] A --> O[Real-time EEG analysis:
feature extraction, ML. 8] O --> P[D2 interface: highlighted
correct/incorrect responses. 8] A --> Q[Test videos: relaxing
landscapes, action sports. 9] A --> R[Test, training runs
boosted result confidence. 10] R --> S[Rest vs D2: clearly
distinguishable. 10] A --> T[90%+ accuracy distinguishing
rest vs D2. 11] A --> U[Classifier applied to Tetris
at different speeds. 12] U --> V[Game speeds identifiable
from EEG. 12] A --> W[Perceived engagement matched
measured EEG engagement. 13] W --> X[Engagement increased with
Tetris speed. 13] A --> Y[Similar results for
relaxing vs engaging videos. 14] A --> Z[Perceived aligned with measured
EEG engagement across tasks. 15] A --> AA[Study published in
Journal of Neural Engineering. 16] A --> AB[Live demo with colleague. 17] AB --> AC[EEG electrodes attached,
gel applied. 17] AB --> AD[Eye, muscle artifacts
observed, signal confirmed. 18] AB --> AE[Training begins with D2 test. 19] AE --> AF[Subject clicks 'd' with
two marks, avoids others. 19] AE --> AG[Two training rounds
with relax phases. 20] AG --> AH[Classifier generated to
differentiate states. 20] AB --> AI[80%+ accuracy for live subject. 21] AI --> AJ[Subject given D2, rest tasks
for real-time prediction. 21] AJ --> AK[Classifier score monitored,
rises for focusing, falls for rest. 22] AB --> AL[Tetris, video tasks performed,
engagement score monitored. 23] AL --> AM[Relaxing video: scores
around minimum. 24] AL --> AN[Engaging video: scores
towards maximum, Bluetooth issues. 24] AL --> AO[Fast Tetris: high
engagement, overwhelming. 25] AL --> AP[Slow Tetris: low
engagement, like resting. 25] AL --> AQ[Medium Tetris: scores
between high and low. 26] AB --> AR[Live demo replicates
study's key results. 27] A --> AS[Tetris initially used for
calibration, proved difficult. 28] AS --> AT[D2 test performed
better for calibration. 28] A --> AU[Next speaker: Stephen Laureys,
previous: Damien Coyle. 29] A --> AV[Takeaways: EEG reliably
measures task engagement... 30] AV --> AW[Classifiers trainable on tests,
applicable to other tasks. 30] class A,B,E,G,R,AA,AB,AE,AG,AI,AJ,AL,AR,AS experiment; class C,D,F,J,K,L,M,N,O,T,U,V,Y,Z,AC,AD eeg; class H,I,P,Q,AK,AM,AN,AO,AP,AQ test; class S,W,X,AH,AT,AV,AW results; class AU demo;


1.- Martin Walchshofer presents an experiment measuring concentration performance using the Unicorn Hybrid Black EEG device with 8 electrodes at 250 Hz sampling rate.

2.- Electrodes were placed mainly on the occipital area to minimize artifacts from eye movements, muscle activity, and motor areas.

3.- The experiment used 23 subjects who trained on a standardized D2 focus test. Baseline resting EEG was compared to EEG during D2 test.

4.- After training, the system was validated by repeating the test. The classifier was then applied to Tetris gameplay and video watching tasks.

5.- Subject questionnaires were used to compare perceived focus/engagement to EEG measurements. Different EEG frequency bands reflect different mental states.

6.- Theta and alpha band power changes were examined during baseline, D2 test, Tetris at different speeds, and engaging vs non-engaging videos.

7.- D2 test showed theta increase and alpha decrease compared to rest. Tetris speed impacted theta/alpha similarly. Videos showed small band power changes.

8.- Real-time feature extraction and machine learning enabled online EEG analysis. The D2 test interface highlighted correct and incorrect responses.

9.- Two test videos, relaxing landscapes and action sports, were used to compare EEG responses to engagement level.

10.- Test and training runs of the experiment paradigm were conducted to boost confidence in the results. Rest vs D2 test was clearly distinguishable.

11.- Over 90% classification accuracy was achieved across 23 subjects for distinguishing rest from D2 test, with an average of 94.6% accuracy.

12.- After training, the classifier was applied to Tetris at slow, medium and fast speeds. Different game speeds were clearly identifiable from EEG.

13.- Perceived engagement from questionnaires matched measured EEG engagement, with both increasing from slow to medium to fast Tetris speeds.

14.- Similar results were found for relaxing vs engaging videos, with a smaller but measurable difference in engagement detected by EEG.

15.- Perceived engagement aligned well with measured EEG engagement across all 24 subjects and different task difficulties, supporting the system's validity.

16.- The published study "Real-time estimation of EEG-based engagement scores during different tasks" in the Journal of Neural Engineering details the results.

17.- A live demo is conducted with a colleague, first attaching EEG electrodes and applying gel to improve signal quality and stability.

18.- Eye blinks, teeth clenching, and eyes closed are performed to observe corresponding EEG artifacts and alpha waves, confirming good signal quality.

19.- The training experiment begins with the D2 test, where the subject must click on "d" characters with exactly two marks while avoiding others.

20.- Two rounds of training are conducted, with relax phases in between. The system then generates a classifier to differentiate the two states.

21.- Over 80% accuracy is achieved for the live subject, who is then given the D2 and rest tasks again for real-time prediction.

22.- The classifier output score is monitored, rising above zero for focusing and decreasing towards the minimum during rest phases.

23.- Tetris gameplay at different difficulties and video watching tasks are then performed while monitoring the real-time EEG engagement score.

24.- The relaxing video shows scores around the minimum, while the engaging video trends more towards the maximum, but Bluetooth connection issues occur.

25.- Fast Tetris gameplay shows high engagement scores as it is overwhelming, while slow Tetris is underwhelming with scores similar to resting.

26.- Medium speed Tetris results in scores between the high and low speeds, matching the pattern observed in the 23-person study.

27.- The live demo replicates the key results from the full study, demonstrating measurable differences in EEG engagement across tasks and difficulties.

28.- Tetris was initially used for calibrating the BCI system but proved difficult due to rapidly increasing difficulty. The D2 test performed better.

29.- The presentation ends with a brief mention of the next keynote speaker Stephen Laureys and previous speaker Damien Coyle.

30.- Key takeaways: EEG can reliably measure task engagement in real-time, classifiers can be trained on standardized tests and applied to other tasks.

Knowledge Vault built byDavid Vivancos 2024