Knowledge Vault 3/26 - G.TEC BCI & Neurotechnology Spring School 2024 - Day 3
Auditory BCI research at Shibaura Institute of Technology
Shin'ichiro Kanoh, Shibaura Institute of Technology (JP)
<Resume Image >

Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Llama 3:

graph LR classDef campus fill:#f9d4d4, font-weight:bold, font-size:14px; classDef research fill:#d4f9d4, font-weight:bold, font-size:14px; classDef bci fill:#d4d4f9, font-weight:bold, font-size:14px; classDef erp fill:#f9f9d4, font-weight:bold, font-size:14px; classDef other fill:#f9d4f9, font-weight:bold, font-size:14px; A[Shin'ichiro Kanoh] --> B[Shibaura Institute: Tokyo campuses,
Toyosu near fish market. 1] A --> C[Auditory BCI: brain activity,
sounds, bi-directional communication. 2] C --> D[Simple BCI demos possible,
much research needed. 3] C --> E[Task variations enrich BCI,
auditory understudied vs visual. 4] E --> F[Visual BCI issue: screen focus,
interrupts other activities. 5] E --> G[Auditory BCI: leverage sound location,
eyes closed. 6] C --> H[ERPs: brain responses to stimuli,
mismatch negativity, P300. 7] H --> I[P300 modulated by attention,
appears for attended deviants. 8] C --> J[Early auditory BCIs: binary decisions,
left/right attention. 9] C --> K[Extend auditory BCI dimensionality:
location, motion, frequency, timbre, melody. 10] K --> L[Identify sound locations, features
with eyes closed. 11] C --> M[Auditory stream segregation illusion:
alternating tones, separate streams. 12] M --> N[Auditory BCIs: focus on deviant
in one stream, P300. 13] C --> O[4-class auditory BCI: focus on
one of four streams. 14] O --> P[Offline analysis: accurately detect
attended stream. 15] O --> Q[4-class BCI: first step toward
4D, 5D, 6D sound space. 16] C --> R[BCI: 5 real speakers,
5 virtual sound sources. 17] R --> S[Real, virtual sound locations
elicit P300 in ERP study. 18] A --> T[Other BCI research: multi-modal
visual/tactile, gamma band activity. 19] T --> U[Spatial-temporal analysis: map EEG
to cortical currents, single finger decoding. 20] T --> V[Detect driver drowsiness: EEG, ECG,
driving simulator. 21] V --> W[Sudden targets while driving:
detectable brain responses, auto-brake potential. 22] V --> X[Toll gate opening speed affects
driver stress, detectable in brain. 23] C --> Y[Expanding auditory BCI to richer stimuli,
exploring virtual reality combination. 24] H --> Z[P300: visual and auditory stimuli,
largest at midline parietal electrodes. 25] H --> AA[Channel selection for ERPs:
machine learning, subject-specific, anatomical knowledge. 26] H --> AB[MMN, P300, N400 ERPs:
different neural generators, sensory detection, attention, cognition. 27] H --> AC[P300 for BCI spellers:
depends on attention, large for attended targets. 28] A --> AD[Humorous video: speaker demonstrates
what to do when BCIs fail. 29] A --> AE[Speaker hopes to attend
BCI conference in Graz. 30] class B campus; class C,D,E,F,G,J,K,L,M,N,O,P,Q,R,S,Y bci; class H,I,Z,AA,AB,AC erp; class T,U,V,W,X other;


1.-Shibaura Institute of Technology has two campuses in Tokyo, Japan. The Toyosu campus is near a popular fish market.

2.-The talk focuses on auditory BCI research, extracting information from brain activity in response to sounds to enable bi-directional communication.

3.-Simple BCI demonstrations are now possible, thanks to improvements in technology like gTec amplifiers. However, much research still needs to be done.

4.-The talk discusses task variations to enrich BCI functionality, especially in the understudied area of auditory BCI compared to visual BCI.

5.-A problem with visual BCI spellers is that the user must focus on the screen, interrupting other visual activities like watching presentations.

6.-Auditory BCI aims to enable using a BCI while performing other activities by leveraging the ability to locate sounds with eyes closed.

7.-Event-related potentials (ERPs) are brain responses to stimuli. Infrequent deviant tones among frequent standard tones elicit mismatch negativity and P300 ERPs.

8.-The P300 ERP component is modulated by attention - it appears in response to attended deviant stimuli but not unattended deviants.

9.-Early auditory BCIs enabled binary decisions based on attention to sounds from the left or right. Later, up to 6 sound locations were used.

10.-The speaker aims to extend the dimensionality of auditory BCI by leveraging abilities to detect sound features like location, motion, frequency, timbre, and melody.

11.-Even with closed eyes, humans can identify sound locations and features like individual instruments playing in an orchestra based on auditory information alone.

12.-Auditory stream segregation is an auditory illusion where alternating high and low tones are perceived as two separate "streams" or melodies.

13.-Auditory BCIs can leverage stream segregation by having subjects focus on deviant target tones in one of two streams, evoking a P300.

14.-In a recent 4-class auditory BCI, participants focused on one of 4 tone streams. Deviants in the attended stream evoked a P300.

15.-Offline analysis of the 4-class auditory BCI data showed it could accurately detect which stream the participant was attending to.

16.-The 4-class BCI is a first step toward extending auditory BCI to use "4D, 5D or 6D sound space" as used by animals for hunting.

17.-Another auditory BCI uses an array of 5 real speakers plus 5 virtual sound sources generated between speaker pairs, for 10 total choices.

18.-Both real and virtual sound locations elicited P300 responses in an ERP study, showing virtual sound sources can be used in auditory BCIs.

19.-The talk briefly covers other BCI research at the presenter's lab, including a multi-modal visual/tactile BCI targeting gamma band activity.

20.-Spatial-temporal analysis is used to map scalp EEG to cortical currents to improve spatial resolution. This may enable single finger movement decoding.

21.-Another project aims to detect driver drowsiness using EEG and ECG, tested in a driving simulator designed to induce sleepiness.

22.-Sudden target appearance while driving elicits detectable brain responses that could potentially be used to automatically brake in dangerous situations.

23.-Differences in toll gate opening speed affect driver stress levels, which can be detected in brain activity.

24.-The presenter concludes that expanding auditory BCI to richer stimuli is key to improving performance. They are exploring combining it with virtual reality.

25.-P300 ERPs occur for both visual and auditory stimuli. Auditory P300s are largest at midline parietal electrodes.

26.-Channel selection for different ERP components can leverage machine learning but is subject-specific. Anatomical knowledge can guide electrode placement.

27.-Mismatch negativity, P300, and N400 ERPs have different neural generators. MMN reflects sensory change detection, P300 reflects higher-order attention and cognition.

28.-P300 is used for BCI spellers because it strongly depends on attention - it's large for attended targets and small or absent for unattended stimuli.

29.-The talk ended with a humorous old video of the speaker demonstrating what to do when BCIs fail - "harakiri" (staged).

30.-The speaker hopes to attend the BCI conference in Graz in September and looks forward to further discussions with the hosts and attendees.

Knowledge Vault built byDavid Vivancos 2024