Concept Graph & Resume using Claude 3 Opus | Chat GPT4 | Llama 3:
Resume:
1.-Bernard, a software developer at G-Tech, presented on developing VR/AR games using brain-computer interface (BCI) technology and Unity.
2.-G-Tech offers different products for developing VR/AR games with BCI: a Unity Speller app, Simulink model, and Unity package.
3.-The Unity Speller app allows selecting commands using the P300 protocol to control a standalone gaming interface via UDP.
4.-The P300 protocol uses an oddball paradigm with infrequent target stimuli among frequent non-target stimuli to elicit a brain response.
5.-The Unity Speller interface has a selectable grid of characters. Users focus on a target character while rows/columns flash.
6.-Counting target flashes trains a classifier to detect the P300 brain response, enabling the user to spell characters.
7.-The spelled characters/commands can control aspects of a separate Unity game application connected via UDP.
8.-In one demo, the Unity Speller selected colors to remove obstacles in a Mario-style 2D game running separately.
9.-The Simulink model integrates the BCI stimuli and signal processing with the actual 3D game environment in one app.
10.-An example Simulink game has the user focus on colors of 3D shapes to guess a target shape's hidden color.
11.-The Simulink model handles data acquisition, filtering, and classification of the BCI data to control the game.
12.-A Unity package SDK allows integrating BCI stimuli, signal processing, and game logic flexibly into one cohesive app.
13.-The Unity package currently supports event-related potential (ERP) and code-based visual evoked potential (cVEP) BCI protocols.
14.-SSVEP support is planned for the Unity SDK in the future. It allows continuous BCI control.
15.-The Unity SDK enables creating VR games with integrated BCI control using the Oculus Quest 2 headset.
16.-An ADHD game is in development using the Unity BCI SDK to have users focus on puzzle pieces.
17.-Focusing on the puzzles to reveal an underlying image may help improve concentration skills in children with ADHD.
18.-Unity BCI games aim to make the BCI control an added "superpower" rather than replacing standard controls completely.
19.-Important considerations for Unity BCI games include using shorter training times and minimizing the number of separate applications needed.
20.-Environmental lighting should be bright enough for the visual BCI stimuli to be effective in evoking the brain responses.
21.-A stable frame rate is critical in Unity BCI games to enable precisely timed stimuli and accurate brain response classification.
22.-G-Tech's BCI game development has evolved to better integrate stimuli into the actual game environment within one application.
23.-Bluetooth allows directly connecting G-Tech's Unicorn BCI headset to Unity without streaming data through a separate interface.
24.-The Unicorn Unity integration will soon support the Oculus/Meta Quest 2/3 and Pico 4 virtual reality headsets.
25.-The BCI signal processing introduces a roughly 3-5 second delay for commands, suitable for occasional game interactions but not continuous control.
26.-Lengthening classifier training from 30 seconds to 1 minute may reduce false positives when shifting focus between selectable objects.
27.-The Unicorn Unity asset will be available in the Unity asset store to enable easy drag-and-drop integration into games.
28.-LINUX and SteamDeck support for the Unicorn Unity SDK are being explored but are not yet fully implemented.
29.-A Unicorn simulator allows developing BCI games without a physical headset to facilitate collaboration between remote developers.
30.-G-Tech's VR/AR BCI game development tools have evolved to be more user-friendly and cohesively integrated to facilitate easier adoption.
Knowledge Vault built byDavid Vivancos 2024