top of page
Crux_Image_NT_Brain_edited.jpg

Abstract

The growing field of neurotechnology aims to meld minds with machines through programs that extract biodata from an individual’s nervous system activity. BrainBrush is one such interactive model that aims to improve understanding of human emotions through machine learning and graphic design. Using a Muse headset to record brainwave activity, raw EEG data was gathered from six participants as they watched emotion-inducing video clips. After each video, users rated their emotional states on scales of high to low valence (positivity), arousal (excitement), and dominance (control over experienced emotions). The collected data was then used to train a neural network to perform linear classification on these emotional scales, with the goal of establishing a model that can indiscriminately classify raw brainwave data and predict new users’ emotional states. The model output was then fed into the multimedia design software, TouchDesigner, and altered a series of parameters on a pre-set torus design. Some of the parameters were rotational speed, color palette, and edge jaggedness. The resulting image brought the unique brainwave data to life in a color-specific, torus-shaped modern artwork. BrainBrush has proven to be a powerful program that identifies new patterns within EEG data for emotion identification, while also providing individuals with a personalized artistic display of their feelings.

Contact
Abstract: Bio
bottom of page