Having spent quite a bit of time working with EEG in various projects, I was eager to take my knowledge a step further in the world of EEG-based brain-computer interfaces (BCI). I found BCIs based on Steady State Visually Evoked Potentials (SSVEP) particularly intriguing, as they are known to be more robust than those relying on motor imagery. With this in mind, I teamed up with three amazing individuals, Joseph González Núñez, Inés Martín Muñoz, and Ritika Gupta, for an elective course titled “Biosignal Processing and Modeling.”

In our project, we utilized the wet EEG system Smarting 24 to record SSVEP signals. These signals are generated in the occipital cortex when a person focuses on a visual stimulus that is flickering at a certain frequency. The brain’s response to this stimulus can be measured using EEG, as the neurons in the occipital cortex synchronize their activity with the frequency of the visual stimulus, resulting in spectral peaks in the EEG signal.

Leveraging these SSVEP signals, we developed a semi-real-time system that allowed a user to navigate through a virtual maze using their brain activity. The participant could control the movement of a character on the screen by simply focusing on the corresponding visual stimulus.

Looking back, this project was as challenging as it was rewarding. It was a fantastic opportunity to apply theoretical knowledge in a practical setting, and see firsthand the potential of BCI technology. We’ve documented our journey and findings in our final report, which you can check out for a deeper dive into our work: