Privacy-preserving EEG data frameworks for brain-computer interfaces
Abstract
Non-invasive brain-computer interface (BCI) systems rely on brainwave activity, predomi-
nantly captured through Electroencephalography (EEG), to facilitate seamless interactions
with digital platforms. Throughout its development, EEG-driven BCIs have touched in-
dustries as diverse as entertainment, healthcare, and cybersecurity. However, despite im-
provements in functionality and accuracy, the critical issue of securing the vast amounts of
sensitive EEG data collected by these systems has remained largely overlooked, posing sig-
nificant privacy risks. While techniques like data anonymization, encryption, masking, and
perturbation aim to protect privacy, they often degrade the quality of the data and fail to
fully eliminate the risk of re-identification. In response, we have developed multiple privacy-
preserving frameworks: a quantum-inspired Differential Privacy-based generative model, a
R ́enyi Differential Privacy (RDP) based Federated model, and a privacy-adaptive Federated
Split Learning framework, featuring Secure Aggregation and Autoencoders. Each framework
is designed to generate synthetic EEG data that comply with privacy protection standards
while ensuring robust data utility for downstream analysis. Modern defenses that focus on
privacy frequently sacrifice performance or depend on large amounts of external data, which
can limit their practicality. Our approach not only mitigates these limitations, but also
significantly strengthens defenses against membership inference and reconstruction threats.