My future research questions focus on how advanced technologies combined with EEG/ERP-based brain–computer interfaces (BCIs) and neurofeedback can enhance children’s cognitive and emotional development.
I ask whether neurofeedback delivered through tablets or smartphones can effectively improve attention and inhibitory control, and whether VR- or AR-integrated BCIs can strengthen executive functions such as working memory and cognitive flexibility more than traditional training methods.
Another key question is which ERP components, such as N2, P3, or error-related negativity, can serve as reliable biomarkers of improvement following neurofeedback, and how gamified BCI tasks might influence children’s motivation, engagement, and transfer of skills into daily academic and social contexts. I also wonder whether adaptive, real-time EEG algorithms can personalize interventions to optimize outcomes, and how immersive neurofeedback in VR/AR environments could support emotional recognition and regulation in children with conditions like anxiety, ADHD, or autism.
In addition, I explore how affective neuroscience markers, including frontal alpha asymmetry or late positive potential, might be combined with BCI-driven feedback to enhance self-regulation, and whether multisensory feedback—visual, auditory, or tactile—adds to the effectiveness of these interventions. I also consider the long-term impacts of BCI training on emotional resilience, its integration into classrooms as real-time support tools for teachers, and the ethical, developmental, and privacy implications of applying such systems in child populations.
Finally, I ask whether hybrid BCI systems combining EEG with other data sources, such as eye-tracking or voice, or the use of machine learning models trained on EEG/ERP features, could predict responsiveness to interventions, and whether VR-based social scenarios could be designed to improve both executive functions and emotion regulation in children who face social difficulties.