Welcome to the Journal of Analytical Neuroscience

Memory-Efficient SEEG Data Processing Pipeline for Large Datasets

Abstract

Stereoencephalography (SEEG) is a powerful technique for intracranial recording of brain activity, crucial for localizing epileptic foci and studying neural dynamics. However, interpreting SEEG data is computationally demanding due to the high spatial and temporal resolution of recordings, significant data volume, electrode placement variability, intrinsic noise, and complex, non-stationary signals. This paper proposes a memory-efficient SEEG data processing pipeline designed to manage large datasets effectively while preserving critical signal information. The preprocessing pipeline includes loading and inspecting raw data, applying zero-phase FIR band-pass filters (1-50 Hz) to eliminate noise without distorting phase relationships, and segmenting data for frequency-specific analysis. Frequency domain analysis is conducted using Fast Fourier Transform (FFT), Short-Time Fourier Transform (STFT), Morlet wavelet transforms, and Welch’s method for Power Spectral Density (PSD) estimation. These methods enable robust exploration of frequency dynamics, capturing both transient and stable oscillatory brain activity. The presented pipeline maintains computational efficiency through optimized windowing parameters and filtering strategies, ensuring high-quality data interpretation without extensive resource demands. While primarily linear and limited by fixed parameters and potential redundancy in time-frequency overlaps, the approach successfully addresses common challenges in SEEG interpretation, including artifact reduction, spectral clarity, and reproducibility. This work offers a practical framework for large-scale SEEG analysis, facilitating clinical decision-making and advanced neurophysiological research.

DOI

https://doi.org/10.64045/6mgb-4jdy

Cite
Nashman Z, Belhadj A. Memory-Efficient SEEG Data Processing Pipeline for Large Datasets. Journal of Analytical Neuroscience [Internet]. 2025 May 25;1(1). Available from: https://www.janeuro.org/2513660_memory-efficient-seeg-data-processing-pipeline-for-large-datasets

 

Download Article (PDF)

Read more »

About us

Journal of Analytical Neuroscience conveys research and developments that require novel techniques for data analysis pertaining to all aspects of neuroscience. Original articles on any aspect of neuroscience or related work in allied fields are invited for publication.

 

The primary goal of JAN is to further the analytical tools and approaches to data pertaining to neuroscience including but not limited to : scRNAseq, EEG, quantitative EEG (QEEG), evoked potentials, magnetoencephalography (MEG), electroconvulsive therapy (ECT), transcranial magnetic stimulation (TMS), deep brain stimulation (DBS), polysomnography (sleep EEG), Stereoelectroencephalography (SEEG) and EEG Neurofeedback. We publish both translational and clinical work.

 

JAN believes that exorbidant fees do not need to be paid by authors just to have their work known to the wider scientific community. Nor do we believe in having a paywall that disallows people from accessing knowledge that can be used imminently for the benefit of humanity. Hence, we charge a reasonable sum to keep our journal running while producing quality articles both in-print and online versions of the works submitted. All of our articles are open access.

 

Our journal employs best practices to reduce prejudice in publishing, such as a double-anonymous peer review, constructive feedback for all submissions, and making editorial practices transparent (e.g. through publishing guides). Consistent with this practice, we accept submissions that show negative results that maintain the same level of scientific rigor.

 

Finally, despite our best efforts we know we will make mistakes along the way. To that end, we commit to acknowledging our mistakes and learn from them.

 

We will be open, honest, and clear about any areas in which we fall short and invite feedback from our authors, reviewers, and partners about this at any time by contacting our chief coordinator at: