laMEG
: laminar inference with MEG#
Introduction#
Toolbox for laminar inference with MEG, powered by FreeSurfer (https://surfer.nmr.mgh.harvard.edu/fswiki) and SPM (spm/) for Python 3.7.
The source code of the project is hosted on Github at the following address: danclab/laMEG
To get started, follow the installation instructions.
Available modules#
Here is a list of the modules available in laMEG
:
This module provides tools for the coregistration, and source reconstruction of MEG data using SPM (Statistical Parametric Mapping). |
|
This module provides tools for performing laminar analysis of MEG signals. |
|
This module provides a set of tools for handling and manipulating surface mesh data. |
|
This module provides tools for converting and visualizing data. |
|
This module facilitates the simulation of MEG data using the Statistical Parametric Mapping (SPM) toolbox. |
Tutorials#
A collection of tutorials is available:
Operating system#
Windows: Tested on WSL (using Ubuntu 24.04.1), follow instructions here
Mac: May work, not tested
Linux: Tested on Ubuntu and Debian
Requirements#
Python version 3.7
Anaconda (or miniconda)
Installation#
Create a conda environment:
conda create -n <env name> python=3.7
replacing
<env name>
with the name of the environment you would like to create (i.e. ‘lameg’, or the name of your project)Activate the environment:
conda activate <env name>
replacing
<env name>
with name of the environment you created.Install FreeSurfer, following the instructions here
To install
laMEG
, run:pip install lameg
This also installs SPM standalone and Matlab runtime, which can take some time depending on your connection speed.
Before using, deactivate and reactivate the environment for changes to environment variables to take effect:
conda deactivate conda activate <env name>
If you want to run the tutorials, download and extract the test data
Indices and tables#
Funding#
Supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme grant agreement 864550, and a seed grant from the Fondation pour l’Audition.