Installation

Basic Install

pip install neuralset neuralfetch

neuralset is the data pipeline. neuralfetch adds the dataset catalog — you need both to follow the tutorials.

Verify it works:

import neuralset as ns
print(ns.Study.catalog())

To follow the tutorials (which use spacy embeddings and audio data):

pip install 'neuralset[tutorials]'

Some extractors require heavier dependencies. Install everything at once with:

pip install 'neuralset[all]'

Key extras in [all] beyond [tutorials]: transformers (HuggingFace text/image/audio models), torchaudio, torchvision, nilearn (fMRI).

Developer Install

For active development or to try the latest features:

Step 1: Create and activate a virtual environment:

uv venv .venv
source .venv/bin/activate
uv pip install pip   # spacy auto-download requires pip inside the venv

Step 2: Install packages in editable mode:

# from the repo root — core pipeline + datasets
uv pip install -e 'neuralset-repo/.[dev,all]'
uv pip install -e 'neuralfetch-repo/.'

Step 3: For strict type checking (optional):

uv pip install --config-settings editable_mode=strict -e 'neuralset-repo/.[dev]'

Step 4: Set up pre-commit hooks:

pre-commit install

Step 1: Create and activate a conda environment:

conda create -n neuralset python=3.12
conda activate neuralset

Step 2: Install pip and uv in the environment:

conda install pip
pip install uv

Step 3: Install packages in editable mode:

# from the repo root — core pipeline + datasets
uv pip install -e 'neuralset-repo/.[dev,all]'
uv pip install -e 'neuralfetch-repo/.'

Step 4: For strict type checking (optional):

uv pip install --config-settings editable_mode=strict -e 'neuralset-repo/.[dev]'

Step 5: Set up pre-commit hooks:

pre-commit install