LUNA¶
NtLunathorir/LUNA (HuggingFace Hub, manual download)LUNA is a topology-invariant EEG foundation model that processes signals from varying numbers of channels using a learned channel-unification mechanism based on cross-attention queries. It is topology-agnostic: it accepts arbitrary montages without requiring a fixed channel set. Three pretrained checkpoints are available (Base, Large, Huge); the NeuralBench default uses the Large variant. The model was pretrained on the Temple University Hospital EEG corpus (TUEG) plus the Siena Scalp EEG Database at 256 Hz.
Pretraining data overlap¶
Warning
LUNA was pretrained on TUEG plus the Siena Scalp EEG Database (Section 4.1
of the paper). The paper explicitly excluded TUAB, TUAR, TUSL,
and SEED-V from pretraining, but did not exclude TUEV. This means
clinical_event has a direct overlap, while pathology and
artifact have corpus-level overlap (same TUEG parent corpus, but the
specific recordings were excluded).
Pretraining dataset |
NeuralBench task |
Overlap type |
NeuralBench study |
|---|---|---|---|
TUEV (not excluded) |
|
direct |
Harati2015 |
TUEG (TUAB excluded) |
|
corpus |
Lopez2017 |
TUEG (TUAR excluded) |
|
corpus |
Hamid2020 |
Known limitations¶
Patch size padding –
n_timesis automatically zero-padded to the next multiple ofpatch_size(40 at 256 Hz) when needed. This is handled transparently by_LunaEncoderWrapperbut may introduce a small number of padding tokens at the end of the sequence.