Glossary¶
Key concepts in the neuralset pipeline, from data source to batched tensors. See the tutorials for how they fit together.
- Chain¶
Sequences Steps — each step’s output feeds the next. Auto-created when a list is assigned to a
Step-typed field. Resumes from the latest cached intermediate result. →Chain· Tutorial- Event¶
Something that happens in time: a stimulus, a neural recording, a word. Core fields:
start,duration,timeline. Concrete subclasses (Meg,Image,Word,Fmri, …) add modality-specific fields. Events are collected into a pandas DataFrame for bulk processing. →Event· Tutorial- EventsTransform¶
A processing step that takes an events DataFrame and returns a modified one (filtering, chunking, linking word context, …). Subclasses override
_run(events) -> events. →EventsTransform· Tutorial- Extractor¶
Converts events into numeric arrays for a given time window — e.g.
MegExtractorreturns MEG time-series at a given frequency. →BaseExtractor· Tutorial- Frequency¶
A float subclass with helpers for converting between seconds and sample indices (
to_ind,to_sec). →Frequency- infra¶
Parameter controlling disk caching and cluster execution. Its type varies by component:
Backendon Steps and Chains (whole-step caching),MapInfraon extractors andStudy.infra_timelines(per-item caching and batch dispatch),TaskInfraon neuraltrain experiments (full computation and job arrays). · Caching & Cluster Execution- prepare¶
Method on extractors that precomputes results for all events in one pass, triggering disk caching and optional cluster dispatch. Called automatically by
Segmenter.apply(). · Caching & Cluster Execution- Segment¶
A time window (
start,duration) within a Timeline, anchored to a Trigger. Defines which slice of data each Extractor reads. →Segment· Tutorial- SegmentDataset¶
A torch Dataset that pairs segments with extractors. Each
__getitem__reads data for one segment across all extractors. →SegmentDataset· Tutorial- Segmenter¶
Creates segments from an events DataFrame and wraps them into a SegmentDataset. Configured with a trigger query, a time window, and a dict of extractors. →
Segmenter· Tutorial- Step¶
A composable pipeline unit — a pydantic model with a
_run()method and an optionalinfraparameter for caching and cluster execution. Base class for Study, EventsTransform, and Chain. →Step· Caching & Cluster Execution- Study¶
An interface to an external dataset that knows how to iterate timelines and load events from raw data files. Subclasses implement
iter_timelines()and_load_timeline_events(). No data is bundled — each study points to an externally hosted repository. All studies can share a singlepath— each resolves its own subfolder automatically. →Study· Tutorial- TimedArray¶
A numpy array with time metadata (
frequency,start,duration). Used by dynamic extractors for time-aligned slicing. →TimedArray- Timeline¶
A recording session or run within a Study, typically identified by BIDS entities (
subject,session,task,run).Study.iter_timelines()yields one dict per timeline; events loaded from it share that timeline identifier.- Trigger¶
The Event that anchors a Segment — for example a word onset or an image presentation. Selected by the
trigger_queryparameter of the Segmenter.