Skip to main content
1

Install

pip install neuroencoder
2

Request access

The MRL model is gated. Request access, then log in:
huggingface-cli login
3

Embed your EEG (one call)

Any number of channels, any sampling rate. Returns a numpy array.
import mne
from neuroencoder import MRL

raw = mne.io.read_raw_edf("recording.edf", preload=True)
model = MRL.from_pretrained()

embeddings = model.embed(
    raw.get_data(),
    sfreq=raw.info["sfreq"],
    channel_names=raw.ch_names,
    dim=192,
)
# -> numpy array, shape [N, 192], L2-normalized
4

Use the embeddings

import neuroencoder as ne
ne.explore(embeddings)   # interactive Atlas
ne.plot(embeddings)      # static UMAP

Common patterns

Find similar epochs

# embeddings are L2-normalized, so cosine similarity is just a dot product
import numpy as np
sim = embeddings @ embeddings.T
top5 = np.argpartition(-sim, 5, axis=1)[:, :5]   # 5 nearest for each epoch

Truncate without re-running the model

# Compute once at the highest dim and slice later
full = model.embed(eeg, sfreq=256, channel_names=ch_names, dim=768)
compact = full[:, :48] / np.linalg.norm(full[:, :48], axis=1, keepdims=True)

Non-overlapping epochs (for classification with epoch-level labels)

# Default is a 1s sliding window. For one embedding per 30s window:
embeddings = model.embed(eeg, sfreq=256, channel_names=ch_names,
                         dim=192, stride_seconds=30.0)

Skip preprocessing for already-clean data

images = ne.preprocess(epochs, sfreq=250, filter=False)   # [N, 8, 7500] -> images
embeddings = model.predict(images, dim=192)
See Embeddings for the model class details and Visualization for the Atlas.