AlphaHandOpen repo

AlphaHand

EEG Finger Movement Identification Research

Individual Finger Movement Identification From Muse 2 EEG

A lightweight EEG pipeline for decoding individual finger intent with Muse 2 signals.

AlphaHand is an EEG finger movement identification effort using Muse 2 signals to decode individual finger intent for practical BCI and HCI applications.

Pipeline at a glance

Capture multi-channel Muse 2 EEG, process in real time, and decode finger intent with a lightweight neural stack.

Adaptive filteringTemporal featuresDeployment ready
AlphaHand EEG capture and decoding workflow visual

Key Results

TODO_ACCURACY_PERCENT% cross-subject accuracy (TODO_CV_PROTOCOL)

Macro F1: TODO_F1_MACRO

Subjects: TODO_N_SUBJECTS | Trials: TODO_N_TRIALS | Latency: TODO_LATENCY_MS ms

Capture

Muse 2 headset + finger prompts with synchronized annotations.

Decode

Feature extraction + temporal model trained for individual intent.

Deploy

Low-latency inference ready for assistive and HCI systems.

Read the paper

Publication-ready research

The manuscript covers experimental design, preprocessing, model architecture, and evaluation results.

Highlights

  • • Multi-subject evaluation with robust cross-validation.
  • • Feature visualization aligned with neurophysiological markers.
  • • Portable inference pipeline for real-world deployments.
See results →

Media highlights

Latest visuals & demos

Signal Capture

Signal Capture

Muse 2 EEG acquisition with motion prompts.

Model Training

Model Training

Temporal convolution + transformer classifier.