Neurofeedback Platform: Engineering Stack & SDK

High-Level Stack Overview

This platform is the backbone for building personalized neurofeedback applications. We separate core engineering from student-facing creativity. The result is a modular system: engineers build robust core libraries, while PhD/Masters students (or other developers) innovate apps on top.

  1. Core platform for real-time signals, features, and states.
  2. Construct axes as reusable dimensions.
  3. SDK for student-led app development—defining new states, feedback, and interaction designs.

Core Platform Engineering Stack

Signal Acquisition & Processing

  • Integrate EEG (MVP) first, with future support for fNIRS.
  • Handle device streams, timestamps, channel metadata, and signal quality (impedance, artifacts).
  • Preprocessing includes: filtering, rereferencing, artifact detection, and signal reliability scoring.

Marker/Feature Extraction

  • Core library of evidence-backed markers (e.g. SMR, individualized alpha, theta/beta ratio, SCPs).
  • Expandable registry of features (e.g. band power, coherence).
  • Each marker is well-defined: modality, channels, latency, and evidence level.

Construct Axes Calculation

  • Axes are high-level dimensions derived from markers (e.g. Calm Focus, Task Engagement, Cognitive Control).
  • Each axis fuses multiple markers.
  • Axes are reusable across apps, acting as stable “control knobs.”

Task-Specific States

  • Pre-defined states are combinations of axes (e.g. calm-focused, distracted, over-aroused).
  • States are what apps respond to.

SDK for Student-Led Development

Defining New States

  • Students combine existing axes to create new states relevant to their domain.
  • Example: Define a “flow state” as a combination of high Task Engagement and Calm Focus.

Feedback Policy & Interactive Design

  • SDK exposes state listeners. Students define how feedback adapts when states change.
  • Students can build interactive experiences—games, tasks, or interfaces that shift based on user brain state.
  • Feedback can be visual, auditory, or task difficulty changes.

Vibecoding in the SDK

  • Students create new neurofeedback apps by:
    • Subscribing to states or axes.
    • Designing novel feedback rules (e.g. when calm, show visual expansion).
    • Creating interactive games or tasks that adapt to brain state shifts.