- 🐍 Python SDK (
nimbus-bci): sklearn-compatible classifiers with MNE-Python integration - ⚡ Julia SDK (
NimbusSDK.jl): Built on RxInfer.jl for maximum performance
Why Nimbus BCI
Brain-Computer Interfaces face unique challenges: noisy neural signals, inherent uncertainty, real-time requirements, and the need for explainability in medical applications. Current BCI Challenges:- High Latency: Standard processing takes 200ms+ for trial classification
- No Uncertainty Quantification: Deterministic outputs without confidence measures
- Limited Adaptability: Cannot handle changing brain states or signal quality
- Black Box Models: Deep learning lacks explainability for FDA approval
- ✅ Fast Inference: 10-20ms per trial with Bayesian models
- ✅ Uncertainty Quantification: Full posterior distributions, not just point estimates
- ✅ Training & Calibration: Subject-specific personalization in minutes
- ✅ Explainable: White-box probabilistic models for medical compliance
- ✅ Production-Ready: Batch and streaming modes, quality assessment, performance tracking
Python SDK Quickstart
Get started with nimbus-bci in Python
Julia SDK Quickstart
Get started with NimbusSDK.jl in Julia
Core Features
Fast Bayesian Inference
Bayesian LDA and Bayesian GMM models with 10-20ms inference latency using RxInfer.jl reactive message passing
Training & Calibration
Train custom models on your data or calibrate pre-trained models with 10-20 trials per subject
Real-time Streaming
Process EEG chunks as they arrive with chunk-by-chunk inference and weighted aggregation
Uncertainty Quantification
Full posterior distributions with confidence scores for quality assessment and rejection
Implemented Models
Both SDKs provide three production-ready Bayesian models:Bayesian LDA
Bayesian Linear Discriminant Analysis with shared covariance. Fast (10-15ms), efficient, ideal for well-separated classes like motor imagery. Python:
NimbusLDA, Julia: RxLDAModel.Bayesian GMM
Bayesian Gaussian Mixture Model with class-specific covariances. More flexible (15-25ms), better for overlapping distributions like P300. Python:
NimbusGMM, Julia: RxGMMModel.Bayesian Softmax/MPR
Bayesian Multinomial Logistic Regression with continuous transitions. Most flexible (15-25ms), ideal for complex multinomial classification tasks. Python:
NimbusSoftmax, Julia: RxPolyaModel.BCI Paradigms Supported
Motor Imagery
2-4 class classification
- Left/right hand
- Hands/feet/tongue
- 70-90% accuracy
- RxLDA recommended
P300
Binary classification
- Target/non-target
- Speller applications
- 80-95% accuracy
- RxGMM recommended
SSVEP
Multi-class frequency
- 2-6 target frequencies
- High accuracy (85-98%)
- Works with both models
Quick Example
- Python
- Julia
Use Cases
Assistive Technologies
Wheelchair control, prosthetic limbs, communication devices with explainable, FDA-ready probabilistic models
Research Platforms
Academic research with training capabilities, subject-specific calibration, and comprehensive performance metrics
Neurofeedback
Real-time brain state monitoring with streaming inference and confidence-based quality control
Gaming & Wellness
Low-latency brain control for immersive experiences with adaptive difficulty based on confidence
SDK Architecture
- Local Inference: All processing on your machine (Python SDK is fully local)
- Privacy: Your EEG data never leaves your computer
- Speed: No network latency, consistent <20ms performance
- Offline Capable: Python SDK works completely offline
- Authentication and licensing
- Pre-trained model distribution
- Optional analytics logging
Getting Started
- Python
- Julia
1
Install Python SDK
2
Preprocess EEG
Use MNE-Python to extract features (CSP, bandpower, ERP)
3
Train & Predict
Use sklearn-compatible API to train models and make predictions
4
Deploy
Integrate with sklearn pipelines, streaming inference, or real-time applications
Documentation
Python SDK
sklearn-compatible Bayesian classifiers
Julia SDK
RxInfer.jl-based inference engine
Model Training
Train custom models on your data
Preprocessing Guide
Required EEG preprocessing steps
Code Examples
Working Python and Julia examples
API Reference
Authentication and model registry
Performance
| Metric | RxLDA | RxGMM | Bayesian MPR |
|---|---|---|---|
| Inference Latency | 10-15ms | 15-25ms | 15-25ms |
| Training Time | 10-30s | 15-40s | 15-40s |
| Calibration Time | 5-15s | 8-20s | 8-20s |
| Memory Usage | Low | Moderate | Moderate |
| Accuracy | 70-90% (MI) | 80-95% (P300) | 70-85% (Complex) |
Technology Stack
Python SDK:- Core: Python 3.10+, NumPy, JAX, NumPyro
- Integration: scikit-learn pipelines, MNE-Python
- Models: NimbusLDA, NimbusGMM, NimbusSoftmax (Polya-Gamma VI)
- Core: Julia 1.9+
- Inference Engine: RxInfer.jl - Reactive message passing
- Models: RxLDAModel, RxGMMModel, RxPolyaModel
- Preprocessing: MNE-Python, EEGLAB, BrainFlow
- API: TypeScript/Vercel serverless (Julia SDK authentication)
What’s Included
Python SDK (nimbus-bci):
- ✅ sklearn-compatible classifiers: NimbusLDA, NimbusGMM, NimbusSoftmax
- ✅ MNE-Python integration for EEG preprocessing
- ✅ Streaming inference for real-time BCI
- ✅ Online learning with
partial_fit() - ✅ Comprehensive metrics and diagnostics
NimbusSDK.jl):
- ✅ RxInfer.jl-based models: RxLDAModel, RxGMMModel, RxPolyaModel
- ✅ Pre-trained model distribution
- ✅ Batch and streaming inference
- ✅ Training and calibration
- ✅ Quality assessment and ITR calculation
Support
Email Support
[email protected] - Technical support and inquiries
Book a Demo
See Nimbus in action with live demonstration
API Status
Check API availability and version
GitHub
NimbusSDK.jl source code, issues, and examples
License
Python SDK (nimbus-bci): Proprietary license with free evaluation and academic tiers. See Python SDK Installation for details.
Julia SDK (NimbusSDK.jl): Commercial software with tiered licensing:
- Free: 10K monthly inferences, basic models
- Research: 50K monthly inferences, all features
- Commercial: 500K monthly inferences, priority support
- Enterprise: Unlimited, custom models, on-premise deployment
Built with ❤️ for the neurotechnology community Nimbus BCI Engine - Bringing Bayesian inference to Brain-Computer Interfaces