Beyond the Joystick

Update 1 · May 10th 2026

A development log for an in-home, personalized brain-and-bio-signal interface for power wheelchair control. We update this page as the project moves forward.

In This Update

Hardware integration is validated end to end. The classification model trains and runs cross-session at 92.9 percent accuracy, with the highest-confidence predictions reaching 96 percent. The R-Net joystick command format has been decoded directly from captured chair traffic. A high-fidelity wheelchair simulator is now running on real captured data. Physical chair drive is held until an independent hardware emergency stop is in place.

First End-to-End Closed-Loop Session

It worked. The operator drove the simulated chair through the full pipeline today for the first time. Forward, left, right, and a sequence of turns and forwards strung together into actual navigation. No false fires, no watchdog stops, no commands the operator did not intend. The path came back near the origin by the end. Forward, around, and home. A user can do that with a hand joystick when the hand still works. The point of this project is that a user can now do that with a face. The session ran in our data-grounded simulator, built from real captured bus traffic from the actual F5 Corpus VS we work with, so the chair's response curve is the chair's, not a guess. Physical-chair driving remains held until the independent hardware emergency stop is in place. Everything upstream of the wheels is now demonstrated end to end with a real user in the loop.

Where We Are Right Now

Active phase: hardware integration validated, chair-side software in development.

Working today:

  • Real-time EEG capture with per-electrode signal-quality monitoring.

  • Trained four-class model with cross-session validation, serving live inferences.

  • Confidence-thresholded state machine that fires commands only after consistent multi-window agreement.

  • Decoded R-Net joystick command format.

  • High-fidelity wheelchair simulator built from manufacturer specifications and captured bus traffic.

Not yet complete:

  • Physical chair drive. Held until an independent hardware emergency stop is in place.

  • Population validation. Current results are from one operator.

  • Longitudinal stability across weeks of use.

What's Next

  1. Hardware emergency stop. A safety device in line with the control bus that cuts drive power if commands stop arriving. Prerequisite for any physical-chair driving.

  2. Full chair integration. The captured boot data is in hand. The safety hardware needs to be in place first.

  3. Multi-subject validation. Recruit users across the intended population, with appropriate review and informed consent.

  4. Longitudinal study. Track classification stability across weeks rather than days.

For New Readers: What Beyond the Joystick Is

Beyond the Joystick is a power wheelchair control system for users whose hands can no longer reliably operate a hand joystick, including those with high-level spinal cord injury, advanced ALS, severe muscular dystrophy, and locked-in syndrome. We use a four-electrode consumer EEG headband to read intentional facial gestures (a wink, a jaw clench, a held neutral state), classify them with an in-home trained neural network, and send drive commands to a Permobil power wheelchair through its R-Net control bus.

Our Approach: Hybrid Bio-Signal Control

Traditional brain-computer interfaces classify pure brain activity. They are slow, fatiguing, and fragile to electrode contact. Our system is different by design. We treat the eye movements and small facial muscle activations that consumer EEG hardware picks up as the signal, not as artifacts to filter out. The literature calls this a hybrid bio-signal interface, and it works well for the population that needs assistive technology most.

The classifier is an advanced neural network designed for short-window biosignal data. On cross-session validation, the four-class system reaches 92.9 percent accuracy. With a confidence threshold applied, accuracy rises to 96.0 percent over 88 percent of the time the system makes a decision. Inference runs in about 5 milliseconds, well within the latency budget for real-time control.

The chair side speaks through R-Net, the control-bus protocol used across most Permobil power wheelchairs. We have decoded the joystick command format, characterized the chair's response from captured traffic, and recorded the boot sequence required for full integration.

No surgical implantation. No cloud connectivity. No institutional infrastructure. The whole pipeline runs locally on portable hardware.

For Healthcare Staff and Caregivers

The headband. No electrode gel, no skin preparation, no clinical infrastructure. It rests on the forehead and temporal areas like a pair of glasses. Fitting takes under 60 seconds with assistance.

The control gestures. Winks and jaw clenches are intentional facial actions most users retain through significant disease progression, often long after a hand joystick has become unreliable. They are natural enough that user training is minimal compared to motor-imagery systems, which can take weeks of practice.

The model is trained for one person at a time, in their home. Most medical AI is trained on thousands of patients, deployed to the cloud, and used by everyone the same way. Our model is the opposite. Every user trains their own copy, on their own signals, in their own living room. A user wears the headband and performs each gesture on cue while a laptop records the data. After a couple of sessions across two days, the model is ready.

This matters because each person's signal is genuinely their own. A wink in one user looks different from a wink in another. The same user produces slightly different signals after a few weeks of disease progression. A personalized model adapts. A generic one fights those differences.

Rural patients, veterans, and remote access. Removing the clinic from the loop changes who this technology can reach. A patient three hours from the nearest specialty rehab center is in the same position as one next door, because setup, training, and use all happen at home on equipment the family can purchase directly. Veterans living with service-connected spinal cord injury or ALS, including those served by VA home-based primary care or living far from a polytrauma center, gain access without the travel and scheduling burden that often gates current options. No facility visit to recalibrate. No internet dependency, so it works without broadband. No cloud account, no recurring fee, no third-party data flow.

The system is additive, not a replacement. A user keeps their existing chair and seating. The BCI is an alternative input the user or caregiver can engage and disengage as needed.

If you have a patient who could benefit from following this work, or practical questions about what an assistive BCI needs to do to be useful in your setting, we would like to hear from you. The contact form below reaches us directly.

For Researchers and Developers

A pre-print describing the architecture, training methodology, and integration work is available on request. The project is in active development, and we are open to research collaboration, particularly with partners working on assistive-technology-grade safety architectures and multi-subject validation protocols.