Skip to main content
Graduate coursework · Biofeedback Innovations for Musculoskeletal Systems

Biofeedback Rehab Study

Motion-capture study testing whether real-time audio feedback helps athletes hold knee alignment under fatigue. Led a 4-person team through study design, test execution, and data analysis. Essentially usability testing for a biofeedback system, dressed up as a kinesiology experiment.

Role Study & iteration lead · 4-person team
Timeline Sep – Dec 2024
Type Motion capture study / Biofeedback system
Participants 3 athletes · 15 kick trials
Read time 10 min

Visual coaching cues break down under fatigue

In musculoskeletal rehab, knee alignment during loaded movement is one of the strongest predictors of injury. Clinicians coach it visually: watch the patient, call out when the knee drifts, demonstrate again. That works for the first few reps. It stops working when the patient is winded, the rep count climbs, or the clinician is running three patients at once.

The question the study asked: can a real-time audio cue, triggered when the knee angle leaves a safe band, replace or augment that visual coaching? If it can, then the same idea scales to solo home exercise, post-discharge maintenance, and return-to-sport training where no clinician is watching at all.

Visual coaching only

Clinician eye + verbal cue

Works for the first few reps. Degrades as the patient fatigues and the clinician splits attention. Impossible at home, on the track, or in a busy clinic. No record of how often the knee actually drifted out of safe range across a session.

With real-time audio biofeedback

Motion-cap + Max/MSP audio cue

OptiTrack streams joint angles into Max/MSP. The moment the knee leaves its safe band, the system plays a cue the athlete can hear without looking. Every deviation is logged. Clinician can watch the session, or watch the data later, or both.

How the system works

OptiTrack captures joint positions at 120 Hz. Custom MATLAB scripts compute knee flexion and valgus angles frame by frame. When an angle leaves the safe band, a trigger goes to Max/MSP, which plays a short audio cue the athlete can hear without breaking form. EMG on the quadriceps runs in parallel so we can correlate muscle activation with alignment failures.

System Architecture
OptiTrack Capture Retro-reflective markers, 120 Hz
MATLAB Processing Joint angles, 4–8 Hz bandpass, EMG sync
Max/MSP Audio Engine Triggered cue when alignment exits safe band
Athlete Correction Closed-loop feedback, next rep
Session Log Alignment events, EMG, kick quality

Key design decisions

Audio instead of visual. A HUD or a light is useless during a Taekwondo kick. The athlete is looking at a target, not a screen. Audio leaves the visual field alone, which matters as soon as you imagine the same idea applied to a runner, a post-ACL patient doing squats, or anyone whose eyes need to be somewhere other than a screen.

Taekwondo kicks as the test movement. High load, high speed, high knee valgus risk, and repeatable across participants. If audio biofeedback can land inside that movement's rhythm, it can land in almost any rehab movement. And the participants were willing. That matters.

Ground-truth validation before any claims about athletes improving. Before we looked at whether participants moved better with the cue, we validated that the Max/MSP system was actually firing when and only when the knee angle crossed the threshold. That's the usability-testing mindset. You don't evaluate the intervention until you've verified the instrument.

Signal processing detail

EMG was bandpass-filtered 4 to 8 Hz to isolate the muscle activation envelope from motion artifact and high-frequency noise. Joint angles were computed from marker positions using standard vector math in MATLAB. The trigger logic compared the filtered knee angle to athlete-specific safe-band thresholds calibrated from a slow baseline kick at the start of each session. Every trigger event and every raw sample was logged to disk alongside the corresponding timestamp in the Max/MSP patch so we could reconstruct any trial offline and verify the system behaved the way the participant thought it did.

Research foundation

Biofeedback for musculoskeletal rehab sits at the intersection of motor learning, signal processing, and human factors. These are the concepts the study was grounded in.

Knowledge of Performance (KP)

External feedback about movement quality, delivered after or during the rep. Audio cues are a KP delivery mechanism. Drives motor learning when timed correctly.

Knee Valgus & Injury Risk

Dynamic knee valgus under load is one of the strongest predictors of ACL and patellofemoral injury. Keeping the knee tracked over the foot is the target behavior.

EMG Envelope Analysis

Bandpass-filtered EMG isolates the activation pattern from motion artifact. Tells you whether the right muscles are firing at the right time during a movement.

Usability Testing Mindset

Before evaluating whether the feedback helps athletes, validate that the system is firing correctly. Instrument first, intervention second. Borrowed straight from IEC 62366-1.

Study numbers

Small, fast, academic. The study was one semester and four people. Within that envelope, here's what we ran.

3
athletes recruited
15
kick trials analyzed
4–8 Hz
EMG envelope bandpass
120 Hz
OptiTrack capture rate

Built with

OptiTrack / Motive MATLAB Max/MSP EMG Processing Signal Filtering Statistical Analysis Study Design Motion Capture

Why these choices?

OptiTrack + MATLAB. OptiTrack is the gold standard for research-grade motion capture. MATLAB handles the marker position math, joint angle computation, and EMG filtering in one environment. For an academic study on a one-semester timeline, that speed mattered.

Max/MSP for the audio side. Max/MSP is built for real-time audio triggered by data streams. It can accept a UDP message or a MIDI trigger and fire a sample with negligible latency. No need to write a custom audio engine from scratch when a mature tool already does exactly this.

EMG bandpass 4 to 8 Hz. That band captures the activation envelope of voluntary muscle contractions while rejecting motion artifact on one side and high-frequency noise on the other. It's the same band the tremor-suppression cuff project uses, just applied to the opposite problem.

What it would take to deploy this

From lab study to rehab clinic

The study used an OptiTrack rig that costs more than a car and takes an hour to calibrate. No clinic is going to deploy that. A realistic clinical version would swap OptiTrack for a consumer depth camera (Azure Kinect or similar) or a small IMU pod on the thigh and shin, with the audio cue running on a phone or a Bluetooth earbud. The usability question the study answered, audio beats visual for coaching under load, is what transfers. The hardware is replaceable. That's the same pattern as PhysioRep, which took pose estimation out of the motion capture lab and put it on a phone.

What I learned

Ground-truth validation before claims. Before we ran any analysis on whether athletes moved better with the cue, we verified that the audio cue fired when and only when the knee actually left the safe band. That's usability testing logic applied to a research rig. You don't ask "did the intervention work" until you've confirmed the instrument is behaving the way you think it is.

The clinician in the room is the bottleneck the study exists to remove. Watching a single clinician coach three patients at once made the point clear. Verbal cues are scarce when attention is split. A well-timed audio cue is cheap to deliver and never fatigues. That shifts the role of the clinician from reactive corrector to program designer. Same idea, applied through different hardware, is what PhysioRep does with a phone camera.

Audio is under-used in rehab tech. Most commercial rehab tools either do nothing during the rep or flash a screen the patient has to stop and look at. The screen-based cue is useless the moment the movement is complex or fast. Audio fills that gap and almost nobody is building for it. That was the biggest takeaway from this study, and it's shaped how I think about every feedback-driven device project since.

What I got wrong.

The study ran. The system worked. But I'd do several things differently if I ran it again.

01
n=3 is not enough to say anything definitive.

Three athletes across fifteen kick trials is enough to validate that the system fires correctly and that a cue can be delivered mid-kick without disrupting the athlete's rhythm. It is not enough to claim that audio biofeedback improves outcomes. I was careful not to overclaim in the final report, but the study design itself was underpowered for any statistical claim about performance. If I ran it again, I'd either recruit ten participants or reframe the study explicitly as a usability and feasibility pilot, not a performance study. I also wouldn't pick kicks. I'd pick a slower, more controllable movement where a larger sample could be run in a single afternoon.

02
I didn't pilot the audio cues with non-athletes.

The cues were designed for and tested on Taekwondo athletes who already had strong body awareness. A rehab patient recovering from ACL surgery has much less body awareness, much more fear of pain, and very different expectations about what sound means during movement. A beep that feels like a helpful nudge to a trained athlete might feel like a judgment to a post-op patient. I should have tested the cue design with a few non-athletes, including someone who had recently had a knee injury, to see whether the sound was interpreted the way I hoped. That would have caught the emotional tone of the feedback as an issue early, not late.

03
The data logging was not set up for reuse.

Every trial produced OptiTrack marker files, EMG channels, and a Max/MSP trigger log. I saved them all but I didn't build a consistent naming convention or a manifest file tying them together. Six months later, when I wanted to look back at a specific trial to pull a figure, I had to open three folders and manually cross-reference timestamps. A fifteen-minute investment in a file naming scheme and a CSV index at the start of the study would have saved hours of reconstruction work. Same lesson I learned on the capstone pneumothorax project, just earlier this time.

Answers before the interview.

If I were screening this portfolio, these are the three questions I'd ask. So here they are, answered.

Q1
Why call it usability testing and not a sports biomechanics study?

Because the primary question wasn't "do these athletes move differently with the cue" in any sports-performance sense. The primary question was "can this system deliver feedback the athlete understands and acts on, in the middle of a complex movement, without disrupting the movement." That's a human factors question about the usability of the biofeedback device. Framing it as sports biomechanics would have oversold what a three-athlete study can actually conclude. Framing it as usability testing is honest about what we were doing and maps directly onto how I'd evaluate a real rehab product before handing it to patients.

Q2
How would this hold up as a medical device?

As a research rig, it would not pass as a device. No risk analysis per ISO 14971, no formal software verification, no labeling, no clinical validation. That's expected for a coursework study. But the architecture separates the parts that would matter: the motion capture subsystem, the feedback logic, the audio delivery subsystem, and the data logging are all cleanly decoupled. If I were formalizing this into a device candidate, each block has a clear analog in 62366-1 usability evaluation, and the audio delivery channel has a clean failure mode analysis (missed cue, false cue, wrong-timing cue) that maps onto an FMEA. The research rig is not the device. It's the first iteration of the thinking that would eventually become one.

Q3
Why is this on an engineering portfolio and not a biomechanics portfolio?

Because the interesting part, for an employer evaluating engineering fit, is how the system was architected and validated, not the sports physiology result. I ran an end-to-end pipeline from motion capture through signal processing, trigger logic, real-time actuation, and logging. I led a four-person team through study design, execution, and analysis. I applied the same validation mindset I'd apply to a formal device (instrument first, intervention second, log everything, fail safely). That's the story an R&D or human factors or validation hiring manager cares about. The fact that it happened inside a kinesiology course is incidental.

Interested in this work?

Human factors and biofeedback for physical rehab is where I want to work. If you're building wearables, rehab devices, or coaching tools where the feedback loop matters, I want to hear from you.