Conducting Tutor.

A research-style tool that ingests video of a conductor, tracks pose landmarks over time, and turns movement into beat detection, cycle analysis, and publication-ready plots.

Timeline
Fall term — analysis pipeline, UI, and graph outputs
Role
Author — computer vision pipeline, algorithms, and interface
Stack
Python · MediaPipe · NumPy · Matplotlib · custom analysis modules
Conducting Tutor menu screen with Live, Video, and Settings

Figma exploration

Layout and interaction concepts for the analysis workflow and visualization surfaces—paired with the Python pipeline and MediaPipe-backed processing below.

What it does

The program runs a multi-stage pipeline: frame extraction, landmark detection with MediaPipe, then signal processing to find peaks, beats, and expressive motion like sway, mirror movement, and cueing.

Outputs include annotated frames, BPM-style summaries, and graphs for beat position, hand paths, and time-signature estimates—useful for studying how conducting maps to measurable motion.

How the pipeline fits together

Core stages are split into dedicated modules—segment processing, cycle analysis, and graph generation—so experiments can swap algorithms without rewriting the whole stack.

Capabilities

Beat detection

Filtering and peak logic tuned for conducting gestures rather than generic motion.

Hand & body cues

Mirror detection, elbow tracking, and cueing classifiers layered on pose streams.

Visualization

Configurable graphs and an interface layer for running analyses end to end.

Next

Crafty Cottage

A static multi-page site from coursework—HTML, CSS, and imagery for a small retail narrative.

All projects