Beat detection
Filtering and peak logic tuned for conducting gestures rather than generic motion.
A research-style tool that ingests video of a conductor, tracks pose landmarks over time, and turns movement into beat detection, cycle analysis, and publication-ready plots.
Design
Layout and interaction concepts for the analysis workflow and visualization surfaces—paired with the Python pipeline and MediaPipe-backed processing below.
Overview
The program runs a multi-stage pipeline: frame extraction, landmark detection with MediaPipe, then signal processing to find peaks, beats, and expressive motion like sway, mirror movement, and cueing.
Outputs include annotated frames, BPM-style summaries, and graphs for beat position, hand paths, and time-signature estimates—useful for studying how conducting maps to measurable motion.
Architecture
Core stages are split into dedicated modules—segment processing, cycle analysis, and graph generation—so experiments can swap algorithms without rewriting the whole stack.
Highlights
Filtering and peak logic tuned for conducting gestures rather than generic motion.
Mirror detection, elbow tracking, and cueing classifiers layered on pose streams.
Configurable graphs and an interface layer for running analyses end to end.
A static multi-page site from coursework—HTML, CSS, and imagery for a small retail narrative.
All projects