experiment 01
What if gestures were just math?
Every gesture library needs the DOM. This one doesn't. A gesture is just a pattern in coordinates.
{ x, y, t }[]
classification
SWIPE RIGHT
94% confidence
02
Try it
Click a preset to see the engine classify it step-by-step, or draw your own gesture on the canvas.
click a preset or draw a gesture
try: swipe, tap, hold, circle, flick
math breakdown
click a preset or draw to see the math
04
Three functions. That's the entire API.
No classes. No lifecycle. No configuration objects. Feed coordinates in, get classifications out.
recognize(points)Classify a completed gesture. Call when the user lifts their finger.
const gesture = recognize(points);
// { type: "swipe", direction: "right",
// confidence: 0.92, velocity: 0.78, ... }predict(points)Classify mid-gesture from partial data. Enables predictive UI.
const { likely, alternatives } = predict(points);
// likely: { type: "swipe", confidence: 0.7 }
// alternatives: [{ type: "pan", ... }]recognizeDoubleTap(a, b)Detect double-tap from two separate tap sequences.
const dblTap = recognizeDoubleTap(first, second);
// { type: "double-tap", interval: 180,
// confidence: 0.95, center: {x, y} }05
What you could build
Pure-math recognition unlocks places DOM-based libraries can't reach.
Canvas / WebGL Games
Gesture controls without DOM elements. Detect swipes, flicks, and holds directly from pointer data in your render loop.
Session Replay Analytics
Classify gestures from logged pointer data on a server. Find rage-clicks, hesitant scrolls, confused navigation.
Gesture Prediction
Call predict() mid-gesture to start UI responses before the gesture completes. Interfaces that feel like they read your mind.
Accessibility Tools
Classify shaky or imprecise input from users with motor impairments. Distinguish intent from tremor-induced movement.
Custom Gesture Vocabularies
Combine recognized gestures into compound patterns. Swipe-then-hold = drag mode. Double-tap-then-swipe = selection.
Cross-Platform Input
Same recognition on web, mobile, and desktop. Process raw coordinates from any source: mouse, touch, pen, or trackpad.
06
Under the hood
Ten math functions. Threshold-based classification. No neural networks, no training data, no black boxes.
math primitives
| function | description |
|---|---|
| distance(a, b) | Euclidean distance between two points |
| straightLineDistance(pts) | Distance from first to last point |
| totalPathLength(pts) | Sum of all segment distances |
| duration(pts) | Time elapsed from first to last point |
| velocity(pts) | Average speed across the gesture |
| curvature(pts) | How much the path deviates from straight |
| maxDrift(pts) | Max distance any point strays from centroid |
| centroid(pts) | Average position of all points |
| angle(a, b) | Direction angle between two points |
| predictEnd(pts) | Extrapolate endpoint with friction decay |
engine stats
built by aumiqx labs — this is an experiment, not a product (yet)