Storytelling with AI: Grand Gestures

Objective

After completing this lesson, you will be able to implement, train and use a gesture recognizer with supervised learning and a nearest neighbor algorithm to illustrate stories.

Content

Video summary:

The video shows how to build a simple sketch-and-animate tool using the $1 unistroke gesture recognizer: capture a single-line drawing as a sequence of mouse positions, normalize it (resample to a fixed number of evenly spaced points), compare with labeled examples using Euclidean nearest-neighbour distances, and automatically trigger matching animations (via broadcasts). It also covers training examples, handling bias in training data, and demos using animated hearts/stars and a short story.

Key points:

  • Capture each unistroke as a list of mouse positions while the pointer is down (one gesture at a time).
  • Normalize sketches by resampling to a fixed number of evenly spaced points so comparisons are consistent.
  • Recognize gestures with a nearest-neighbour check using summed Euclidean distances to labeled examples ($1 algorithm); add examples to train.
  • Broadcast the recognized label to trigger animate scripts; ensure diverse training data to avoid bias.