newt : I saw on IRC you proposed an idea to compare old/new points, but I'm not sure where you want to add that.
If it's an idea to simplify new gestures before adding them to the library, it's not necessary. You can use a bunch of points (even if some of them are overlapping), the lib is doing itself the cleaning (it's a kind of Procruste analysis to simplify the input gesture to compare it against the reference).
Some screenshots from the paper :
1: you can see there's some variability in the input gesture
<img src="http://dl.dropbox.com/u/1412774/oneStrokeGestureRecognizerC2Plugin/01.png" border="0" />
2: so the gesture is resampled internally to a predefined number of equidistant points
<img src="http://dl.dropbox.com/u/1412774/oneStrokeGestureRecognizerC2Plugin/02.png" border="0" />
3: the gesture is rotated, always in the same direction, to make it
rotation independent
<img src="http://dl.dropbox.com/u/1412774/oneStrokeGestureRecognizerC2Plugin/03.png" border="0" />
4: finally some error rate regarding recognition, depending on the number of availble samples to compare against (you see there that 3 is the minimum efficient number) - Rubine is another algorithm from the field, and DTW is Dynamic Time Warping, more powerful, but way more ressources hungry (too much for a quick Javascript, at least)
<img src="http://dl.dropbox.com/u/1412774/oneStrokeGestureRecognizerC2Plugin/04.png" border="0" />