Here's an interesting benchmark - looking at Flash vs. webGL.
(Let's ignore the fact the article itself is two years old and as such is totally wrong about the state of webGL today)
My testing (on an Intel integrated graphics machine) puts the webGL benchmark at around 10 fps. The flash demo is a solid 60fps.
Hmm, maybe Flash is the future, after all, eh?
What I really find remarkable is that nobody working with webGL has noticed this issue, in the 4 or so years it's been out. Like, do developers not test on integrated graphics systems? Or maybe this is just a recent development? Or by some bizarre coincidence, do all 6 of the machines I've tested, as well as additional machines by helpful forum members, have the same unique problem, and on other iGPU machines it works fine?
None of those options seem plausible to me. This issue is seriously weirding me out. Ashley I'd love it if you could do some testing on your end.