Oh, sorry if I confused things! I was intrigued by this part of the blog post:
"The webkitAudioContext element was introduced to HTML a while back, and is now available in the standard as AudioContext. The API is fairly flexible, and one of the things you can do with it is get a Javascript callback which writes samples more or less directly to the output device, just like we used to do with DMA loop interrupts back in the day. It’s called createScriptProcessor." I just read HTML standard. So "createScriptProcessor" is already part of WebAudio? Sorry then. Damn, And it looked so easy to implement...