Out of curiosity, I just did a bit of testing in a different level of the full game.
I disabled all the layer shaders (There's still an adjust HSL shader on the player, but the effect of that has to be minimal)
I'm not even talking jank at the moment, just raw fps.
At 1080p in Chrome, with webGL ON, I got around 40fps.
At 1080p in Chrome, with webGL OFF, I got 60fps.
I'll say it again
At 1080p in Chrome, with webGL ON, I got around 40fps.
At 1080p in Chrome, with webGL OFF, I got 60fps.
This is actually sort of exciting for me because it means that there's definitely something wacky going on somewhere. Ashley any ideas?
EDIT: I did some further testing in my fillrate tester (The same one used in this topic) and found that webGL performance was basically exactly the same as non webGL. I've tried to do some research to find out if this is expected behaviour or not, but there doesn't seem to be a lot written on the subject. Apparently fillrate usually isn't an issue for most games?
EDIT 2: OK, maybe I'm going crazy. I looked up some benchmarks:
Renderperf3: WebGL disabled
Renderperf3: WebGL enabled
I went around to every computer in the house and tested them out, on Chrome and IE. There were four laptops with Intel Integrated Graphics of different varieties, and every single one had non-webGL either perform identically to webGL, or surpass it by a significant margin. The only exception was a five year old mac, which crushed all the other computers when it came to webGL performance, and was about the same with non-webGL performance.
So... what's going on here? Why am I the only one who's noticed that webGL is performing the same as canvas2D on all Intel Integrated Graphics machines? Or is there something I'm missing?