I'd say it's reliable, but I believe the debugger costs a bit of performance itself (though not 100% sure, or, if it does cost performance, is that reflected in the cpu %?).
It may be more cleaner to add a text/SpriteFont and set text to cpuutilisation, then preview without debugger and see what you get.
For the FPS, I've found Chrome is always wayy more performant, whilst Firefox lags in same exact game.
For the FPS dropping randomly, I've only ever seen that on mobile devices. My mobile is a 120hz display, and even when I force it to "gaming mode", it drops to 60fps when it is overwhelmed, then few seconds pass, then it jumps back up to 120fps. Not sure if any desktop computers also do this, but I wouldn't have thought so.
Perhaps the whole "nearly 100% but on older device it barely reaches 15%" thing, my guess is, is it using webgl2, or is it using your gpu on the device that reaches 100%? The debugger will tell you if webgl2 is being used, and also you could create a text object that outputs gpuutilisation - if you get a value, it's using your gpu. If you get "NaN", then perhaps gpu stuff is being offloaded onto the cpu, causing 100% cpu usage.
Just some thoughts, hope it points you in the right direction!