The problem was there was a function in the runtime - "GetObjectTypeByName" - that loops every single object type and finds the one with the given name, case insensitive. For behavior management, this was called once every tick. This meant if your project was huge (hundreds if not thousands of different object types), it was doing a fair amount of work per tick. I came up with a faster way that doesn't need to loop every single object type, so now it barely does anything every tick.
Now for various reasons I think
- it won't reach the 11,000fps of an empty layout in a new game, and
- it doesn't matter.
In the runtime there's a certain amount of work that must be done every tick, and some of these jobs invariably take slightly longer for large projects. However, the difference is (now) incredibly tiny. Remember: 11,000 fps means one tick lasts about 91 microseconds. That's obscenely fast. If that drops by half to 5,500 fps, each tick is now taking about 182 microseconds (less than a fifth of a millisecond).
At the super-high framerates you get with Classic, tiny insignificant amounts of CPU work are vastly over-represented. For real games a change of 91 microseconds per tick has no effect. None. However when Classic is essentially doing nothing per tick, such a tiny change seems to be trashing your framerate. It doesn't matter though.
So I think the big one has been fixed - but there shouldn't be any other significant performance issues. If in doubt calculate your delta-times instead of framerates, and you should see you're doing just fine.