From what I've read (I'm no expert on it, though), 'true' fullscreen boosts fps a bit because somehow the graphics card can skip some steps/ignore every other application and throw the rendered image directly to the monitor - with a fullscreen borderless window it still has to check if any other windows are overlapping it, ui elements, os stuff, etc - some also theorize that the graphics card could be drawing the desktop and then the window on top of it, meaning it hits the pixel fill rate and such (I'm honestly not sure of the real reasons, again, I'm no expert and what I've read could be wrong as they were from people theorizing/discussing it on the internet, so the specific reasons could be incorrect).
However, it's not a placebo effect, as I've seen it on my computer as well, playing the new XCOM with a fullscreen borderless window results in a choppy framerate, and using the 'actual' fullscreen setting makes it much, much smoother - easily perceptibly so. It could just be the difference of tipping it a bit over the point where it can't handle 60fps, but it shows that it can make a noticeable effect, and we're talking about low-end laptops here where it's likely to be a problem.
I don't know if it's possible or not, but it would be helpful if it is.
Edit: From gaming.stackexchange.com/questions/107028/is-there-a-difference-between-running-games-in-windowed-or-fullscreen-mode
"When an application runs in fullscreen mode, it runs in "exclusive mode". That means it has full and direct control over the screen output.
But when it runs in window mode, it needs to send its output to the window manager (windows explorer) which then manages where on the screen that output is drawn. This takes some additional performance. The performance penalty, however, is greatly reduced in newer version of windows."
That might explain why it's an issue on my machine, as it's running vista. In newer versions of windows it might be imperceptible.