CVS systems store details of all revisions of every file in the project, so if you're using a website stripper to pull down a copy of the entire Construct CVS site, you may well be pulling down copies of every single revisions of every single file every uploaded to the CVS.
If you have a 'main.c' or something that has been revised 50 times, the CVS system will hold the current version of the file along with details of all changes made since the original upload, in terms of additions and deletions. This is stored as series of 'commits', likely as diff files. The online viewer for the CVS has links to all revisions, which would include a dynamically generated version of the file at that revision. Those links get recognised by your site downloader and it will download all those old versions that you probably don't need.
Do yourself a favour and follow lucid's advice - get a CVS program like tortoise, and download the source as a set of folders and not as an offline site dump.