I'm taking over a web site with (I think) about 150 pages, but the retiring web-master can't/won't provide a CD. He suggests I download it page by page, following the links manually. This means, of course, that I'll have a folder full of pages and a folder for every page containing the supporting files (needlessly duplicated). Awful prospect, deadly tedious and wide open to error. I don't have FTP access to the server.
Can anyone recommend a program that will crawl a web site and compile a report for me listing all the unique pages by URL ?
I've found a program called 'Web2Disk' which will download the site so that it can be browsed, but it's not very good at putting supporting files (images, CSS, etc) into the correct folder. Anyone know of something better ?
I've also tried 'PowerMapper' to good effect. Great map, but many pages are duplicated if they can be reached by more than one link sequence. 'SiteSort' would do the job, but not in its evaluation mode, and I'm not yet in a position to commit to purchase.