is there a software or web app that can map out the site architecture of a website?
i'm doing a redesign of a site that alot of pages. i was using a spreadsheet to list and map out the major sections and the pages within each section along with it's respective urls. But that is time consuming.
I don't know of any software explicitly for mapping out the architecture of a website, though you could probably adapt software for other mapping systems like UML for your purposes.
I always just like to use the low-tech version... lots of paper.
Another idea would be along the same lines as your spreadsheet. Try to group the various pages into logical categories. Then instead of having to map out every single individual page, you can just map out how each category would fit in with the site.
Depending on the structuring of the site, you could also use a spidering utility to crawl your site and pull out a list of their titles and URLs.
To be spidered it doesn't necessarily have to have a text-based navigation or sitemap (though both of those should be there for good practices). How spiders work is they get all the links on that page (that are local). Then it goes to all of those pages and gets all those links, and repeat until it runs out of pages that it hasn't checked.
So, if there are pages that aren't linked anywhere (or linked only within pages that require a log in), then it won't get those. But all others it will get.
You can also write your own spidering software pretty easily. Simply have it read in the text from a page you specify and find all anchor elements. Then add all of the hrefs from those to an array and add whatever information about the specific page you are on to another array. Then just go to the next page and do the same thing, making sure you don't ever put the same link into the array.