We're a website design, development and hosting company. Every so often we inherit a client who wants us to replicate their existing site on one of our servers, but aren't able to provide us with an ftp username and password (for one reason or another).

So assuming there's not legal reasons why we shouldn't do this (i.e. the client assures us that it's their site), and assuming we're talking about a totally static/brochure site (no server-side scripting), what's the best tool for grabbing all, or as close to all as possible, of the elements of such a site?

Can downloadthemall! (firefox tool) or the linux command line "wget -r www.domainname.com" be improved upon?