I would like to make local cache copies of some webpages but I’m having some trouble. I can run wget via the shell on my webhost, but wget is very limited in being able to get stylesheet links and images embedded via CSS.
Is there a php equivalent that I can use, to basically give it the url of the page like http://site.com/article.php?a=1234 and then have it download everything required to view the page locally?
You would have to get the HTML file then phrase it get all the needed urls then go out again and fetch those items, repeat…this is how a browser does it.