Save website in database

Hi.
What is the best way to save a web page into a mysql database?
I need to download it all (images,js,css …) so i can view it even if the site closes.

I am thinking in using curl or create a mht file from the url-

What do you think is best?
Also does anybody know a good php class for creating mht files?

Well I would imagine that you could use curl to get the source of the HTML pages. But then you’d have to read through the HTML and retrieve the images and javascript files. Basically anything that’s referenced that’s not HTML.

In terms of a systematic way to store all of those external files - I’m not sure. I guess you could replace the reference to the files in the HTML (once stored in your database) with the corresponding ID to the row where the image or js file is stored in a separate table.

If this is just a way to backup your own site I can think of much easier ways to do this with FTP. :slight_smile: