So far I have been getting website with reasonable size of code and database. So I could work on it on the localhost, make changes and upload the entire code and database to the FTP.
Now I have got a website which has millions of pages and a huge database. To work on the localhost, the database should be downloaded. How can I download such a huge database? How do you guys work on a big website? I am asked to work directly through the FTP. But I’m a bit scared to do so…
When your trying to export the structure and data and your getting displayed “The page cannot be displayed”, are the any entries in the error logs for either PHP or MySQL?
In server’s PHPmyadmin, while exporting, I selected ‘zip’. But its not working. It shows ‘The page cannot be displayed’ within the PHPmyadmin window.
If I select only the ‘structure’ of the database, only the structure is exported. I want ‘some’ data along with the structure to get a feel of what each table has. Is it possible to export a part of the data (say 100 rows per table) along with the structure?
Personally in the early stages of building a site I work on localhost (for speed as much as anything else) then transfer to the remote server and work by FTP, but I guess everyone has different ways of working.
As for the database, I would say it depends on the client you’re using but phpMyAdmin’s export feature has an option to zip the output, which makes downloading a large database a lot quicker/easier.