Deployment Issues

I was talking to my boss about the following issue that seems to be, as far as we can tell, unsolved.

Set up a git repo. Install a CMS system (say Wordpress, but any one of a number of PHP solutions face this problem). Get the thing to launch and release to the client. Now the fun starts – how do you move module and page installs from a test server to a live machine efficiently? If you install a new module in development and test it and / or write patches for it, you can’t simply port the code over - the database entries ran at install must be repeated live.

It’s a headache.

Yes it is

With Drupal any database structure changes would be defined in an update hook. Than when the module is pushed to production the update would be ran modifying the database as necessary. Though that is really only one part of the problem.

The other part is that at least when it comes to Drupal much of a sites structure is stored within the database itself in endless serialized arrays . So if you were to create fields, content types, taxonomy, panels, views, etc all of which are stored in the database how to transition them to production. In most cases the answer to that question is to carefully document the manual steps that must be repeated on production to replicate dev. However, there are some tools that exist in Drupal that can aid in the process.

The features module allows a developer to bundle up a structure of a site and install it on another instance such as production. This can be quite an effective method so long as the thing works properly and integrates with the entities being used. The features module has been known to act erratically under certain conditions.

Ctools exports are another option though it is much less automated than features. Any module that implements ctools export api can have its entities stored in the database exported as code and imported into another site via the way of an import form where you simply copy and paste exported code. Though in dev if you were to create say 40 views and needed to transfer those views to production you would need to manually export each one separately and import, which is not very fun. Than you also run into the obvious problem where someone on production might have changed an existing entity but did not do so through dev. The last module that can useful is something called [url=]strongarm which is used to transfer various configuration between different sites.

The good thing when it comes to Drupal is that in Drupal 8 there is a very sophisticated configuration management system that will make it much easier to sync changes between various different environments. The Drupal 8 configuration management initiative (CMI) was a pure focus on just what you are talking about surrounding the CMS namespace.

I know I talked a lot about Drupal but that is what I’m most familiar with. I can just imagine how big of a pain this would be in something like Wordpress. Especially a large wordpress site.

Here is the Drupal 8 docs for CMI:

Very powerful, can’t wait until we can actually move to using Drupal 8.

I know that might not be all that useful but I know that all CMS’s suffer from the same basic problem outlined on the CMI website. Seeing how one CMS has addressed it might motivate others to do so as well (cough WP) but I’m not holding my breath.

Currently there is no good way to move Drupal configuration information between environments because this data is scattered throughout the database in a variety of formats, oftentimes intermingled with content. This also makes it impossible to version control this information, to store history, and to be able to rollback changes. Every module stores their configuration data in a different format, there is no standardization at all, even within core. There is also no standard API for saving this information (aside from the simple case of the variables table) so developer often roll their own solutions. The entire contents of the variables table is loaded on each page request, even for rarely-accessed data, leading to memory bloat. It is cumbersome to manage information that is different between server environments for the same project (database information, api keys, etc.)

If all you need to do is update the database structure, you can use a program like Navicat for this. Navicat will allow you to do an ssh to ssh connection and sync just the structure of the databases, without changing the data in either if that’s what you want (always make a complete backup first, of course).

It’s not that simple though. During the time the development of the feature is done, the client makes ongoing additions to production. It isn’t possible to sync production to dev because that would destroy the client’s work. Sync’ing the other way destroys the database entries made for the new features.

Unless you have a very specific situation, data sync b/w development & production shouldn’t be your worry, you just need same schema at both places, you can work with dummy data on production. For the schema changes, you should have migrations.

If you’re using WordPress:

  1. wp-cli should become your best friend - very easy way to run migrations and any other scripts that you want to run to move/port stuff
  2. imho - altering DB schema is a bad idea. If you’re running a website then you can work within the DB schema that WordPress has with some creative thinking. But if you have specific needs that are better served by altering DB schema then that’s something else.

If you’re switching over from one module to another module which stores data differently then you need to deploy code with a fallback - like first check for existence of data via new module and if its there then use it else fallback to current module. Run your data migration (whichever way you want) and once its done and verified you can then deploy a code patch which removes the now defunct fallback.