The point of a test site is to have a place to experiment or develop new content without impacting your users or giving away your latest gee-whiz before its time. But it is also important to make sure that you always have a good working version of your live site on your test site.

Let's ignore, for the moment, that in an "ideal" world you will have a staging site which is always identical to the live site and nothing goes live without being staged.

There are two basic techniques for keeping your sites synchronized:

  • Manually making sure that the content of the two sites are the same.
  • Frequent uploads or downloads of database back ups.

The first technique is quite time consuming and it's difficult to keep user comments on your test site. It also relies on your memory to know what's been changed, unless you update both sites at the same time. If there are things that are dependent on other things (sorry for the technical terminology), you have to be careful that you update them in the right order so that your random visitor doesn't break something.

The second technique allows for user comments to be downloaded to your test site. From there they can be incorporated into the content and deleted, or left as is. You just have to be careful that you don't lose new, experimental, or incomplete content that hasn't been put onto the production site yet.

Which is better? I can't answer that question for you. You need to weigh the risks and benefits and decide that for yourself. Personally, at my age and with my memory, the second technique is probably better for me, at the risk of losing something not yet uploaded. I also have the bad habit of updating things on the live site and not bringing the changes back to the test site.

NOTE: the following is more advanced and not part of the original Cookbook:

You may want to review the following modules which can assist with migrating database changes:

You might also consider development using tools which allow feature sets to be stored in files so they can be applied to other site databases:

Some modules, like CCK and Views provide means to export their settings so that they can be applied to other sites. In combination with custom implementations of hook_update_N, this can provide an automated way to migrate some kinds of database changes.

To read more discussion on this topic you can read this post by Larry Garfield

Comments

authentictech’s picture

It is sometimes good idea to create your test site on the same server as your live site, especially if you cannot guarantee that your local test system is the same set-up as your remote live system. For example, if your live web server still uses MySQL 3.23 with Apache 1.1 running on Linux but your local test site uses MySQL 5.1 running Apache 2.2 on Windows you cannot always guarantee that Drupal or the modules and themes you install will work on your remote server if they work on your local test server due to version incompatibilities.

I usually do this by creating a test directory on the same server my live site is using. I password protect that folder so that it is inaccessible to the public (I use .htaccess to do this—see the Apache documentation to see how it is done or check your server control panel for a short-cut).

It is much better to use a different database for testing so there is no chance of your test site corrupting your live site's database; but if you cannot do that you will have to use a table prefix (such as "test_") to differentiate between test and live database. Be sure to configure this correctly, though, in your /sites/yoursite.com.testfolder/settings.php file.

You could just make sure that the software running on your local server is the same as your remote server but if you have accounts with several different hosting service providers or are doing work for several different clients it is sometimes impractical to have multiple versions of software on your local test system to accommodate this. There is peace of mind in knowing for sure that the installation and modules that you are testing on your test site will definitely work on your live site also.

kwinters’s picture

This is the general idea behind a "staging" site. You would want it to be as identical to the production (live) environment as possible, but have its own folders and data. This allows you to avoid situations where a mod will work on your development environment, but not in live (a fairly common issue in load balanced, multi-server environments).

I wholly recommend keeping an entirely separate development environment, on a completely different and isolated server (your own personal computer if need be). It's entirely possible for a misbehaving modules, etc. to trash the entire server and not just your "test" environment. Something like rm -rf * can happen and there's a good chance it eventually will happen. Restoring a development snapshot is a far better situation than trying to bring production servers back online.

graphicdefine’s picture

Using a tool like SVN or CVS can help make the process of keeping local and live builds sync'd correctly. These are especially useful if you are building a website with multiple developers working at the same time. Also, the beauty of these tools is that they can keep archived versions of the files to go back to should anything be wrong.

Useful Links:
CVS Homepage
Subversion Homepage
Windows SVN Client
Mac SVN Client
Eclipse SVN Plug-in

big_a_from_pa’s picture

Hello - we are looking to set up an offline test platform and implement version control with SVN. Can you point me to any posts/how-tos/articles on how to use SVN to keep the test and live sites in sync? Any help is tremendously appreciated!

tcarnell’s picture

Thanks for the original article - it is a breath of fresh air that someone has actually thought about the real-world practical issues of using Drupal. However, no practical solutions are actually given here.

It seems to me that a PHP script could be written that points to a 'source' installation and a 'target' installation and when run:

1. creates a backup of the 'target' database
2. exports .sql files for the 'content'
3. drops the 'target' database
4. creates a new 'target' database as a copy of the 'source' database (without content)
5. inserts the live data into the new source database
6. resurses over all files folders in the source directory and overwrites/copies them to the target directory.

But with the myriad modules, is such a script even possible with Drupal? Has anybody attempted this, or even published this?

Doing absolutely anything in Drupal would have been 10000% times easier if it was designed to have both a 'content' database and a 'system' database. If Drupal had been designed this way, systems could be upgraded, migrated, syncronized and replicated completely independantly of the actual content.

So, has anybody actually successfully automated the syncronization of two Drupal installations? (or at least that fulfills the 'staging' to 'live' change propogation requirement).

tom

NancyDru’s picture

I use this process to keep my local sites synced with my live sites and it works fine for me.

dman’s picture

Yeah, it's called drush sql-sync
You need to set up the login and path details for the respective installations (drush calls these configs site-aliases) but once that's done, syncing them is a one-liner.