This article is part of a series about my deployment workshop on the T3DD13. Make sure to read the other posts, too.
Example for an deployment setup
During the workshop, we first talked about the deployment setups and tools we use. I presented a setup I recently worked with.
We had all code versionized in Git, using the „Git flow“ approach for branching and releasing. If you don’t know this yet, I highly recommend to read „A successful Git branching model“ and checkout this Git flow cheat sheet. In the end it means we had a stable and always-ready-to-release master branch, a release branch and a develop branch with the latest changes.
Beside of multiple development installations on the developer’s local systems, we had a live system, a latest and a staging system.
On live the latest master branch was checked out (which means the latest stable release). live is also the system where editors editing the actual content.
On staging HEAD of the release branch was checked out. This branch contains the latest changes/features, which are supposed to be released next on live. Once in a while a script (triggered manually, see below) copies the (MySql-) database and asset files (like fileadmin/ and uploads/ using rsync) from live to the staging. This means all changes on staging will be overwritten somewhen. On staging the newest features and fixes can be tested using the current live data, before releasing them on the actual live system. Also the customer had access to this system to test and accept the new changes.
On latest the HEAD of develop was checked out. All changes from multiple developers come together here on one system to be tested. This also means the system might be broken from time to time. Like on staging the database and assets (upload/ and fileadmin/) came from live, although a lot more rarely.
Unfortunately we didn’t had a real integration system, where a developer/editor could prepare a new feature (create pages and content), test it and easily publish it to the live system (because content deployment is a not yet solved issue). We had to do this on staging (for testing and acceptance) and once again on live after releasing. For complex setups (multiple pages, a lot of content) we used .t3d exports to got that to the live system. From my personal experience you really need to be careful when ex-/importing t3d files!
We had a Jenkins, which took care of deploying code and data to the different system. These are Jenkins jobs we setup:
deploy to latest
This job was triggered on each commit (actually whenever someone pushed to the central git repository; using a git post-receive hook). It triggered an TYPO3 Surf deployment, which itself deploys the branch develop on latest and runs „EXT:coreapi“ on CLI to „clear caches“ and do a „DB compare“ on latest.
TYPO3 Surf was installed on the same machine Jenkins is running on. TYPO3 Surf and all Surf deployment configurations (for other projects too) were versionized in a separate Git repositories.
deploy to staging
This job wasn’t triggered automatically, but manually only. It also runs TYPO3 Surf to deploy the HEAD of release on staging, „clear the caches“ and do a „DB compare“.
deploy to live
This is almost the same as „deploy to staging“ except its deploys the head of master to live instead.
fetch live data to testing or staging
This job (manually triggered) did a mysqldump on the live system (excluding the caching and log tables) and applies it to the testing or staging system. All asset files (e.g. images) from live got synced using a rsync. The database was just a few hundred MBs and all asset files together sum up to like 5-6 GB (but most of them didn’t ever change, so rsync was pretty fast).
Whenever a developer felt like wanting to update his local system, he manually fetched a database dump from staging on the CLI and, if necessary, some asset files using rsync, too.
Deployment setups of others in the workshop
During the T3DD13 workshop we also discussed about setups and tools the participants are using:
Using Git seems to be very common these days, only a few are (still?) using SVN. When using Git, most people prefer using submodules to include other extensions or other repositories or just have „everything in one repository“. Using multiple branches (like master/staging/production or master/release/delevop) seems to be good practice.
Using composer instead seems to be a very promising alternative, but for TYPO3 CMS no one used that yet in production (in the time of asking). In the meanwhile Lightwerk/Felix Oertel setup a TYPO3 CMS composer repository containing all not-unsecure extensions, the TYPO3 CMS core and a custom composer installer. Using this repository one can setup a TYPO3 Installation using composer, instead of Git submodules.
Also Jenkins is very widely used to automate deployments. It triggers, depending on the personal preference, „TYPO.Surf“, „Ant“ or „phing“ to do the actual deployment.
I was very pleased to hear that EXT:coreapi found its place in many deployment setups to automatically clear caches or do Database scheme updates (DB compare). As an alternative (and maybe for historical reasons only ;-) some are using EXT:t3deploy and EXT:cleartypo3cache.