Dockerized FTP deploy of Drupal

Submitted by Stopka on

During the time, the list of websites I am managing became quite long. I built all of them on top of the opensource project Drupal, so when a new update of Drupal or one of it's extension arrives, new round of updating work begins for me.

Updating an extension used to be an easy task, because Drupal has built-in extension installer. But the php world adopted dependecy installation by Composer and several Drupal extensions need to download such dependencies. This complicates the update, because man have to download, extend and build vendor directory with php libraries. And the same it goes with updating Drupal itself. Drupal can't update it's core, so man has to download new version, rebuild vendor because of extensions and then deploy.

And the deploy is also problematic. I run few of websites on my own server, where it is easy, beacuse I am running them using Docker and update is matter of rebuilding containers. But most of my websites are running on some LAMP webhosting, where app files need to be uploaded using FTP (FTPS to be precise). And that is very annoying job. Man has to upload/update new application files, must pay attention not to overwrite configurations or user data, and then delete old anymore not needed files. Uff, this is not the right way of doing it.

Build

I was thinking, that this can't bother only me, and I started to search solution. I found out, that nowadays Drupal can be installed fully using Composer, even with it's extensions. All Drupal core components are published at Composer repositories and can be all downloaded together by adding one Composer dependency on "drupal/core-recommended". Furthermore all official Drupal extensions published on drupal.org are also published in Composer repositories under "drupal/" vendor. And it gets even better, because there is a Composer extension "drupal/core-composer-scaffold", which can automatically store all the downloaded content to proper Drupal directory structure, so it saves all extension modules to www/modules directory, themes to www/themes, core to www/core and so on. All this simplyfies the website definition to a single Composer file. Perfect.

I refactored all my Drupal websites to this new structure. I prepared the composer.json file and added Dockerfile, which installs Nginx, Php, Composer and builds the website app. It is worth mentioning, that there are already images on the internet, that do the same. They download Drupal and extensions and run the app with the full webserver stack. But they often use Apache with php module, and I would rather use Nginx with php-fpm, installation is often managed by RUN commands in Dockerfile, I like to define custom composer.json file more... In short, not even one of prebuilt Docker images fully suited me, so it was easier form me to put together my custom Dockerfile from scratch.

Every website I made of course has it's own custom made design theme. I added the theme module to the repository, created Composer file to it, so it can be installed by Composer as one of the theme modules automatically. Themes also always need to be built by frontend tools and packages from Npm repositories. Therefore I extended Dockerfile by adding css and javascript building phase.

At this point, websites I am running in Docker are finished. But as I mentioned, most of my websites must be somehow deployed to webhosting using FTP.

Deploy

In this step I used another simple tool I discovered. It's the "dg/ftp-deployment", command line application made by David Grudl in PHP, also published in Composer repository. The app is designed in clever way: together with all files it also uploads a list of these files with their hashes, so nextime, It can upload only changed or new files and delete old files, that are not needed anymore.

It's also easy to configure, it reads ini or php file. I have chosen the php config, because this way I can read several config values from environment variables. The configuration file defines which directories should be uploaded to the remote server, which server they should be uploaded to and credentials the upload should be performed with. All the private values are securely stored in a separate env file, loaded from environment variables, thus not present in the git repository.

I extended Docker file once more, and added a deploy phase, which installs ftp-deployment application and also adds the configuration file to the container. At this stage it's all prepared for me to just call the deploy app.

After all this, the update is performed almost automatically. First I rebuild the container, this makes a working copy of the web on localhost, where I can test that everything still works as expected. Then I connect to the container and call deploy app. That's all. This is a huge improvement for me.

Where can you have a look?

I published updated sources of one of my websites on GitHub, the link is below. It's a small, simple presentaion of florist and jewellery maker Petra Šípková.