Weekly MediaWiki and Extension updates

The stable (and/or LTS) release of a software should be used by third party customers, who wants a stable enough software, and don’t want to have high maintenance costs for continous updates, but a specific plannable time frame for security fixes and new releases. This allows system administrators to plan the amount of maintenance work they’ve to do with a Software installation.

However, this also means, that your users of, for example, your MediaWiki site does not get all the new cool features (and, of course, bugs and bug fixes ;)) of the latest MediaWiki build (e.g. functions they know from Wikipedia or other big MediaWiki sites). However, using a release, which isn’t marked as stable isn’t garantueed to work, may contain critical bugs or has major changes with which your extensions may not work anymore. Using a release, which isn’t marked as stable is a high risk from an operation point of view, which may result in problems and high maintenance cost.

However, on droidwiki.org you’ll find the MediaWiki release which is also installed one of the latest version installed on big Wikipedias (like the english or german Wikipedia). That means, droidwiki.org get’s the latest nightly/wmf-release, which is available every friday night and follows the update cycle of Wikipedia. The staffs of the Wikimedia Foundation (the Release Engineering team) creates a nightly/wmf-release every Tuesday (currently it’s Tuesday) and deployes them to all MediaWiki-based sites, which are operated by the team, at Tuesday (group 0), Wednesday (group 1) and finnaly to the Wikipedias at Thursday (group 3), see e.g. the Release Roadmap of the upcoming 1.29 release of MediaWiki. At friday, after critical bugs, which was found during the deployment, was fixed, droidwiki.org get’s this version of MediaWiki. This blog post should give some insights how this is done, what steps are required to achieve this and probably what could be improved.

Preparation and update of the droidwiki.org codebase

The MediaWiki code and the code of it’s extensions is mainly kept in mirror-like git repositories through the Gerrit installation used inside droidwiki.org (and other projects). This means, that the code, which is deployed on droidwiki.org and the other MediaWiki sites, needs to go through the git repositories, as this is the single defined state of the software. As a preparation step for upgrading to the latest wmf-release, the code in these repositories needs to be updated.

On the staging and upgrading server for droidwiki.org, there’s a shell script, which does this automatically for MediaWiki core and al WMF-maintained extensions (as only these gets a wmf release branch). In fact, the script iterates over all known extensions, that should be updated, and updates the git repository by downloading the latest commit from the Wikimedia Gerrit code-review tool and commits it to gerrit.go2tech.de. This, of course, takes some time, depending on the count of extensions.

The changes are not directly pushed to the master branches of the gerrit.go2tech.de clones of the repositories, but has to go through a code-review. Mainly to have the possibility to run some sanity code checks by Jenkins, which, in fact, runs the same test suite as the ones, which should run when the code is submitted in the WMF gerrit. However, running the tests on the go2tech.de infrastructure ensures at a very low level, that at least the basic things should work. In addition, if something weird is spotted in a specific extension, it’s easier to implement new tests aginst it for go2tech.de. That, on the other side, means, that each change has to be approved by someone with +2 rights in the go2tech.de Gerrit. Doing this does usually not require a code-review of each change, that’s why there’s a python script, which reviews each of these changes automatically through the gerrit REST api.

After all changes are checked by the CI and merged into their git repositories, the real work can begin. This, in fact, means only, that the staging area for mediawiki needs to be updated. The staging area is a full copy of the MediaWiki release, which should be deployed to droidwiki.org wikis, including the MediaWiki core and extensions code itself, the configuration for MediaWiki and some services code (e.g. for the jobrunner and jobchron). The MediaWiki core code is directly fetched from the Wikimedia gerrit (the copy in the go2tech.de gerrit is only updated for the CI system and will probably be replaced at some time, too), whereas the other code is fetched from the master branches of the go2tech.de gerrit installation. Updating MediaWiki core is a totally manual step (git fetch && git checkout wmf/1.29.0-wmf.X e.g.), whereas for the extensions an update shell script exists. This shell script itaretes over all directories, checks if it’s a git clone and updates it to master.

Because MediaWiki and some extensions requires some dependencies, which are fetched from packagist.org using composer, these dependenices needs to be updated, too. Instead of going into each extensions directory and install the composer dependencies again and again, droidwiki.org uses the composer merge plugin, which is provided by MediaWiki and merges a specified set of composer.json requirements into a single one and updates only this one. The explained code lines are located in the composer.loca.json file in the MediaWiki core clone:

"extra": {
	"merge-plugin": {
	        "include": [

This merged the composer.jsons of each extension into the MediaWiki core one. Because MediaWiki automatically loads the composer autoloader during the initialization steps, it’s enough to now run composer update in the MediaWiki core clone only in order to update all composer requirements.

Updating the code to all application servers

Once the staging area is built, the actual deployment can start. The MediaWiki code is stored on each application server in the /data/mediawiki/ directory. That means, that each server needs to be updated in order to deploy the code. The WMF uses rsync for this, as well as droidwiki.org. However, the WMF has an rsync server, which can be used by each app server to fetch the changes, which allows the app servers to update in parallel. For droidwiki.org, as there’re not much app servers, a more serial way is used. Instead of having on ersync server, where app servers can fetch from, for the droidwiki.org deployment, the deployment script (written in python and oriented on how scap3 works) pushes the changes via rsync to all app servers. This takes some more time and will not scale for more production servers, as the time, the deployment needs, will increase for each new app server. But for now, it works pretty well.

After the files on the app servers are updated with the staging area, the deployment scripts run the update.php script for each site (using the MWScript.php script file from the minimized multiversion) as well as the rebuildLocalisationCache.php maintenance script. For the latter one, there’s one problem: Between the time where the files are deployed and the localisationCache is updated (even if it’s run directly after the sync of the files), there could be fatal errors, which makes the MediaWiki site inaccesible. The reason for that is, that, e.g., Magic Words needs to exist in the language cache, and they does not for new magic words. That means, whenever an extension or MediaWiki core implements a new magic word, droidwiki.org will break for the time of deploying the new version of the code. This could be resolved by rebuilding the localisationCache before the code is deployed and deploy the new cache files during the deployment of the MediaWiki code. This, however, requires to load all extensions during the run of the localisationCache. For now, I had no time to implement such a functionality, and there’re not much new magic words, so there’s just a little problem.

After all of this, the MediaWiki sites, including droidwiki.org, has a the new MediaWiki code. Of course, this blog post has not taken into account to backup the code files and the database, which is also done, before a new deployment is done.

Leave a Reply

Your email address will not be published. Required fields are marked *