Using docker-compose-ci
These notes supplement Automated deployment of MediaWiki and also PHPUnit/VSCode
I added docker-compose-ci to the CrawlerProtection extension, and here is how to do it.
Initial setup
- create a
Makefilein the project root - create an .
envfile in the project root - create a
docker-compose-override.ymlfile in the 'build' directory
Fine-tuning
- add 'scripts' to composer.json
- add a .phpcs.xml
The neccessary files are shown below.
Results[edit]
Bottom line up front: After this is setup, you can effortlessly test and check your code from your laptop/workstation before even making a commit, much less a pull request. You can avoid the work of creating GitHub Actions on your repo fork - which you can still do as an intermediary quality filter.
# From your laptop/workstation (host) make bash # Inside container: composer phpcs # Check code style composer phpcbf # Fix code style composer phpunit # Run tests composer test # Run phpcs + phpunit
Makefile[edit]
do not specify COMPOSER_EXT=false or NODE_JS=false Any value will cause that code branch to run, potentially causing a build failure.
-include .env
export
# setup for docker-compose-ci build directory
ifeq (,$(wildcard ./build/))
$(shell git submodule update --init --remote)
endif
EXTENSION=CrawlerProtection
# docker images
MW_VERSION?=1.43
PHP_VERSION?=8.2
DB_TYPE?=mysql
DB_IMAGE?="mariadb:11.2"
# composer
# Enables "composer update" inside of extension
# Leave empty/unset to disable, set to "true" to enable
COMPOSER_EXT?=
# nodejs
# Enables node.js related tests and "npm install"
# Leave empty/unset to disable, set to "true" to enable
NODE_JS?=
include build/Makefile.env[edit]
Here's the .env file I used to set key variables[1] relevant to my extension.
# .env file example
MW_VERSION=1.43
PHP_VERSION=8.1
DB_TYPE=mysql
SMW_VERSION=6.0.1
PF_VERSION=5.7 # compatible with MW 1.43
# COMPOSER_EXT=true # Uncomment to enable "composer update" inside extension
# NODE_JS=true # Uncomment to enable Node.js and "npm install"docker-compose-override.yml[edit]
This file is critical to making life easier. Without it, you would have to resort to hacks like connecting VSCode to the container using remote explorer, configuring git for 'root' and adding roots .ssh key to your github repo. That's a lot of wasted effort when you can simply mount the host directory into the container and edit/commit using VSCode as usual.
services:
wiki:
volumes:
# Mount your local extension directory into the container
# Changes in container will reflect on host and vice versa
- ../:/var/www/html/extensions/CrawlerProtection
ports:
# Optional: expose wiki on localhost:8080
- "8080:8080"Beautiful Code[edit]
Using code beautifier to enforce MediaWiki coding standards.
If you haven't yet installed MediaWiki in the container, all the code will be there, but there will not be any LocalSettings.php file, and some tools won't work like they would on an installed wiki.
You can either cd /var/www/html and issue phpcbf commands directly rather than through the 'scripts' definition of MediaWiki's composer.json - which defines e.g. composer fix instead.
Direct usage[edit]
./vendor/bin/phpcbf -d memory_limit=512M --standard=MediaWiki --ignore=*/build/*,*/vendor/* /var/www/html/extensions/CrawlerProtection/
Or, you can create a .phpcs.xml configuration file for phpcbf that points to the vendored sniffs plus modify the extension's composer.json file to include a 'scripts' stanza that also points to MediaWiki's vendored binaries.
With the .phpcs.xml in place, you point to that as the 'standard' in your 'scripts' configuration, or on the command-line.
Fully-Configured usage[edit]
composer phpcbf
In the Docker container:
- Extension is at
/var/www/html/extensions/CrawlerProtection/ - MediaWiki vendor is at
/var/www/html/vendor/ - So
../../vendor/bin/phpcsresolves correctly
In GitHub Actions:
- Extension is at
$GITHUB_WORKSPACE/extensions/CrawlerProtection/ - MediaWiki vendor is at
$GITHUB_WORKSPACE/vendor/ - Same
../../vendor/bin/phpcspath works
Testing:
With everything in place, we should be able to make bash to get a command line in the container, and composer phpcs or composer phpcbf to run the tools. Plus, fixes in the container can be viewed/edited/committed from VSCode on the host thanks to the docker-compose-override.yml filesystem mounts.
.phpcs.xml[edit]
<?xml version="1.0"?>
<ruleset name="CrawlerProtection">
<description>MediaWiki coding standards for CrawlerProtection extension</description>
<rule ref="../../vendor/mediawiki/mediawiki-codesniffer/MediaWiki">
<exclude name="MediaWiki.Commenting.FunctionComment.MissingDocumentationProtected" />
<exclude name="MediaWiki.Commenting.FunctionComment.MissingDocumentationPublic" />
</rule>
<!-- Paths to check -->
<file>.</file>
<!-- Paths to ignore -->
<exclude-pattern>*/build/*</exclude-pattern>
<exclude-pattern>*/vendor/*</exclude-pattern>
<exclude-pattern>*/node_modules/*</exclude-pattern>
<exclude-pattern>*.phan/*</exclude-pattern>
<exclude-pattern>*/tests/phpunit/stubs.php</exclude-pattern>
<exclude-pattern>*/tests/phpunit/namespaced-stubs.php</exclude-pattern>
<!-- Show progress -->
<arg name="colors"/>
<arg name="encoding" value="UTF-8"/>
<arg name="extensions" value="php"/>
<arg value="sp"/>
<!-- Memory limit -->
<ini name="memory_limit" value="512M"/>
</ruleset>scripts[edit]
Here is the section of code to add to your Extension's composer.json to act like MediaWiki core for Continuous Integration
"scripts": {
"test": [
"@phpcs",
"@phpunit"
],
"phpcs": "../../vendor/bin/phpcs -sp --standard=.phpcs.xml",
"phpcbf": "../../vendor/bin/phpcbf --standard=.phpcs.xml",
"phpunit": "../../vendor/bin/phpunit tests/phpunit"
}