Jump to content

Abstract Wikipedia/Abstract developer cheatsheet

From Meta, a Wikimedia project coordination wiki

Welcome to our Wikifunctions development cheatsheet!

Wikifunctions Repositories

[edit]

The Wikifunctions stack consists of four repositories, currently spread around Gerrit and GitLab:

WikiLambda

[edit]

WikiLambda is the MediaWiki extension for Wikifunctions and contains: - the data persistence layer - the front-end Vue interface - the interface via APIs to make requests to the function orchestrator backend

WikiLambda is currently hosted on Gerrit: https://gerrit.wikimedia.org/r/admin/repos/mediawiki/extensions/WikiLambda

To clone this repository, follow the instructions in the Gerrit page:

git clone "ssh://gengh@gerrit.wikimedia.org:29418/mediawiki/extensions/WikiLambda"

This repository depends on the function-schemata submodule.

More documentation can be found on:

Continuous Integration for WikiLambda in Gerrit

[edit]

We have two CI systems for WikiLambda; to merge, code should pass both CI systems:

  • Wikimedia's "legacy" CI, which tests our PHP, JS, etc. code for linting, and runs our unit and integration tests, including those of our platform.
  • DUCT, a custom system for end-to-end testing our code along with our services via our browser tests. (This is planned to be replaced by Catalyst.)

To trigger a re-run:

Function Orchestrator

[edit]

Function Orchestrator is a Node service that manages the execution of Function Calls (Z7s). It is the point of interoperation between MediaWiki/WikiLambda and the Function Evaluator, which executes native code in various programming languages.

The function orchestrator repository is hosted on GitLab: https://gitlab.wikimedia.org/repos/abstract-wiki/wikifunctions/function-orchestrator

To clone this repository, follow the instructions in the README file:

git clone --recurse-submodules git@gitlab.wikimedia.org:repos/abstract-wiki/wikifunctions/function-orchestrator.git

This repository depends on the function-schemata submodule.

More documentation can be found on:

Function Evaluator

[edit]

The Evaluator service executes user-written 'native' code in a variety of programming languages. The repository consists of the evaluator service and a variety of language-specific executors.

The function evaluator repository is hosted on GitLab: https://gitlab.wikimedia.org/repos/abstract-wiki/wikifunctions/function-evaluator

To clone this repository, follow the instructions in the README file:

git clone --recurse-submodules git@gitlab.wikimedia.org:repos/abstract-wiki/wikifunctions/function-evaluator.git

This repository depends on the function-schemata submodule.

More documentation can be found on:

Function Schemata

[edit]

This repository is a shared set of JSON schemata for the Wikifunctions project, to achieve a "single version of the truth" on what counts as a structurally valid ZObject. It is used as a git sub-module for the function-orchestrator and function-evaluator services, and the WikiLambda MediaWiki extension. The function schemata repository is hosted on GitLab:

https://gitlab.wikimedia.org/repos/abstract-wiki/wikifunctions/function-schemata

To clone this repository, do:

git clone git@gitlab.wikimedia.org:repos/abstract-wiki/wikifunctions/function-schemata.git

To update the repository as submodules of each project, go to each project root directory and run:

git submodule update --init --recursive

WikifunctionsClient

[edit]

WikifunctionsClient will be the MediaWiki extension for embedding Wikifunctions calls

WikiLambda is currently hosted on Gerrit: https://gerrit.wikimedia.org/r/admin/repos/mediawiki/extensions/WikifunctionsClient

To clone this repository, follow the instructions in the Gerrit page:

git clone "ssh://gengh@gerrit.wikimedia.org:29418/mediawiki/extensions/WikifunctionsClient"

Gerrit Cheatsheet

[edit]

(Unfortunately, we use both Gerrit and Gitlab because the org-wide effort to fully migrate to GL did not finish. A hope for the distant future perhaps.)

These are the Gerrit and git review related summaries that I created for my own use and reference, but you can find all this information more extensively in:

Perhaps the biggest difference here, is the flow of pushing code!

Flow for creating and pushing a new patch:

$ git checkout -b mynewfeature
# do stuff and and stage
$ git commit
# Write message and description with Bug: <Phabricator task> at the bottom
$ git review
# Maybe do some more modifications on your patch
$ git commit --amend
$ git review

Git will retain the commit history of the present branch when you create a new branch. If you want a "clean" branch, create it from origin/master.

$ git fetch  # to make sure you have the most up-to-date version of master
$ git checkout origin/master
$ git checkout -b my-new-branch

Flow for reviewing or amending a patch:

$ git review -d <gerrit patch ID> # you can see it on the gerrit url of a given patch
# do stuff and and stage
$ git commit --amend
# modify the commit message or leave it the same
$ git review

Rebases don't work? Same flow, do a normal rebase, commit amend and review:

$ git checkout master
$ git pull origin master
$ git checkout <your branch>
$ git rebase master
$ git commit --amend
$ git review

Other git review stuff to remember:

$ git review -s --verbose  # Setup git review
$ git review -d 643568     # -d change (--download=change) downloads a change ID into a local branch.
$ git review
$ git review -R            # Git review but without performing rebase (--no-rebase)
$ git review -f            # Submit a change for review and close local branch (--finish)

Updating mediawiki installation

[edit]

A lot of times, we find ourselves needing to update our local env by reinstalling virtually everything so we routinely come here to execute the flow below.

(Environment installation instructions from https://gerrit.wikimedia.org/g/mediawiki/core/+/HEAD/DEVELOPERS.md)

  • Go to mediawiki core and do git pull
  • This does not update submodules directly, and you shouldn't if you don't want to lose your extension changes. You might need to update your skin version once in a while. For that, go to skins/Vector and do git pull
  • Remove directory cache/sqlite
  • Rename LocalSettings.php
  • Run docker compose up -d
  • Run docker compose exec mediawiki composer update
  • Run the installation script docker compose exec mediawiki /bin/bash /docker/install.sh
  • Copy your personal changes to the newly generated LocalSettings.php
  • Run docker compose exec mediawiki php maintenance/run.php update

Rights and Privileges

[edit]

As you all probably know, we have landed the first version of user rights and privileges,


There are two user groups with different degrees of authority, so WikiLambda thinks twice before saving edits. Our rights and privileges system means things like:

Wikilambda user groups are:

  • functioneers: they are contributors to functions.
  • functionmaintainers: the function keepers, very smart, we can trust them.

To develop and test locally like we've done till now, we have to add our user to both groups. To give your Admin user the special rights for creating and editing ZObjects, run:

$ php maintenance/run.php createAndPromote --custom-groups functioneer,functionmaintainer --force Admin
$ # or for docker compose environment:
$ docker compose exec mediawiki php maintenance/run.php createAndPromote --custom-groups functioneer,functionmaintainer --force Admin

You can also edit your own user rights using the Special:UserRights page in your localhost installation: http://localhost:8080/wiki/Special:UserRights

Wikifunctions development workflow tips

[edit]

WikiLambda starting guide

[edit]

https://www.mediawiki.org/wiki/MediaWiki-Docker/Extension/WikiLambda

Vue Devtools

[edit]

To work with our Vue (front-end) code, it's useful to employ the Vue Devtools. To help ensure smooth functioning of the Vue Devtools, add the following line to mediawiki/LocalSettings.php:

$wgVueDevelopmentMode = true;
require "$IP/includes/DevelopmentSettings.php";

And set the following to provide source maps so that files can be understood even though they're minified:

$wgResourceLoaderEnableSourceMapLinks = true;

It can also be really helpful to check Disable cache, under the Devtools Network tab.

Auxiliary extensions

[edit]

You may run into errors saying that you are missing an installation of a MediaWiki extension, such as WikimediaMessages, EventLogging, UniversalLanguageSelector,...

At the moment, it seems that UniversalLanguageSelector is the extension we need to manually install. Take note this should live parallel to where your WikiLambda repo lives: (/mediawiki/extensions/)

1. So clone said extension in that directory

2. And also add below into your LocalSettings.php (also mentioned in above link)

wfLoadExtension( 'UniversalLanguageSelector' );

Set up mobile front-end in Dev environment

[edit]

Download the MobileFrontend extension and MinervaNeue skin, and add them to LocalSettings with appropriate config:

wfLoadExtension( 'MobileFrontend' );
wfLoadSkin( 'MinervaNeue' );
$wgDefaultMobileSkin = 'minerva';

Unminified Front-End JS code

[edit]

Also it can be quite useful to debug using the unminified javascript source code on the Front-End. To do so, just add &debug=true to the URL and check out again your browser JavaScript debugger.

Git Submodules

[edit]
git submodule update --init --recursive

Updating Submodules

[edit]

Once there are new changes in function-schemata, it is convenient to keep the other projects updated to schemata's most recent version. To do so, there is a convenience script in all three projects (WikiLambda, function-orchestrator and function-evaluator). To do a synchronous function-schemata pull through, run the following commands from the updated master branch of each of these repositories:

$ ./bin/updateSubmodule.sh
$ git review

This will generate a patch for each repo with a schemata pull-through to the latest version in master and a commit summary containing details of every new update that the new schemata version includes.

WikiLambda PHPunit tests

[edit]

About PHP unit testing: https://www.mediawiki.org/wiki/Manual:PHP_unit_testing/Running_the_tests

# mediawiki should be current directory for all these commands

# Run all test suite using local settings:
$ docker compose exec mediawiki composer phpunit:entrypoint

# Run WikiLambda tests:
$ docker compose exec mediawiki composer phpunit:entrypoint extensions/WikiLambda/tests/phpunit/

# Run a particular test file:
$ docker compose exec mediawiki composer phpunit:entrypoint extensions/WikiLambda/tests/phpunit/integration/ZObjectStoreTest.php

# Filter by files:
$ docker compose exec mediawiki composer phpunit:entrypoint extensions/WikiLambda -- --filter "ZObjectAuthorization|ZObjectStore"

# Run a particular test method:
$ docker compose exec mediawiki composer phpunit:entrypoint -- extensions/WikiLambda/tests/phpunit/integration/API/ApiFunctionCallTest.php --filter testExecuteSuccessfulViaBetaCluster

# Run tests and generate coverage in HTML
$ docker compose exec mediawiki composer phpunit:entrypoint --coverage-html="./docs/coverage" extensions/WikiLambda/tests/phpunit/ --whitelist="./extensions/WikiLambda"

# Run tests and generate coverage in txt (and print it)
$ docker compose exec mediawiki composer phpunit:entrypoint --coverage-text="./docs/coverage.txt" extensions/WikiLambda/tests/phpunit/ --whitelist="./extensions/WikiLambda" ; cat ./docs/coverage.txt

PHP linting

[edit]
# Apply automatic fixes
docker compose exec mediawiki composer fix extensions/WikiLambda/

# Check other PHP CodeSniffer errors (fix manually)
docker compose exec mediawiki composer phpcs extensions/WikiLambda/

Or, from inside your WikiLambda check-out:

  1. Run the standard CI linting tests locally

docker run --rm -it -u "$(id -u):$(id -g)" -v "$PWD/.git:/src/.git:ro" -v "$PWD/:/src" -w /src docker-registry.wikimedia.org/releng/composer-php81:latest test

  1. Try to automatically fix any linting isses

docker run --rm -it -u "$(id -u):$(id -g)" -v "$PWD/.git:/src/.git:ro" -v "$PWD/:/src" -w /src docker-registry.wikimedia.org/releng/composer-php81:latest fix

  1. Run the Phan CI linting tests locally (on PHP 7.4 as Phan has issues on 8.1)

docker run --rm --it --volume "$PWD/../..":/src -v "$PWD/.git:/src/.git:ro" docker-registry.wikimedia.org/releng/composer-php74:latest --working-dir=/src/extensions/WikiLambda phan

WikiLambda e2e tests

[edit]

To run selenium tests, do from the Wikilambda extension directory:

# From the directory Wikilambda, 
# Source .env file from the mediawiki docker compose environment
$ source ../../.env

# Unset the DISPLAY environment variable
$ unset DISPLAY
# Or set it starting with a colon if you want to trigger ffmpeg recording
$ export DISPLAY=:1

# And run the tests
$ npm run browser-test

# To run a particular test file
$ clear; npm run browser-test -- --spec tests/selenium/specs/function.js

# Or to run a specific test by test name
$ clear; npm run browser-test -- --spec tests/selenium/specs/function.js --mochaOpts.grep CUJ1

For more scripts on how to run specific tests, see the README.md file at the test/selenium folder

To get a more detailed debugging log, edit the file "tests/selenium/wdio.conf.js" and uncomment the line "logLevel: 'info'"

WikiLambda front-end tests

[edit]

(Avoid using `npm install` because it changes state!!!)

# With mediawiki/extensions/WikiLambda as current directory
# Install npm dependencies
npm ci

# Run tests
npm test

# Run linter
npm run lint:fix

# Run unit tests
npm run test:unit

# Run a particular test file
clear; npm run test:unit tests/jest/components/widgets/FunctionEvaluator.test.js 

# Run one particular test
clear; npm run test:unit tests/jest/store/modules/zobject.test.js -- -t 'currentZFunctionInvalidInputs'
clear; npm run test:unit -- -t 'currentZFunctionInvalidInputs'

Function-schemata tests

[edit]
# Install npm dependencies
npm ci

# Run tets
npm test

To run the tests without the linting step:

npm run test:nolint

If you want to run a specific test, you can do:

npm run test:nolint -- -f <substring matching test name>

# For example:
npm run test:nolint -- -f "canonical lists"


If you want to run a specific test suite, you can do:

npm run test:unit tests/jest/integration/EditFunction.test.js

npm run test:unit tests/jest/store/modules/zobject.test.js

Other entry points and composer commands

[edit]

Collected by James: https://phabricator.wikimedia.org/T285275

Debugging and logs

[edit]

Logs are added in the default directory ./cache (mw-error.log, mw-dberror.log, etc.) You can print on these logs using the default groups (error, exception…) with wfDebugLog:

wfDebugLog("error", "Here's some error information about the variable $var" );

You can also create your custom log groups following https://www.mediawiki.org/wiki/Manual:How_to_debug#Creating_custom_log_groups

Debugging and logs in NodeJS services

[edit]

Logging messages in function-orchestrator, function-evaluator and function-schemata can be done simply using console.log or console.error. To see the messages, one must rebuild the project with blubber and re initialize the docker containers.

For example, after adding console.log statements in function-orchestrator or in its submodule function-schemata, run in the function-orchestrator root directory:

# For function orchestrator
docker build -f .pipeline/blubber.yaml --target development -t local-orchestrator .

# For function evaluator (javascript)
docker build -f .pipeline/blubber.yaml --target development-javascript-all-wasm -t local-evaluator-js .

# For function evaluator (python)
docker build -f .pipeline/blubber.yaml --target development-python3-all-wasm -t local-evaluator-py .

You can have the above command saved as a shell script since you might use them dozens of times a day :).

And once built, restart your MediaWiki docker containers

docker compose up -d

Use docker compose logs (or docker-compose logs in case using docker-compose <v2) to view the logs:

# Show all container logs and follow
docker compose logs -f

# Show only function-orchestrator logs and follow
docker compose logs function-orchestrator -f

# Alternate logs command--runs from any directory with cleaner output, but is less comprehensible.
docker logs mediawiki-function-orchestrator-1

To log exceptions from the python executor (function-evaluator/executors/python3/executor.py):

import logging
logging.exception("this is some error")

Then, in (function-evaluator/executor-classes/Python3ExecutorBase.js), in the callback for this.childProcess_.stderr.on( 'data' ..., there is an else clause which should be populated with console.log( data );.

Finally, as before, rebuild the function-evaluator with blubber and reinitialize the MediaWiki docker compose, then view the logs for mediawiki-function-evaluator-1.

Testing NodeJS Services

[edit]

function-orchestrator

[edit]

The easiest way I have found is to install the orchestrator locally:

# from function-orchestrator directory
npm install

# make mocha easy to find (this can go in .bashrc or OS equivalent)
alias mocha='./node_modules/mocha/bin/mocha'

# run tests (you can filter like "mocha -g 'substring of test name'")
mocha

Note: if "mocha" gives a "Cannot find module" error, run the tests using "npm run test". (This "test" script, which also runs lint, is defined in package.json. "npm run" appends a few more directories onto $PATH.)

IMPORTANT: You need to terminate the docker images for these tests to run properly:

docker-compose down

To run only one test, you can use the flag -g TEXT, where text is a string matched against all the names of the describe(name, callback) and it(name, callback) calls:

# If you are running tests with npm:
npm run test:nolint -- -g 'TEXT'

# If you are running tests directly with mocha:
mocha -g 'TEXT'

function-evaluator

[edit]

It's best to test using the docker images. First, I have a shell command build-run-blubber <Blubber variant> <Docker image name>:

# place in /usr/local/bin/build-run-blubber or elsewhere on $PATH
#!/bin/bash

set -e

if [ -z $2 ]; then
    echo "Please invoke as $ build-run-blubber <Blubber variant> <image name>."
    exit 1
fi

docker build -f .pipeline/blubber.yaml --target ${1} -t ${2} . && \
    docker run ${2}

The function-evaluator has four test variants which run in CI; they can be run as follows:

# run these commands from function-evaluator directory
# run the full test suite for the Node services, including eslint for JavaScript files
build-run-blubber test-python3-all-evaluator testpyall
build-run-blubber test-python3-all-wasm-evaluator testwhyall
build-run-blubber test-javascript-all-evaluator testjsall
build-run-blubber test-javascript-all-wasm-evaluator testjswall

# run the tests for the Python executor
build-run-blubber test-python3-executor testpy

# run the tests for the JS executor
build-run-blubber test-javascript-executor testjs

# run format testing for python files
build-run-blubber format-python3 formatpy

carbon-neutral, packet-negative Green Anarchist coder life

[edit]

Every time you run Blubber, you are calling to an external service. This is 1) wasteful and 2) not a good thing to depend on if you intend to write code e.g. while traveling. For that reason, I recommend saving all generated Dockerfiles locally (e.g. blubber .pipeline/blubber.yaml somevariant > SOMEVARIANT.DOCKER) and using SOMEVARIANT.DOCKER in the above commands.

For example, instead of

blubber .pipeline/blubber.yaml development | docker build -t local-orchestrator -f - .

you can save a Dockerfile as above and then run

docker build -t local-orchestrator -f SOMEVARIANT.DOCKER .


Maintenance scripts

[edit]

Reload built-in data

[edit]

function-schemata/data/definitions contains all the JSON files for the canonical built-in ZObjects that a blank Wikifunctions environment must have in a blank installation. When following the general installation instructions, you will be asked to run the MediaWiki maintenance/run.php update script, which loads into the database all the built-in data if they are not there yet. However, the update script will not overwrite any existing data.

If you need to restore all the built-in files to their original state (for example, there were changes in the function model and all the data definitions are updated) you can do a totally blank installation (as explained here).

If you wish to push the builtin data again, partially, forcefully or merge the builtin data changes with the already stored data, you will need to use the loadPreDefinedObjects.php maintenance script.

Follow loadPreDefinedObjects.php script detailed usage documentation.

Load production dump

[edit]

Sometimes it might be necessary to replicate locally the current state of the Wikifunctions.org production database. This can be useful for data debugging purposes, analytics, etc. To do this, you will first need to download all the ZObjects in production, and then use the loadJsonDump.php maintenance script to push them into your local database.

To download the ZObjects in production, you need to use the wikifunctions-content-download script. This script can be used for a one-time download of the whole dump of objects, but it can also be periodically run to keep a local copy updated. One-time download takes quite a while (~30 mins for ~10000 objects), so it is a good idea to keep a local copy regularly updated.

1. Download wikifunctions-content-download and follow the guide in its README.md file.

2. Follow loadJsonDump.php script detailed usage documentation.

Supporting Contributors

[edit]

Allow list for new contributors

[edit]

To prevent a user from uploading malicious code that can be executed by the CI servers, code contributors need to be added to an Allow list before their patches get executed by the pipelines. You can read more info in the MediaWiki Continuous integration documentation.

To add a new user to the allow list, add the user's primary Gerrit e-mail address and push a patch like this one adding our dearest contributor Lindsay. Then you can ping the RelEng team via the #wikimedia-releng IRC channel.

Force Zuul to run tests

[edit]

If you are on the list, you can force Zuul to run all tests on a patchset by adding a comment beginning with the word recheck in Gerrit


[edit]

The Wikimedia Foundation

Wiki

MediaWiki

Abstract Wikipedia

Working tools

Mailing lists

Complete list of mailing lists:

Some interesting mailing lists you can sign up to: