Jump to content

Grants:IEG/Lua libs for behavior-driven development

From Meta, a Wikimedia project coordination wiki
statusselected
Lua libs for behavior-driven development
summaryFinalize the necessary libs for testing of Lua-modules in a behavior-driven development style, using spec-like tests.
targetAll Wikimedia-projects that use Lua modules.
strategic priorityimprove content quality (code quality)
amountTotal amount requested is USD 12,400
granteeJeblad
contact• jeblad(_AT_)gmail.com
this project needs...
volunteer
join
endorse
created on09:08, 5 April 2016 (UTC)

Project idea

[edit]

What is the problem you're trying to solve?

[edit]

When a user tries to develop a module on a project within Wikimedia s/he has to do this in a very terse environment. There are no debugger and only a very limited console. The developer ends up developing by trial and error, the coding will be slow, and the code quality will be low. After a while most developers start to add lots of output statements to see whats going on, cluttering the code, and adding a lot of comments trying to describe what the code attempts to do, and how it does it.

The limited development environment makes it hard to develop fast, with confidence, and with few defects (bugs). Instead the process becomes slow, both the developer and community lacks confidence in the result, and there are often hidden defects.

A common workaround is to add tests, but we have none of the standard test harnesses (automated testing frameworks). We use a few rather non-standard test libs for Lua, which makes it cumbersome to import and test external libs like Roland Yonabas Moses (table) and Allen (string) libraries into the Wikimedia-universe. Those are libs written in Lua and inspired by Underscore, and both have spec-like tests.

What is your solution?

[edit]

To make good code, Lua or any other code, it is important to verify whether the code implements the expected behavior. To do that the programmer resolve to testing, prodding in a console, and using a debugger. In our Lua environment we have a defunc debugger, it is turned off, but the console is working, and we have a few non-standard test libs.

Testing should be an integral part of our on-line support for Lua libs, and it should be very easy to set up and use. Ideally tests should follow a standard-ish pattern, at least to a level so the user can get a grip on how it is expected to work.

There are a few different test patterns in use today, but especially the patterns from RSpec has gained popularity within behavior-driven development. By supporting those we could create (and reuse) readable tests that are easy to maintain.

A good testing environment should not only give a report after a lengthy test process, but it should be possible to run the tests interactively interactively. That is the core idea behind test-driven development. By making the cycle very tight we will no longer be so dependent on a working debugger.

The tests should not only show failing tests on interactive runs, they should trigger tracking categories so that Lua modules can be identified as they fail. It is to inefficient to wait for someone reporting an article with a failing module, failing modules should trigger warnings at once.

Proof of concept

[edit]

I have made a module (w:no:Spesial:PrefiksIndeks/User:Jeblad/Module:BDD, about to be moved to w:en:Special:PrefixIndex/Module:BDD, with docs at mw:Help:Spec) as a proof of concept for how to do online spec-like testing in Mediawiki. The module partly use ideas from Busted [1] and partly from RSpec [2]. In its present state it almost makes it possible to do testing of the Allen and Moses libs in our on-line environment.

The existing libs are partially done, but are not ready for production. They need cleanup/refactoring and documentation, should support multiple test "files", should be more easy to rerun during development, and should integrate better with the doc.

The BDD-module started as a local project on nowiki, and not as a project on mediawiki as would be more obvious for an extension. That is also the reason why it avoids use of Lua functions that could otherwise be interesting to use in an extension. On the positive side it makes the present module "safe" as it only use functionality already deemed safe.

Note that the existing code does not run at the present location, paths are wrong, etc. The present code also use the "expect" style from Rspec, while Busted use the "assert" style.

Project goals

[edit]

Make it easier for an on-wiki user to develop code reasonably fast and with confidence, increase the overall code quality, and make it possible to maintain a minimum code quality for our Lua-modules. As such this project is about making a testing environment for the developers of those modules.

By tracking the state of the tests, and which modules has tests, the community will start to expect modules to have them, and then expect them to go green. That will make the community more confident that the module is in fact of good quality.

The hidden goal will be that the developers and users want to test their code, and not as now where major libs have no tests at all. It must be worthwhile for the developer to add those extra lines of code to make a test fixture, because the development process goes faster and s/he can be more confident in the code.

Risks

[edit]
Developers of Lua-code might not accept new solutions
This is probably my biggest concern. It is often rephrased as "Not invented here". To avoid this it could be an idea to make some showcases that directly compares different ways to write tests. One thing that is in favor of the proposed solution is that the number of tests are still pretty low on most projects.
The new code might not be finished within time
Always a possibility, and I'm very good at overestimating my own work progress. It seems to me that most of this should be doable within the given time.
The new code might pose a security risk
Always a possibility, but the real security risk is within the PHP extension code, which will be limited. The Lua-code itself should pose no more risk than existing code. There will also be a limited security risk within the Javascript code.
The new code might create to high load on the system
The libs as they are proposed will test individual modules, not use of the modules on individual pages, and because of that the load should be limited. As all code a test could be unbound, but as the runtime for a single invocation is limited the overall runtime for a limited number of modules should be bound. The biggest problem is probably if someone use a bot to create tests for a lot of configured instances of a module instead of testing the module itself.

Project plan

[edit]

Activities

[edit]
Functional
  1. Create proper entries at the statistics page for source modules and their test modules
    • this must be done to be able to get valid indicators about the impact of later activities
    • indicator for done is existence of subentries at the special page
    • it could be necessary to implement a simplified initial solution
  2. Create a bare minimum portal about Spec-style testing on Mediawiki
    • this should be done to get sufficient user satisfaction
    • this is partially done already as Spec portal, but must be adjusted and extended as the extension is finalized
    • post a note about the existence of the portal on several wikis technical forums (done)
    • indicator for partially done is existence of the portal, that is it is already partially done
  3. Create a bare minimum extension for Spec-style testing
    • this must be done to have a working solution
    • when it's done, post a note on non-technical forums
    • indicator for done is that the tests in Hello World runs
  4. Extend the extension with test doubles
    • this should be done as it is assumed to be important for ordinary development
    • when it's done, post a note on technical forums
    • indicator for done is that the tests in Signature runs
  5. Extend the extension with spies
    • this should be done as it is assumed to be important for ordinary development
    • when it's done, post a note on technical forums
    • indicator for done is that the tests in Timing runs
  6. Extend the extension with coverage of public interfaces
    • this could be done as it is assumed to be important for confidence-building in the community
    • when it's done, post a note on technical forums
    • indicator for done is not clear at this point, it could be as simple as existence of an indicator
    • it is important that this is only on public interfaces, as it will otherwise be a hard problem
Socializing
  1. Maintain a list of non-techie updates at Rollout
  2. At startup add a note at the biggest wikimedia projects' community portals about existence of the Spec project
  3. Mark introduction and core pages at Spec portal for translation as early as possible
  4. Write a monthly newsletter for the wikitech mailing list, mostly about technical progress
  5. Write a non-techie newsletter for the community portals when the extension has the first on-wiki running tests for the examples

Budget

[edit]

The estimated workload is about 3 full-time person-months for an experienced developer; or 6 person-months at 50 %. This workload estimation is based on the main developer's previous experience with similar projects.

Budget breakdown

[edit]
Item Description Commitment Person-months Cost
Main developer Developing and releasing proposed code Part time (50 %) 6 USD 12,400
There is no co-funding
Total USD 12,400

The item costs are computed as follows: The main developer's gross salaries (including 35 % Norwegian income tax) are estimated upon pay given to similar projects using Norwegian standard salaries,[1] given the current exchange rate of 1 NOK = 0.120649 USD, and a quarter of a year's full-time work.

Community engagement

[edit]

Other than code review it is not expected that the community in general will participate very much in the initial development up to the baseline, that is #Activities point 3. It will although be possible for other developers to provide patches for the published code.

It is expected that it is necessary to get feedback on the very limited UI, and to get help with translation of system messages. The messages are quite simple, even if they use a patch-work approach. Translation of pages at the Spec portal is already on its way as of May 2016.

A few examples will be made, but those are close to bare minimum. If time permits test could be made for a few additional core modules. That would be very interesting as it would be an enabler for the community in further testing of other modules, and it will also make it possible to get feedback on an early stage on what needs clarification.

Sustainability

[edit]

The code will be available on-wiki, and possibly also in the code repo if a extension is necessary. Because of this the code is assumed to be maintained by the community.

Code will be developed with re-usability and maintainability criteria in mind. Thus the code will be documented and manuals and initial tutorials and examples will be made available at the Spec portal.

Measures of success

[edit]
  1. Extension is finalized and functional at the repo within six (6) months
    • the Hello World example has functional tests, that is #Activities point 3 (this indicates a working solution)
    • the Signature example has functional tests, that is #Activities point 4 (this indicates completion of development)
    • the Timing example has functional tests, that is #Activities point 5 (this is a minimum to do effective testing)
  2. Extension is available and functional on-wiki within three (3) months after baseline (#Activities point 3) is functional at the repo
    • this step is crucial for the following
  3. After extension is available on-wiki, and within three (3) months, five (5) additional modules shall have Spec-style tests at enwiki
    • this indicates sufficient interest in the solution
  4. After three (3) months, and within next three (3) months, ten (10) additional modules shall have Spec-style tests at enwiki
    • this indicates continued interest in the solution

Get involved

[edit]

Participants

[edit]
  • Jeblad – I'm a wikipedian with a cand.sci. in mathematical modeling, and started editing on Wikipedia during the summer of 2005.
  • (needs one or more reviewers and a co-maintainer so the extension isn't abandoned)

Community notification

[edit]

Please paste links below to where relevant communities have been notified of your proposal, and to any other relevant community discussions. Need notification tips?

Endorsements

[edit]

Do you think this project should be selected for an Individual Engagement Grant? Please add your name and rationale for endorsing this project below! (Other constructive feedback is welcome on the discussion page).

  • My tech knowledge is close to zero, so I can say nothing of the quality and valuability of this project. However, this lack of knowledge is why I – for close to a decade – have had to rely on Jeblad on a lot of tech issues at the Norwegian wikis. He is a real senior in that area and have produced a bunch of good solutions to both known an unknown problems. I therefore feel certain that this will be value-worth money. Regards, GAD (talk) 14:44, 7 April 2016 (UTC)
  • Seems like this can make it easier to develop modules, so I'm all for it. - Soulkeeper (talk) 15:57, 7 April 2016 (UTC)
  • Jeblad is a very active editor and contributor to the Norwegian wikis, as already mentioned above. Wikimedia Norge endorses this project on making it easier to write good Lua code and increase the overall code quality. WMNOastrid (talk) 13:04, 8 April 2016 (UTC)
  • As Wikimedia's dependence on LUA will become increasingly important, so will good interoperability and well documented, "clean" (i.e. refactored where necessary) code. Given that a transparent, comprehensive project of this kind also enhances overall security and code integrity, I endorse this proposal. Asav (talk) 07:08, 11 April 2016 (UTC)
  • I don't know much about the Lua module system, but I do know that Jeblad has been a very active Wikipedia member for quite some time and is heavily invested in Wikipedia, which I believe is an indicator of success. And good testing is important in dynamically typed programming languages like Lua. This proposal looks well thought-out and has my endorsement. --Unhammer (talk) 16:39, 12 April 2016 (UTC)
  • I support this project, as others signing on my tech knowledge is limited but Jeblad has produced a lot of handy solutions for the versions of Wikipedia in Norwegian. Ulflarsen (talk) 11:35, 14 April 2016 (UTC)
  • I support this project. Lua code will probably tend to be increasingly complex with new usecases, for example lua infoboxes that pulls and treat datas from wikidata, generates graph and so on. It's now possible to develop on wiki modules that would in the past have to be coded in a PHP extension to mediawiki, which is way easier - just a browser, time and an internet connection are needed. But with increasing complexity and powerness come quality problems that need to be addressed to find and debug problems. This plans seems to me a very good move towards this. TomT0m (talk) 18:37, 21 May 2016 (UTC)
  • 469i 168.10.192.3 19:05, 18 November 2016 (UTC)

References

[edit]