On Wed, Jun 7, 2017 at 2:29 PM, Brion Vibber bvibber@wikimedia.org wrote:
On Wed, Jun 7, 2017 at 10:18 AM, Joaquin Oltra Hernandez < jhernandez@wikimedia.org> wrote:
*Context*
We'd like to have a build script/process for an extension so that I can perform certain commands to install dependencies and perform optimizations on the extension sources. For example, on front-end sources.
Some examples could be:
- Installing libraries from bower or npm and bundling them into the
resources folder
- Applying post processing steps to CSS with something like post css
- Optimizing images
We are aware of other projects that have build processes for building deployables, but not extensions. Such projects have different ways of dealing with this. A common way is having a repository called <Project>/deploy and in there you pull from <Project> and run the build scripts, and that is the repository that gets deployed.
*Current system*
The current way we usually do this (if we do) is run those build scripts/jobs on the developers machines and commit them into the git repository on master.
With this system, if you don't enforce anything in CI, then build processes may be skipped (human error).
If you enforce it (by running the process and comparing with what has been committed in CI) then patches merged to master that touch the same files will produce merge conflicts with existing open patches, forcing a rebase+rebuild on open patches every time one is merged on master.
*Questions*
Can we have a shared configuration/convention/system for having a build step on mediawiki extensions?
- So that a build process is run
cluster and to production
- on CI jobs that require production assets like the selenium jobs
- on the deployment job that deploys the extension to the beta
How would it look like? Are any extensions doing a pre-deployment build step?
For JS dependencies, image optimizations etc the state of the art still seems to be to have a local one-off script and commit the build artifacts into the repo. (For instance TimedMediaHandler fetches some JS libs via npm and copies/patches them into the resources/ dir.)
For PHP deps we've got composer dependency installation for extensions, so it seems like there's an opportunity to do other build steps in this stage...
Not sure offhand if that can be snuck into composer directly or if we'd need to replace the "run composer" step with "run this script, which runs composer and also does other build steps".
When I first joined the Foundation and started working with MediaWiki on a daily basis I wondered about the lack of a build process. At past jobs I had built PHP application environments that had a "run from version control" mode for local development, but always included a build step for packaging and deployment that did the sort of things that Joaquin is talking about. When I was in the Java world Ant and then later Maven2 were the tools of choice for this work. Later in a PHP shop I selected Phing as the build tool and even committed some enhancements upstream to make it work nicer with the type of projects I was managing.
I helped get Composer use into MediaWiki core and that added a post deploy build step for MediaWiki, but one that is pretty limited in what it can do easily. Composer is mostly a tool for installing PHP library dependencies. Most of the attempts I have seen to make it do things beyond that are clunky uses of the tool. I can certainly still see the possible benefit of having a full fledged build step for core, skins, and extensions. It is something that should be thought about a bit before diving right into an implementation though. One thing to consider is if what would be best is a packaging step that leads to a tarball or similar artifact that can be dropped into a runtime environment for MediaWiki or if instead it would be better to have a unified post-deploy build step that operates across MediaWiki core and the entire collection of optional extensions and skins deployed to create a particular wiki.
The Foundation's production deployment use case will always be an anomaly. It should be considered, but really in my opinion only to ensure that nothing absolutely requires external network access in the final build. For Composer this turned out to be as easy as maintaining a submodule with all the vendored libraries included.
The two main use cases to consider for build tooling are (in this order) 3rd party deployers of MediaWiki and local developers. 3rd party users are the most important because this is the largest number of people who will be impacted by tooling changes. In an ideal world all or most of the changes could be hidden by changes to ExtensionDistributor or similar tooling that makes it easy to create a download and run tarball.
One of the awesome features of working on a PHP codebase is the quick cycle of making a change and seeing it live in your test environment. Today that is mostly a matter of saving an edit and hitting refresh in a browser. It would be sad to lose that, so the build system that is devised should also provide a path that allows a git clone to be a viable wiki. This runtime doesn't need to be the best that the wiki could be however. Its usually ok if a local dev environment needs to do a bit more work than a prod deployment would to gather l10n resources and do other dynamic processes that would be expected to be baked into artifacts for a production deployment.
$0.02 USD, Bryan