Hi there,
I've been playing this morning with gitlab.w.o and CI, and here is my proposal on how to work with it:
* have all gitlab-ci related stuff consolidated into a single repository [0], this includes: ** generic gitlab-ci.yaml config files as required [1] ** Dockerfiles for the images used in the above gitlab-ci.yaml files [2]
* in each repo that requires CI (likely all), instead of having to re-write the gitlab-ci.yaml file every time, "include" it from the repo above. [3]
* due to upstream docker registry ratelimits, we need to do some heavy caching in our docker registry (docker-registry.tools.wmflabs.org), which even involves scp the base docker image from ones laptop (something like [4]) because you can't even pull the base images at docker.io from tools-docker-imagebuilder-01.
To see a live demo/example of this, here is a successful tox job for a python 3.9 repository: https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/jobs...
I prefer if we have our own CI/CD stuff in a separate repo/docker registry for now. I don't think there is a unified effort for this unlike in gerrit.
I think this kind of CI stuff is one of the main missing pieces that was previously preventing us from adopting gitlab for good.
PD: there are a bunch of things to automate here, like base image maintenance and such. Will wait to see if this proposed workflow is something we're interested in.
regards.
[0] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci [1] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [2] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [3] https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/blob... [4] https://stackoverflow.com/questions/23935141/how-to-copy-docker-images-from-...
+100 for trying to get this going
On 03/25 14:44, Arturo Borrero Gonzalez wrote:
Hi there,
I've been playing this morning with gitlab.w.o and CI, and here is my proposal on how to work with it:
- have all gitlab-ci related stuff consolidated into a single repository
[0], this includes: ** generic gitlab-ci.yaml config files as required [1] ** Dockerfiles for the images used in the above gitlab-ci.yaml files [2]
This repository is now under common cloud, where volunteers (toolforge roots) don't have merge rights, should we have one per-project? Or the idea is to have several of this gitlab-ci.yaml files, like: * python39_tox * golang1_17
that sound good to me (though would be nice to make sure the volunteers of the given project are ok losing merge rights there). Random question, do they have MR rights to propose changes in that cicd repo? (/me is still a bit confused on what are the current rights).
- in each repo that requires CI (likely all), instead of having to re-write
the gitlab-ci.yaml file every time, "include" it from the repo above. [3]
- due to upstream docker registry ratelimits, we need to do some heavy
caching in our docker registry (docker-registry.tools.wmflabs.org), which even involves scp the base docker image from ones laptop (something like [4]) because you can't even pull the base images at docker.io from tools-docker-imagebuilder-01.
Hopefully this would be come a bit easier once we have harbor.
To see a live demo/example of this, here is a successful tox job for a python 3.9 repository: https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/jobs...
I prefer if we have our own CI/CD stuff in a separate repo/docker registry for now. I don't think there is a unified effort for this unlike in gerrit.
I think this kind of CI stuff is one of the main missing pieces that was previously preventing us from adopting gitlab for good.
PD: there are a bunch of things to automate here, like base image maintenance and such. Will wait to see if this proposed workflow is something we're interested in.
regards.
[0] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci [1] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [2] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [3] https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/blob... [4] https://stackoverflow.com/questions/23935141/how-to-copy-docker-images-from-...
-- Arturo Borrero Gonzalez Site Reliability Engineer Wikimedia Cloud Services Wikimedia Foundation _______________________________________________ Cloud-admin mailing list -- cloud-admin@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud-admin.lists.wikimedia.org/
On 25/03/2022 15:58, David Caro wrote:
+100 for trying to get this going
Indeed, thank you!
On 03/25 14:44, Arturo Borrero Gonzalez wrote:
Hi there,
I've been playing this morning with gitlab.w.o and CI, and here is my proposal on how to work with it:
- have all gitlab-ci related stuff consolidated into a single repository
[0], this includes: ** generic gitlab-ci.yaml config files as required [1] ** Dockerfiles for the images used in the above gitlab-ci.yaml files [2]
This repository is now under common cloud, where volunteers (toolforge roots) don't have merge rights, should we have one per-project? Or the idea is to have several of this gitlab-ci.yaml files, like:
- python39_tox
- golang1_17
that sound good to me (though would be nice to make sure the volunteers of the given project are ok losing merge rights there). Random question, do they have MR rights to propose changes in that cicd repo? (/me is still a bit confused on what are the current rights).
I currently have 'owner' rights on /repos/cloud/*, and as far as I'm aware we can grant a few different access levels to an individual repository or a subgroup which applies to everything below it. Not as customizable as Gerrit allows, but should be good enough for our needs.
Gerrit has CI configuration in a centralized repo (integration/config) and requires production shell access in order to deploy updates, having it in a repo controlled by us is already an improvement.
Everyone can create a MR to any repository they can see by creating a fork I believe.
- in each repo that requires CI (likely all), instead of having to re-write
the gitlab-ci.yaml file every time, "include" it from the repo above. [3]
- due to upstream docker registry ratelimits, we need to do some heavy
caching in our docker registry (docker-registry.tools.wmflabs.org), which even involves scp the base docker image from ones laptop (something like [4]) because you can't even pull the base images at docker.io from tools-docker-imagebuilder-01.
Hopefully this would be come a bit easier once we have harbor.
Caching images on Toolforge registries sounds fine to me. Just note that by default everything available on docker-registry.tools.wmflabs.org is also available for Toolforge Kubernetes users.
To see a live demo/example of this, here is a successful tox job for a python 3.9 repository: https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/jobs...
I prefer if we have our own CI/CD stuff in a separate repo/docker registry for now. I don't think there is a unified effort for this unlike in gerrit.
I think this kind of CI stuff is one of the main missing pieces that was previously preventing us from adopting gitlab for good.
PD: there are a bunch of things to automate here, like base image maintenance and such. Will wait to see if this proposed workflow is something we're interested in.
regards.
[0] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci [1] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [2] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [3] https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/blob... [4] https://stackoverflow.com/questions/23935141/how-to-copy-docker-images-from-...
-- Arturo Borrero Gonzalez Site Reliability Engineer Wikimedia Cloud Services Wikimedia Foundation _______________________________________________ Cloud-admin mailing list -- cloud-admin@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud-admin.lists.wikimedia.org/
Cloud-admin mailing list -- cloud-admin@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud-admin.lists.wikimedia.org/
On 3/25/22 15:27, Taavi Väänänen wrote:
On 25/03/2022 15:58, David Caro wrote:
Hopefully this would be come a bit easier once we have harbor.
Caching images on Toolforge registries sounds fine to me. Just note that by default everything available on docker-registry.tools.wmflabs.org is also available for Toolforge Kubernetes users.
Turns out gitlab includes a docker container registry:
https://gitlab.wikimedia.org/help/user/packages/container_registry/index.md
I need to research/POC more, but we may not need tools registry after all.
On 3/25/22 14:58, David Caro wrote:
This repository is now under common cloud, where volunteers (toolforge roots) don't have merge rights, should we have one per-project? Or the idea is to have several of this gitlab-ci.yaml files, like:
- python39_tox
- golang1_17
Exactly.
As many centrally-managed gitlab-ci.yaml files as we need. One per lang/release combo, and the code required to maintain the associated container images in which we run the tests.
Then each individual repo can override as required.
I've asked previously though haven't got much of an answer, are we going to upset people by moving away from github? Particularly PAWS people?
What would the shared CI stuff look like? I'll describe more on Thursday but PAWS is now setup to do pretty much everything on a PR. The main reusable thing is the container build/push bits. Would we be trying to coalesce around a standard way of doing that? Along with other shared checking or the like?
On Fri, Mar 25, 2022 at 9:45 AM Arturo Borrero Gonzalez < aborrero@wikimedia.org> wrote:
Hi there,
I've been playing this morning with gitlab.w.o and CI, and here is my proposal on how to work with it:
- have all gitlab-ci related stuff consolidated into a single repository
[0], this includes: ** generic gitlab-ci.yaml config files as required [1] ** Dockerfiles for the images used in the above gitlab-ci.yaml files [2]
- in each repo that requires CI (likely all), instead of having to
re-write the gitlab-ci.yaml file every time, "include" it from the repo above. [3]
- due to upstream docker registry ratelimits, we need to do some heavy
caching in our docker registry (docker-registry.tools.wmflabs.org), which even involves scp the base docker image from ones laptop (something like [4]) because you can't even pull the base images at docker.io from tools-docker-imagebuilder-01.
To see a live demo/example of this, here is a successful tox job for a python 3.9 repository:
https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/jobs...
I prefer if we have our own CI/CD stuff in a separate repo/docker registry for now. I don't think there is a unified effort for this unlike in gerrit.
I think this kind of CI stuff is one of the main missing pieces that was previously preventing us from adopting gitlab for good.
PD: there are a bunch of things to automate here, like base image maintenance and such. Will wait to see if this proposed workflow is something we're interested in.
regards.
[0] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci [1]
https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [2]
https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [3]
https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/blob... [4]
https://stackoverflow.com/questions/23935141/how-to-copy-docker-images-from-...
-- Arturo Borrero Gonzalez Site Reliability Engineer Wikimedia Cloud Services Wikimedia Foundation _______________________________________________ Cloud-admin mailing list -- cloud-admin@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud-admin.lists.wikimedia.org/
On 03/25 10:03, Vivian Rook wrote:
I've asked previously though haven't got much of an answer, are we going to upset people by moving away from github? Particularly PAWS people?
We can always ask, though I would wait until we have a clear idea and a tested process/setup.
What would the shared CI stuff look like? I'll describe more on Thursday but PAWS is now setup to do pretty much everything on a PR. The main reusable thing is the container build/push bits.
So for what I'm seeing (and guessing) the idea would be to have some skeleton gitlab ci tasks to do some common stuff (run tox, run npm ci, docker build, push to toolforge, ...) and then include each of them from the project's gitlab ci, overriding whatever specifics the project needs.
Would we be trying to coalesce around a standard way of doing that? Along with other shared checking or the like?
I'd say that that would be the biggest benefit, allowing reusage and easing maintenance.
On Fri, Mar 25, 2022 at 9:45 AM Arturo Borrero Gonzalez < aborrero@wikimedia.org> wrote:
Hi there,
I've been playing this morning with gitlab.w.o and CI, and here is my proposal on how to work with it:
- have all gitlab-ci related stuff consolidated into a single repository
[0], this includes: ** generic gitlab-ci.yaml config files as required [1] ** Dockerfiles for the images used in the above gitlab-ci.yaml files [2]
- in each repo that requires CI (likely all), instead of having to
re-write the gitlab-ci.yaml file every time, "include" it from the repo above. [3]
- due to upstream docker registry ratelimits, we need to do some heavy
caching in our docker registry (docker-registry.tools.wmflabs.org), which even involves scp the base docker image from ones laptop (something like [4]) because you can't even pull the base images at docker.io from tools-docker-imagebuilder-01.
To see a live demo/example of this, here is a successful tox job for a python 3.9 repository:
https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/jobs...
I prefer if we have our own CI/CD stuff in a separate repo/docker registry for now. I don't think there is a unified effort for this unlike in gerrit.
I think this kind of CI stuff is one of the main missing pieces that was previously preventing us from adopting gitlab for good.
PD: there are a bunch of things to automate here, like base image maintenance and such. Will wait to see if this proposed workflow is something we're interested in.
regards.
[0] https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci [1]
https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [2]
https://gitlab.wikimedia.org/repos/cloud/cicd/gitlab-ci/-/blob/main/py3.9-bu... [3]
https://gitlab.wikimedia.org/repos/cloud/toolforge/jobs-framework-api/-/blob... [4]
https://stackoverflow.com/questions/23935141/how-to-copy-docker-images-from-...
-- Arturo Borrero Gonzalez Site Reliability Engineer Wikimedia Cloud Services Wikimedia Foundation _______________________________________________ Cloud-admin mailing list -- cloud-admin@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud-admin.lists.wikimedia.org/
--
*Vivian Rook (They/Them)* Site Reliability Engineer Wikimedia Foundation https://wikimediafoundation.org/
Cloud-admin mailing list -- cloud-admin@lists.wikimedia.org List information: https://lists.wikimedia.org/postorius/lists/cloud-admin.lists.wikimedia.org/
On 3/25/22 15:03, Vivian Rook wrote:
I've asked previously though haven't got much of an answer, are we going to upset people by moving away from github? Particularly PAWS people?
I think gitlab can cover for pretty much all the main reasons we started using github in the first place. The whole gitlab stuff is pretty new anyway, I don't think we should rush with any movement.
That being said, you have been the person with more involvement with PAWS in the recent times. Honestly, I think those interested in the community + you should get to decide what do to with it!
cloud-admin@lists.wikimedia.org