Jump to content

Neon/Builder: Difference between revisions

From KDE Community Wiki
Sgclark (talk | contribs)
No edit summary
Carlosdem (talk | contribs)
 
(13 intermediate revisions by 3 users not shown)
Line 3: Line 3:
= The Setup =  
= The Setup =  


drax.kde.org is the master server, owned by Blue Systems and administrated by KDE Sysadmins.  It runs a Jenkins instance which is a Continuous Integration website at [http://build.neon.kde.org build.neon.kde.org] that has many jobs to build the packages and run other functions, either on demand or at pre-scheduled intervals.
One of KDE's many servers is the master of neon and runs a Jenkins instance which is a Continuous Integration website at [http://build.neon.kde.org build.neon.kde.org] that has many jobs to build the packages and run other functions, either on demand or at pre-scheduled intervals.


The code behind build.neon is from [https://github.com/blue-systems/pangea-tooling pangea-tooling] which also runs the code for [http://dci.pangea.pub/ DCI Debian CI], [http://kci.pangea.pub/ KCI Kubuntu CI], [http://aci.pangea.pub/ ACI Appstream CI], and [http://mobile.neon.pangea.pub:8080/ MCI Mobile neon Plasma CI].
The code behind build.neon is from [https://github.com/blue-systems/pangea-tooling pangea-tooling] which also runs the code for [http://dci.pangea.pub/ DCI Debian CI], [http://kci.pangea.pub/ KCI Kubuntu CI], and [http://mobile.neon.pangea.pub:8080/ MCI Mobile neon Plasma CI].


The Jenkins jobs farm off the hard build work to 4 DigitalOcean slave servers.  It runs most jobs inside a Docker container to give a fresh build environment.
The Jenkins jobs farm off the hard build work to a number of DigitalOcean slave servers.  It runs most jobs inside a Docker container to give a fresh build environment.


After a checkout of pangea-tooling add the submodule for the CI config with <code>git submodules update</code>.
After a checkout of pangea-tooling add the submodule for the CI config with <code>git submodule update</code>.
This adds <code>https://github.com/blue-systems/pangea-conf-projects.git</code> which contains the files that list the jobs to be made.
This adds <code>https://github.com/blue-systems/pangea-conf-projects.git</code> which contains the files that list the jobs to be made.


Line 16: Line 16:
https://github.com/blue-systems/pangea-tooling/wiki/Jenkins-Config
https://github.com/blue-systems/pangea-tooling/wiki/Jenkins-Config


The setup of the machines is maintained in [https://github.com/blue-systems/pangea-kitchen/ pangea-kitchen] which uses Chef to set up the servers with software all configured.
The setup of various machines, that are provided by Blue Systems, is maintained in [https://github.com/blue-systems/pangea-kitchen/ pangea-kitchen] which uses Chef to set up the servers with software all configured.


For more information see [https://github.com/blue-systems/pangea-tooling/wiki/Getting-Started pangea-tooling/Getting-Started]
For more information see [https://github.com/blue-systems/pangea-tooling/wiki/Getting-Started pangea-tooling/Getting-Started]
Line 22: Line 22:
= The Packaging =
= The Packaging =


Our packaging is kept at [http://packaging.neon.kde.org packaging.neon.kde.org] Git archives, see [[Neon/Git]].
Our packaging is kept at [http://invent.kde.org/neon invent.kde.org/neon] Git archives, see [[Neon/Git]].


The packaging is for .deb packages and the Git repos contain a single <code>debian/</code> directory which defines how the .deb is made.  We try to keep the packaging in sync with [http://pkg-kde.alioth.debian.org/ Debian pkg-kde] team's Git repositories and keep the diff as small as possible with them.
The packaging is for .deb packages and the Git repos contain a single <code>debian/</code> directory which defines how the .deb is made.  We try to keep the packaging in sync with [http://pkg-kde.alioth.debian.org/ Debian pkg-kde] team's Git repositories and keep the diff as small as possible with them.
Line 60: Line 60:
<code>src</code> will create the source package.  For User Edition this means running uscan to use the debian/watch file to download the relevant tar, for Dev Editions it uses the source the parent job checked out.  It then builds the source package.
<code>src</code> will create the source package.  For User Edition this means running uscan to use the debian/watch file to download the relevant tar, for Dev Editions it uses the source the parent job checked out.  It then builds the source package.


<code>bin</code> job will extract the source, install the build dependencies and compile the package.  It finishes by checking the output from lintian and fails on any errors, you can override errors with lintian-overrides files in the normal .deb packaging method (see dh_lintian).  It also checks for any list-missing files and fails if there are any, override by adding a debian/not-installed files.  It also fails if cmake reports build-dependencies it needs, override with debian/meta/cmake-ignore.
If the epoch has changed it will fail here. Log onto the build server (charlotte) and under /home/neon/data/jobs remove the last_version files from all the build jobs such as {{Input|1=<nowiki>find *pulseaudio-qt* -name last_version</nowiki>}}
 
If the version has downgraded you will need a pin file in /etc/apt/preferences.d in the neon-settings package such as the one at 97fdd3e7818a7bf00e60f5e2094798390de232dd. For version and epoch downgrades you will need to delete the existing packages in the archive first.
 
<code>bin</code> job will extract the source, install the build dependencies and compile the package.  It finishes by checking the output from lintian and fails on any errors, you can override errors with lintian-overrides files in the normal .deb packaging method (see dh_lintian).  It also checks for any list-missing files and fails if there are any, override by adding a debian/not-installed files.   
 
It also fails if cmake reports build-dependencies it needs, override with debian/meta/cmake-ignore.  cmake-ignore can be a list of the missing dependencies as output by cmake, it can also be a YAML list which allows to set the ignore only for specific releases e.g.:
{{Input|1=<nowiki>
- QCH , API documentation in QCH format (for e.g. Qt Assistant, Qt Creator & KDevelop):
  series: xenial
</nowiki>}}
 
KCrash Validator adds a test to executable to ensure they are linking to KCrash.  It fails the build if they do not explicitly link to KCrash.  If you come across this problem add KCrash::initialize() into the same place as KApplication is set up.


<code>adt</code> job  runs Debian's test framework autopkgtest.  See [http://packaging.ubuntu.com/html/auto-pkg-test.html Ubuntu guide] for some details.  It runs adt-run on the binaries which installs them and runs the relevant test suite as defined in debian/tests/.  It doesn't fail if tests fail.
<code>adt</code> job  runs Debian's test framework autopkgtest.  See [http://packaging.ubuntu.com/html/auto-pkg-test.html Ubuntu guide] for some details.  It runs adt-run on the binaries which installs them and runs the relevant test suite as defined in debian/tests/.  It doesn't fail if tests fail.
Line 78: Line 90:
It will fail it is finds an "unstable" line in the watch file as we don't include these in User Edition.
It will fail it is finds an "unstable" line in the watch file as we don't include these in User Edition.


It will mangle the watch file to use the [https://github.com/apachelogger/neon-sftp-bridge neon-sftp-bridge] we have running on Drax, this uses ssh to expose the contents of download.kde.org even when they are hidden from the web server so we can get previews of tars before they are released.  When building unreleased packages make sure not to snapshot them into User until they get released. Consider disabling the Snapshot job to avoid mistakes.
It will mangle the watch file to use the [https://cgit.kde.org/sysadmin/neon-sftp-bridge.git] we have running on the jenkins master, this bridges http to sftp to expose the contents of download.kde.org even when they are hidden from the web server so we can get previews of tars before they are released.  When building unreleased packages make sure not to snapshot them into User until they get released. Consider disabling the Snapshot job to avoid mistakes.


<code>mgmt</code> jobs run various management tasks.  <code>mgmt_tooling</code> is run whenever there is a commit made to pangea-tooling, it blocks new jobs being started while it's running which can be a blocker so you can cancel it if you check the commits to pangea-tooling are not relevant.  It fails if ruby testing fails. It runs <code>mgmt_docker</code> which updates the docker images used for builds.  <code>mgmt_pause_integration</code> can be run manually and just blocks jobs from starting, remember to kill is when you're done.  <code>mgmt_jenkins_prune_archives</code> and <code>mgmt_jenkins_prune_logs</code> clear some space on the server and run periodically.   
<code>mgmt</code> jobs run various management tasks.   
<code>mgmt_progenitor</code> is run daily and runs <code>mgmt_build_xenial_release</code> and stable/unstable.  It deploys code onto slaves and then deploys docker images.
*<code>mgmt_appstream-generator_bionic</code> and friends use asgen (appstream-generator) to generate appstream data from the data in repos.  It gets run after the snapshot
<code>mgmt_digital_ocean</code> does snapshot maintenance</code>
*<code>mgmt_appstream-health</code> checks the above has run correctly and sends an e-mail if now
*<code>mgmt_aptly</code> sets up the aptly archives
*<code>mgmt_build_bionic_release</code> and friends is the nightly job to make all the build jobs poll for changes in KDE git and runs the jobs if there is new code
*<code>mgmt_daily_promotion_bionic_release</code> et al gets run by the snapshot jobs to make sure the current 'user' archive can install all, then that it can upgrade to the new 'release' archive successfully and then that it can purse all the packages.  If it fails then the snapshot does not run.
* <code>mgmt_digital_ocean</code> updates the VM images used to create cloud servers on digital ocean, this needs to be run manually after updating pangea-tooling and after <code>mgmt_tooling</code>, you will need to wait for all the existing DO servers to die before it actually has an effect so don't run lots of jobs.
*<code> mgmt_digital-ocean_dangler</code> removes old digital ocean droplets (cloud servers) that for some reason have not removed themselves
*<code>mgmt_docker</code> used to be run by mgmt_tooling and updates and pushes the docker images used for builds.  It has now moved to https://build.plasma-mobile.org/view/mgmt/ which runs it automatically instead and then you need http://xenon.pangea.pub/job/mgmt_tooling_deploy to run to get tooling updated in the images.
*<code>mgmt_docker_hub_check</code> checks all the neon images on hub.docker.com are built correctly
*<code> mgmt_docker_hub_rebuild</code> runs daily to ping hub.docker.com to rebuild the neon docker images
*<code>mgmt_germinate</code> Updates the Neon/release branch in our seed package
*<code>mgmt_git-semaphore</code> Pushes out update to git-semaphore, our wrapper around git which limits simultanious connections to servers
*<code> mgmt_jenkins_archive</code> archives old builds onto the slower but larger disk
*<code>mgmt_jenkins_prune_parameter-files</code> removes old paramater files used to pass on status between sub-jobs
*<code>mgmt_job-updater</code> runs jenkins_job_updater_nci which updates or adds all the build jobs according to pangea-conf-projects settings
*<code>mgmt_merger</code> runs all the merger jobs each night
*<code>mgmt_merger_debian-frameworks</code> merges in debian branches into Neon/unstable branches for frameworks
*<code>mgmt_pause_integration</code> can be run manually and just blocks jobs from starting, remember to kill is when you're done.   
*<code>mgmt_progenitor</code> runs the mgmt_build jobs each night
*<code> mgmt_repo_cleanup</code> removes old snapshots of user edition (we keep the most recent 4)
*<code>mgmt_repo_divert_stable_bionic</code> is used when new Qt is built to allow temporary copies of dev-stable and dev-unstable repos for testing and rebuilding bits
*<code>mgmt_repo_metadata_check</code> checks for changes in repo-metadata in the last day and e-mail them outLikely changes might be new stable branch, new repos or repo moved
*<code>mgmt_repo_test_versions_release-lts_bionic</code> checks all the packages in our archive have larger version numbers than ubuntu archive
*<code> mgmt_repo_undo_divert_stable_bionic</code> undoes mgmt_repo_divert_stable_bionic
*<code>mgmt_snapshot_bionic_user</code> snapshots release to user repo
* <code>mgmt_tooling</code> is run whenever there is a commit made to pangea-tooling to update tooling on the jenkins master.  It fails if ruby testing fails. 
*<code> mgmt_workspace_cleaner</code> cleans the build workspace on build servers


<code>iso</code> jobs builds the installable ISOs.  See [[Neon/InstallableImages]].  The Dev Editions are run daily and the User Edition is run weekly.
<code>iso</code> jobs builds the installable ISOs.  See [[Neon/InstallableImages]].  They are run weekly and should be run manually after significant updates such as a new Plasma release.


= The Archive =
= The Archive =
[http://archive.neon.kde.org archive.neon.kde.org] is our .deb package archive.  For your sources.list you need one of the following lines.
[http://archive.neon.kde.org archive.neon.kde.org] is our .deb package archive.  For your sources.list you need one of the following lines.
{{Input|1=<nowiki>
{{Input|1=<nowiki>
deb http://archive.neon.kde.org/dev/unstable xenial main
deb http://archive.neon.kde.org/unstable xenial main
deb http://archive.neon.kde.org/dev/stable xenial main
deb http://archive.neon.kde.org/testing xenial main
deb http://archive.neon.kde.org/user xenial main
deb http://archive.neon.kde.org/user xenial main
deb http://archive.neon.kde.org/user/lts xenial main
deb http://archive.neon.kde.org/user/lts xenial main

Latest revision as of 22:00, 14 February 2023

Neon uses a Jenkins continuous integration system to build its packages

The Setup

One of KDE's many servers is the master of neon and runs a Jenkins instance which is a Continuous Integration website at build.neon.kde.org that has many jobs to build the packages and run other functions, either on demand or at pre-scheduled intervals.

The code behind build.neon is from pangea-tooling which also runs the code for DCI Debian CI, KCI Kubuntu CI, and MCI Mobile neon Plasma CI.

The Jenkins jobs farm off the hard build work to a number of DigitalOcean slave servers. It runs most jobs inside a Docker container to give a fresh build environment.

After a checkout of pangea-tooling add the submodule for the CI config with git submodule update. This adds https://github.com/blue-systems/pangea-conf-projects.git which contains the files that list the jobs to be made.

To use the scripts to access Jenkins you will need to set ~/.config/pangea-jenkins.json using access key available inside Jenkins to administrators User (top right menu) -> Configure -> API Key

https://github.com/blue-systems/pangea-tooling/wiki/Jenkins-Config

The setup of various machines, that are provided by Blue Systems, is maintained in pangea-kitchen which uses Chef to set up the servers with software all configured.

For more information see pangea-tooling/Getting-Started

The Packaging

Our packaging is kept at invent.kde.org/neon Git archives, see Neon/Git.

The packaging is for .deb packages and the Git repos contain a single debian/ directory which defines how the .deb is made. We try to keep the packaging in sync with Debian pkg-kde team's Git repositories and keep the diff as small as possible with them.

Neon/unstable is for Developer Edition Unstable Branches, its packages are combined with master branches from the KDE project.

Neon/stable is for Developer Edition Stable Branches, its packages are combined with stable branches from the KDE project which are defined in overrides/base.yaml. Stable branches also includes branches released as Beta (so the name is not quite logical).

When a project makes a new (non-bugfix) release you should merge Neon/unstable into Neon/stable and update the stable branch in overrides.

Neon/release is for User Edition, the code gets built with release tars.

Neon/release-lts is for User LTS Edition, the code gets built with release tars except plasma which is lts tars.

Neon/mobile is used by mobile CI, and not available in all repos. This branch have patches applied which are required only for Plasma Mobile.

When moving files between packages in the same source package you can use the variables (<< ${source:Version}~ciBuild) for your Breaks/Replaces where ~ciBuild gets replaced on merge into Neon/release.

See [New Repositories] for new packages.

The [build overrides] files is used to define jobs which need a paticular branch or tar to build from.

We don't use debian/changelog files, they just add merge conflicts and we already log changes in Git changelog.

The repos are kept in sub-directories which are the same as Debian pkg-kde team uses. Ones we add are neon-packaging/ for stuff we package but don't expect Debian to use, neon/ for distro specific packages such as neon-settings and forks/ for repos packaged elsewhere we want to base on.

The Jobs

The Jenkins jobs are created by running the pangea-tooling script jenkins_jobs_update_nci.rb. This creates some manual Jobs specified in the script such as the ISO jobs but mostly uses factories to create batches of jobs based on archives. Use NO_UPDATE=1 to speed up running of it by not updating git checkouts. As with other scripts it needs the version of the Gems provided by Bundle so run it with bundle exec jenkins_jobs_update_nci.rb.

The YAML files in pangea-conf-projects define what jobs get created.

For each package there is a parent MultiJob which runs some sub jobs.

parent job this is set to checkout the relevant archive from KDE Git as source/ (for Developer Editions), then check out the relevant archive from KDE neon Git as packaging/. It then runs a number of child jobs...

src will create the source package. For User Edition this means running uscan to use the debian/watch file to download the relevant tar, for Dev Editions it uses the source the parent job checked out. It then builds the source package.

If the epoch has changed it will fail here. Log onto the build server (charlotte) and under /home/neon/data/jobs remove the last_version files from all the build jobs such as

find *pulseaudio-qt* -name last_version

If the version has downgraded you will need a pin file in /etc/apt/preferences.d in the neon-settings package such as the one at 97fdd3e7818a7bf00e60f5e2094798390de232dd. For version and epoch downgrades you will need to delete the existing packages in the archive first.

bin job will extract the source, install the build dependencies and compile the package. It finishes by checking the output from lintian and fails on any errors, you can override errors with lintian-overrides files in the normal .deb packaging method (see dh_lintian). It also checks for any list-missing files and fails if there are any, override by adding a debian/not-installed files.

It also fails if cmake reports build-dependencies it needs, override with debian/meta/cmake-ignore. cmake-ignore can be a list of the missing dependencies as output by cmake, it can also be a YAML list which allows to set the ignore only for specific releases e.g.:

- QCH , API documentation in QCH format (for e.g. Qt Assistant, Qt Creator & KDevelop):
  series: xenial

KCrash Validator adds a test to executable to ensure they are linking to KCrash. It fails the build if they do not explicitly link to KCrash. If you come across this problem add KCrash::initialize() into the same place as KApplication is set up.

adt job runs Debian's test framework autopkgtest. See Ubuntu guide for some details. It runs adt-run on the binaries which installs them and runs the relevant test suite as defined in debian/tests/. It doesn't fail if tests fail.

pub job will upload to aptly, see The Archive below.

lintqml job will scan for QML dependencies which have not been satisfied by the package dependencies, it will print a JSON output of any missing QML modules it requests. The packager should add these to the packaging manually and rebuild. Any false positives can be overridden, see Kubuntu/CI/QMLIgnore.

lintcmake job will install build packages (plus dependencies), scan for CMake Config files then get cmake to try to use them. This will show if any dependencies are missing. Using the cmake file in isolation may also show problems in the cmake file such as missing includes such as CMakeFindDependencyMacro

snap job will package it up as a Snappy Snap package. This is experimental, you can see the output at distribute.kde.org.

Other Jobs

watcher jobs are made for packages in User Edition. They use debian/watch files to check for new releases and if one is found add a new changelog entry, merges from Neon/stable, then runs the release build job. See man uscan for info on watch files.

It will fail it is finds an "unstable" line in the watch file as we don't include these in User Edition.

It will mangle the watch file to use the [1] we have running on the jenkins master, this bridges http to sftp to expose the contents of download.kde.org even when they are hidden from the web server so we can get previews of tars before they are released. When building unreleased packages make sure not to snapshot them into User until they get released. Consider disabling the Snapshot job to avoid mistakes.

mgmt jobs run various management tasks.

  • mgmt_appstream-generator_bionic and friends use asgen (appstream-generator) to generate appstream data from the data in repos. It gets run after the snapshot
  • mgmt_appstream-health checks the above has run correctly and sends an e-mail if now
  • mgmt_aptly sets up the aptly archives
  • mgmt_build_bionic_release and friends is the nightly job to make all the build jobs poll for changes in KDE git and runs the jobs if there is new code
  • mgmt_daily_promotion_bionic_release et al gets run by the snapshot jobs to make sure the current 'user' archive can install all, then that it can upgrade to the new 'release' archive successfully and then that it can purse all the packages. If it fails then the snapshot does not run.
  • mgmt_digital_ocean updates the VM images used to create cloud servers on digital ocean, this needs to be run manually after updating pangea-tooling and after mgmt_tooling, you will need to wait for all the existing DO servers to die before it actually has an effect so don't run lots of jobs.
  • mgmt_digital-ocean_dangler removes old digital ocean droplets (cloud servers) that for some reason have not removed themselves
  • mgmt_docker used to be run by mgmt_tooling and updates and pushes the docker images used for builds. It has now moved to https://build.plasma-mobile.org/view/mgmt/ which runs it automatically instead and then you need http://xenon.pangea.pub/job/mgmt_tooling_deploy to run to get tooling updated in the images.
  • mgmt_docker_hub_check checks all the neon images on hub.docker.com are built correctly
  • mgmt_docker_hub_rebuild runs daily to ping hub.docker.com to rebuild the neon docker images
  • mgmt_germinate Updates the Neon/release branch in our seed package
  • mgmt_git-semaphore Pushes out update to git-semaphore, our wrapper around git which limits simultanious connections to servers
  • mgmt_jenkins_archive archives old builds onto the slower but larger disk
  • mgmt_jenkins_prune_parameter-files removes old paramater files used to pass on status between sub-jobs
  • mgmt_job-updater runs jenkins_job_updater_nci which updates or adds all the build jobs according to pangea-conf-projects settings
  • mgmt_merger runs all the merger jobs each night
  • mgmt_merger_debian-frameworks merges in debian branches into Neon/unstable branches for frameworks
  • mgmt_pause_integration can be run manually and just blocks jobs from starting, remember to kill is when you're done.
  • mgmt_progenitor runs the mgmt_build jobs each night
  • mgmt_repo_cleanup removes old snapshots of user edition (we keep the most recent 4)
  • mgmt_repo_divert_stable_bionic is used when new Qt is built to allow temporary copies of dev-stable and dev-unstable repos for testing and rebuilding bits
  • mgmt_repo_metadata_check checks for changes in repo-metadata in the last day and e-mail them out. Likely changes might be new stable branch, new repos or repo moved
  • mgmt_repo_test_versions_release-lts_bionic checks all the packages in our archive have larger version numbers than ubuntu archive
  • mgmt_repo_undo_divert_stable_bionic undoes mgmt_repo_divert_stable_bionic
  • mgmt_snapshot_bionic_user snapshots release to user repo
  • mgmt_tooling is run whenever there is a commit made to pangea-tooling to update tooling on the jenkins master. It fails if ruby testing fails.
  • mgmt_workspace_cleaner cleans the build workspace on build servers

iso jobs builds the installable ISOs. See Neon/InstallableImages. They are run weekly and should be run manually after significant updates such as a new Plasma release.

The Archive

archive.neon.kde.org is our .deb package archive. For your sources.list you need one of the following lines.

deb http://archive.neon.kde.org/unstable xenial main
deb http://archive.neon.kde.org/testing xenial main
deb http://archive.neon.kde.org/user xenial main
deb http://archive.neon.kde.org/user/lts xenial main

It runs on KDE server racnoss and mirrors its packages with cdn77.

It is an aptly instance and may be running the Blue Systems Aptly fork.

Admins can access it using the repo console from pangea-tooling: ./ci-tooling/nci/repo_console.rb

Repo.list
repo = Repo.get("unstable_xenial")
repo.packages()

This makes available the Aptly-Api code using the Ruby GEM written by Harald and Rohan https://github.com/KDEJewellers/aptly-api/

pangea-tooling/ci-tooling/nci/repo_cleanup.rb can be run to delete packages other than the latest one and save some disk space on racnoss

User Repo

To allow for extra QA the packages built for User Edition are uploaded to the secret release repo

deb http://archive.neon.kde.org/release xenial main

You can test this manually and when happy run mgmt_snapshot to copy the packages to user repo. This will first run mgmt_daily_promotion_xenial_release (slow takes ~30 mins) which installs existing packages and attempts to upgrade them to new packages, if there are any problems it'll stop the snapshot. It also runs mgmt_appstream-generator which creates the Appstream data files used by the archive.