Jenkins

From Mpich
Revision as of 22:34, 24 June 2015 by Sseo (talk | contribs) (Build Slave Details)

Jump to: navigation, search

MPICH has a relatively recent (as of early 2013) Jenkins continuous integration server setup at https://jenkins.mpich.org/. This page describes how we use this service in MPICH, how it works, and how we might use it in the future.

Executive Summary

If you don't have time to read the lovely prose below, at least internalize this info:

  • we now have a continuous integration server at https://jenkins.mpich.org/
  • this system runs automated build an test runs on regular intervals or whenever a commit is pushed to the revision control system
  • most of the jobs are setup to run automatically whenever code is pushed to http://git.mpich.org/mpich.git
  • the MPICH jobs are unsurprisingly named "mpich" or "mpich-SOMETHING"
  • build+test results are sent to builds@mpich.org. Click here to sign up for this list if you want to receive build status emails.
  • configuration is all done through the web interface (no more cron jobs!)

The Details

Goals

What are we even trying to accomplish by using Jenkins or any other continuous integration system?

  • reduce developer time and effort spent running tests by hand
  • reduce developer time spent fiddling with our existing automated testing systems
  • tighten the automated testing feedback loop from O(1 day) to O(1 hour) or better
  • improve accountability for "breaking the build" (low priority goal, given the current team)
  • improve software (MPICH) quality in several dimensions:
    • ensure correctness on multiple platforms (Linux, OS X, etc.)
    • ensure correctness with multiple compilers (GNU, Clang, Intel, PGI, etc.)
    • ensure correctness with multiple configure options and debugging levels
    • prevent performance regressions
  • track historical software quality information further back than just "last night"
  • reduce average build times, partly by tracking this information historically

Some of this is handled by the existing Nightly Tests infrastructure, though the "old nightlies" have a number of problems:

  • They are fragile. They are a cobbled together series of numerous shell scripts run as cron jobs by several different users on the team. It can be very confusing to track the entire flow of a test run.
  • They are not flexible. Adding a new test suite or configuration can be difficult.
  • They only run nightly.
  • They require the MCS NFS system.
  • They provide no way to suppress known test issues without completely disabling tests or platforms.
  • State from one build or day of testing is not always correctly cleaned up, leading to false positives and false negatives in some cases.

The short term goal is to augment the "old nightlies". In the longer term it would be good to replace it with an all-Jenkins solution, provided that it remains stable and can provide us all important features that are currently offered by the "old nightlies".

The MPICH Jenkins CI Server and General Jenkins Overview

Visit https://jenkins.mpich.org/ to access the Jenkins server. You should log in with your MCS username and password.

After logging in, on the home page you'll see a list of menu options on the left-hand side with an "executor status" table listed below that. In the main central/right-hand panel you'll see a list of jobs which you are able to view. I (goodell@) do not know how to filter this list automatically at this stage. There is a concept of "views", but that doesn't seem to quite solve the problem in general. Look at the list for jobs named "mpich" or "mpich-SOMETHING".

Helpful Jenkins Terminology
job (or sometimes "project") 
a logically related set of operations which should be executed in order to test a particular piece of software
build 
a particular execution of a job
workspace 
the working directory where a build executes
master (or sometimes "server") 
the Jenkins server which orchestrates builds, reports results, and manages configuration
slave (or "build executor") 
a host on which builds actually execute (that is, the job actions run on that host)
build status 
one of STABLE, UNSTABLE, or FAILED (colors are right for our server, STABLE is blue on stock Jenkins servers)

In order to control what happens in a build, you need to find your way to the "configure" panel for a given job. If you don't have the right permissions for the job, any link related to the job will probably yield an HTTP 404 for you.

Build Slave Details

Jenkins utilizes BreadBoard hardware for build testing. All nodes are in the .mcs.anl.gov domain. The platforms, Jenkins node names, and hostnames are:

Ubuntu 12.04 64-bit ubuntu64-1 ubuntu64-2 ubuntu64-3 ubuntu64-4 ubuntu64-5 ubuntu64-6 ubuntu64-7 ubuntu64-8
bb87 bb88 bb94 bb85 bb76 bb67 bb66 bb65
Ubuntu 12.04 32-bit ubuntu32-1 ubuntu32-2 ubuntu32-3 ubuntu32-4 ubuntu32-5 ubuntu32-6
bb93 bb90 bb83 bb84 bb62 bb64
Ubuntu 12.04 64-bit with IB and MXM ib-1 ib-2 ib-3 ib-4
bb73 bb74 bb72 bb75
FreeBSD 9.1 64-bit freebsd64-1 freebsd64-2
bb95 bb91
FreeBSD 9.1 32-bit freebsd32-1 freebsd32-2
bb92 bb86
OSX 10.8.5 64-bit osx-1 osx-2 osx-3
mpich-mac1 mpich-mac2 mpich-mac3
Solaris x86 (OpenIndiana) solaris-1
bb69

If for some reason you wanted to log into these machines, use the 'autotest' user. In the /sandbox/jenkins-ci/workspace/ directory you will find a forest of directories leading you to the configuration Jenkins displayed. For example, /mpich-review-tcp/compiler/gnu/jenkins_configure/strict/label/solaris/ has the working directory for the gnu,debug,solaris version.

Jenkins nightly jobs

In the "nightly" view, there are some dependencies between jobs. The dependency means that some jobs are triggered when the dependent upstream job is successfully completed. The following illustrates dependencies between jobs:

mpich-tarball --> armci-mpi
              --> mpich-abi-prolog --> mpich-abi
              --> mpich-master-freebsd
              --> mpich-master-mxm
              --> mpich-master-ofi
              --> mpich-master-osx
              --> mpich-master-portals4
              --> mpich-master-solaris
              --> mpich-master-special-tests
              --> mpich-master-ubuntu

'A --> B' indicates that the right job B is dependent on the left job A. For example, mpich-abi-prolog depends on mpich-tarball, and its build is triggered only when the build of mpich-tarball is successfully done.

mpich-tarball creates a tarball of the MPICH master branch using release.pl, and all downstream jobs, which are dependent on mpich-tarball, use the tarball. Therefore, the MPICH master repository is pulled once in mpich-tarball, and autogen.sh is not executed in most jobs except mpich-tarball.

Jenkins mpich-review details

The mpich-review repository is used for both jenkins testing and for human reviews/signoffs.

There are several jenkins jobs that pull from mpich-review.

The following branches are tested by mpich-review-tcp:

mpich-review/jenkins/all/*
mpich-review/jenkins/tcp/*

The following branches are tested by mpich-review-mxm:

mpich-review/jenkins/all/*
mpich-review/jenkins/mxm/*

The following branches are tested by mpich-review-portals4:

mpich-review/jenkins/portals4/*


Possible Future Uses of Jenkins in MPICH

  • run the other test suites as well (MPICH1, Intel, C++, LLNL I/O)
  • automated performance regression testing, including historical performance trend plotting
  • packaging our nightly snapshot tarballs
  • packaging our final release tarballs
  • write a script to filter TAP results for more sophisticated xfail criteria, possibly based on machine or test environment (e.g., exclude bcast2 failures due to MPIEXEC_TIMEOUT on shared machines)
  • gate pushes to origin on 100% clean tests
  • automated builds on platforms that are harder to integrate with the old nightlies (BG/Q, niagara machines, etc.)
  • multi-machine tests
  • build an extreme feedback device (google for more ideas) :)