Skip to content

benchmark on c3potato

Oliver Beckstein edited this page Mar 7, 2018 · 17 revisions

Notes on how to run the benchmarks on c3potato.

Set up

Ideally, I would like to run asv from outside the source tree. However, that ran into some issues, possibly related to the non-standard set up of the MDAnalysis/mdanalysis repo (we need to use the new feature in the JSON file "repo_subdir":"package" from merged asv PR 611 so that we can use package/setup.py). At the moment I run the benchmarks from inside the checked out repo (in the benchmarks directory) and store the results and environments elsewhere. This allows making the results a separate repo without fear of interference or having to use git subrepositories.

benchmarking/
   benchmarks/       # the MDAnalysis/benchmarks repo
       asv.conf.json
       results/      # all benchmark results
   env/              # cache with asv environments 
   html/             # html output, becomes the MDAnalysis/benchmarks gh-pages branch
repositories/
   asv/              # offical asv 
   mdanalysis/       # MDAnalysis/mdanalysis
       benchmarks/   # asv benchmark files; benchmarks are run in this directory
       package/      # source code including setup.py
       testsuite/    # unit tests
miniconda3/          # anaconda environment

I added the function

function add_miniconda () {
   echo ">> adding miniconda3 to PATH"
   # added by Miniconda3 installer
   export PATH="${HOME}/MDA/miniconda3/bin:$PATH"
}

to .bashrc so that I can add the miniconda environment on demand.

Python environment

I installed miniconda3. It is not enabled by default so to get started do

add_miniconda
source activate benchmark

and work in the benchmark environment.

Run benchmark

asv run --config asv_c3potato.conf.json -e -j 4 "release-0.11.0..HEAD --merges" 2>&1 | tee asv_log.txt

We are running the benchmarks since release 0.11.0 because the transition from 0.10 to 0.11 broke so many things in the API that it is too painful to write the performance tests to also cater to the pre-0.11.0 code. (At least for now.)

Process and publish results

We want to create html pages of the results and push it to the gh-pages branch of the MDAnalysis/benchmarks repo. In principle asv gh-pages should do this automatically. In practice it fails for this set up with a cryptic failure at the push step ("asv.util.ProcessError: Command '/usr/bin/git push origin gh-pages' returned non-zero exit status 1").

For right now, force-push manually:

cd benchmarking/benchmarks
asv publish     # might be superfluous
asv gh-pages --no-push
git push origin +gh-pages
Clone this wiki locally