-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Split the github workflow in CI and CD #1063
Conversation
I like the idea of running the slow tests during CD to make sure they are run at all. |
I very much agree. I am thinking of putting the coverage back in for CI, especially since the CD will only run once a PR is merged (if it stays as I intended) I did not check the difference in cov with included slow tests |
Even though this PR is not marked as a draft any more, it still is. The reason is that I want to debug the if clause for ignoring draft PRs but first I need to make sure that the respective workflow triggers at all xD |
a6260ac
to
5f8266d
Compare
Ignoring draft PRs works |
@tomMoral @janfb @Baschdl I now configured it to run only the fast tests for every push and the fast+slow tests for every push on a PR. Note, that this will lead to double execution of tests. I don't have a strong opinion in this and can easily revert it. As said before
I don't believe that we will catch too many error's with this, but the coverage estimate will be better for PRs |
To quote myself (#1063 (comment)):
and increasing it to over 30min is quite a lot. I would be in favor of running the slow tests as well as tests on other Python versions somewhere but an improvement to the current state would already be if that means running them only once a day etc. |
Yeah sure, I mean my original proposal was to only run them once a PR is merged (not ideal, but minimally invasive from the current state) |
Quick summary for everyone who doesn't want to look at CI. This test fails for sbi/tests/multiprocessing_test.py Lines 28 to 31 in c383d7f
I also had a look and wanted to open an issue for it. I could imagine multiple reasons for it: overhead of creating 10 workers (on a machine with only 2 cores) is not worth it if we only have 1 sample per batch or problems with running this test with |
It is a bit unintuitve though that it fails for the smallest batch size. |
A bit late to the party but here is my input:
|
Yes, they aren't run anywhere currently. We could set up a self-hosted runner with a GPU but we would need to think about long-term maintenance. |
OK, so this PR is almost ready then. Just a few questions / points from my side. I added the verbose |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would say it's done but we should first fix #1111, otherwise we'll get a failing main.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great! Thanks for adding this and discussing it in detail!
One question: The main difference between CI and CD are the slow tests and building the docs, right?
I.e., codecov is checked in both?
I suggest that we merge this now and just see how it works in practice and then adapt. The CI-CD separation gives a lot of flexibility..
Yes, but there is also a difference in when they are triggered:
Both compute and upload the coverage, but under different (new) names, i.e., Moreover, CI runs the OS-PyTorch matrix, while CD only runs the combination Python3.8 and pip's choice for the torch version. CD also builds the docs to check if that process runs without any errors. However, it does not upload. |
What does this implement/fix? Explain your changes
The general idea is that we want to run a CI workflow that is more lightweight, hence, allows for faster turnover rates, and a CD workflow that runs more tests, computes the coverage, and maybe in the future directly pushes the latest package version to PyPi.
Does this close any currently open issues?
No.
Any relevant code examples, logs, error output, etc?
See the actions.
Any other comments?
I chose to run the pre-commit hooks in a different way than in
test.yml
such that we only need to specify one container and do everything in it.For the first checks, the CD workflow will run on draft PRs. This should be changed when this PR is close to acceptance.
Let's debate on the CD workflow running the slow tests.
Checklist
Put an
x
in the boxes that apply. You can also fill these out after creatingthe PR. If you're unsure about any of them, don't hesitate to ask. We're here to
help! This is simply a reminder of what we are going to look for before merging
your code.
guidelines
with
pytest.mark.slow
.guidelines
main
(or there are no conflicts withmain
)