Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[HPC] Rolling Submissions Proposal #538

Open
prodromou87 opened this issue Apr 1, 2024 · 0 comments
Open

[HPC] Rolling Submissions Proposal #538

prodromou87 opened this issue Apr 1, 2024 · 0 comments

Comments

@prodromou87
Copy link

prodromou87 commented Apr 1, 2024

Intro:
The MLPerf HPC working group is proposing an update to our submission deadlines. This issue aims to centralize the proposal and conversation around implementing this change.

Timeline:
We are hoping to get this new system in place before our next planned submission round (v4.0). If we are not ready on time, we will proceed with the currently established system.

Goals:

  1. Submissions arrive whenever they are ready instead of waiting for a predetermined deadline.
  • The MLPerf HPC benchmark suite seems to be more useful to run during initial system power-on, and less valuable to the submitter to do recurring submissions (which includes preventing user access for the duration of the benchmark and other overheads).
  1. Submission results carry over for as long as they are valid, without needing the submitter to resubmit their old results.
  • Submitters are allowed to drop their results from the table if they want to.

Proposal details:

  1. Submission Frequency: Any time of the year
  2. Review period: 4 weeks, roughly following current schedule. All submissions that arrive before the beginning of the review period will be considered. Submission that arrive after the start of the review process will be considered in the next review period.
  3. Review committee: Anyone who submitted in the last 12 months.
  4. Publication Frequency: Quarterly (aligns with MLCommons community meetings). Notice that we expect most of these rounds to not have any submissions. Publication will be in the form of updating the rolling results table.
  5. Benchmark rules remain unchanged
  6. Migration rules: v3.0 results carry on. Older results that may qualify for inclusion in the table will be considered by the WG on submitter's request. As long as the results are comparable to v3.0 they can be included in the table.

Implementation
1. We will need a "live" results repository.
- Publications will be represented by Git tags on the main branch of the repository.
- Only one results repository will exist (not one per publication)
2. Submission UI needs to enable submissions at any time.
- Behind the submission UI there will be some submission repository, potentially one per quarter.
- A new submissions repository will need to be created the day review period starts (once a quarter) so that it allows submissions at all times.
3. Results presentation:
- The results table will typically append new results to the existing ones.
- It should be a rare event to remove results (when a submitter specifically asks for it)
- Presentation details can be decided by the WG and MLCommons (one table per benchmark? One with all benchmarks? etc). We should do whatever communicates results most efficiently. We may need to drop some of the currently displayed fields to simplify.
4. Rules updates:
- The WG reviewed all policy documents. It's unclear what exactly needs to change, because in their current form, the policy documents allow for our proposal with practically no changes. Especially if we consider our new submission style to be 4 deadlines per year, but with submissions allowed at any time.

Adding people for visibility, feedback, and planning/execution. Please add more people who may be relevant. We should get eyes on this. @TheKanter @sparticlesteve @memani1 @nv-rborkar

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant