You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Intro:
The MLPerf HPC working group is proposing an update to our submission deadlines. This issue aims to centralize the proposal and conversation around implementing this change.
Timeline:
We are hoping to get this new system in place before our next planned submission round (v4.0). If we are not ready on time, we will proceed with the currently established system.
Goals:
Submissions arrive whenever they are ready instead of waiting for a predetermined deadline.
The MLPerf HPC benchmark suite seems to be more useful to run during initial system power-on, and less valuable to the submitter to do recurring submissions (which includes preventing user access for the duration of the benchmark and other overheads).
Submission results carry over for as long as they are valid, without needing the submitter to resubmit their old results.
Submitters are allowed to drop their results from the table if they want to.
Proposal details:
Submission Frequency: Any time of the year
Review period: 4 weeks, roughly following current schedule. All submissions that arrive before the beginning of the review period will be considered. Submission that arrive after the start of the review process will be considered in the next review period.
Review committee: Anyone who submitted in the last 12 months.
Publication Frequency: Quarterly (aligns with MLCommons community meetings). Notice that we expect most of these rounds to not have any submissions. Publication will be in the form of updating the rolling results table.
Benchmark rules remain unchanged
Migration rules: v3.0 results carry on. Older results that may qualify for inclusion in the table will be considered by the WG on submitter's request. As long as the results are comparable to v3.0 they can be included in the table.
Implementation 1. We will need a "live" results repository.
- Publications will be represented by Git tags on the main branch of the repository.
- Only one results repository will exist (not one per publication) 2. Submission UI needs to enable submissions at any time.
- Behind the submission UI there will be some submission repository, potentially one per quarter.
- A new submissions repository will need to be created the day review period starts (once a quarter) so that it allows submissions at all times. 3. Results presentation:
- The results table will typically append new results to the existing ones.
- It should be a rare event to remove results (when a submitter specifically asks for it)
- Presentation details can be decided by the WG and MLCommons (one table per benchmark? One with all benchmarks? etc). We should do whatever communicates results most efficiently. We may need to drop some of the currently displayed fields to simplify. 4. Rules updates:
- The WG reviewed all policy documents. It's unclear what exactly needs to change, because in their current form, the policy documents allow for our proposal with practically no changes. Especially if we consider our new submission style to be 4 deadlines per year, but with submissions allowed at any time.
Adding people for visibility, feedback, and planning/execution. Please add more people who may be relevant. We should get eyes on this. @TheKanter@sparticlesteve@memani1@nv-rborkar
The text was updated successfully, but these errors were encountered:
Intro:
The MLPerf HPC working group is proposing an update to our submission deadlines. This issue aims to centralize the proposal and conversation around implementing this change.
Timeline:
We are hoping to get this new system in place before our next planned submission round (v4.0). If we are not ready on time, we will proceed with the currently established system.
Goals:
Proposal details:
Implementation
1. We will need a "live" results repository.
- Publications will be represented by Git tags on the main branch of the repository.
- Only one results repository will exist (not one per publication)
2. Submission UI needs to enable submissions at any time.
- Behind the submission UI there will be some submission repository, potentially one per quarter.
- A new submissions repository will need to be created the day review period starts (once a quarter) so that it allows submissions at all times.
3. Results presentation:
- The results table will typically append new results to the existing ones.
- It should be a rare event to remove results (when a submitter specifically asks for it)
- Presentation details can be decided by the WG and MLCommons (one table per benchmark? One with all benchmarks? etc). We should do whatever communicates results most efficiently. We may need to drop some of the currently displayed fields to simplify.
4. Rules updates:
- The WG reviewed all policy documents. It's unclear what exactly needs to change, because in their current form, the policy documents allow for our proposal with practically no changes. Especially if we consider our new submission style to be 4 deadlines per year, but with submissions allowed at any time.
Adding people for visibility, feedback, and planning/execution. Please add more people who may be relevant. We should get eyes on this. @TheKanter @sparticlesteve @memani1 @nv-rborkar
The text was updated successfully, but these errors were encountered: