Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[unet3D]: Rounding up epochs with mixed-batches and learning_rate schedules #427

Open
mndevec opened this issue Mar 4, 2021 · 1 comment

Comments

@mndevec
Copy link

mndevec commented Mar 4, 2021

There are 168 images in Unet3D dataset, which is lower than the other benchmarks. Based on the rules here, if we were to use batch size of 128, we can use mixed-batch approach and merge the images from 2 epochs in a single step.

This makes it a bit complicated to satisfy math equivalence regarding learning_rate_schedules. The closest math equivalent way would be scaling step numbers by (256 / 168) to match reference_epoch number. This still has differing learning_rate applications to partial epochs as in other models, but the difference might be more visible due to the smaller size of dataset.

So, I wanted to be sure about that this still satisfies math equivalence under the current rules, is that right?

@mndevec mndevec changed the title [unet3D]: Rounding up epochs with mixed-batches and learning_rate schedules in [unet3D]: Rounding up epochs with mixed-batches and learning_rate schedules Mar 4, 2021
@mndevec
Copy link
Author

mndevec commented Mar 8, 2021

@johntran-nv @sergey-serebryakov
Would that be possible to include this in the next working group meeting?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant