Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch Size impact on model performance #25

Open
KJGithub2021 opened this issue Feb 29, 2024 · 0 comments
Open

Batch Size impact on model performance #25

KJGithub2021 opened this issue Feb 29, 2024 · 0 comments

Comments

@KJGithub2021
Copy link

Hi @JasonForJoy can you please confirm if reducing the batch size (due to low-end GPU machine availability) can impact the performance values of the model ?

Secondly, every batch_size including the recommended (i.e. 96) is not a divisor of the total number of training, valid or test samples. As a result of which the last batch misses out on samples. Does this process contribute in lowering the recall and other metric values ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant