Replies: 1 comment
-
I realized someone raised an issue, but still relevant. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I would like to better understand the need for a batch size in the config.
The actual batch size is always defined by the iterator of the dataloader which is independent from the DS config.
So why can't the forward of the model_engine take the length of the batch directly ?
Subsequently I have an extra question:
When it comes to variable batch_size for memory optimization (eg: number of tokens batch vs number of sentence batch in Seq2seq NLP tasks) is there a way with the current framework to accomplish this ?
Many thanks.
Beta Was this translation helpful? Give feedback.
All reactions