Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support stbe length rebatching and remove stbe output padding for MTIA #2523

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

seanx92
Copy link
Contributor

@seanx92 seanx92 commented Oct 25, 2024

Summary:

  1. For rebatching stbe length without output, it must be 2d tensor in the shape of [F x B] and we can directly concat at dim1;
  2. For MTIA inference, if stbe is in remote, its output will be padded to max batch size, which will make split not work. In this case, we want to remove the padding and restore its original size.

Differential Revision: D64914077

Summary:
1. For rebatching stbe length without output, it must be 2d tensor in the shape of [F x B] and we can directly concat at dim1;
2. For MTIA inference, if stbe is in remote, its output will be padded to max batch size, which will make split not work. In this case, we want to remove the padding and restore its original size.

Differential Revision: D64914077
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 25, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D64914077

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants