Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pass transforms around instead of making duplicates #416

Closed
wants to merge 2 commits into from

Conversation

JasonKChow
Copy link
Contributor

Summary: Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 29, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 29, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 30, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@JasonKChow JasonKChow force-pushed the export-D65155103 branch 2 times, most recently from b85b9fb to fb39cf3 Compare October 30, 2024 22:16
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 30, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

2 similar comments
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 30, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 31, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 31, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@JasonKChow JasonKChow force-pushed the export-D65155103 branch 2 times, most recently from b300da4 to f503e5f Compare October 31, 2024 23:28
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Oct 31, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 1, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:
Pull Request resolved: facebookresearch#416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 8, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

…earch#443)

Summary:

All parameters will have some options that will always be the same. We add a private function that unifies this.

Differential Revision: D65695127
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D65155103

JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 9, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 9, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
JasonKChow added a commit to JasonKChow/aepsych that referenced this pull request Nov 11, 2024
…#416)

Summary:

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Differential Revision: D65155103
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 901de00.

crasanders pushed a commit that referenced this pull request Nov 14, 2024
Summary:
Pull Request resolved: #416

Instead of creating duplicate transforms whenever we need one, we create a single transform from the config and initialize the wrapped model and wrapped generators with that one transform. This passes the same transform object around and allows the transformations to learn parameters and still be synced up across wrapped objects.

Reviewed By: crasanders

Differential Revision: D65155103

fbshipit-source-id: 6350857e70d8abc6229bb556f746ad53ea4abe30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants