Skip to content

[fbsync] Replace TransformerEncoder in torchtext with better transformer #1703

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 6, 2022

Conversation

parmeet
Copy link
Contributor

@parmeet parmeet commented May 6, 2022

Summary:
X-link: facebookresearch/multimodal#34

Pull Request resolved: #1700

Replace the usage of TransformerEncoder by BetterTransformerEncoder
In theory we should be able to remove torchtext.TransformerEncoderLayer after this diff.

Reviewed By: parmeet

Differential Revision: D36084653

fbshipit-source-id: 64ed3810e809fc1db840e75e2e05783089ff31d2

…rch#34)

Summary:
X-link: facebookresearch/multimodal#34

Pull Request resolved: pytorch#1700

Replace the usage of TransformerEncoder by BetterTransformerEncoder
In theory we should be able to remove torchtext.TransformerEncoderLayer after this diff.

Reviewed By: parmeet

Differential Revision: D36084653

fbshipit-source-id: 64ed3810e809fc1db840e75e2e05783089ff31d2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants