Skip to content

Commit 3816c9a

Browse files
authored
Update xFormers docs (huggingface#2208)
Update xFormers docs.
1 parent 8267c78 commit 3816c9a

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

docs/source/en/optimization/xformers.mdx

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,13 +14,16 @@ specific language governing permissions and limitations under the License.
1414

1515
We recommend the use of [xFormers](https://github.com/facebookresearch/xformers) for both inference and training. In our tests, the optimizations performed in the attention blocks allow for both faster speed and reduced memory consumption.
1616

17-
Installing xFormers has historically been a bit involved, as binary distributions were not always up to date. Fortunately, the project has [very recently](https://github.com/facebookresearch/xformers/pull/591) integrated a process to build pip wheels as part of the project's continuous integration, so this should improve a lot starting from xFormers version 0.0.16.
18-
19-
Until xFormers 0.0.16 is deployed, you can install pip wheels using [`TestPyPI`](https://test.pypi.org/project/formers/). These are the steps that worked for us in a Linux computer to install xFormers version 0.0.15:
17+
Starting from version `0.0.16` of xFormers, released on January 2023, installation can be easily performed using pre-built pip wheels:
2018

2119
```bash
22-
pip install pyre-extensions==0.0.23
23-
pip install -i https://test.pypi.org/simple/ formers==0.0.15.dev376
20+
pip install xformers
2421
```
2522

26-
We'll update these instructions when the wheels are published to the official PyPI repository.
23+
<Tip>
24+
25+
The xFormers PIP package requires the latest version of PyTorch (1.13.1 as of xFormers 0.0.16). If you need to use a previous version of PyTorch, then we recommend you install xFormers from source using [the project instructions](https://github.com/facebookresearch/xformers#installing-xformers).
26+
27+
</Tip>
28+
29+
After xFormers is installed, you can use `enable_xformers_memory_efficient_attention()` for faster inference and reduced memory consumption, as discussed [here](fp16#memory-efficient-attention).

0 commit comments

Comments
 (0)