|
| 1 | +Distributed and Parallel Training Tutorials |
| 2 | +=========================================== |
| 3 | + |
| 4 | +This page includes all distributed and parallel trainings available |
| 5 | +at pytorch.org website. |
| 6 | + |
| 7 | +Getting Started with Distributed Data-Parallel Training (DDP) |
| 8 | +------------------------------------------------------------- |
| 9 | + |
| 10 | +.. grid:: 3 |
| 11 | + |
| 12 | + .. grid-item-card:: Getting Started with PyTorch Distributed |
| 13 | + :shadow: none |
| 14 | + :link: https://example.com |
| 15 | + :link-type: url |
| 16 | + |
| 17 | + This tutorial provides a gentle intro to the PyTorch |
| 18 | + DistributedData Parallel. |
| 19 | + |
| 20 | + .. grid-item-card:: Single Machine Model Parallel Best Practices |
| 21 | + :shadow: none |
| 22 | + :link: https://example.com |
| 23 | + :link-type: url |
| 24 | + |
| 25 | + In this tutorial you will learn about best practices in |
| 26 | + using model parallel. |
| 27 | + |
| 28 | + .. grid-item-card:: Writing Distributed Applications with PyTorch |
| 29 | + :shadow: none |
| 30 | + :link: https://example.com |
| 31 | + :link-type: url |
| 32 | + |
| 33 | + This tutorial demonstrates how to write a distributed application |
| 34 | + with PyTorch. |
| 35 | + |
| 36 | +Learn FSDP |
| 37 | +---------- |
| 38 | + |
| 39 | +Fully-Sharded Data Parallel (FSDP) is a tool that distributes model |
| 40 | +parameters across multiple workers, therefore enabling you to train larger |
| 41 | +models. |
| 42 | + |
| 43 | + |
| 44 | +.. grid:: 3 |
| 45 | + |
| 46 | + .. grid-item-card:: Getting Started with FSDP |
| 47 | + :shadow: none |
| 48 | + :img-top: ../_static/img/thumbnails/cropped/pytorch-logo.png |
| 49 | + :link: https://example.com |
| 50 | + :link-type: url |
| 51 | + |
| 52 | + This tutorial provides a gentle intro to the PyTorch |
| 53 | + DistributedData Parallel. |
| 54 | + |
| 55 | + .. grid-item-card:: Single Machine Model Parallel Best Practices |
| 56 | + :shadow: none |
| 57 | + :img-top: ../_static/img/thumbnails/cropped/pytorch-logo.png |
| 58 | + :link: https://example.com |
| 59 | + :link-type: url |
| 60 | + |
| 61 | + In this tutorial you will learn about best practices in |
| 62 | + using model parallel. |
| 63 | + |
| 64 | + .. grid-item-card:: Writing Distributed Applications with PyTorch |
| 65 | + :shadow: none |
| 66 | + :img-top: ../_static/img/thumbnails/cropped/pytorch-logo.png |
| 67 | + :link: https://example.com |
| 68 | + :link-type: url |
| 69 | + |
| 70 | + This tutorial demonstrates how to write a distributed application |
| 71 | + with PyTorch. |
| 72 | + |
| 73 | +Learn RPC |
| 74 | +--------- |
| 75 | + |
| 76 | +Distributed Remote Procedure Call (RPC) framework provides |
| 77 | +mechanisms for multi-machine model training |
0 commit comments