Skip to content

Commit b3b7a41

Browse files
parmeetSebastian Raschka
andauthored
sharing -> sharding (#1787) (#1788)
Co-authored-by: Sebastian Raschka <[email protected]>
1 parent 56f9826 commit b3b7a41

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/source/datasets.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ torchtext.datasets
4242
- All workers (DDP workers *and* DataLoader workers) see a different part
4343
of the data. The datasets are already wrapped inside `ShardingFilter
4444
<https://pytorch.org/data/main/generated/torchdata.datapipes.iter.ShardingFilter.html>`_
45-
and you may need to call ``dp.apply_sharing(num_shards, shard_id)`` in order to shard the
45+
and you may need to call ``dp.apply_sharding(num_shards, shard_id)`` in order to shard the
4646
data across ranks (DDP workers) and DataLoader workers. One way to do this
4747
is to create ``worker_init_fn`` that calls ``apply_sharding`` with appropriate
4848
number of shards (DDP workers * DataLoader workers) and shard id (inferred through rank

0 commit comments

Comments
 (0)