Skip to content

Commit

Permalink
Update distribute_pyg.rst
Browse files Browse the repository at this point in the history
  • Loading branch information
ZhengHongming888 authored Jan 18, 2024
1 parent 14f00f5 commit 91d6188
Showing 1 changed file with 1 addition and 6 deletions.
7 changes: 1 addition & 6 deletions docs/source/tutorial/distribute_pyg.rst
Original file line number Diff line number Diff line change
Expand Up @@ -271,11 +271,6 @@ At the same time we also store the partition information like num_partitions, pa
3. Torch RPC and dist Context
---------------------------------------------------

.. figure:: ../_static/thumbnails/distribute_torch_rpc.png
:align: center
:width: 90%


In the distributed pyg two torch.distributed parallel technologies are used:

+ ``torch.distributed.ddp`` used for data parallel on training side
Expand Down Expand Up @@ -330,7 +325,7 @@ The working flow is from load partition into graphstore/featurestore, distNeighb

.. figure:: ../_static/thumbnails/distribute_distloader.png
:align: center
:width: 90%
:width: 50%

Distributed class ``DistLoader`` is used to create distributed data loading routines like initializing the parameters of current_ctx, rpc_worker_names, master_addr/port, channel, num_rpc_threads, num_workers, etc and then at the same time will initialize the context/rpc for distributed sampling based on ``worker_init_fn``.

Expand Down

0 comments on commit 91d6188

Please sign in to comment.