|
def | __init__ (self, module, device_ids=None, output_device=None, dim=0, broadcast_buffers=True) |
|
def | __getstate__ (self) |
|
def | __setstate__ (self, state) |
|
def | forward (self, inputs, kwargs) |
|
def | scatter (self, inputs, kwargs, device_ids) |
|
def | parallel_apply (self, replicas, inputs, kwargs) |
|
def | gather (self, outputs, output_device) |
|
def | train (self, mode=True) |
|
|
| dim |
|
| module |
|
| device_ids |
|
| output_device |
|
| broadcast_buffers |
|
| need_reduction |
|
| broadcast_bucket_size |
|
| nccl_reduce_bucket_size |
|
| bucket_sizes |
|
| bucket_map |
|
| buckets |
|
| bucket_events |
|
| reduced |
|
| dispatch_lock |
|
| nccl_reduction_group_id |
|
Definition at line 27 of file distributed.py.
The documentation for this class was generated from the following file: