Caffe2 - Python API
A deep learning, cross platform ML framework
torch.multiprocessing Namespace Reference


def set_sharing_strategy (new_strategy)
def get_sharing_strategy ()
def get_all_sharing_strategies ()

Detailed Description

torch.multiprocessing is a wrapper around the native :mod:`multiprocessing`
module. It registers custom reducers, that use shared memory to provide shared
views on the same data in different processes. Once the tensor/storage is moved
to shared_memory (see :func:`~torch.Tensor.share_memory_`), it will be possible
to send it to other processes without making any copies.

The API is 100% compatible with the original module - it's enough to change
``import multiprocessing`` to ``import torch.multiprocessing`` to have all the
tensors sent through the queues or shared via other mechanisms, moved to shared

Because of the similarity of APIs we do not document most of this package
contents, and we recommend referring to very good docs of the original module.

Function Documentation

def torch.multiprocessing.get_all_sharing_strategies ( )
Returns a set of sharing strategies supported on a current system.

Definition at line 73 of file

def torch.multiprocessing.get_sharing_strategy ( )
Returns the current strategy for sharing CPU tensors.

Definition at line 68 of file

def torch.multiprocessing.set_sharing_strategy (   new_strategy)
Sets the strategy for sharing CPU tensors.

    new_strategy (str): Name of the selected strategy. Should be one of
        the values returned by :func:`get_all_sharing_strategies()`.

Definition at line 56 of file