1 #include "caffe2/operators/tt_linear_op.h" 6 REGISTER_CPU_OPERATOR(TT, TTLinearOp<float, CPUContext>);
7 REGISTER_CPU_OPERATOR(TTLinearGradient, TTLinearGradientOp<float, CPUContext>);
16 The TT-layer serves as a low-rank decomposition of a fully connected layer. The 17 inputs are the same as to a fully connected layer, but the number of parameters 18 are greatly reduced and forward computation time can be drastically reduced 19 especially for layers with large weight matrices. The multiplication is computed 20 as a product of the input vector with each of the cores that make up the TT 21 layer. Given the input sizes (inp_sizes), output sizes(out_sizes), and the ranks 22 of each of the cores (tt_ranks), the ith core will have size: 24 inp_sizes[i] * tt_ranks[i] * tt_ranks[i + 1] * out_sizes[i]. 26 The complexity of the computation is dictated by the sizes of inp_sizes, 27 out_sizes, and tt_ranks, where there is the trade off between accuracy of the 28 low-rank decomposition and the speed of the computation. 32 "(int[]) Input sizes of cores. Indicates the input size of " 33 "the individual cores; the size of the input vector X must match the " 34 "product of the inp_sizes array.")
37 "(int[]) Output sizes of cores. Indicates the output size " 38 "of the individual cores; the size of the output vector Y must match " 39 "the product of the out_sizes array.")
42 "(int[]) Ranks of cores. Indicates the ranks of the " 43 "individual cores; lower rank means larger compression, faster " 44 "computation but reduce accuracy.")
48 "Input tensor from previous layer with size (M x K), where " 49 "M is the batch size and K is the input size.")
50 .Input(1,
"b",
"1D blob containing the bias vector")
54 "1D blob containing each individual cores with sizes " 59 "Output tensor from previous layer with size (M x N), " 60 "where M is the batch size and N is the output size.");
62 OPERATOR_SCHEMA(TTLinearGradient);
64 GRADIENT_NOT_IMPLEMENTED_YET(TT);
A global dictionary that holds information about what Caffe2 modules have been loaded in the current ...