AutogradCrypTensors allow CrypTensors to store gradients and thus enable backpropagation. AutogradCrypTensors can be created from CrypTensors and are only needed when backpropagation needs to be performed. For all other operations on encrypted tensors, CrypTensors are sufficient.

To create an AutogradCrypTensor,

# Create a CrypTensor
x_enc = crypten.cryptensor(x)

# Create an AutogradCrypTensor from the CrypTensor
x_enc_auto = AutogradCrypTensor(x_enc)

For an example of backpropagation using AutogradCrypTensor, please see Tutorial 7.

class crypten.autograd_cryptensor.AutogradCrypTensor(tensor, requires_grad=True)

CrypTensor with support for autograd, akin to the Variable originally in PyTorch.

backward(grad_input=None, top_node=True)

Backpropagates gradient through the computation graph. The function only maintains the gradients in leaf nodes of the graph.

property tensor

Returns underlying (non-autograd) tensor.