:py:mod:`kosmos.ml.models.vqc.circuit.qiskit_circuit.autograd_function` ======================================================================= .. py:module:: kosmos.ml.models.vqc.circuit.qiskit_circuit.autograd_function Classes ------- .. py:class:: AutogradCtx Bases: :py:class:`Protocol` Autograd context for the QiskitAutogradFunction. ---- .. py:class:: QiskitAutogradFunction(*args, **kwargs) Bases: :py:class:`torch.autograd.Function` Custom autograd bridge between PyTorch and Qiskit for variational quantum circuits. | .. rubric:: Methods .. py:method:: forward(ctx: torch.autograd.function.FunctionCtx, x: torch.Tensor, weights: torch.Tensor, evaluator: collections.abc.Callable, gradient_fn: collections.abc.Callable) -> torch.Tensor Perform the forward pass by evaluating the quantum circuit via the provided evaluator. :param ctx: Autograd context to save information for the backward pass. :type ctx: FunctionCtx :param x: Input batch of shape (B, input_dim). :type x: torch.Tensor :param weights: Trainable weights of the circuit. :type weights: torch.Tensor :param evaluator: Callable that evaluates the quantum circuit and returns the output values for the given inputs and weights. :type evaluator: Callable :param gradient_fn: Callable computing gradients of the circuit output with respect to its parameters, e.g., via the parameter-shift rule. :type gradient_fn: Callable :returns: Output batch of shape (B, output_dim). :rtype: torch.Tensor .. py:method:: backward(ctx: AutogradCtx, *grad_outputs: torch.Tensor) -> tuple[None, torch.Tensor, None, None] Compute gradients using the provided quantum gradient function. :param ctx: Autograd context with saved tensors from the forward pass. :type ctx: AutogradCtx :param \*grad_outputs: Gradient of the loss with respect to the forward output, shape (B, output_dim). :type \*grad_outputs: torch.Tensor :returns: Gradients for each forward input. Only the gradient with respect to `weights` is returned; gradients for `model` and `x` are None. :rtype: tuple[None, torch.Tensor, None, None]