You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, when I use autograd, it is possible to see its gradient function? Or in other words, it is possible to see derivative of that function? Or is it possible to see computational graph?
For example, I want to see grad_tanh function
importautograd.numpyasnp# Thinly-wrapped numpyfromautogradimportgrad# The only autograd function you may ever needdeftanh(x): # Define a functiony=np.exp(-2.0 * x)return(1.0 - y) / (1.0 + y)grad_tanh=grad(tanh)# Obtain its gradient function
Thank you
The text was updated successfully, but these errors were encountered:
I do not know whether this is possible with autograd, but if your function is simple enough, https://www.matrixcalculus.org/ can give you both the mathematical formula and a Python program to compute the gradient.
Using your function (vector(1) - exp(-2 * x)) ./ (vector(1) + exp(-2 * x)):
I used the Jacobian here since grad only works for scalar-valued functions, but I wanted to show a more general case. You can also select x is a [scalar] from the list on the website to get the gradient for a scalar value.
Hi, when I use autograd, it is possible to see its gradient function? Or in other words, it is possible to see derivative of that function? Or is it possible to see computational graph?
For example, I want to see grad_tanh function
Thank you
The text was updated successfully, but these errors were encountered: