Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
338 views
in Technique[技术] by (71.8m points)

python - Computing jacobian matrix in Tensorflow

I want to calculate Jacobian matrix by Tensorflow.

What I have:

def compute_grads(fn, vars, data_num):
    grads = []
    for n in range(0, data_num):
        for v in vars:
            grads.append(tf.gradients(tf.slice(fn, [n, 0], [1, 1]), v)[0])
    return tf.reshape(tf.stack(grads), shape=[data_num, -1])

fn is a loss function, vars are all trainable variables, and data_num is a number of data.

But if we increase the number of data, it takes tremendous time to run the function compute_grads. Any ideas?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Assuming that X and Y are Tensorflow tensors and that Y depends on X:

from tensorflow.python.ops.parallel_for.gradients import jacobian
J=jacobian(Y,X)

The result has the shape Y.shape + X.shape and provides the partial derivative of each element of Y with respect to each element of X.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...