拜托了tensorflow"/>
拜托了tensorflow
tensorflow里的torch.no_grad()
参考1
参考2
官方文档
x = tf.Variable(2.0)
y = tf.Variable(3.0)with tf.GradientTape() as t:x_sq = x * xwith t.stop_recording():y_sq = y * yz = x_sq + y_sqgrad = t.gradient(z, {'x': x, 'y': y})print('dz/dx:', grad['x']) # 2*x => 4
print('dz/dy:', grad['y'])
x = tf.Variable(2.0)
y = tf.Variable(3.0)with tf.GradientTape() as t:y_sq = y**2z = x**2 + tf.stop_gradient(y_sq)grad = t.gradient(z, {'x': x, 'y': y})print('dz/dx:', grad['x']) # 2*x => 4
print('dz/dy:', grad['y'])
更多推荐
拜托了tensorflow
发布评论