如何在tf.clip

编程入门 行业动态 更新时间:2024-10-28 04:21:35
如何在tf.clip_by_global_norm中处理None?(How to handle None in tf.clip_by_global_norm?)

我已经在这里阅读了这个问题的答案,tf.clip_by_global_norm()通过简单地忽略它们来处理None值(danijar在@danijar的评论中对答案进行评论),但是当我尝试应用它时,我似乎做了一些错误的操作它抛出

ValueError:不支持值。

tf.reset_default_graph() z = tf.get_variable(name = 'z', shape = [1]) b = tf.get_variable('b', [1]) c = b*b - 2*b + 1 optimizer = tf.train.AdamOptimizer(0.1) gradients, variables = zip(*optimizer.compute_gradients(c)) gradients = tf.clip_by_global_norm(gradients, 2.5) train_op = optimizer.apply_gradients(zip(gradients, variables))

有人可以告诉我我做错了什么,或者如果tf.clip_by_global_norm()不处理无梯度,我必须手动处理它们

官方文件似乎同意@ danijar的评论。 看到这里

任何类型为t_list的条目都被忽略。

I have read in answers to this question here that tf.clip_by_global_norm() handles None values by simply ignoring them (comment by danijar in comments to the answer by @danijar) but when i try to apply it i seem to be doing something wrong as it throws

ValueError: None values not supported.

tf.reset_default_graph() z = tf.get_variable(name = 'z', shape = [1]) b = tf.get_variable('b', [1]) c = b*b - 2*b + 1 optimizer = tf.train.AdamOptimizer(0.1) gradients, variables = zip(*optimizer.compute_gradients(c)) gradients = tf.clip_by_global_norm(gradients, 2.5) train_op = optimizer.apply_gradients(zip(gradients, variables))

Can somebody please tell me what am i doing wrong or if tf.clip_by_global_norm() does not handle None gradients and i have to take care of them manually

The official documentation seems to agree with @danijar's comments. see here

Any of the entries of t_list that are of type None are ignored.

最满意答案

代码中存在一个小问题:当此函数返回一对值时,您将tf.clip_by_global_norm的返回值分配给单个变量。

该文件说:

返回:

list_clipped:与list_t类型相同的张量列表。

global_norm:表示全局标准的0-D(标量)张量。

因此,当您尝试将梯度应用于变量时,会出现问题,在下一行中。

您可以轻松修复代码,忽略global_norm返回的值。

gradients, _ = tf.clip_by_global_norm(gradients, 2.5)

There's a small problem in your code: you're assigning the return value of tf.clip_by_global_norm to a single variable, when this function returns a pair of values.

The documentation says:

Returns:

list_clipped: A list of Tensors of the same type as list_t.

global_norm: A 0-D (scalar) Tensor representing the global norm.

Hence, the problem arises when you try to apply the gradients to the variables, in the next line.

You can easily fix your code ignoring the global_norm returned value.

gradients, _ = tf.clip_by_global_norm(gradients, 2.5)

更多推荐

本文发布于:2023-07-24 00:33:00,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1239333.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:如何在   tf   clip

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!