.data在pytorch中仍然有用吗?

编程入门 行业动态 更新时间:2024-10-12 18:19:57
本文介绍了.data在pytorch中仍然有用吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我是pytorch的新手.我读了很多pytorch代码,这些代码大量使用了张量的.data成员.但是我在官方文档和Google中搜索.data,却发现很少.我想.data包含张量中的数据,但我不知道何时需要它,何时不需要它?

I'm new to pytorch. I read much pytorch code which heavily uses tensor's .data member. But I search .data in the official document and Google, finding little. I guess .data contains the data in the tensor, but I don't know when we need it and when not?

推荐答案

.data是Variable的属性(对象表示Tensor具有历史跟踪功能,例如用于自动更新),而不是Tensor.实际上,.data允许访问Variable的基础Tensor.

.data was an attribute of Variable (object representing Tensor with history tracking e.g. for automatic update), not Tensor. Actually, .data was giving access to the Variable's underlying Tensor.

但是,由于PyTorch版本0.4.0,Variable和Tensor已被合并(成为更新的Tensor结构),所以.data沿着先前的Variable对象消失了(Variable是仍然具有向后兼容性,但已弃用).

However, since PyTorch version 0.4.0, Variable and Tensor have been merged (into an updated Tensor structure), so .data disappeared along the previous Variable object (well Variable is still there for backward-compatibility, but is deprecated).

来自版本0.4.0的发行说明的段落(我建议阅读有关Variable/Tensor更新的整个部分):

Paragraph from Release Notes for version 0.4.0 (I recommend reading the whole section about Variable/Tensor updates):

.data怎么样?

What about .data?

.data是从计算机获取基础Tensor的主要方法 Variable.合并之后,调用y = x.data仍然具有类似的功能 语义.因此,y将是与以下对象共享相同数据的Tensor x,与x的计算历史无关,并且具有 requires_grad=False.

.data was the primary way to get the underlying Tensor from a Variable. After this merge, calling y = x.data still has similar semantics. So y will be a Tensor that shares the same data with x, is unrelated with the computation history of x, and has requires_grad=False.

但是,在某些情况下.data可能是不安全的. x.data上的任何更改 不会被autograd跟踪,并且计算出的梯度将是 如果在向后传递中需要x,则不正确.一个更安全的选择是 使用x.detach(),它还会返回共享数据的Tensor 与requires_grad=False一起使用,但会就地更改 如果后向需要x,则由autograd报告.

However, .data can be unsafe in some cases. Any changes on x.data wouldn't be tracked by autograd, and the computed gradients would be incorrect if x is needed in a backward pass. A safer alternative is to use x.detach(), which also returns a Tensor that shares data with requires_grad=False, but will have its in-place changes reported by autograd if x is needed in backward.

更多推荐

.data在pytorch中仍然有用吗?

本文发布于:2023-11-07 00:17:20,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1565041.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:有用吗   data   pytorch

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!