Sensitive Attention 和 Attention Weights"/>
Location Sensitive Attention 和 Attention Weights
Location Sensitive Attention 是一种机制,可帮助解码器在生成每个输出时将注意力集中在输入序列的不同部分。 具体来说,它将先前的解码器输出和编码器输出的摘要(称为“密钥”)作为输入,并生成一组权重,指示编码器输出的哪些部分与当前解码步骤最相关。 Location Sensitive Attention 结合了一个时变位置向量,这有助于解码器随着时间的推移注意输入的不同部分。
The Attention Weights refer to the set of weights produced by the Location Sensitive Attention mechanism, which indicate the relative importance of different parts of the encoder output for the current decoding step. These weights are used to compute a weighted sum of the encoder outputs (), which is then used as input to the decoder.
更多推荐
Location Sensitive Attention 和 Attention Weights
发布评论