访问神经网络权重和神经元激活(Accessing neural network weights and neuron activations)

编程入门 行业动态 更新时间:2024-10-28 16:25:08
访问神经网络权重和神经元激活(Accessing neural network weights and neuron activations)

使用Keras训练网络后:

我想以某种顺序访问网络的最终训练权重。

我想知道传递的每个输入的神经元激活值。 例如,在训练之后,如果我将X作为我的输入传递给网络,我想知道网络中每个神经元的X的神经元激活值。

Keras是否提供对这些内容的API访问? 我想根据神经元激活值做进一步的分析。

更新 :我知道我可以纯粹使用Theano来做到这一点,但Theano需要更多的低级编码。 而且,由于Keras建立在Theano之上,我认为可以有办法做到这一点?

如果Keras无法做到这一点,那么在Tensorflow和Caffe中,哪个可以呢? Keras是最容易使用的,其次是Tensorflow / Caffe,但我不知道哪些提供了我需要的网络访问权限。 我的最后一个选择是下降到Theano,但我认为用Theano建立一个深度CNN会更耗时。

After training a network using Keras:

I want to access the final trained weights of the network in some order.

I want to know the neuron activation values for every input passed. For example, after training, if I pass X as my input to the network, I want to know the neuron activation values for that X for every neuron in the network.

Does Keras provide API access to these things? I want to do further analysis based on the neuron activation values.

Update : I know I can do this using Theano purely, but Theano requires more low-level coding. And, since Keras is built on top of Theano, I think there could be a way to do this?

If Keras can't do this, then among Tensorflow and Caffe , which can? Keras is the easiest to use, followed by Tensorflow/Caffe, but I don't know which of these provide the network access I need. The last option for me would be to drop down to Theano, but I think it'd be more time-consuming to build a deep CNN with Theano..

最满意答案

这在Keras常见问题解答中有所介绍,您基本上想要计算每个图层的激活,因此您可以使用以下代码执行此操作:

from keras import backend as K #The layer number n = 3 # with a Sequential model get_nth_layer_output = K.function([model.layers[0].input], [model.layers[n].output]) layer_output = get_nth_layer_output([X])[0]

不幸的是,你需要为每一层编译和运行一个函数,但这应该是直截了当的。

要获得权重,您可以在任何图层上调用get_weights() 。

nth_weights = model.layers[n].get_weights()

This is covered in the Keras FAQ, you basically want to compute the activations for each layer, so you can do it with this code:

from keras import backend as K #The layer number n = 3 # with a Sequential model get_nth_layer_output = K.function([model.layers[0].input], [model.layers[n].output]) layer_output = get_nth_layer_output([X])[0]

Unfortunately you would need to compile and run a function for each layer, but this should be straightforward.

To get the weights, you can call get_weights() on any layer.

nth_weights = model.layers[n].get_weights()

更多推荐

本文发布于:2023-07-30 22:11:00,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1340276.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:神经元   神经网络   权重   Accessing   neural

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!