我正在学习张量流,并且在理解如何正则化成本函数方面遇到一些麻烦.我看了看,发现很多不同的答案.有人可以告诉我如何对成本函数进行正则化吗?
I'm learning tensorflow and I'm having some trouble understanding how to regularize the cost function. I've looked and I'm finding a lot of different answers. Could someone please tell me how to regularize the cost function?
我在Coursera上参加了Andrew Ng的机器学习课程,当我在论坛上看时似乎有一件事与众不同.似乎大多数人都将每个权重以及最终成本函数进行正则化,但是在此过程中没有提及这一点.哪个是正确的?
I took Andrew Ng's machine learning course on Coursera, and there is one thing that seems to be different when I look on forums. It seems like most people regularize each weight as well as regularizing the final cost function, and on the course there is no mention of that. Which one is correct?
推荐答案在带有正则化参数 lambda _ 的 TensorFlow L2(Tikhonov)中,正则化可以这样写:
In TensorFlowL2 (Tikhonov) regularization with regularization parameter lambda_could be written like this:
# Assuming you defined a graph, placeholders and logits layer. # Using cross entropy loss: lambda_ = 0.1 xentropy = tf.nn.softmax_cross_entropy_with_logits_v2(labels=y, logits=logits) ys = tf.reduce_mean(xentropy) l2_norms = [tf.nn.l2_loss(v) for v in tf.trainable_variables()] l2_norm = tf.reduce_sum(l2_norms) cost = ys + lambda_*l2_norm # from here, define optimizer, train operation and train ... :-)更多推荐
如何规范损失函数?
发布评论