## 交叉熵实例,Tensorflow交叉熵函数

[ 隐藏 ]

### 1. 概述

• tf.nn.sparse_softmax_cross_entropy_with_logits
• tf.nn.softmax_cross_entropy_with_logits

labels是训练数据的标签.

### 2. 实例

item
label 0 1 0
logit 0.1 0.8 0.2

### 3. 计算公式

#### 3.1. softmax

softmax 是将 数组[a, b, c] 变换成概率分布,假如转换完成后得到数组[A,B,C]满足:

1. A,B,C都在0和1之间,
2. A+B+C=1成立

$$softmax = \left[ \frac{e^a}{e^a + e^b + e^c}, \; \frac{e^b}{e^a + e^b + e^c}, \; \frac{e^c}{e^a + e^b + e^c} \right]$$

#### 3.2. cross entropy

1. 计算 logitssoftmax $softmax([a,b,c]) = [S_a,S_b,S_c]$;
2. 计算的softmax的对数(以$e$为底的 ) $log([S_a,S_b,S_c]) = [LS_a],LS_b,LS_c]$;
3. 计算cross entropy $cross entropy = -(l_1 \times LS_a + l_2 \times LS_b + l_3 \times LS_c)$.

### 4. 代码实现

NOTE: 概述部分给出的两个函数的区别在于参数labels的形式. 假如labels=[0,1,0]sparse开头的方法,

#!/usr/bin/env python3

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import sklearn.datasets as skDs
import sklearn as skl
import tensorflow as tf_old

tf = tf_old.compat.v1
logit = [0.1, 0.8, 0.2]
label = [0, 1, 0]

def softmax(X):
total_exp = 0.0
xLen = len(X)
y = np.zeros(xLen)
for i in range(xLen):
exp = np.exp(X[i])
y[i] = exp
total_exp += exp
y = y / total_exp
return y

def cross_entropy(logit_, label_):

the_label = label_

current_output_softmax = softmax(logit_)

current_output_softmax_log = np.log(current_output_softmax)

result = - np.sum(the_label * current_output_softmax_log)
return result

def main():
y = tf.constant([logit])
y_ = tf.constant([label])
cross_entropy1 = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=y, labels=tf.argmax(y_, 1))
cross_entropy2 = tf.nn.softmax_cross_entropy_with_logits(logits=y, labels=y_)

with tf.Session() as sess:
tf.global_variables_initializer().run()
print(sess.run(cross_entropy1))
print(sess.run(cross_entropy2))

if __name__ == '__main__':
main()
# main的输出结果为
# [0.7155919]
# [0.7155919]
print(cross_entropy(logit, label))
# 输出结果为 -0.7155918732927512


微信赞赏  支付宝赞赏