大家好,又见面了,我是你们的朋友全栈君。如果您正在找激活码,请点击查看最新教程,关注关注公众号 “全栈程序员社区” 获取激活教程,可能之前旧版本教程已经失效.最新Idea2022.1教程亲测有效,一键激活。
Jetbrains全系列IDE使用 1年只要46元 售后保障 童叟无欺
paddle深度学习基础之训练调试与优化
前言
上一节咱们讨论了四种不同的优化算法,这一节,咱们讨论训练过程中的优化问题。本次代码修改模型全是在卷积神经网络
文章目录
网络结构
优化思路
- 计算分类准确率,观测模型训练效果
- 检查模型训练过程,通过输出训练过程中的某些参数或者中间结果,识别潜在问题
- 加入校验或测试,更好评价模型效果
- 加入正则化项,避免模型过拟合
- 可视化分析
一、计算模型的分类准确率
通过计算训练的准确度,能够比较直接的反应模型的精准程度。
在paddle框架中,我们可以使用自带的准确率计算方法:
fluit.layers.accuracy(prediction,lable)
第一个参数是预测值,第二个参数是实际标签值。下面是代码中需要修改的地方:
def forward(self, inputs,label):
conv1 = self.conv1(inputs)
pool1 = self.pool1(conv1)
conv2 = self.conb2(pool1)
pool2 = self.pool2(conv2)
pool2 = fluid.layers.reshape(pool2, [pool2.shape[0], -1])
outputs = self.linear(pool2)
if label is not None:#添加
acc = fluid.layers.accuracy(input=outputs,label=label)#添加
return outputs,acc
else:
return outputs
输出结果:
epoch: 0, batch: 0, loss is: [2.796657], acc is [0.04]
epoch: 0, batch: 200, loss is: [0.50403804], acc is [0.88]
epoch: 0, batch: 400, loss is: [0.2659506], acc is [0.92]
epoch: 1, batch: 0, loss is: [0.22079289], acc is [0.92]
epoch: 1, batch: 200, loss is: [0.23240374], acc is [0.92]
epoch: 1, batch: 400, loss is: [0.16370663], acc is [0.95]
epoch: 2, batch: 0, loss is: [0.37291032], acc is [0.92]
epoch: 2, batch: 200, loss is: [0.23772442], acc is [0.92]
epoch: 2, batch: 400, loss is: [0.18071894], acc is [0.95]
epoch: 3, batch: 0, loss is: [0.15938215], acc is [0.95]
epoch: 3, batch: 200, loss is: [0.21112804], acc is [0.92]
epoch: 3, batch: 400, loss is: [0.05794979], acc is [0.99]
epoch: 4, batch: 0, loss is: [0.24466723], acc is [0.93]
epoch: 4, batch: 200, loss is: [0.14045799], acc is [0.96]
epoch: 4, batch: 400, loss is: [0.12366832], acc is [0.94]
二、检查模型训练过程
在我们训练模型时,时常会出现结果和我们预期有很大差距。此时,我们就想了解训练过程中数据的变化过程。恰巧,paddle深度学习框架支持这些功能,我们一起去看看如何做的:
class MNIST(fluid.dygraph.Layer):
def __init__(self):
super(MNIST, self).__init__()
# self.linear1 = Linear(input_dim=28*28,output_dim=10,act=None)
# self.linear2 = Linear(input_dim=10,output_dim=10,act='sigmoid')
# self.linear3 = Linear(input_dim=10,output_dim=1,act='sigmoid')
self.conv1 = Conv2D(num_channels=1, num_filters=20, filter_size=5, stride=1, padding=2, act='relu')
self.pool1 = Pool2D(pool_size=2, pool_stride=2, pool_type='max')
self.conb2 = Conv2D(num_channels=20, num_filters=20, filter_size=5, stride=1, padding=2, act='relu')
self.pool2 = Pool2D(pool_size=2, pool_stride=2, pool_type='max')
self.linear = Linear(input_dim=980, output_dim=10, act='softmax')
def forward(self, inputs,label,check_shape=False,check_content=False):
conv1 = self.conv1(inputs)
pool1 = self.pool1(conv1)
conv2 = self.conb2(pool1)
pool2 = self.pool2(conv2)
pool21 = fluid.layers.reshape(pool2, [pool2.shape[0], -1])
outputs = self.linear(pool21)
# hidden1 = self.linear1(inputs)
# hidden2 = self.linear2(hidden1)
# outputs = self.linear3(hidden2)
if(check_shape):
print("\n------------打印各个层设置的网络超参数的尺寸 -------------")
print("conv1-- kernel_size:{}, padding:{}, stride:{}".format(self.conv1.weight.shape, self.conv1._padding, self.conv1._stride))
print("conv2-- kernel_size:{}, padding:{}, stride:{}".format(self.conv2.weight.shape, self.conv2._padding, self.conv2._stride))
print("pool1-- pool_type:{}, pool_size:{}, pool_stride:{}".format(self.pool1._pool_type, self.pool1._pool_size, self.pool1._pool_stride))
print("pool2-- pool_type:{}, poo2_size:{}, pool_stride:{}".format(self.pool2._pool_type, self.pool2._pool_size, self.pool2._pool_stride))
print("liner-- weight_size:{}, bias_size_{}, activation:{}".format(self.fc.weight.shape, self.fc.bias.shape, self.fc._act))
print("\n------------打印各个层的形状 -------------")
print("inputs_shape: {}".format(inputs.shape))
print("outputs1_shape: {}".format(conv1.shape))
print("outputs2_shape: {}".format(pool1.shape))
print("outputs3_shape: {}".format(conv2.shape))
print("outputs4_shape: {}".format(pool2.shape))
print("outputs5_shape: {}".format(outputs.shape))
if check_content:
# 打印卷积层的参数-卷积核权重,权重参数较多,此处只打印部分参数
print("\n########## print convolution layer's kernel ###############")
print("conv1 params -- kernel weights:", self.conv1.weight[0][0])
print("conv2 params -- kernel weights:", self.conv2.weight[0][0])
# 创建随机数,随机打印某一个通道的输出值
idx1 = np.random.randint(0, conv1.shape[1])
idx2 = np.random.randint(0, conv1.shape[1])
# 打印卷积-池化后的结果,仅打印batch中第一个图像对应的特征
print("\nThe {}th channel of conv1 layer: ".format(idx1), conv1[0][idx1])
print("The {}th channel of conv2 layer: ".format(idx2), conv1[0][idx2])
print("The output of last layer:", conv1[0], '\n')
if label is not None:
acc = fluid.layers.accuracy(input=outputs,label=label)
return outputs,acc
else:
return outputs
输出结果:
------------打印各个层设置的网络超参数的尺寸 -------------
conv1-- kernel_size:[20, 1, 5, 5], padding:[2, 2], stride:[1, 1]
conv2-- kernel_size:[20, 20, 5, 5], padding:[2, 2], stride:[1, 1]
pool1-- pool_type:max, pool_size:[2, 2], pool_stride:[2, 2]
pool2-- pool_type:max, poo2_size:[2, 2], pool_stride:[2, 2]
liner-- weight_size:[980, 10], bias_size_[10], activation:softmax
------------打印各个层的形状 -------------
inputs_shape: [20, 1, 28, 28]
outputs1_shape: [20, 20, 28, 28]
outputs2_shape: [20, 20, 14, 14]
outputs3_shape: [20, 20, 14, 14]
outputs4_shape: [20, 20, 7, 7]
outputs5_shape: [20, 10]
三、加入校验或者测试,更好的评价模型
通常在我们获取数据集时,通常将数据集分成三个部分,分别是训练集,校验集,测试集。
- 训练集 :用于训练模型的参数,即训练过程中主要完成的工作。
- 校验集 :用于对模型超参数的选择,比如网络结构的调整、正则化项权重的选择等。
- 测试集 :用于模拟模型在应用后的真实效果。因为测试集没有参与任何模型优化或参数训练的工作,所以它对模型来说是完全未知的样本。在不以校验数据优化网络结构或模型超参数时,校验数据和测试数据的效果是类似的,均更真实的反映模型效果。
四、加入正则化项,避免模型过拟合
过拟合现象
对于样本量有限、但需要使用强大模型的复杂任务,模型很容易出现过拟合的表现,即在训练集上的损失小,在验证集或测试集上的损失较大,如 图2 所示。
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-jGYNAyE0-1585745105788)(https://ai-studio-static-online.cdn.bcebos.com/e291cf2ea42947a7b671a71f862b96005270ef0f4b31478f9ef155581622b616)]
图2:过拟合现象,训练误差不断降低,但测试误差先降后增
反之,如果模型在训练集和测试集上均损失较大,则称为欠拟合。过拟合表示模型过于敏感,学习到了训练数据中的一些误差,而这些误差并不是真实的泛化规律(可推广到测试集上的规律)。欠拟合表示模型还不够强大,还没有很好的拟合已知的训练样本,更别提测试样本了。因为欠拟合情况容易观察和解决,只要训练loss不够好,就不断使用更强大的模型即可,因此实际中我们更需要处理好过拟合的问题。
导致过拟合原因
造成过拟合的原因是模型过于敏感,而训练数据量太少或其中的噪音太多。
如图3 所示,理想的回归模型是一条坡度较缓的抛物线,欠拟合的模型只拟合出一条直线,显然没有捕捉到真实的规律,但过拟合的模型拟合出存在很多拐点的抛物线,显然是过于敏感,也没有正确表达真实规律。
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-jNca1VU7-1585745105789)(https://ai-studio-static-online.cdn.bcebos.com/53ffb08a6a4a4e92ac2e622953dceda225b0452418fe4835863d1814bbe4f744)]
图3:回归模型的过拟合,理想和欠拟合状态的表现
如图4 所示,理想的分类模型是一条半圆形的曲线,欠拟合用直线作为分类边界,显然没有捕捉到真实的边界,但过拟合的模型拟合出很扭曲的分类边界,虽然对所有的训练数据正确分类,但对一些较为个例的样本所做出的妥协,高概率不是真实的规律。
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-7OfBRW1N-1585745105790)(https://ai-studio-static-online.cdn.bcebos.com/d3361a54e3524701ad993c2bb0e341d22c1277f575a7422798ed6fb7a4e5c626)]
图4:分类模型的欠拟合,理想和过拟合状态的表现
正则化项
前面咱们提到,过拟合现象就是因为模型过于复杂,对数据太敏感。所以,我们在无法获取更多数据时,只能降低模型的复杂度。
具体来说,在模型的优化目标(损失)中人为加入对参数规模的惩罚项。当参数越多或取值越大时,该惩罚项就越大。通过调整惩罚项的权重系数,可以使模型在“尽量减少训练损失”和“保持模型的泛化能力”之间取得平衡。泛化能力表示模型在没有见过的样本上依然有效。正则化项的存在,增加了模型在训练集上的损失。
代码实现
optimizer = fluid.optimizer.AdamOptimizer(learning_rate=0.01, regularization=fluid.regularizer.L2Decay(regularization_coeff=0.1),
parameter_list=model.parameters())
主要是通过这一句:
regularization=fluid.regularizer.L2Decay(regularization_coeff=0.1)
我们可以调节正则化项的权重,也就是regularization_coeff的值。
五、可视化库
训练模型时,经常需要观察模型的评价指标,分析模型的优化过程,以确保训练是有效的。可视化分析有两种工具:Matplotlib库和tb-paddle。
- Matplotlib库:Matplotlib库是Python中使用的最多的2D图形绘图库,它有一套完全仿照MATLAB的函数形式的绘图接口,使用轻量级的PLT库(Matplotlib)作图是非常简单的。
- tb-paddle:如果期望使用更加专业的作图工具,可以尝试tb-paddle。tb-paddle能够有效地展示飞桨框架在运行过程中的计算图、各种指标随着时间的变化趋势以及训练中使用到的数据信息。
Matplotlib 显示
通常,我们会把模型训练的的损失值和训练次数存进列表中,再通过matplotlib 相关函数进行可视化显示。
if batch_id % 100 == 0:
print("epoch: {}, batch: {}, loss is: {}, acc is {}".format(epoch_id, batch_id, avg_loss.numpy(), acc.numpy()))
iters.append(iter)
losses.append(avg_loss.numpy())
iter = iter + 100
#画出训练过程中Loss的变化曲线
plt.figure()
plt.title("train loss", fontsize=24)
plt.xlabel("iter", fontsize=14)
plt.ylabel("loss", fontsize=14)
plt.plot(iters, losses,color='red',label='train loss')
plt.grid()
plt.show()
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-sizcOGq6-1585745105794)(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAYkAAAEjCAYAAADHWv01AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzt3XeYlNXd//H3F1iWsgoqunTBHlsUcNFoEI0aOxpTNMYaRVNMHtErsaLxSSy/RH00+jxKbNHYkmgiKhY0rCWKCooFFEEhiqCiILB02O/vj3MPOzs7s0zbe3Z3Pq/rmmtm7rnLObOz85lzzl3M3REREUmnQ6kLICIirZdCQkREMlJIiIhIRgoJERHJSCEhIiIZKSRERCQjhYRICjO7y8zczC6PcZsjo23OjWubItnoVOoCiCSY2anAIOCf7j6ttKUREVBISOtyKrA/MBcoZUgsAGYCX5SwDCKtgkJCJIW7XwhcWOpyiLQGGpMQEZGMFBJScmZ2qpk5oasJ4M5oENdTB3MT85pZbfT8RDN7zsy+jKYfE03vaGaHmdmtZjbVzD4zszVmNt/M/mFmBzZTnrQD12Y2KFGm6PmuZvaAmX1qZqvM7D0zu9TMOhf1DWrY/gFm9nC0vTXR/cbqsklUpqlmtizpPZhiZr83s13TLLO/mf3dzOZF8y8xs1lm9k8zO8vM9L1RRtTdJK3BSuAzYHOgAlgaTUtYmG4hM7sROAeoB5ZE9wlfAyYkPV8KrAH6AMcAx5jZRe5+VT4FNrNDgH8CXaNtVwA7AlcAQ6NtFI2Z/Ra4OHrq0Ta3oqEuV0fdZMnL9ABeAnaOJiXep2rC+zAUWA9ckLTMaODWpNWsADoC20W3UcCfgVVFrJ60YvpFICXn7g+6e2/CFxrAL929d9JtrzSLDQV+DlwGbOHumwObJa1jDXAH8G2gh7v3cPcqwhfkpYQvx9+Z2fA8i/0g8Cgw2N17ApsSxjEcGGVmh+e53ibM7HgaAuImYCt33wzYEvhjNP0CM/tRyqK/JATEQuBIoDJ6n7oAOxDC4YOk7XQDro2e3gEMdPfu0fu2BXAYcD+Nw1jaO3fXTbdWcQNqCV+ypzYzz6nRPA5cWcC2Lo3WcWea1+6KXrs8ZfqgpG0/DViaZR+NXr8jx/KMjJabmzLdgFnRa/dnWPa+6PU5QIek6ROi6b/Osgw10fx1QMdSfx50ax03tSSkrVoPXFfA8o9G9/vmufzV7p7uYiz/jO6b9PXnaQ9CNw/AbzPM85vofhDhiz5haXTfJ8ttJeavILQcRBQS0mbNdvdmj2Mws65mdq6Z1ZrZ52a2Nmng+Y1otr55bv+1DNM/ie43y3O9qYZE9wvdfXq6Gdx9ZtJ2hyS9lBiT+YWZ3RMN5G/SzLZmRbfOwMvRe7eTmVkB5Zc2TiEhbVXawewEM+tDOCDvOsJeU1sCq6PlPqPhQLnu+Wzc3ZdleCkxoFuRz3rT2DK6/6TZuWBeyvy4+93AOEKX1Y8IofGVmb1hZldE7xFJ868HfhhtaxvCe/cu8IWZ/c3MjlZglB+FhLRV6zfy+v8QBmc/BI4DNnf3KnffysMg+d4tXcAi65LPQu5+FqHr6wrCmM9qQhfWpcAsMzs4Zf4pwPaEULmb8P5tDnwXeAR43Mw65lcFaYsUEtLuRMcpjIqenujuD7v74pTZqmMuVr4SLaYBG5mvf8r8G7j7dHe/zN0PAHoCRwFvE1pRfzazipT5V7r7ve5+irtvS2hVXEUY1D4MODvv2kibo5CQ1iSxa2WhXRq9gMro8RsZ5jmowG3E5fXovruZ1aSbwcx2APqlzJ+Wu69x98eA70WT+hBaDs0tM8fdLyLs9gsNBz1KGVBISGuS2LumZ4HrWUb41QuwW+qLUV/8OQVuIy7TgNnR44syzHN5dD8XeDUxcSNHficfrFiZxfzJy1Q2O5e0KwoJaU0Se+98JzpaOC/RoPLk6OkdZrYHgJl1MLNvAc9ReGslFtFutpdET0eZ2R/NbAsAM9siOur8hOj1S9w9+UC3Z8zsRjMbYWZdExPNbBfCsSAQznj7dvT4cDN72czONLOtk+bvZmZnAidGk54qZh2lddNpOaQ1uQc4H9iPsEfN58BaYJ6775fjus4FJhFaEm+Y2XLCj6KuwCLgdBqOaWjV3P1BM9uNcNT1z4GfmtkSoAcNP/Sudvd7UxbdlNBiOgeoj5bpSsMg+ArgJHdfl7TM3tENM1tJ2FurJw2hOoGwx5SUCbUkpNVw9/eAg4EnCecY6g1sTcOgbC7regXYhxAEiwm7pH5OOC/RHsCbxSl1PNz9EuBbhD2MvgCqgC+B8cBBnnLepsgZhNOWTAI+IgQEwHuE03vs6u7PJs3/L+AkwrmZ3iaEyCbRdiYCJwNHpYSKtHOW/qBRERERtSRERKQZCgkREclIISEiIhkpJEREJKPYdoE1swGEc8FUEw50GufuN6TMM5Kw98acaNLD7n5Fc+vt1auXDxo0KK8yLV++nO7d8zq/W5tXrnVXvcuL6p3Z1KlTv3D3LZudiXiPk1gHnOfur0enK55qZhPdfUbKfC+4+5HZrnTQoEFMmTIlrwLV1tYycuTIvJZt68q17qp3eVG9MzOz/2Szrti6m9x9gbu/Hj1eRjgFcb/mlxIRkVIqyXESZjYIeJ5wMM/SpOkjgYcI58afD5yf7kIr0cXaRwNUV1cPfeCBB/IqR11dHVVVVXkt29aVa91V7/Kiemd2wAEHTHX3YRtdWdzXSyUcKToV+E6a1zYFqqLHhwOzNra+oUOHer4mTZqU97JtXbnWXfUuL6p3ZsAUb23XuI7OW/8QcK+7P5z6ursvdfe66PEEoMLMesVZRhERaRBbSESXPbwdeNfd017A3sx6Jy6PGJ07vwPhvDEiIlICce7dtC/h5GFvm9m0aNpFwEAAd7+FcInEn5jZOsK564+PmkUiIlICsYWEu7/IRs7h7+43Ec5OKSIirUD5HnH99tsMvv12+OKLUpdERKTVKt+QeP99tv7LX2D+/FKXRESk1SrfkEgcsl5XV9pyiIi0YuUbEokDTZYvL205RERaMYWEWhIiIhkpJBQSIiIZlW9IJMYk1N0kIpJR+YaEWhIiIhtVviHRrVu4V0iIiGRUviHRsSPrKysVEiIizSjfkADWd+2qMQkRkWYoJNSSEBHJSCGhkBARyai8Q6JLF3U3iYg0o7xDQi0JEZFmKSQUEiIiGSkkFBIiIhmVdUjUa0xCRKRZZR0SakmIiDRPIbFiBdTXl7ooIiKtUnmHRJcu4cGKFaUtiIhIK1XeIdG1a3igLicRkbQUEqCQEBHJQCEBCgkRkQzKOyQSYxLaDVZEJK3yDgm1JEREmqWQAIWEiEgG5R0S6m4SEWlWeYeEWhIiIs1SSIBCQkQkg7IOifrKSjBTSIiIZFDWIUGHDtCtm8YkREQyiC0kzGyAmU0ysxlmNt3MfplmHjOzG81stpm9ZWZDWrxgVVVqSYiIZNApxm2tA85z99fNbBNgqplNdPcZSfMcBmwf3YYD/xfdtxyFhIhIRrG1JNx9gbu/Hj1eBrwL9EuZbRRwtweTgZ5m1qdFC9a9u7qbREQyKMmYhJkNAvYEXkl5qR/wcdLzeTQNkuJSS0JEJKM4u5sAMLMq4CHgv9x9aZ7rGA2MBqiurqa2tjavstTV1bFozRo6LV7M63muo62qq6vL+31ry1Tv8qJ6Fy7WkDCzCkJA3OvuD6eZ5RNgQNLz/tG0Rtx9HDAOYNiwYT5y5Mi8ylNbW8vmAwfCzJnku462qra2tuzqDKp3uVG9Cxfn3k0G3A686+7XZZhtPHBytJfT3sASd1/QogXTmISISEZxtiT2BU4C3jazadG0i4CBAO5+CzABOByYDawATmvxUmlMQkQko9hCwt1fBGwj8zjws3hKFFFIiIhkVN5HXEPoblq1CtavL3VJRERaHYVEVVW417iEiEgTColESKjLSUSkCYWEQkJEJCOFRPfu4V7dTSIiTSgk1JIQEclIIaGQEBHJSCGR6G5SSIiINKGQ0C6wIiIZKSTU3SQikpFCQiEhIpKRQqJLFzBTd5OISBoKCTOd5E9EJAOFBCgkREQyUEiAQkJEJAOFBOjqdCIiGSgkQC0JEZEMFBKgkBARyUAhAepuEhHJQCEBakmIiGSgkACFhIhIBgoJUEiIiGSgkIAwJrFmDaxdW+qSiIi0KgoJ0OnCRUQyUEiAzgQrIpKBQgIark6nloSISCMKCVBLQkQkA4UEKCRERDJQSIBCQkQkA4UEaExCRCQDhQSoJSEikoFCAhQSIiIZKCRA3U0iIhnEFhJmdoeZfW5m72R4faSZLTGzadFtbFxlo7ISOnZUS0JEJEWnGLd1F3ATcHcz87zg7kfGU5wkZjrJn4hIGrG1JNz9eWBRXNvLmUJCRKSJOFsS2djHzN4E5gPnu/v0dDOZ2WhgNEB1dTW1tbV5bayurm7DsjUdOlA3Zw4z8lxXW5Nc93KiepcX1bsI3D22GzAIeCfDa5sCVdHjw4FZ2axz6NChnq9JkyY1PBkyxP2II/JeV1vTqO5lRPUuL6p3ZsAUz+I7ttXs3eTuS929Lno8Aagws16xFUDdTSIiTbSakDCz3mZm0eMaQtm+jK0A3btrF1gRkRSxjUmY2f3ASKCXmc0DLgMqANz9FuC7wE/MbB2wEjg+ahLFo6oK5syJbXMiIm1BbCHh7ids5PWbCLvIloa6m0REmmg13U0lp5AQEWlCIZGgMQkRkSYUEglVVbB2LaxZU+qSiIi0GgqJBJ0JVkSkCYVEgs4EKyLShEIiQS0JEZEmFBIJCgkRkSYUEgkKCRGRJnIKCTPb0sy2THq+m5n91syaPVCuTdCYhIhIE7m2JP4KHAUQnXzveeBY4BYzO6/IZYuXWhIiIk3kGhK7A5Ojx98FZrv7LsDJwFnFLFjsFBIiIk3kGhJdgcS36EHA+Ojx68CAYhWqJNTdJCLSRK4hMQv4jpkNAA4Bno6mVwNfFbNgsVNLQkSkiVxD4jfANcBcYLK7vxJN/zbwRhHLFb/OnaGiQiEhIpIkp1OFu/vDZjYQ6Au8mfTSM8BDxSxYSehMsCIijeR8PQl3/wz4LPHczLYD3nT3VcUsWEnoTLAiIo3kepzElWZ2SvTYzGwi8D6wwMyGt0QBY6WWhIhII7mOSZwIzIweHwbsAewN3A1cXcRylYZCQkSkkVy7m6qBedHjw4G/uvurZrYImFLUkpVCVZW6m0REkuTakvgS2Dp6fAjwbPS4E2DFKlTJdO+uloSISJJcWxIPAfeZ2fvA5sBT0fQ9gNnFLFhJqLtJRKSRXENiDPAfYCDwK3dP9M30Af6vmAUrCYWEiEgjuR4nsQ64Ns3064tWolLSLrAiIo3kfJyEmVUDPwN2BhyYAdzs7p8XuWzxS7Qk3MHa/hCLiEihcj1OYl/C2MMPgZXAKsJusbPNbJ/iFy9mVVWwfj2sXl3qkoiItAq5tiT+ANwPnO3u9QBm1gG4hdAN9Y3iFi9mySf569KltGUREWkFct0Fdg/g2kRAAESPrwP2LGbBSkKnCxcRaSTXkFgCDE4zfTBt/VThoNOFi4ikyLW76QHgdjP7FfBSNG1fwunD7y9mwUpCISEi0kiuIfErwpHVd9BwlPUawjESFxS3aCWg7iYRkUZyPU5iDfBLM7sQ2Daa/IG7ryh6yUpBLQkRkUY2GhJmNj6LeQBw96OLUKbSUUiIiDSSTUviy2JsyMzuAI4EPnf3XdO8bsANhLPLrgBOdffXi7HtrCkkREQa2WhIuPtpRdrWXcBNhGtPpHMYsH10G04Y54j3QkYakxARaSTXXWDz5u7PA4uamWUUcLcHk4GeZtYnntJFEiGhloSICJDHuZtaUD/g46Tn86JpC1JnNLPRwGiA6upqamtr89pgXV1dk2VHVFQwb8YMPsxznW1FurqXA9W7vKjehWtNIZE1dx8HjAMYNmyYjxw5Mq/11NbW0mTZTTZh4BZbMDDPdbYVaeteBlTv8qJ6Fy627qYsfAIMSHreP5oWL11TQkRkg9YUEuOBky3YG1ji7k26mlqcQkJEZIPYupvM7H5gJNDLzOYBlwEVAO5+CzCBsPvrbMIusMXaqyo3CgkRkQ1iCwl3P2EjrzvhYkalpavTiYhs0Jq6m1oHtSRERDZQSKRSSIiIbKCQSKXuJhGRDRQSqdSSEBHZQCGRKhES7qUuiYhIySkkUlVVhYBYubLUJRERKTmFRCqdCVZEZAOFRCpdU0JEZAOFRCqFhIjIBgqJVOpuEhHZQCGRSi0JEZENFBKpFBIiIhsoJFIlQuKrr0pbDhGRVkAhkWrgQOjdG+6+u9QlEREpOYVEqspKuOACmDQJnnuu1KURESkphUQ6o0dDnz5w2WWlLomISEkpJNLp2hUuvDC0JCZNKnVpRERKRiGRyZlnQr9+MHasTvYnImVLIZFJly6hNfHii/Dss6UujYhISSgkmnPGGdC/fxibUGtCRMqQQqI5lZVw0UXw0kswcWKpSyMiEjuFxMacfjoMGKDWhIiUJYXExlRWwsUXw+TJ8NRTpS6NiEisFBLZOO002HprtSZEpOwoJLLRuXNoTbz6KjzxRKlLIyISG4VEtk49NZzT6Z57Sl0SEZHYKCSyVVEB++0XWhMiImVCIZGLmhr48ENYuLDUJRERiYVCIhfDh4f7114rbTlERGKikMjFkCHQoQO88kqpSyIiEguFRC6qqmDXXTUuISJlQyGRq5qaEBI6XkJEyoBCIlc1NbBoEXzwQalLIiLS4mINCTM71MxmmtlsM7sgzeunmtlCM5sW3c6Is3xZSQxeq8tJRMpAbCFhZh2Bm4HDgJ2BE8xs5zSzPujue0S32+IqX9Z23hm6ddPgtYiUhThbEjXAbHf/0N3XAA8Ao2LcfnF06gRDh6olISJloVOM2+oHfJz0fB4wPM18x5nZCOB94Fx3/zh1BjMbDYwGqK6upra2Nq8C1dXV5bXsNn370v/hh3lh4kS8oiKvbZdavnVv61Tv8qJ6F4G7x3IDvgvclvT8JOCmlHm2ACqjx2cB/9rYeocOHer5mjRpUn4L/vWv7uD+2mt5b7vU8q57G6d6lxfVOzNgimfx3R1nd9MnwICk5/2jaRu4+5fuvjp6ehswNKay5UaD1yJSJuIMideA7c1ssJl1Bo4HxifPYGZ9kp4eDbwbY/myN2AAVFdr8FpE2r3YxiTcfZ2Z/Rx4CugI3OHu083sCkKzZzzwCzM7GlgHLAJOjat8OTFrOKhORKQdi3PgGnefAExImTY26fGFwIVxlilvw4fDo4/CkiXQo0epSyMi0iJ0xHW+amrCvc4IKyLtmEIiX3vtFe7V5SQi7ZhCIl89e8KOO2rwWkTaNYVEIWpqQkjojLAi0k4pJAoxfDh89hnMm1fqkoiItAiFRCESg9fqchKRdkohUYjdd4fOnTV4LSLtlkKiEJWVsOeeCgkRabcUEoWqqYEpU2D9+vSva1BbRNowhUShhg+H5cthxoyGafX14WjsAw4I53l6663SlU9EpAAKiUIlD16vXAm33hquXnf00eE62O4hLN54o7TlFBHJg0KiUNttB5ttBtdfDwMHwtlnQ1UV3HdfCIkXXgjPDzwwdEuJiLQhColCmcE3vwnvvgvf+AY891w4n9MJJ0BFBWyzTZjWsyccdFD+u8uuW6fxDRGJnUKiGO68E+bOhUcegREjQnAkGzQoBMUWW8DBB8NLL+W2/vfeg+23hyOOgLVri1VqEZGNUkgUw+abh66m5gwcGIKid2/49rfhxRezW/crr8B++8HixfDEE3DWWS3Toli7NnSZvflm8dctIm2WQiJO/ftDbS306xeC4tprYfXqzPM/+WQYy+jRA6ZOhbFjQ6vld78rbrkWLQrlGTMmBNIzzxR3/SLSZikk4ta3bwiKESPg/PPDnlB/+1vT1sFf/gJHHQU77BC6p7bdFi6/HE46CS69NLxeDO+/D3vvDf/+N9xwAwweDIcfDg8+WJz1i0ibppAohd69Q9fRU09Bt27w/e+HX/CTJ4fXr7suhME3vxm6qKqrw3QzuO22sEvt6aeHsEnHPQTLH/8Is2dnLsezz4bjPBYvDo9/8Qt4/vkQGiecEJaX7H34IZxzDsyfX+qSiBSNQqKUDjkEpk2DcePC7rL77BO+oM87D447DiZMgE03bbxM587w0ENh19tjjw17VSXU1YXjNPbcE/bdN3zpb789DBsWuraSzlbbZ/z40MXUr184rch++4UXevYM4XX00WH5Sy7RXlXZmD49vIc33RTeuxUrSl0ikaJQSJRax45w5pkwa1b4Qn77bfjpT0N3T5cu6ZfZbLMQIJWVcNhh4df/OeeErqyzzw7z3HorzJwJf/hDaIGcf344+nvECDjxRHa8/voQUi+9FLqYknXtCn//O5xxRhj/GD067IIr6U2ZAvvvH8L02mvh9dfhlFPCkfcihVi+HK65Br74onRlcPc2fRs6dKjna9KkSXkv22LWrs1+3tdec+/WzR3cO3d2/9GP3F96yb2+vum877/v/t//7b7zzu7gHx933Ma3VV/vfsklYf3f/GZYd1u0fr377be777ijL/76190vvdT96afdly0rfN21te6bbOI+aJD77Nlh2u9/H96zsWMLX3+RtMrPegzS1nvpUvfPPou9LHk57bTwWfrxj3NaLJu/NzDFs/iOLfmXfKG3dhcSuXruOffrrnP//PPsl6mry63ud9zhvtVW4eNy5JHu06blXMy8TZjg/uST+S//8svue+0Vyj5smC/ZcUf3Dh3C844d3Wtq3M87z33mzNzX/fjj7l26uH/ta+7z5jVMr69v+Oe+7778y15ERfmsf/KJ+403ui9YUPi6YtKk3h995L7NNu59+7rX1ZWkTFm7997wGRo0KHxmp0/PelGFhEKiYDnXfdky9yuvdO/ZM3xsfvAD9/fea3h96VL3qVPd77/f/Yor3M89N/zKXr8+vwLW14ftgXtFhfu//53b8vPnu598cli+b1/3v/zFvb4+1Hvp0hA8F18cWkidO7tvuaX7jBnZr/+BB9w7dXIfMiR9QK9aFdZdWek+eXJuZW8BBX/Wn346vEcQ6nTWWQ0tp1asUb0TAdG9e6jHFVeUrFwbNXt2aKHuu6/7p5+6b7qp+9FHZ724QkIhUbC86754cfhy7d49/LoZPty9T5/wUUq+VVY2/Aq69FL3WbOy38bate5nntkQRttsE7aRzS/YtWvdr7nGvaoqfPlfeGGjbqW09Z450726OoTJBx80v/76evcbbnA3CyHw1VeZ51240H3w4LDujz5q+vr69aEFsnr1xutVoLz/3uvWuV92WajvLru4P/VUCIjOncPf//jj3d94o5hFLaoN9U4ExKabur/yivuxx4bPSGvsdlq92n3YsPCDbO7cMC3xg+mFF7JahUJCIVGwguv+2WfuY8aEL8rTTnO/6ir3v//d/a233FescF++PPx6P/jg8AUD4VfRuHHuixZlXu/Spe6HHhrmv+ii8EX65pvuXbu677ef+5o1mZddvjx0h4H7UUelDaaM9X77bffNNw+hlu4LPbH+H/0orH/UqPB8Y6ZPD19MX/+6+223uf/61+ELatddQ1cVuG+7bWiFtaC8/t6ffur+rW+FMp58cuPumfnzQ1022SS8fuihoWuvlZk0aVLTgHAPreCOHd1/9rOSli+t888P7+lDDzVMW748/FD6xjfSjzmmUEgoJAoWa90//jiEyE47+YZB9lGj3B98sPEX7bx57nvsEf55x41rvI777gvL/uIX6bfx5Zfu++wTAul//zdjUZqt95Qp4Ytkhx3CF2SyDz4IX/RmYQeAXLrRnniiYRykosJ9xx1DmI0ZE8aT+vcP78nNN2f1BZCz9ev95fvvD11sN9zg/pOfuB94YNju4MHuxxzjfvnl7v/8p/ucOaEMtbXuvXuHILv99szlWrw4/G0TXVFHHx1+KLQSLz34YNOASDj77NBl+P77pSlcOk88Ed7Hs89u+tq4ceG1f/xjo6tRSCgkClaSutfXu7/6ahivSHRRVVWFX+d//nP40qqqCv8o6Zx7bljmnnsaT//oozB43LlzaM00Y6P1fuGFsMfYbruF4HEPX66bbRaa/48/nl1dU330UQiadHuULVzofvjhoW7f+17zXVjZ+uKL8D59//vuPXp4o67AHj3CgP1JJ4XXd9ihobUHoZ4dOrhvv31oxWVj2TL33/0urNvM/cQTSz9m8dFHvqJv3/QB4R66L7t3D+95rpYtC+Nkf/pT8UJxwYKwg8iuu4bWeKq1a8MPrZ122uieiQoJhUTBSl73devc//Uv9zPOaBgM79u3+f7tNWvc998/dD0l5ps+PYTLppu6Z1GnrOo9cWIInL32cv/Nb8KX3m67teyX3vr1YSylY8f8up/q693fecf96qtDt1yi5VJd7X766f7emDFhT7hPP03fKqirC91Ft9wSfsWOGeO+ZEnu9fjyS/cLLgh/o06dwvhF8p5fLWXlyrBL+K23hm3W1LhXVvra7t3TB0TC2LHhfWpu54L6evdnngktph/8oGmogvtBB4U98fLdUWP9+rCOrl3D3zGTf/wjbO9Pf2p2dQoJhUTBWlXdV60KgZHNbryffurer1/oJnnssfALv3fvrHfLzbre48eHLzlwP+GE+HaXfPHFhu6niy92f+SRMLCe7pfjnDmhK+jEExvvPLDnnmFngVdf3fClFfvfe/5895/+NLyHXbqE8YvmxqLysX59aDnW1DT8rRItpQMOcB8zxl+9/fbm17F0afj1PmJE+vD89NOGMbLEjhjHHBN+PIwf7/7uuyE8+vYNr++0UwjabMar3ENX1+WXh+CBpt2sqerrQ7dq377NbkMhoZAoWJuu++TJ4UsU3Lfbzv3DD7NeNKd6P/WU+513tsw4QXMWLgx9+8m/VCsqQpfasceGQeTBgxteq64OQTZuXBj/SaNkf+8PPgghZhZajFdfnf0XaCb19eELeo89Gr6YL7rI/W9/C9tL+ntlVe+bbw7refTRxtMfeyyMtXTpEsZymgu51avDjhpDhoR1bbFF6M4bOzYcZzRpUgj1tWtDt9L11zccv2PmPnJbU4mKAAAI4UlEQVRk6HLN5rP2/PNhuSuvzDiLQkIhUbA2X/d77w2/6HLchbFN1Xvx4hCId90VunCOOSZ8IW61VXh8442ha6JIe7u0qGnTGsZd+vYN3ULN7amWTn19CO6aGt+wV9g994SuywyyqveaNWH8ZZddwrpWrAh7PYH77rs33/2TrozPPef+ne+EFmFqt1THjg1dgXvu6f6HP2QM9mYddVToYv3ii7QvFzMkOpXobCAihfnhD8OtPevZM5yld/jwUpekcF//Ojz+eDjP2IUXhotnnXNOOCNynz6Nbz17wqpV4SSJK1eG+xUrYMYMePnlcAGv226Dk08OlwguVEUFXHklfO97cPHF8OijYVvnnhumZzqHWjpm4fxoI0aE52vWwMcfhytXzpkT7jt3Dtv62tfyL/NVV8Huu4fyXXtt/uvJQqwhYWaHAjcAHYHb3P3qlNcrgbuBocCXwA/cfW6cZRSRFjRiRLgq44QJITAWLAi3WbPC80WLGs9fWRlOONmtW7gC5M03w49/HKYX03HHhTC+5poQXE8+Gc6SXKjOncO1YLbdtvB1JdtlF7jllnBRshYWW0iYWUfgZuBgYB7wmpmNd/cZSbP9GFjs7tuZ2fHANcAP4iqjiMTALFyv/Ygjmr62ejUsWxaCoUuXcJbkuMp0223hyo8XXABbbhnPdgtx5pmxbCbOU4XXALPd/UN3XwM8AIxKmWcU8Ofo8d+Bb5mZxVhGESmlykro1Qu6d48vIBJ23TV03bSFgIhRnN1N/YCPk57PA1I7WzfM4+7rzGwJsAXQ6GTqZjYaGA1QXV1NbaYrtG1EXV1d3su2deVad9W7vKjehWuTA9fuPg4YBzBs2DAfOXJkXuupra0l32XbunKtu+pdXlTvwsXZ3fQJMCDpef9oWtp5zKwT0IMwgC0iIiUQZ0i8BmxvZoPNrDNwPDA+ZZ7xwCnR4+8C/4r25xURkRKIrbspGmP4OfAUYRfYO9x9upldQTioYzxwO3CPmc0GFhGCRERESiTWMQl3nwBMSJk2NunxKuB7cZZJREQyi7O7SURE2hiFhIiIZGRtfVzYzBYC/8lz8V6kHINRRsq17qp3eVG9M9va3Td65GCbD4lCmNkUdx9W6nKUQrnWXfUuL6p34dTdJCIiGSkkREQko3IPiXGlLkAJlWvdVe/yonoXqKzHJEREpHnl3pIQEZFmKCRERCSjsg0JMzvUzGaa2Wwzu6DU5SmUmd1hZp+b2TtJ0zY3s4lmNiu63yyabmZ2Y1T3t8xsSNIyp0TzzzKzU9JtqzUxswFmNsnMZpjZdDP7ZTS9XdfdzLqY2atm9mZU799E0web2StR/R6MTqaJmVVGz2dHrw9KWteF0fSZZlaEa3a2PDPraGZvmNlj0fN2X28zm2tmb5vZNDObEk1r+c+5u5fdjXCCwQ+AbYDOwJvAzqUuV4F1GgEMAd5Jmvb/gAuixxcA10SPDweeAAzYG3glmr458GF0v1n0eLNS120j9e4DDIkebwK8D+zc3uselb8qelwBvBLV56/A8dH0W4CfRI9/CtwSPT4eeDB6vHP0+a8EBkf/Fx1LXb8s6j8GuA94LHre7usNzAV6pUxr8c95ubYksrmUapvi7s8TzpybLPlysH8GjkmafrcHk4GeZtYH+DYw0d0XuftiYCJwaMuXPn/uvsDdX48eLwPeJVzhsF3XPSp/XfS0Iro5cCDh0r/QtN7pLg08CnjA3Ve7+xxgNuH/o9Uys/7AEcBt0XOjDOqdQYt/zss1JNJdSrVficrSkqrdfUH0+FOgOnqcqf5t+n2JuhL2JPyqbvd1j7pcpgGfE/7ZPwC+cvd10SzJdWh0aWAgcWngNldv4H+AXwH10fMtKI96O/C0mU21cAlniOFz3iYvXyq5c3c3s3a7v7OZVQEPAf/l7kvDj8Wgvdbd3dcDe5hZT+AfwE4lLlKLM7Mjgc/dfaqZjSx1eWK2n7t/YmZbARPN7L3kF1vqc16uLYlsLqXaHnwWNTGJ7j+Ppmeqf5t8X8ysghAQ97r7w9Hksqg7gLt/BUwC9iF0KyR+/CXXIdOlgdtavfcFjjazuYRu4gOBG2j/9cbdP4nuPyf8KKghhs95uYZENpdSbQ+SLwd7CvBI0vSToz0g9gaWRE3Wp4BDzGyzaC+JQ6JprVbUv3w78K67X5f0Uruuu5ltGbUgMLOuwMGE8ZhJhEv/QtN6p7s08Hjg+GgvoMHA9sCr8dQid+5+obv3d/dBhP/bf7n7ibTzeptZdzPbJPGY8Pl8hzg+56UesS/VjTD6/z6hH/fiUpenCPW5H1gArCX0M/6Y0Pf6LDALeAbYPJrXgJujur8NDEtaz+mEQbzZwGmlrlcW9d6P0Ff7FjAtuh3e3usO7A68EdX7HWBsNH0bwpfdbOBvQGU0vUv0fHb0+jZJ67o4ej9mAoeVum45vAcjadi7qV3XO6rfm9FteuI7K47PuU7LISIiGZVrd5OIiGRBISEiIhkpJEREJCOFhIiIZKSQEBGRjBQSIjkys7sSZx8Vae+0C6xIjsysB+F/5yszqyWceffnJS6WSIvQuZtEcuTuS4q9TjPr7OGMxCKtiloSIjkys7uAXsAXNJwSIWGwu881s52B3xOu87GScFTsue7+aco6XgDOATq7+1axVEAkBxqTEMnfL4GXgTsJFz/qA3wcnWjtecLpMmqAg4Aq4BEzS/6f259weo1DgW/FWG6RrKm7SSRP7r7EzNYAKxItBAAz+wnwprv/OmnayYSLQg2j4URyq4DT3X11jMUWyYlCQqT4hgIjzKwuzWvb0hAS7yggpLVTSIgUXwfgceD8NK99lvR4eTzFEcmfQkKkMGuAjinTXge+D/zH3dfGXySR4tHAtUhh5gI1ZjbIzHpFA9M3E66A9qCZDTezbczsIDMbl7hwjEhboZAQKcwfCK2JGcBCYKC7zydcZrMeeJJwkZibgdXRTaTN0HESIiKSkVoSIiKSkUJCREQyUkiIiEhGCgkREclIISEiIhkpJEREJCOFhIiIZKSQEBGRjP4/FMStuIqsa7IAAAAASUVORK5CYII=)]
tb-paddle
- 步骤1:引入tb_paddle库,定义作图数据存储位置(供第3步使用),本案例的路径是“log/data”。
from tb_paddle import SummaryWriter
data_writer = SummaryWriter(logdir="log/data")
- 步骤2:在训练过程中插入作图语句。当每100个batch训练完成后,将当前损失作为一个新增的数据点(scalar_x和loss的映射对)存储到第一步设置的文件中。使用变量scalar_x记录下已经训练的批次数,作为作图的X轴坐标。
data_writer.add_scalar("train/loss", avg_loss.numpy(), scalar_x)
data_writer.add_scalar("train/accuracy", avg_acc.numpy(), scalar_x)
scalar_x = scalar_x + 100
- 步骤3:命令行启动 tensorboard。
使用“tensorboard –logdir [数据文件所在文件夹路径] 的命令启动Tensor board。在Tensor board启动后,命令行会打印出可用浏览器查阅图形结果的网址。
$ tensorboard --logdir log/data
- 步骤4:打开浏览器,查看作图结果,如 图6 所示。
查阅的网址在第三步的启动命令后会打印出来(如TensorBoard 2.0.0 at http://localhost:6006/),将该网址输入浏览器地址栏刷新页面的效果如下图所示。除了右侧对数据点的作图外,左侧还有一个控制板,可以调整诸多作图的细节。
注意:这里需要安装tensorboard,要不然无法打开这个这个。
令行会打印出可用浏览器查阅图形结果的网址。
$ tensorboard --logdir log/data
- 步骤4:打开浏览器,查看作图结果,如 图6 所示。
查阅的网址在第三步的启动命令后会打印出来(如TensorBoard 2.0.0 at http://localhost:6006/),将该网址输入浏览器地址栏刷新页面的效果如下图所示。除了右侧对数据点的作图外,左侧还有一个控制板,可以调整诸多作图的细节。
注意:这里需要安装tensorboard,要不然无法打开这个这个。
发布者:全栈程序员-用户IM,转载请注明出处:https://javaforall.cn/196350.html原文链接:https://javaforall.cn
【正版授权,激活自己账号】: Jetbrains全家桶Ide使用,1年售后保障,每天仅需1毛
【官方授权 正版激活】: 官方授权 正版激活 支持Jetbrains家族下所有IDE 使用个人JB账号...