批处理(Batch) 在批处理方法中,针对训练数据的所有误差计算每个权重更新,并且使用权重更新的平均值来调整权重。 In the batch method, each weight update iscalculated for all errors of the training data, and the average of the weightupdates is used for adjusting the weights. 该方法使用所有的训练数据,只执行一次更新操作。 This method uses all of the training dataand updates only once. 图2-16解释了批处理方法的权值计算和训练过程。 Figure 2-16 explains the weight updatecalculation and training process of the batch method. 批处理方法计算权值更新如下: The batch method calculates the weightupdate as: Because of the averaged weight updatecalculation, the batch method consumes a significant amount of time fortraining. 小批量处理(Mini Batch) 小批量处理方法是SGD与批量处理二者的结合。 The mini batch method is a blend of the SGDand batch methods. 它选择出训练数据中的一部分用于批量处理训练。 It selects a part of the training datasetand uses them for training in the batch method. 因此,小批量处理方法计算出被选择训练数据的权值更新,并用平均权值更新训练神经网络。 Therefore, it calculates the weight updatesof the selected data and trains the neural network with the averaged weightupdate. 例如,从100个训练数据中挑选出20个随机的数据,小批量处理方法将被应用于这20个数据的处理。 For example, if 20 arbitrary data pointsare selected out of 100 training data points, the batch method is applied tothe 20 data points. 在这个例子中,总共进行5次加权调整才能完成所有数据的训练过程(5 = 100/20)。 In this case, a total of five weightadjustments are performed to complete the training process for all the datapoints (5 = 100/20). 图2-17解释了小批量处理方法如何选择训练数据,并计算权值更新。 Figure 2-17 shows how the mini batch schemeselects training data and calculates the weight update. 当选择合适的处理数据规模时,小批量处理方法会获得SGD的收敛速度和批处理的稳定性。 The mini batch method, when it selects anappropriate number of data points, obtains the benefits from both methods:speed from the SGD and stability from the batch. 因此,在深度学习处理海量数据时,小批量处理方法经常被实际应用。 For this reason, it is often utilized inDeep Learning, which manipulates a significant amount of data. 现在,让我们根据“时代”的概念进一步深入地研究一下SGD、批处理和小批量处理算法。 Now, let’s delve a bit into the SGD, batch,and mini batch in terms of the epoch. ——本文译自Phil Kim所著的《Matlab Deep Learning》 更多精彩文章请关注微信号: |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论