2월 2, 2020의 게시물 표시

Publish to my blog (weekly)

Difference Between a Batch and an Epoch in a Neural Network The batch size is a hyperparameter that defines the number of samples to work through before updating the internal model parameters. When the batch is the size of one sample, the learning algorithm is called stochastic gradient descent When the batch size is more than one sample and less than the size of the training dataset, the learning algorithm is called mini-batch gradient descent. Batch Gradient Descent . Batch Size = Size of Training Set   Stochastic Gradient Descent . Batch Size = 1   Mini-Batch Gradient Descent . 1 < Batch Size < Size of Training Set ...