Explain one advantage and one disadvantage of online

Question: An extreme version of gradient descent is to use a mini-batch size of just 1. That is, given a training input, x, we update our weights and biases according to the rules wk→w′k=wk-η∂Cx/∂wk and bl→b′l=bl-η∂Cx/∂bl. Then we choose another training input, and update the weights and biases again. And so on, repeatedly. This procedure is known as online, on-line, or incremental learning. In online learning, a neural network learns from just one training input at a time (just as human beings do). Name one advantage and one disadvantage of online learning, compared to stochastic gradient descent with a mini-batch size of, say, 20.?

Solution Preview :

Prepared by a verified Expert
Other Subject: Explain one advantage and one disadvantage of online
Reference No:- TGS01130986

Now Priced at $10 (50% Discount)

Recommended (94%)

Rated (4.6/5)