Assume that the computations in the m-step are relatively


Problem

We now consider how to use the interpretation of the EM as maximizing an energy functional to allow partial or incremental updates over the instances. Consider the EM algorithm of algorithm. In the Compute-ESS we collect the statistics from all the instances. This requires running inference on all the instances. We now consider a procedure that performs partial updates where it update the expected su-cient statistics for some, but not all, of the instances. In particular, suppose we replace this procedure by one that runs inference on a single instance and uses the update to replace the old contribution of the instance with a new one. This procedure, instead of computing all the expected sufficient statistics in each E-step, caches the contribution of each instance to the sufficient statistics, and then updates only a single one in each iteration.

a. Show that the incremental EM algorithm converges to a fixed point of the log-likelihood function. To do so, show that each iteration improves the EM energy functional.

b. How would that analysis generalize if in each iteration the algorithm performs a partial update for k instances (instead of 1)?

c. Assume that the computations in the M-step are relatively negligible compared to the inference in the E-step. Would you expect the incremental EM to be more efficient than standard EM? If so, why?

Request for Solution File

Ask an Expert for Answer!!
Computer Engineering: Assume that the computations in the m-step are relatively
Reference No:- TGS02650576

Expected delivery within 24 Hours