Consider the problem of finding a stable state in a


Text Book - Algorithm Design by Jon Kleinberg and Eva Tardos

Chapter 12 - Local Search

Exercises

Q1. Consider the problem of finding a stable state in a Hopfield neural network, in the special case when all edge weights are positive. This corresponds to the Maximum-Cut Problem that we discussed earlier in the chapter: For every edge e in the graph G, the endpoints of G would prefer to have opposite states.

Now suppose the underlying graph G is connected and bipartite; the nodes can be partitioned into sets X and Y so that each edge has one end in X and the other in Y. Then there is a natural "best" configuration for the Hopfield net, in which all nodes in X have the state +1 and all nodes in Y have the state -1. This way, all edges are good, in that their ends have opposite states.

The question is: In this special case, when the best configuration is so clear, will the State-Flipping Algorithm described in the text (as long as there is an unsatisfied node, choose one and flip its state) always find this configuration? Give a proof that it will, or an example of an input instance, a starting configuration, and an execution of the State-Flipping Algorithm that terminates at a configuration in which not all edges are good.

Q2. Recall that for a problem in which the goal is to maximize some underlying quantity, gradient descent has a natural "upside-down" analogue, in which one repeatedly moves from the current solution to a solution of strictly greater value. Naturally, we could call this a gradient ascent algorithm. (Often in the literature you'll also see such methods referred to as hill-climbing algorithms.)

By straight symmetry, the observations we've made in this chapter about gradient descent carry over to gradient ascent: For many problems you can easily end up with a local optimum that is not very good. But sometimes one encounters problems-as we saw, for example, with the Maximum-Cut and Labeling Problems-for which a local search algorithm comes with a very strong guarantee: Every local optimum is close in value to the global optimum. We now consider the Bipartite Matching Problem and find that the same phenomenon happens here as well.

Thus, consider the following Gradient Ascent Algorithm for finding a matching in a bipartite graph.

As long as there is an edge whose endpoints are unmatched, add it to the current matching. When there is no longer such an edge, terminate with a locally optimal matching.

(a) Give an example of a bipartite graph G for which this gradient ascent algorithm does not return the maximum matching.

(b) Let M and M' be matching's in a bipartite graph G. Suppose that |M'| > 2|M|. Show that there is an edge e' ∈ M' such that M ∪{e'} is a matching in G.

(c) Use (b) to conclude that any locally optimal matching returned by the gradient ascent algorithm in a bipartite graph G is at least half as large as a maximum matching in G.

Q3. Suppose you're consulting for a biotech company that runs experiments on two expensive high-throughput assay machines, each identical, which we'll label M1 and M2. Each day they have a number of jobs that they need to do, and each job has to be assigned to one of the two machines. The problem they need help on is how to assign the jobs to machines to keep the loads balanced each day. The problem is stated as follows. There are n jobs, and each job j has a required processing time tj. They need to partition the jobs into two groups A and B, where set A is assigned to M1 and set B to M2. The time needed to process all of the jobs on the two machines is T1 = ∑jA tj and T2 = jBtj. The problem is to have the two machines work roughly for the same amounts of time-that is, to minimize |T1 - T2|.

A previous consultant showed that the problem is NP-hard (by a reduction from Subset Sum). Now they are looking for a good local search algorithm. They propose the following. Start by assigning jobs to the two machines arbitrarily (say jobs 1, . . . , n/2 to M1, the rest to M2). The local moves are to move a single job from one machine to the other, and we only move jobs if the move decreases the absolute difference in the processing times. You are hired to answer some basic questions about the performance of this algorithm.

(a) The first question is: How good is the solution obtained? Assume that there is no single job that dominates all the processing time-that is, that tj ≤ ½i=1nti for all jobs j. Prove that for every locally optimal solution, the times the two machines operate are roughly balanced: ½T1 ≤ T2 ≤ 2T1.

(b) Next you worry about the running time of the algorithm: How often will jobs be moved back and forth between the two machines? You propose the following small modification in the algorithm. If, in a local move, many different jobs can move from one machine to the other, then the algorithm should always move the job j with maximum tj. Prove that, under this variant, each job will move at most once. (Hence the local search terminates in at most n moves.)

(c) Finally, they wonder if they should work on better algorithms. Give an example in which the local search algorithm above will not lead to an optimal solution.

Q4. Consider the Load Balancing Problem from Section 11.1. Some friends of yours are running a collection of Web servers, and they've designed a local search heuristic for this problem, different from the algorithms described in Chapter 11.

Recall that we have m machines M1,...,Mm, and we must assign each job to a machine. The load of the ith  job is denoted ti. The make span of an assignment is the maximum load on any machine:

maxmachinesMijobs j assigned toMi tj.

Your friends' local search heuristic works as follows. They start with an arbitrary assignment of jobs to machines, and they then repeatedly try to apply the following type of "swap move."

Let A(i) and A(j) be the jobs assigned to machines Mi and Mj, respectively. To perform a swap move on Mi and Mj, choose subsets of jobs B(i) ⊆ A(j) and B(j) ⊆ A(j), and "swap" these jobs between the two machines. That is, update A(i) to be A(i) ∪ B(j) - B(i), and update A(j) to be A(j) ∪ B(i) - B(j). (One is allowed to have B(i) = A(i), or to have B(i) be the empty set; and analogously for B(j).)

Consider a swap move applied to machines Mi and Mj. Suppose the loads on Mi and Mj before the swap are Ti and Tj, respectively, and the loads after the swap are T'I and T'j. We say that the swap move is improving if max(T'i, T'j) < max(Ti, Tj)-in other words, the larger of the two loads involved has strictly decreased. We say that an assignment of jobs to machines is stable if there does not exist an improving swap move, beginning with the current assignment.

Thus the local search heuristic simply keeps executing improving swap moves until a stable assignment is reached; at this point, the resulting stable assignment is returned as the solution.

Request for Solution File

Ask an Expert for Answer!!
Data Structure & Algorithms: Consider the problem of finding a stable state in a
Reference No:- TGS01630387

Expected delivery within 24 Hours