Let a aij be a matrix where aij are real numbers for 1 i


Question: Let A = [aij ] be a matrix, where aij are real numbers, for 1 = i = n and 1 = j = m. We consider two different specifications of when a subset X of entries aij in A is considered to be independent:

(a) X is independent when, for any column, there is at most one element of X in this column.

(b) X is independent when, for any column, there is at most one element of X in this column, and, for any row, there is at most one element of X in this row. We want to find a maximal independent set X of entries of A (in the sense of inclusion) that minimizes the sum of entries in X. To this end, we use the greedy algorithm. For which of these specifications of independence is the greedy algorithm correct for any matrix A?

Solution Preview :

Prepared by a verified Expert
Mathematics: Let a aij be a matrix where aij are real numbers for 1 i
Reference No:- TGS02383128

Now Priced at $10 (50% Discount)

Recommended (96%)

Rated (4.8/5)