How many times is the multiply routine executed in the set


Problem

In a machine M1 clocked at 100 MHz it was observed that 20% of the computation time of integer benchmarks is spent in the subroutine multiply(A, B, C), which multiplies integer A and B and returns the result in C. Furthermore, each invocation of Multiply takes 800 cycles to execute. To speed up the program it is proposed to introduce a new instruction MULT to improve the performance of the machine on integer benchmarks. Please answer the following questions, if you have enough data. If there are not enough data, simply answer "not enough data."

(a) How many times is the multiply routine executed in the set of programs?

(b) An implementation of the MULT instruction is proposed for a new machine M2; MULT executes the multiplication in 40 cycles (an improvement over the 800 cycles needed in M1). Besides the multiplies, all other instructions which were not part of the multiply routine in M1 have the same CPI in M1 and M2. Because of the added complexity, however, the clock rate of M2 is only 80 MHz. How much faster (or slower) is M2 than M1?

(c) A faster hardware implementation of the MULT instruction is designed and simulated for a proposed machine M3, also clocked at 80 MHz. A speedup of 10% over M1 is observed. Is this possible, or is there a bug in the simulator? If it is possible, how many cycles does the mult instruction take in this new machine? If it is not possible, why is this so?

Request for Solution File

Ask an Expert for Answer!!
Computer Engineering: How many times is the multiply routine executed in the set
Reference No:- TGS02780571

Expected delivery within 24 Hours