Why use asymptotic notation instead of running time or


Answer the following questions and also justify your answers with suitable examples

Question 1: Why use asymptotic notation instead of running time or operation counts?

Question 2: When does it make more or less sense?

Question 3: Why invent a system of notation where multiplying by an algorithms running time by two does not seem to make it asymptotically faster?

Question 4: How would you explain asymptotic notation to someone else?

Question 5: Would you recommend a coding standard that EVERY function must include its asymptotic complexity in its documentation?

Question 6: How important do you think it is to understand and be able to calculate asymptotic complexity?

Question 7: Do you think asymptotic notation is an academic's diversion or does it matter in the "real world"?

Question 8: What would be an unfair exam question involving asymptotic notation?

Question 9: Have you ever used a StringBuilder in java? Why is it preferred over using string concatenation?

Request for Solution File

Ask an Expert for Answer!!
Computer Engineering: Why use asymptotic notation instead of running time or
Reference No:- TGS0961983

Expected delivery within 24 Hours