if quicksort is so quick why bother with anything


If quicksort is so quick, why bother with anything else? If bubble sort is so bad, why even mention it? For that matter, why are there so many sorting algorithms? Your mission (should you choose to accept it) is to investigate these and other questions in relation to the algorithms selection sort, insertion sort, merge sort, and quicksort.

Core Questions

1. Explain each of the algorithms in a way that would be understandable to an intelligent lay person. You should not use any code (or even pseudo code) in your explanation, but you will probably need to use general concepts such as "compare" and "swap", and you'll certainly need to use procedural words such as "if" and "repeat". You might find it helpful to consider an algorithm as if it were a game for which you need to define the rules. For example, here's how you could describe the bubble sort algorithm as if it was a solitaire game played with a deck of cards that contain the values to process.

Bubble Trouble

The playing area consists of several regions: foundation, tableau, stock, and discard. Initially, all cards are in the stock. Play consists of a number of rounds. To begin a round, place the top card of the stock face up in the tableau, then turn over the next card. If the stock card is smaller that the tableau card, place it face down on the discard pile; otherwise, place the tableau card on the discard pile and the stock card in the tableau. Play the remaining stock cards in the same way, then move the final tableau card (which will be the largest of the stock cards) to the foundation and use the discard pile as the new stock. This completes one round. Continue to play rounds until the stock is exhausted. The cards in the foundation will now be sorted with the smallest card on top.

2. Write a set of guidelines for helping someone decide which sort algorithm would be most appropriate for a particular situation. Include in your guidelines a description of the advantages and disadvantages of each algorithm, together with an indication as to why those characteristics apply. Your goal is to provide enough information so that someone not familiar with the details of each algorithm would be able to decide which algorithm is right for them.

Extension Questions

In this section, you'll need to be able to measure the speed of execution of parts of your code. You can measure how much time a section of code takes by calling the system function getrusage before and after that section. The function returns information about various aspects of resource usage, including the amount of system time (time taken by system routines that you call) and the amount of user time (time taken by your own code). Note that this is process time, not "wall-clock" time, so it's an accurate measure even if the system is busy executing other people's code as well. Consult the man page for getrusage if you need more information.

#include

int main() {

  struct rusage before, after; // for recording usage stats

  // prepare the data

  getrusage(RUSAGE_SELF, &before);

  // execute the code you want to time

  getrusage(RUSAGE_SELF, &after);

  int secs = after.ru_utime.tv_sec - before.ru_utime.tv_sec;

  int usecs = after.ru_utime.tv_usec - before.ru_utime.tv_usec;

  cout << secs * 1000000 + usecs << endl; // in microseconds

}

Practical sort implementations usually combine more than one sorting algorithm, attempting to take advantage of the best characteristics of each. For example, a straightforward but effective approach for general-purpose sorting is to use quicksort, but with a switch-over to insertion sort when the size of the lists that result from the partitioning falls below a threshold value. This approach is generally faster that using pure quicksort because insertion sort has a lower overhead than quicksort and is thus faster, provided the length of the list is small enough.

The structure of the combined sort would be like this:

sort (...) {

  if size is less than some threshold {

    do an insertion sort

  } else { // do a quicksort

    partition

    recursively sort the first part

    recursively sort the second part

  }

}

3. Design an experiment to determine the best "cutover" size for the combined "quicksort-plus- insertion-sort" implementation. You'll need to consider a range of data sizes, including both random and "worst-case" data sets. Write a program that could be used to perform the experiment. You'll need to provide the sort code itself as well as a suitable main function for testing it. Your experimental design should be sufficiently detailed that you could hand the task over to a tester who is not familiar with sorting algorithms or even with programming. Ideally, the tester should only need to run the program under specified conditions and record the results.

4. Run your proposed experiment and report on the findings. Your report should include the data you gather, an analysis of that data, and a clear recommendation as to the best cutover threshold. Consider how best to present your data. You'll certainly want to tabulate the data, but you might also find it helpful to plot it as well. Because the actual times will be heavily dependent on the data size, you might find it useful to normalise the times against the "ideal" time (by dividing by n log n) before plotting them.

 

Request for Solution File

Ask an Expert for Answer!!
Computer Engineering: if quicksort is so quick why bother with anything
Reference No:- TGS0208064

Expected delivery within 24 Hours