Which dbms product is the fastest which product yields the


Which DBMS product is the fastest? Which product yields the lowest price/performance ratio?

What computer equipment works best for each DBMS product? These reasonable questions should be easy to answer. They are not. In fact, the deeper you dig, the more problems you find. To begin with, which product is fastest at doing what?

To have a valid comparison, all compared products must do the same work. So, vendors and third parties have defined benchmarks, which are descriptions of work to be done along with the data to be processed. To compare performance, analysts run competing DBMS products on the same benchmark and measure the results.

Typical measures are number of transactions processed per second, number of Web pages served per second, and average response time per user.

At first, DBMS vendors set up their own benchmark tests and published those results. Of course, when vendor A used its own benchmark to claim that its product was superior to all others, no one believed the results. Clearly, vendor A had an incentive to set up the benchmark to play to its product strengths.

So, third parties defined standard benchmarks. Even that led to problems, however. According to The Benchmark Handbook4 : When comparative numbers were published by third parties or competitors, the losers generally cried foul and tried to discredit the benchmark.

Such events often caused benchmark wars. Benchmark wars start if someone loses an important or visible benchmark evaluation. The loser reruns it using regional specialists and gets new and winning numbers. Then the opponent reruns it using his regional specialists, and of course gets even better numbers. The loser then reruns it using some one-star gurus.

This progression can continue all the way to five-star gurus. For example, in July 2002 PC Magazine ran a benchmark using a standard benchmark called the Nile benchmark. This particular test has a mixture of database tasks that are processed via Web pages. The faster the DBMS, the more pages that can be served. The test compared five DBMS products:

• DB2 (from IBM)

• MySQL (a free, open source DBMS)

• Oracle (from Oracle Corporation)

• SQL Server (from Microsoft)

• ASE (from Sybase Corporation) SQL Server's performance was the worst. In the magazine review, the authors stated that they believed SQL Server scored poorly because the test used a new version of a non-Microsoft driver (a program that sends requests and returns results to and from the DBMS).

As you might imagine, no sooner was this test published than the phones and email server at PC Magazine were inundated by objections from Microsoft. PC Magazine reran the tests, replacing the suspect driver with a full panoply of Microsoft products.

The article doesn't say, but one can imagine the five-star Microsoft gurus who chartered the next airplane to PC Labs, where the testing was done. (You can read about both phases of the benchmark at www.eweek. com/article2/0,4149,293,00.asp.) Not surprisingly when the tests were rerun with Microsoft-supporting software, SQL Server performed better than all of the other products in the first test.

But that second test compares apples and oranges. The first test used standard software, and the second test used Microsoft-specific software. When the five-star gurus from Oracle or MySQL use their favorite supporting products and "tune" to this particular benchmark, their re-rerun results will be superior to those for SQL Server. And round and round it will go.

Questions

1. Suppose you manage a business activity that needs a new IS with a database. The development team is divided on which DBMS you should use. One faction wants to use Oracle, a second wants to use MySQL, and a third wants to use SQL Server.

They cannot decide among themselves, and so they schedule a meeting with you. The team presents all of the benchmarks shown in the article at www.eweek.com/article2/0, 4149,293,00.asp.How do you respond?

2. Performance is just one criterion for selecting a DBMS. Other criteria are the cost of the DBMS, hardware costs, staff knowledge, ease of use, ability to tune for extra performance, and backup and recovery capabilities. How does consideration of these other factors change your answer to question 1?

3. The Transaction Processing Council (TPC) is a nonprofit that defines transaction processing and database benchmarks and publishes vendor-neutral, verifiable performance data. Visit its Web site at www.tpc.org.

a. What are TPC-C, TPC-R, and TPC-W?

b. Suppose you work in the marketing department at Oracle. How would you use the TPC results in the TPC-C benchmark?

c. What are the dangers to Oracle in your answer to part b?

d. Suppose you work in the marketing department for DB2 at IBM. How would you use the TPC results in the TPC-C benchmark?

e. Do the results for TPC-C change your answer to question 1?

f. If you are a DBMS vendor, can you ignore benchmarks?

4. Reflect on your answers to questions 1 through 3. On balance, what good are benchmarks? Are they just footballs to be kicked around by vendors? Are advertisers and publishers the only true beneficiaries? Do DBMS customers benefit from the efforts of TPC and like groups? How should customers use benchmarks?

Request for Solution File

Ask an Expert for Answer!!
Business Management: Which dbms product is the fastest which product yields the
Reference No:- TGS02612069

Expected delivery within 24 Hours