Скачать Fundamentals of the Average Case Analysis of Particular Algorithms (Wiley-Teubner Series in Computer Science): Rainer Kemp бесплатно
27 июня 2009 | Автор: Admin | Рубрика: Компьютерная литература » Програм-ние и разработка » Программирование | Комментариев: 0
Fundamentals of the Average Case Analysis of Particular Algorithms (Wiley-Teubner Series in Computer Science): Rainer Kemp
John Wiley and Sons Inc | ISBN: 0471903221 | 1985-01 | djvu (ocr) | 233 pages | 1.92 Mb
'Analysis of algorithms' is quite important in computer programming, because there are generally several algorithms available for a particular application and we would like to measure and compare the time and storage requirements. Time may be measured by counting steps, statements, or the number of times some given operation is performed; space may be measured by bits, words, or the number of registers and cells required during execution of the algorithm. The analysis usually consists of determining the behaviour of an algorithm in the best case, in the worst case and in the average case. The best (worst) case is characterized by the minimum (maximum) total amount of time or space requirements taken over all inputs of some fixed size. To characterize the average case, it is necessary to define what we mean by the average; in general, we must make some assumptions about the expected characteristics of the inputs of some fixed size. If an input x of size n has the probability px and requires the total amount of time or space kx, then the average case is characterized by the behaviour of the expected value (Appendix A) of the random variable which has th^ value kx with probability px. In most problems we will make the reasonable assumption that each of the inputs of size n is equally likely, but the analysis can also be carried out under other assumptions. To obtain a quantitative indication of how close to the average we may expect the amount of time or storage requirements to be, we will compute further characteristics of the given distribution such as the variance, the standard deviation, the moments about the origin, or the (cumulative) distribution function (Appendix A).
Now an important problem is to compare the time and space requirements of algorithms available for a particular application. Sometimes we want to decide which is best. But this is easier said than done. In many cases we may only compare the time requirements or the storage requirements of two algorithms, because the one algorithm requires less time but more space than the other. Similarly, comparing two algorithms in the best, worst, or average case, the same situation can occur. For example, the sorting algorithm 'Heapsort' is faster than the algorithm 'Quicksort' in the worst case, but not in the average case. Summing up, a comparison of two algorithms should be made only for the time or storage requirements in the best or worst or average case. (Nevertheless, there are other criteria of goodness of algorithms such as the product of time and space requirements or the adaptability to computers.) The classical complexity theory deals with the time and storage requirements of algorithms in the worst case. In practice, there are some objections to the measuring of the goodness of an algorithm by these quantities, although their computation can be an extremely difficult task. If an algorithm requires time or space of order O(f(n)) in the worst case, then the constant in the 0-term can be fantastically large and the result is only of theoretical interest. Furthermore, if the inputs corresponding to the worst case have a probability which tends to zero for large input sizes, then it is hard to see why the goodness of the algorithm is measured by its worst case. Therefore, the importance of the worst case can be reduced by the knowledge of its probability. But the computation of this probability can be rather difficult, unless impossible. In practice, an algorithm requiring time n on the average in 99 per cent of all possible inputs of size n should be preferred to an algorithm for the same problem which needs time n2 in the worst and average case, even though the former algorithm needs time n3 in the worst case.
Study of the behaviour of an algorithm on the average is accompanied by many mathematical calculations; we need to use the results of complex variable theory, number theory, probability theory, discrete mathenutics and combina- combinatorics. The principal techniques involved in the analysis of igorithms consist of counting of certain objects, solving of recurrences, working with finite summations, handling of generating functions and asymptotic evaluating of expressions. The last part of this introductory section is devoted to some simple examples elucidating the above ideas and concepts.