A very straightforward process and pretty close

to the one that I'll be talking about in the next segment of this lecture.

Now this has great benefits.

First, I as I mentioned it's the scientific foundation for

analysis of algorithms.

It gives us a process whereby we can develop mathematical models,

develop hypotheses about performance and comparing algorithms.

And then test those hypotheses scientifically.

You certainly can predict performance and

compare algorithms if you analyze algorithms like Knuth.

Now it does have some drawbacks.

The first drawback is the input model might be unrealistic.

And that's certainly still a problem that plagues people today.

And the second drawback is that there might be too much detail in this analysis.

And that is a bigger problem that actually we try to address

with analytic combinatorics.

So that'll be a theme that we'll come back to often in this course,

is how do we suppress detail while still getting accurate results.

Now in the 1970s, and even up to the present day, the study of algorithms

has revolved around books by Aho Hopcroft and Ullman.

Cormen, Leiserson, Rivest, and Stein and others, they try to address

Knuth drawbacks by first of all, just analyzing the worst-case cost.

And what that does is it takes the model pretty much out of the picture.

What we want is a guarantee on the worst-case running time of an algorithm.

And the second thing is we just use big O-notation and

just try to get an upper bound on the worst-case cost.

And that's a way to get a lot of the detail out of the analysis.

So, we're looking at a guaranteed upper bound on the cost of an algorithm.

And that addresses both of those drawbacks of Knuth.

And then the idea is to classify algorithms according to this cost.

And that was extremely successful approach that unleashed an age of algorithm design.

Where people were able to try out all kinds of new ideas.

And drive down these guaranteed worst case costs with the ultimate goal

of developing optimal algorithms.

Where the worst case cost is equal to the best possible.

But this approach also has a drawback.

And the serious drawback of this approach is that you cannot use it,

generally, to predict performance or compare algorithms.

That's an elementary fact that's often overlooked, but in this course,

it's important to lay out right at the outset.