0:04

In recent years a lot of people have asked me just what the phrase analytic

combinatorics actually means. In this lecture I'm going to describe

what the field is, and tell the story of how it came into being.

I think this context is important for anyone who's interested in learning

anything about the field. this lecture is dedicated to the memory

of my friend and colleague Philippe Flajolet, who really was the driving

force behind the development of the field and who died suddenly in 2011.

I'll start off with a brief history to try to give some context for how we got

there. I first met Philippe in 1977.

my first research paper that I wrote, after getting my PhD, was on an algorithm

called odd even merging. and I, I went to in those days you would

get your paper typed by a secretary and go to a conference and present it.

and I was very proud to have developed this formula that involves the gamma

function and the zeta function and gives a precise description of the performance

of this particular algorithm. and just a few months later we had a

conference in Providence, Rhode Island where I was at the time and in, in those

days you go to the conference and the first thing you do in the proceedings is

go and and look at the table of contents of the proceedings, see what's there.

And there was another type paper and I was amazed to see a formula very much

like mine involving the zeta function and the gamma function, even though it was

studying a completely different problem. and just as I was realizing that Philippe

came up to me and said I believe that we have a formula in common and both of us

were very surprised to see the similarities among these formulas.

in, it might be said we spent the rest of our careers trying to understand why.

now it's worth it to think about wh, what the world was like at the time, that we

started our research careers. we were both, at that time, in that fa,

just the earliest part of our research careers.

and the world was changing in very important ways, all around us.

Without going into too much detail, it really was the case that when we started

school people wore coats and ties, coats and ties to dinner, and so forth.

But by the time we get our PHD's there was Woodstock and hippies and, and so

forth. And, but, but, with re, respect to the

technology there were huge changes. when we started school computers were big

expensive rare, there were physical devices for every switch, or for every

bit. but not that much longer when we started

research and teaching, we had integrated circuits and computers were becoming

ubiquitous, and fast and cheap. another big thing was the access to

computers. most of the time that we were in college

and in graduate school you would get to develop a program.

You had to put each line of the program on a punched card, and you had to give a

box of punched cards to a computer operator.

And you would get to run your program once a day.

not that much longer, we had later when we su, started research in teaching, we

had time shared terminals and we're always connected and have been connected

ever since. and as I mentioned, when we started

school my thesis was typed by a secretary.

so you present the result and six months later you sort of see what it looked like

and submit it. It might be might be a year between the

time that you get the result and somebody sees it.

but not that much longer we had word processing and and mathematical type

setting and we can have much quicker and much broader communication of our, of our

research results. And another important thing is that when

we were in school and graduate school the curriculum was about math.

Everybody learned lots of math. And I learned PDEs and abstract algebra

and probability and topology. that's what that, that's what people with

an interest in working in technical fields did.

but the, by the time we started researching teaching, there was computer

science. And people had to learn about compilers

and algorithms and data structures and graphics and operating systems

programming languages, numerical analysis.

and all kinds of fields related to computer science.

So these are huge differences in a relatively short amount of time and in

thinking about it when preparing this talk, I really came to understand and

believe that this was a really profound change in the way the world worked.

Maybe even more profound then thee evolution of PCs personal computing, or,

or even the internet. the world was a vastly different place,

when we started to get to work. so that's a context where, where this

story starts. now analysis of algorithm so, that's the,

field of study that, both Phillipe and I, were engaged in.

and it's actually, natural and each, questions.

and it actually started with Babbage. So, this is a quote from, from Babbage,

who's widely attributed to have one of the, maybe the first, designed the first

computational engine. It was a mechanical device that could do,

arithmetic computations. in what he said, even before building the

thing, as soon as an analytic engine exists that will necessarily guide the

future course of the science, because you'd be able to do computations.

but he said whenever any result is sought, the question will arise, by what

course of calculations can these results be arrived at, by the machine, in the

shortest time? That's in 1864, and you can see why it

was important to Babbage, this thing actually had a crank, and the only way

that it could compute things, was by somebody turning the crank.

Obviously, you want to minimize the number of times that you need, turn the

crank. And computers were, expensive, and slow

and, used energy and so forth, so minimizing the cost of computation was

always very important. even Turing who many who is, [COUGH] ,

the founder of theoretical computer science could see the importance of these

kinds of practical questions. We want to have a measure of the amount

of work involved in the computing process even though it might be a crude one.

We count up the number of times that elementary operations are applied in the

whole process. and in order to figure out how much work

it's going to take before to help in designing efficient computation.

But the field of analysis of algorithms was really initiated by Knuth in the

1960's. and what Knuth told the world and there

was some debate about it at the time, was that classical mathematics is really got

the necessary tools that we need for understanding the performance of

algorithms. there's things like recurrence relations,

and generating functions, and asymptotic analysis.

That has the benefit of giving a scientific foundation for the analysis of

algorithm. And Knuth wrote a series of four books so

far, it, and the first one came out in the, in the late 60s and two more came

out in the early 70s. they really set out this scientific

foundation, that we really can use classic mathematics to understand the

performance of algorithms. and, with those mathematical models, we

can go ahead and accurately predict performance, and compare the efficiency

of algorithms. and that's what we found exciting.

we could use classical mathematics to understand the cost of a computation, and

then test out those results. And, formulate hypothesis about how long

it would take to do something. And then validate those hypothesis by

actually implementing and running the programs and checking them against the

math. There were many, many practical

applications where people needed to have these kinds of accurate math,

mathematical models and, and predictions. and Knuth's books were very densely

filled with information that helped us advance this science.

so that's a brief history of where we got started with analysis of algorithms.