0:00

What is the shortest path problem? So we're given a graph, G that is we're

Â given. A vertex and an edge set.

Â A set of nodes and a set of links. And it is a weighted graph so each link

Â has a certain number a weight associated with it, interpretable as the cost of

Â using that link. And it is a directed graph so node I goes

Â to node j doesn't mean that node j also goes to node i.

Â And our job is actually very simple. We just want to find a minimum cost path,

Â so there might be multiple. We just need to find one, between each

Â pair of source and destination. Okay?

Â So that's a very simple statement and it turns out this is one of the most

Â extensively studied graph theoretic optimization problems.

Â It's got the name, the shortest path, which is a little misnomer, because the

Â best name really should be minimum. Cross path problem.

Â 1:32

So how can we solve this problem? There's so many different nice ways.

Â We will look at a specific one based on the dynamic programing principle.

Â Or the so called Bellman Equation. And if you are taking an optimization

Â course you probably will spend quite a few lectures on dynamic programing.

Â Okay? But here we just illustrate it in one

Â specific algorithm. Again, notice that we're given the graph

Â G. And for each link there is a cost, the CIJ

Â for link I to J. And the main idea behind this Bowman's

Â equation and the resulting Bowman-Ford's algorithm was the following: if you are

Â node i. And you have a destination n.

Â You want to figure out the best way to get there, the shortest path or main course

Â path. Well, you have to get there from one of

Â your outgoing neighbors. There should be an arrow here.

Â Say you got three neighbors, so whatever path you take, you have to first start

Â with one of the three neighbors. Okay, and there's a cause of going to the

Â neighbors, CIK here. Then, let's denote by p sub it.

Â 2:51

As the minimum cost it takes for node I, to go to destination N, from I to N.

Â Let's assume this N is a common fixed N for everyone.

Â Okay. So we just suppress the N notation here,

Â with implicit understanding that we talk about this nation being node N.

Â So this is the minimum cost for source I that goes to source N, using T or less

Â halves. You think but here we usually reserve that

Â index for time and indeed we can view this as the iteration index over discreet times

Â loss as well. Both have a temporal meaning over a

Â duration as well as a spatial physical meaning here.

Â So we're using that notation, we can write down the following obvious statement.

Â For example, at time zero, every nose takes infinite cause cuz you cannot get to

Â the destination using zero or less hops, okay.

Â Unless node I is node n itself, you are the destination yourself, then in that

Â trivial case, yeah you are, already there. And we can also write down, therefore at

Â whatever time or whatever number of halves you may need, you can always get to

Â yourself, with a zero cost. Okay.

Â So these are just obvious statements using this definition of the notation PIT.

Â Now here comes the somewhat magical part. We want to find out PI, let's say using T

Â plus one or fewer halves. That means we have to go through, well,

Â 4:42

one of the outgoing neighbors indexed by k, okay, with the cost CIK.

Â Once you get to that neighbor, one of these neighbors, you've already used up

Â one hop. So you need to get to that destination

Â with T hops or fewer. And if you could know, still if at this

Â point, the minimum cost it takes for node K, that's the neighbor you just picked to

Â go to, to get to destination N using T or fewer hops, then that's great'cause you

Â can then add these two up. Now, the question is, which outgoing

Â neighbor K to pick? Well, I'm going to pick whichever one can

Â give me the smallest sum . So, pick the min of the sum over all the

Â nodes indexed by k, that's your outgoing neighbors This is the famous Bellman's

Â equation. And we can, indeed, rigorously prove that

Â if this equality holds. Now, this boundless equation is useful for

Â an algorithm only if you can actually efficiently compute a pkt.

Â Then you can compute PIT plus one. But can you compute PKT efficiently? Well,

Â Let's use the idea of recursion. So using recursively this Bellman's

Â equation you can like keep riding this from T plus one to T, to T minus one, T

Â minus two, all the way down to zero. Then you see that no nodes can get to

Â destination in zero of fewer hops except that destination is cell.

Â And then you can recursively work your way back up, back to T plus one.

Â 6:38

Bellman-ford algorithm. Okay.

Â Essentially, it is an algorithm for efficient computation following this

Â Bellman's equation. Notice the min is taking over only those

Â nodes K that are your outgoing neighbors. So, this is one of those beautiful,

Â elegant and yet extremely useful and so often used equations, like those we

Â encountered in lectures one, two, and three, back there.

Â So, without going into the underlying dynamic programming principles, because

Â then we'll be dragging sort of off the track a little bit.

Â Let's take a look at an example to illustrate how this can be computed, at

Â least centrally. Okay, suppose you actually know the whole

Â graph. How can you compute this essentially?

Â Then we'll look at, in the next module of the video, the distributed version where

Â no one has a global bird's eye view of the whole graph.

Â Then how can you implement this computation?

Â 7:52

But. So here is the centralized example.

Â This is an extremely small graph. For such a small graph, you can probably

Â just eyeball the answers. Please don't eyeball the answer.

Â Let's follow the Bellman's equation cuz the purpose is not to redefine the

Â shortest path in this graph. Who cares is to illustrate the

Â Bellman-Ford algorithm. Okay?

Â So, notice in this graph, we got A, B, C, D, four nodes, five nodes.

Â There is also a common destination node N. We can do All to one routing basically,

Â Okay? Find out all the shortest path for these four nodes to a common destination

Â N. We could also do one to all following a

Â similar procedure, Okay? So,

Â You look at these link ways which are numbers written next to the directed

Â weight links in the graph. You see hey, they are some negative

Â numbers. What do you mean by negative weights?

Â The three negative numbers, was really for generality sake.

Â It may have some physical meaning in some context but for this purpose we just want

Â to show that is okay to have negative weights on a link for Bellman-Frod to

Â still work out. As long as you don't have,

Â No negatively weighted cycles. Cycle is a path connect concatenation of

Â links that ends up where it started. Okay?

Â So for example, C to A to B to C, that's a cycle, right?

Â Is a concatenation of links, of three links, and then you end up where you

Â started with. And the total cycle path is minus two plus

Â eight plus four which gives you ten as the cycle went, which is positive number so as

Â long as we don't have a cycle with total negative wave and s'okay, what's wrong

Â with a negative weighted cycle? Well because if this whole cycle is

Â negative weighted then, you can just keep going looping this infinite number of

Â times and it will push your total cost down to infinite.

Â Before you take, you know, one more hop to get to the destination, or two more hops.

Â So, no negative cycles, otherwise the problem is ill-defined.

Â Negative weighted links, that is okay. Alright.

Â So, let's run the Bellman's equation now. At time T for this graph, just remember

Â the graph now, okay. We have the following.

Â 11:05

What are these three numbers? Well it's eight plus infinity, six plus

Â infinity and four plus infinity because the previous iteration they're all

Â infinity and this is infinity which means you have no way to get from node A to the

Â destination in the first iteration i.e. Using one or fewer hops.

Â Which is obviously true, it's trivial. Of course A is separated from N by two

Â nodes by one nodes so you at least need two hops to get to N.

Â But again, don't eyeball solution. We just want to illustrate a Bellman

Â equation using a simple a numeric example as possible.

Â Similarly, you can write down PB. At this time iteration is also

Â implemented. What about PC?

Â Actually PC is not infinity any more cuz is the mean of three numbers, six plus

Â zero if you go through node N directly. Minus five plus infinity if you goes to

Â its outgoing neighbor B. And minus two plus infinity if you go

Â through its outgoing neighbor A. But clearly you want to pick the first one

Â that's six. And PD, similarly can find to be seven.

Â So it's infinity , infinity six and seven and iteration one, okay?

Â Using one or fewer hops. Now, let, iteration two starts.

Â You can now see suddenly, that PA at two is not infinity anymore.

Â It is the mean out of three choices of its outgoing neighbor.

Â Through B, you get A+ infinity, because it's still infinity in the last iteration.

Â Pb at iteration one is to infinity, so it's ainfinity.

Â Plus infinity. That's no good but then you've got sixCs.

Â Plus Cs. No Cs minimum cost last iteration, using

Â one of your halves. That is six,

Â Very good. Now you've got four plus seven if you go

Â through D. Now clearly the mean of that is eleven,

Â four plus seven go through neighbor D. You should write on a separate sheet of

Â paper what exactly node are you using? Okay this is purely computing the shortest

Â path. But obviously you can also trace back the

Â actual path itself. In addition to the cost of the shortest

Â path. Good so PA is now eleven.

Â Following similar calculation PB now is three and PC now is two.

Â You can easy derive this yourself or see it in the textbook.

Â What about PD? We're going to start writing PD cuz PD is

Â boring. D had only one outgoing neighbor which

Â turns out to be the destination itself. So it will always stay the same number

Â which is seven, okay. So now it becomes eleven, three, two,

Â seven. Keep going.

Â Iteration three, you see that PB, Iteration three is now the min of what?

Â Eight plus what? What does it take to node B to get to the

Â destination three? Okay, very good.

Â A plus three, Okay? Six2, plus two.

Â Four7. Plus seven.

Â 14:37

Notice the last iterations updates into eleven, three, two are reflected here.

Â Now, you've got three and two, okay? So now, this is seven and similarly PB

Â iteration is -one. Oh, I'm sorry, not seven this is eight.

Â -One MPC is two . Okay.

Â Pd again, remains the same. Keep going, t equals four.

Â Okay? Pa now is , you can compute, 7pb is minus

Â one pc is two, pd is still the same. And how about we keep going, t equals

Â five, equals six and so on. Turns out we don't need to do that.

Â There are two ways to look at it. One is it no longer changes.

Â Okay? You can try it but you will see that it no

Â longer changes It converged the Bellman-Ford algorithm at the right

Â answer. And second You, if you actually had a

Â bird's eye view and know how many nodes there are, you know there are four nodes.

Â Okay. Other than yourself out there.

Â So, you got to be able to stop at iteration four.

Â You know this APRE, because if you keep on going any further.

Â I say, well I give you one more half, you can go up to five halves or more or, or,

Â or fewer. You can go up to a 100 halves of your, not

Â just four halves of your. Say I don't care, it doesn't help me any

Â more. Why?

Â Because the only way for me to leverage beyond four halves.

Â Is to actually go through the same node twice.'Cause there are only four other

Â nodes out there. And therefore, I will have to go through a

Â cycle. But I know cycles are not negatively

Â weighted, so it will add to my cost. No cycles needed, so I don't need to go

Â beyond, four hops or more. Either way, you know, you can stop.

Â And this illustrates the Bellman-Ford algorithm.

Â But it does not show you how to do it distributively.

Â