[MUSIC] Hello this is Adrienne Fairhall, welcome back for my final week of this part of computational neuroscience. This week we're going to be talking about computing in carbon. We're going to be zooming down from the very high-level picture that we had of Hugh Jackman neurons, those I do have, to the single neuron level. I'm thinking about how the brain in fact instantiates the kinds of computations that we saw in previous lectures. So we're going to be talking about the basics of neuro-electronics. The membranes, the ion channels, and the wiring that make our brains work. We'll be going from there to talk about single simplified neuron models and thinking there about the basic dynamics of neuronal excitability. Can we as modelers come up with simpler models that capture the essential dynamics that neurons instantiate? Finally, we'll be talking about neuronal geometry. We see that neurons have very complex structures, so how do these matter? How do they affect the computational properties of the neuron and in what way do we need to take them into account? Our goal for today is going to be to take this somewhat realistic picture of a neuron, complete with a soma, with dendrites that collect inputs from other neurons, and an axon that transmits action potentials generated in or near the soma to other neurons, into a circuit diagram. We're going to start with a model of a small patch of membrane that generates action potentials only, and I hope to show you exactly what gives a neuron its very distinctive behavior, the ability to fire an action potential, using the classic Hodgkin Huxley model. Just to note that while we've been using quite a variety of mathematics in this course so far, including linear algebra and probability, today's method's going to be based largely on differential equations. We'll be making quite heavy used of first order differential equations and that should be enough to understand what's going on which is interesting as you'll see. What I hope you'll take from today's topics is an understanding of ion channel dynamics and a general sense of how one can either make these models more simple or more complex depending on your goals. So let's start with a review of simple circuits. I imagine that for some of you, it's been a while, if ever, so I'll go over the basics. First remember that along any line in circuit diagram that represents a wire and so current can flow freely and there's no voltage drop along the wire. All voltage changes around a circuit are associated with circuit elements such as a capacitor here, or a resistor, or a battery. A capacitor is an insulator which accumulates charge on either face. A resistor resists the flow of current, like a narrowing in a pipe, which slows down the flow of water. A battery causes a potential drop. At any junction of wires, the total current is zero. This is called Kirchhoff's law. For example, here at this junction, let's imagine that we were driving the circuit with some input, let's say Iext. In that case, the sum of the current flowing out, here through the capacitor and here through the resistor, must add up to the current that's coming in to Iext. How does the potential change across a resistor? There's a voltage drop, the current times the resistance. This is Ohm's law. Often, in neuroscience, we use an alternative measure of resistance, the conductance, which is just one over the resistance. So now let's look at a patch of the membrane that encloses our neuron. How are we going to model it using such a circuit? Remember that the membrane consists of two layers of lipids or fats. These are good insulators. Embedded in the membrane are ion channels, which allow ions to pass through the membrane selectively. So let's use those laws to write down a first pass at an equation for the neuron. For now we'll leave aside the iron channels channels. We'll come back to them. The membrane still allows some small amount of charge to flow through. But how does the membrane itself behave? The lipid bilayer behaves like a capacitor. As some charge can pass through the membrane we also need a resistor in parallel. This gives us this very simple circuit. So how do we right down an equation for this circuit's behavior? We'd like to obtain an equation for the voltage here, V, across the membrane. Now we can use Kirchhoff's law to find it. All the currents have to sum up. So we're putting in some external current, Iext. What is IR? What's the current through the resistor? We can get that from Ohm's law, it's just V over R. How about the capacitative current? We get that from the definition of the capacitance, given here. So the capacitance is defined to be the charge that can be stored across the capacitor, divided by the voltage. Now we can take the time derivative of this expression, so let's rewrite it as Q = C V, and take the derivative of that dQ/dt = C, C is just a constant dv/dt. Now dq/dt, the time derivative of the charge, is just a current Ic. So now we see that Ic is just C dv/dt. We throw these terms back into Kirchhoff's law and we get the following differential equation. So this differential equation is linear and it's first order in V, that is it's dependence on V is just simply linear and V. So people paying shop attention will have realized that while this might be all well and good for a little chunk of membrane floating around in some solution, it's not really what's going on with our cell. Remember from the first lecture that the membrane encloses a solution in which the concentrations of various ions are different from the outside. So this now is more like it. Outside the cell is the harsh world, basically the C with high sodium, high chloride, and also high calcium concentration. Inside, the cell has hoarded some potassium, which is necessary for many of its life operations. It maintains these ionic gradients with specialized pumps, that exchange sodium for potassium. Now, this concentration gradient gives us a battery and maintain potential difference across the cell. There are two opposing forces at work. By osmosis ions tend to move down their concentration gradient, at least until that osmotic force is opposed by electrostatic forces. The potential difference at which these forces are in equilibrium is called the Nernst potential. This is given by the ratio of the concentrations on the inside to the concentrations on the outside, take the log, and now let's multiplied by this set of constants. Now kBT, this is the Boltzmann constant, this is the temperature, q is the ionic charge, and z is the number of charges in the ion. Now how does our equation change in the presence of this battery? The effect is the part of the voltage drop, V, that occurs across the resistor is lessened by the battery. So, before we had all of that voltage drop occurring across the resistor, now part of the voltage drop V, is separate from the resistor, and so now the part of the voltage dropped that occurs across the resistor is, V-Vrest. But that's going to reduce the current through the resistance. So now our IR, the current through the resistor, is going to be V-Vrest/R, so here's a new equation. Some of you will be familiar with the solutions of such a differential equation, they take the form of exponentials. So let's put this equation in a convenient form where we can read off some important quantities. With a little bit of algebra we can rearrange this equation into this form. Constant tau here is known as the time constant. Perhaps we can figure out what is tau in terms of their original constance. We also have this time the infinity. What's the meaning of the infinity? So let's set the time derivative to 0. That is, assume that whatever changes we might have made in Iext happened long ago and we've let the system settle down. If you set dV/dt = 0, then V = V infinity. So it's just the study state. So imagine we're putting in a constant input current. What is the infinity in that case? Here's an example solution of this equation for a square pulse of input current. Initially V is 0 then it rises exponentially until it reaches V infinity, when the current is switched off again it falls exponentially back to 0. So the solution here looks like V infinity (1-e to the -t/tau),and in the falling phase, this looks like V infinity e to the -t/tau. We said that the potential was given by the ratio of ionic concentrations inside and out. If ions could all flow in the same way through the membrane, we wouldn't need to distinguish between concentrations of different ionic species, we'd just consider the total charge, however, the way that ions flow through the cell is via ion channels, and these do distinguish between different ions. What are ion channels really? They are amazing little molecular devices and there are many types. Some are voltage dependent. Some are neurotransmitter dependent. Some depend on calcium. Some are mechanosensitive and some are heat sensitive. The property we'll be focusing on today is voltage sensitivity. So let's look at the current that comes through a single ion channel, as it opens and closes stochastically. We can still apply Ohm's law to this one tiny channel. So the current is just the voltage drop across the channel divided by r, the resistance, or we can also write that as I=vg Where g is the conductance of that channel. So two factors set the size of the current, the voltage drop across the membrane and the conductance of the channel. So it turns out that each ionic species has its own associated equilibrium potential. Then when the membrane allows a particular ion type to flow through it, through an appropriate channel, then that current will tend to pull the memory potential toward the equilibrium for that ion. So here are some examples of the values of equilibrium potentials for different ionic species. So sodium has a positive equilibrium potential, whereas potassium has a very negative equilibrium potential. Here's the one for calcium, also very negative, and for chloride sits at -60mV which is around the rest potential. We're going to be concentrating today on these two, on sodium and potassium. What you should notice here, and the most important thing to notice for the rest of our discussion today, is that sodium potassium have opposite tendencies. Sodium currents tend to depolarized the membrane, that is to move it to more positive potentials, while the potassium current tends to hyperpolarized it, that is to take it toward more negative potentials. So while there are many different types of ion channels that allow different ion species to pass through the membrane, we're going to focus on sodium and potassium. Each ionic path through the membrane has it's own battery potential, as we just learned, and it's own conductance. So that the current through each branch of circuit is, by Ohm's law, V=IR. But let's use conductors and we have to discount the membrane voltage drop, we have V across the membrane, we have to discount the membrane voltage drop by the equilibrium potential for each ion. So now we can see that the current flowing through each of these branches, indexed by the ionic type, is the conductance for that ion multiplied by V-Ei, where Ei is the equilibrium potential for that ions basis. So this is all interesting, there are different ions. They have their own battery. But we haven't yet seen what makes the transformation that a neuron makes on its inputs qualify as a computation. Simply linearly transforming the current through the passive membrane properties doesn't seem to have the flavor of computing. Computing, one could say, requires doing something irreversible, selecting something, getting rid of other things, qualitatively changing something into something. As we discussed in week two, a linear transformation simply re-represents the input in another coordinate basis. One can always invert that transformation and get back what you started with. In the coding models that we looked at, the linear filters extracted a new representation, but the non-linearity threshold of that new representation and made the transformation non-invertible. Information about other components was irretrievably thrown away. So here are the single neuron level, you can see how this linear behavior breaks down. These traces all look comfortably linear. Nice scaled versions of the same response as one changes the sign and amplitude of this current input. Exactly as a linear system should do. But for some input currents something totally different happens. This is called excitability. So let's take a break here and come back to figure out exactly what's going on.