National Academies Press: OpenBook

Proceedings of a Workshop on Statistics on Networks (CD-ROM) (2007)

Chapter: Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University

« Previous: The State of the Art in Social Network Analysis--Stephen P. Borgatti, Boston College
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 270
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 271
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 272
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 273
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 274
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 275
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 276
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 277
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 278
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 279
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 280
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 281
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 282
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 283
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 284
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 285
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 286
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 287
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 288
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 289
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 290
Suggested Citation:"Keynote Address, Day 2--Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks--Eve Marder, Brandeis University." National Research Council. 2007. Proceedings of a Workshop on Statistics on Networks (CD-ROM). Washington, DC: The National Academies Press. doi: 10.17226/12083.
×
Page 291

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Keynote Address, Day 2 270

Variability, Homeostasis per Contents and Compensation in Rhythmic Motor Networks Eve Marder, Brandeis University DR. MARDER: I’m a neuroscientist. I should say that I’m an experimental neuroscientist who has worked at various times and places with a variety of theorists, including my fellow speaker, Nancy Kopell. What I would like to do today is give you a sense of some problems and approaches where neuroscience and network science intersect. I realize that you’ll think some of my take home messages are probably obvious, even though my neuroscience colleagues often get very disturbed and distressed and start foaming at the mouth. The problem that we have been working on for the last 10 or 15 years, like all good problems, is both completely obvious and not. For this group it’s basically the problem of how well-tuned do the parameters that underlie brain networks have to be for them to function correctly? If you were to have asked most experimental neuroscientists 20 years ago how well- tuned any given synapse would have to be, or how well-tuned the number of any kind of channel in the cell would have to be, they would have probably said 5 percent plus or minus. You should know that neuroscientists have traditionally had the perspective that things have to be very tightly regulated, and we can talk about why that is in the future. About 10 or 15 years ago we started working on a problem that has since been renamed homeostasis and compensation. This comes back to something that is true in all biological systems, and not true in mechanical or engineering systems in the same way: almost all the neurons in your brain will have lived for your entire life, but every single membrane protein that gives rise to the ability of cells to signal is being replaced on a time scale of minutes, hours, days or weeks. The most long-lasting ones are sitting in the membrane for maybe a couple of weeks, which means that every single neuron and every single synapse is constantly rebuilding itself. This in turn means that you are faced with an incredible engineering task, which is how do you maintain stable brain function? I still remember how to name a tree and a daffodil and all the things I learned as a child, despite the fact that my brain is constantly, in the microstructure, rebuilding itself. When we started thinking about this, and this goes back a number of years, Larry Abbott and I worked on a series of models that basically used very simple negative feedback homeostatic mechanisms to try and understand how you could get stable neuronal function despite turnover and perturbations. What that led to was going back to step one and saying, how well do negative-feedback self-tuning stability mechanisms have to work? You have 271

to know how well-tuned do any of the parameters need to be in order to get the nervous system to do a given task. That’s what I’m going to talk to you about today. 1 There has been a series of both theoretical and experimental investigations into the question of what really are the constraints. How well does a circuit have to tune all of its parameters in order to give an adequate or a good-enough output? By good enough, I mean good enough so that we can all presumably see a tree, name a tree, although we also all know that my brain and every one of your brains are different. (I would like to acknowledge the work I’ll be showing you today of three post-docs in the lab—Dirk Bucher, Jean-Marc Goaillard, and Adam Taylor—who have done a lot of this work, along with two ex-post-docs who are now faculty, Astrid Prinz and Dave Schulz, and a colleague of mine, Tim Hickey, from the computer science department.) I’m going to show you a little bit about the maintenance of central pattern generators (CPGs). Those are networks that produce rhythmic movements and that function despite growth. Among adult animals, we ask how much animal-to-animal variation there is in network mechanisms and underlying parameters, and then we will talk about the problem of parameter tuning: how tightly the parameters that determine neuronal activity and network dynamics have to be tuned and whether stable network function arises from tightly controlled elements or because elements can compensate for each other. CPGs are groups of neurons in your spinal cord or brain stem. In this case I’m going to work in lobsters and crabs, and the particular pattern we will talk about runs a portion of a lobster or crab’s stomach. As shown in Figure 1, if you drill holes in the carapace and put wires into the muscles of what is called the pylorus or the pyloric region of the stomach, you see one-two-three, one-two-three, one-two-three, which is a rhythmic discharge to those muscles that persists throughout the animal’s life. Here you see it in these muscle recordings. When you dissect the nervous system that produces this motor pattern out, you can record intracellularly from the stomatogastric ganglion, which is responsible for producing this rhythmic discharge. In the lower left part of Figure 1 you see three intracellular electrodes from the cell bodies which show that one-two-three, one-two-three rhythmic discharge pattern. This discharge pattern really is a constrictor one phase, a constrictor two phase, and a dilator phase. You can see this either in extracellular recordings from the motor nerves, or intracellular nerves for the pattern. 1 See also a recent review article on this same material, Marder, E., and J.M. Goaillard. 2006. Variability, compensation and homeostasis in neuron and network function. Nature Reviews Neuroscience 7:563:574. 272

FIGURE 1 For many years people have tried to understand why this discharge exists and how the particular neurons and their connectivity give rise to things like the frequency and the phasing of all the neurons in the network that do this. The proof that this is a central pattern generator lies in the fact that the rhythm persists even when we have discarded most of the animal and are left with just the stomatogastric ganglion. As neuroscientists we worry a lot about things like why the pattern has the frequency it has and why all these specific timing cues are there the way they are. In Figure 2 we see the circuit diagram that gives rise to this pattern. In these circuit diagrams, we use resister symbols to mean that cells are electrically coupled by gap junctions, so current can flow in both directions. These symbols are chemical inhibitory synapses, which means that when one cell is depolarized or fires, it will inhibit the follower cell. This is the circuit diagram for the dilator neurons, the constrictor neurons. None of you could a priori predict what this circuit would do, because there is information missing in the connectivity diagram. I think that’s really an important thing for those who worry about dynamics to think about. What is missing in this connectivity diagram is all the information about the strength and time course of the synaptic inputs. Additionally, and as importantly, all the information about the dynamics of all the voltage and time-dependent currents in each of these cells that give them their particular intrinsic firing properties will then be sculpted by their position in the network. 273

FIGURE 2 So, as neuroscientists, what we have to do if we want to understand how this connectivity diagram gives rise to this particular pattern is ask what kinds of voltage and time-dependent currents does each one of these cells have? Then try and understand how all of those currents act together to give rise to that cell’s own intrinsic electrophysiological signature. Also to understand how they interact with the synaptic currents to end up producing this rhythmic pattern. I can give you the short answer. The short answer is that the AB neuron shown in Figure 2 is an oscillatory neuron that behaves a lot like this in isolation. It’s electrically coupled to the PD neuron, and it’s because of that electrical coupling that the PD neuron shows this depolarization burst of action potentials: hyperpolarization, depolarization, et cetera. Together these two neurons rhythmically inhibit the LP and the PY neurons, and so force them to fire out of phase. The LP neuron recovers from that inhibition first, because of the timing cues or the inhibition that it receives, and because of its own membrane conductances. I can tell you as a first approximation why that occurs, but it took a long time to even get to that first approximation. That said, the question is, for this network, how tightly tuned do these synapses have to be in terms of strength, and how tightly tuned do all the membranes currently in each of these cells have to be? What I’m going to do is first show you some biological data that will constrain the question, and we’ll talk a bit about some models that frame part of the answer. I’ll go back to some recent biological experiments that go directly after the predictions of some of the models. The first little piece of data I’m going to show you are from some juvenile and adult lobsters. As lobsters mature to adulthood, they grow considerably, and obviously their nervous 274

system changes size as well. Figure 3 shows a juvenile’s PD neuron (on the left) and one from an adult (on the right). You can see that the cells have a certain look that has been maintained through that size change. Those of you who are neuroscientists realize that as the cell grows from small to large size, every feature of its electrical properties has to change because its cables are growing—that is to say, the distance signals will travel. Positions of synapses have to change, therefore, the number of channels in the membrane have to change. All of these things have to change just as when you go from a 2-year-old or a 14-month-old who is learning how to walk, the whole peripheral plant is changing as the 14-month-old grows into an adult. It’s a challenge to say how this has happens if it has to maintain constant physiological function. FIGURE 3 Here in Figure 4 are some intracellular recordings, some made from a juvenile, some from an adult. You can see by your eye that the waveforms and patterns look virtually identical between that baby lobster and the big lobster, which tells you that there has to be a mechanism that maintains the stable function despite a massive amount of growth, and therefore a massive change in any one of those properties. 275

FIGURE 4 The question this poses is that there have to be clearly multiple solutions to producing the same pattern, or very similar patterns, because in the small animal and the big animal, many of the parameters are actually different, as are the number of channels. This tells you that at least during natural life, the animal manages to find its way from one solution and to grow its way continuously without making mistakes that shift it into other whole classes of solutions. To go beyond making the simple assertion, I would like to show you what we realized we have to do, which was to ask to what’s the variation among individual animals. Basically, this defines the range of normal motor patterns in the population, and what we are going to do to quantify the motor patterns is look at these extracellular recordings form the motor nerves showing one-two-three, the discharge pattern we have been looking at, that triphasic motor pattern. We can obviously measure the period. We can measure the duration of the bursts. We can measure latencies and the phase relationships of all the cells in the network. (See Figure 5.) 276

FIGURE 5 The first thing I would like to show you is that if we look at different animals, the mean cycle periods vary about two-fold. Figure 6 shows a histogram of the cycle periods in 99 animals. 277

FIGURE 6 DR. BANKS: I could imagine that every lobster is more than its own sort of starting point cycle time, and with age that cycle time slowly increases or slowly decreases. I think that type of trend would be invisible if you do destructive testing. DR. MARDER: We do destructive testing so we have actually done a great deal of work on embryonic animals. That is a whole other story. This system is built in the embryo. It has to be put in place before the animals hatch, because these motor patterns need to be ready for the animal before it starts to eat. In the babies, we tend to see more variability, and to some degree slower patterns, which then consolidate so there is a whole other story about how things change very early in development. DR. BANKS: You don’t know that this pattern is stable over the entire portion of its life. It could change. DR. MARDER: There are two answers to why we think it’s stable within the range of the population. I don’t believe for a moment that an animal keeps an identical period over time. All I’m saying is it’s going to be within that range, and presumably as things are tuning continuously, it may wander round within the tolerance level, and we can’t monitor it long enough. 278

FIGURE 7 Figure 7 shows the phase relationships, where we are looking at the time in a cycle where a cell starts firing and ends firing. And again, you see that over this very large range of periods, the phase relationships are remarkably constant. This in and of itself is actually a very challenging problem, because getting constant phase relationships over a large frequency range with constant time constant events, which is basically what you have, is tricky. Now, thanks to really beautiful work by one of my former postdocs, Farzan Nadim, I think we really do understand the mechanisms underlying this. Now, if we look at these measures in the juvenile and the adult animals (Figure 8), we see the cycle periods are basically the same. Most of the phase relationships are very close, if not perfectly the same, and we just had fewer juvenile animals than adults, because they are very hard to come by. My guess is if we had 99 of both of them, they would all fit in totally the same parameter regime. 279

FIGURE 8 That sets the stage for what we eventually want to account for, motor patterns with pretty tightly constrained, although not perfectly identical output patterns in different individuals. We now want to go after the structure of the underlying conductances. To do so, I’d like to step back and spend some time on the single neuron. A lot of what I’m going to say is true of all neurons. Neurons, when you just record from them, can have a variety of different behaviors. Some of them can just be silent, and we’ll call those silent neurons. Some of them will just fire single action potentials more or less randomly, and I we will call those tonically firing neurons. Other neurons will fire what I’ll call bursts, that is to say they will depolarize, fire bursts of action potentials, and then they will hyperpolarize. They will fire these clumped action potentials. 280

FIGURE 9 Figure 9, which was created by Zheng Liu when he was a graduate student of Larry Abbott, shows something that neuroscientists all take for granted, which is that as you change the conductance density or the channel density for all the different kinds of channels in the membrane, these behaviors can change. I should tell those of you who aren’t neuroscientists that any given neuron in your brain might have—you probably all know about the classic sodium channel and the potassium channel in the Hodgkin-Huxley type axon where sodium carries current in, and potassium is partly responsible for the repolarization of the action potential. Real neurons have 6, 8, 10, 12, and 18 different kinds of voltage and time-dependent conductances. Some carry calcium, some of those carry chloride. Sometimes they are catanionic conductances, sometimes they are gated by other things. Each one of these has its own characteristic voltage and time dependence that can be characterized in very much the same way by differential equations of the same form as Hodgkin-Huxley characterized the action potential equations. Zheng made a model to make these traces that contained a sodium current, a conventional one, two different kinds of calcium currents, and three different kinds of potassium currents. Then he varied the relative density of those currents, and that’s what gave these different characteristic amounts of activity. 281

What we experimental neuroscientists do is use a method called voltage clamp to actually isolate, and then measure each of the currents. They are measured one by one in the cell. We can then fit those experimental data to differential equations that characterize those properties. Then we put all the currents from a cell back together again in a simulation and try to recover the properties of the cell. So, that’s where the data in Figure 10 come from. These models can be very useful in giving you intuition about what goes on, which combinations of currents must be there to determine these properties, and so on. FIGURE 10 The first thing that I would like to show you from these sorts of experiments is illustrated in Figure 10. These data come from Mark Goldman, who was Larry Abbott’s graduate student and who is now on the faculty at Wellesley. They represent a model very much like the one I just showed you, one with a sodium current, a calcium current, and three different kinds of potassium currents. What Mark found, and this is really quite interesting, is he could have silent cells that have either low or high sodium currents. He had tonic firing cells that could either have low or high sodium currents, and bursting cells that could either be low or high. And the same thing was true for any one of the five currents in the model. What this tells you, contrary to what most neuroscientists believe, is that knowing any single current by itself will not be adequate for telling 282

you what the behavior of the cell will be. FIGURE 11 It turns out that if you plot the three potassium currents against each other on a 3- dimensional plot, as in Figure 11, you can see all these areas where you can have regions of silent, tonic firing, or bursting. So, even the values of those three potassium currents are insufficient to basically predict what the cell's behavior. On the other hand, the space partitions better if you plot one of those potassium currents against the calcium current and the sodium current. You can then see the space partitions nicely. This tells you that in this particular model, you have to know the correlated values of three of the five, and it has to be this subset of the three out of the five, which means it is actually a very inconvenient answer, because it makes it difficult to do many kinds of experiments. We are going to take this approach and go one or two steps further. Figure 12 shows work done by Astrid Prinz. She wanted to build a really good model for the pyloric rhythm, because she wanted to build a nice model of that cell self-regulating or homeostatically regulating. She wanted good models for the PD, LP, and PY cells, so she began by hand tuning. She got really frustrated, because hand tuning is unsatisfying, because it’s very hard when you have eight or so, as we have here, conductances in the cell. 283

FIGURE 12 Instead of that, she decided to use brute force. There are eight different currents in the cell. We have a leak current, and a current I haven’t spoken to you about before. It’s a hyperpolarization activated inward current, three different potassium currents, two different calcium currents and sodium current. She decided to use brute force, and took six values for each of those currents, one of the six being zero in all cases. She then simulated combinations of all, and created a database of model neurons to span the full range of behaviors. So, that was the 6th to the 8th, or 1.7 million versions of the model. She ran them all and did this without telling me, because I would have screamed. (All good things that happen in my lab happen without me knowing about them, I kid you not.) She then wrote algorithms to search through the behavior of those cells, and to classify their behaviors in terms of whether they were tonic, firing, or bursting, and what they did. She saved enough of the data so that we could search that database and find different neurons or different behaviors. DR. BANKS: Is it absolutely clear that you distinguish a tonic from a bursting nerve? DR. MARDER: You can make up criteria that will distinguish. We’ll come back to that. What do you see when you do that analysis? Obviously you see cells of all different kinds of behaviors, as shown in Figure 13. There is a single-spike burster, which has a single spike and then a long plateau phase. There is a more conventional burster, and there are just other bizarre things. 284

FIGURE 13 Astrid spent a fair bit of time classifying those neurons, but the real question which comes back to what we are asking is how do these partition in conductance space, what are the overall structures? What is of real importance to us, if we think about cells trying to homeostatically tune, is you want to know whether you have a structure like this. In two dimensions, it’s real easy to see, or like this, where you have multiple regions of disconnected groups that can all be classified as bursters, but obviously this is going to be more problematic than if the space were better behaved. Obviously, this you all know. What I’m going to do is very briefly show you a new way we can visualize what is going on in 8-dimensional spaces. It’s useful to be able to bring eight dimensions down into 2-D, which we do through something called dimensional stacking. You start out for two conductances, everything else set to zero, which yields a plot. Then we take that grid and embed it in another grid which would have those full range of behaviors embedded for all these other values in two dimensions, and we would repeat that grid over and over again. Now we would have all of those four dimensions represented on the plot, and then you repeat that, and repeat it again and again until you have all the conductances plotted where you have one large and then smaller and smaller and smaller sets of repeating units, where everything has dropped down and repeated over and over again. This gives you all of those 1.7 million neurons in principle, represented on one plot. The real key is to use color to represent their activities. Because there are 40,000 different variations in the way you can do the stacking—e.g., 285

what axis you put as the smallest or the biggest—we asked ourselves what would be a most useful stacking order. Tim Hickey and Adam Taylor, who actually did a lot of this work, decided that maximizing large blocks of color would mean there were fewer transitions, and might allow you to see where different patterns of activity were best. To go back to this issue of connectedness, in this particular model, what do these parameter spaces look like? What Adam Taylor did was to ask in the 8-dimensional space whether he could find all the cells that were nearest neighbors of one another by only varying one value of one current at a time. He then defined those as a population. It turns out that of all the tonically firing neurons, 99.96 percent of them can be found in one connected parameter space region. We were very surprised and pleased by that. 100 percent of the silent neurons are found in a connected region of parameter space. The bursters that have 6-10 spikes per burst, 99.3 percent of them are in one region of connected space, and there are all these little regions down where many of the others sit. Surprisingly, almost all of the models in each of the three major activity types are in single islands, which means that if a cell is going to have to tune, it is actually going to have a much easier job, because it can stay moving around, making minor modifications of one or more currents, but it doesn’t have to find 4 or 12 different isolated islands, and therefore cross into very different regions of activity in order to get there. I would now like to show you one thing about our single spike bursters. Of all the single spike bursters, just 99-point-something percent of them are in a single island of connected space. The underlying conductances are in different regions of the map. Indeed, each one of these cells, which to an electrophysiologist would look virtually identical, actually vary considerably in their underlying conductances. So if I start looking at any one of these currents, you can see that all of these models, which have very similar behavior, have nonetheless very different underlying structure. That says that widely disparate solutions producing similar behavior at the single cell level may be found in a continuous region of parameter space, and therefore that relatively simple tuning rules and targeting activity levels might be used to tune cells. At the level of the network, Astrid Prinz did the same sort of thing that we just did at the level of the single neuron, but she did it to try and form a pyloric rhythm. She took five candidate AB-PD cells from the single-cell database, and she wanted to use different ones to make sure that the results were not too idiosyncratic to any one model, and six versions of a PY neuron, 5 of LP neuron. She then took all the synaptic connections and simulated either five or six of each of these. She ended up simulating more than 20 million versions of the three-cell network. Now we can go in and see what those do. 286

The results include some answers where you get a tri-phasic rhythm in the right order. You get other answers where you don’t get a tri-phasic rhythm at all, and then a whole variety of other kinds of mistakes. The tri-phasic rhythm is in the wrong order. It’s almost a tri-phasic rhythm, but one of the cells doesn’t follow. You can get really bad solutions and better solutions. There you have different model neurons of the same synaptic strengths, and again you get the full range of behaviors. This tells you that as you vary either synaptic strength or the intrinsic properties of the individual cells, you can get a variety of different network outputs, which I’m sure you all understand. Now the question is how many of these meet the criteria for being a good model of pyloric rhythm using the biological criteria that we established in the very beginning of the talk? Of the full set of all networks, 19 percent of them were pyloric-like. That is to say, they were tri- phasic in the right order. If we go in and use the criteria from the biological database, and we use those to pick out all the model networks that meet all of these criteria, 2.4 percent of them were in the right frequency range, had the right duty cycles, the right phase relationships, and so on; there were something like 12 or 15 criteria that led us to that subset of networks. We can now ask of that 2.4 percent that fit within a range that we would expect to be relevant to the biological system, what do they look like in their underlying structure? Analysis shows that there are large differences between the specifics of the conductances for the two model networks. For instance, the synapse conductance from PD to PY is much greater in model network 2 than for model network 1. Similarly, the AB-to-LP synapse is considerably different between the two model networks, as is the case for many of the other conductances. So, if you look at all of these, you say even though the behavior of the two networks is very similar, there are very disparate sets of underlying conductances giving rise to that very similar behavior. What this tells the biologist is that there are probably a whole series of compensating changes so that one parameter over here is compensated for by a correlated or a compensating change over there. The interesting question is to ask, what set of correlations and multi-dimensional correlations are there that give rise to these viable different solutions? In the model we see parameters that can vary over very large ranges. Some of these parameters in these successful solutions may vary 5-fold, 10-fold, or 20-fold, and this is saying that if you have enough dimensionality, even if you have a very restricted output, you can come up with a large range of solutions. If we look now at 633 models with very similar outputs, we see that none of them have a small PY-to-LP synapse, and none of them have a large LP-to-PY synapse, so these strengths are obviously important to producing that particularly output. On the other hand, the AB-PY synapse is found either small or large in the population of networks, so 287

there are presumably some things that are much more important or more determinative than that conductance. These data were very interesting to us. Now we have the responsibility to go back to the animal and measure the synapse from LP to PD in 20 different animals to see how variable it is in reality. Or if we measure the amount of potassium current in 20 animals, how variable is it? So, I’m just going to finish off with some new experiments that basically say given the conclusions of this sort of analysis, the prediction would be that each individual animal will have found a different solution set. Therefore, if we measure a given parameter, we should be able to see that variation in levels, so that’s what we are going to do. We’re going to measure some of those things, and we’re going to do this in crabs. We looked at two intracellular recordings from the LP neuron, and two ongoing rhythms. There is one LP neuron in each animal so it’s very easy to compare them. Superimposing them shows that their wave forms and dynamics in the network are almost identical. But when we measured the amount of the potassium currents, one can see differences in the amount of calcium- activated potassium current. Looking at 8-9 animals, we found a range of values, about 2-fold to 4-fold. This range is very similar to what most biophysicists see in most cell types. Experimentalists have always attributed that range to experimental error, but if you ask biophysicists to go back and tell you, they will all say, oh yes, 1.5-3 or 2-4 or something like that. It’s a very common range. To conclude, I would just like to say that taken at face value, these data would say that it’s very important for an animal to maintain stable physiological function throughout its lifetime, even though the nervous system has to constantly rebuild itself. Part of what we see is a tremendous heterogeneity in solutions found by individual animals, and probably by the same animal at a different time in its life. What we eventually have to understand is which parameters are very tightly controlled, which are co-regulated, which are coupled, what things are free to vary, what things don’t matter. And then we have to understand the rules that allow cells to wander around in their protected regions of parameter space as they are constantly rebuilding themselves to maintain our networks functioning throughout our life. QUESTIONS AND ANSWERS DR. DING: I would like to ask two questions. One is, are there multiple PD cells? You said for each cell type there is only one? 288

DR. MARDER: There are two PD cells in every animal; there is one LP cell. There are four GM cells, et cetera. DR. DING: Okay. Another question is, for each of the model neurons that you studied, over a million neurons studied, presumably these neurons are very highly nonlinear units. Couldn’t they be capable of multiple behaviors? DR. MARDER: Under different conditions. If you inject current, yes. But I just showed you their unperturbed behavior. Presumably, if you do a whole variety of different perturbations, they will change their activity patterns, and you could then select for all the cells that were bursting, unless you do X, in which case they might have made a transition to bi- stability or something like that. DR. KLEINFELD: I had the same question. Because you cited your work and Ron Harris Warwick’s work when you add small modulators to the animal, the animal has its own endogenous modulators, you change the firing properties of cells. Because the firing properties of the cells change with endogenous modulators, in fact you showed like 2.4 percent of all possible solutions were biologically relevant in these steady state conditions. Do you in fact get a much smaller range of solutions if the cells actually have to track the effective modulators? DR. MARDER: David has asked that question much more politely than some people have asked it. There is a really interesting question about neuromodulation. With neuromodulation you apply a chemical, and you might alter one of the conductances 30 percent, and you may see a very big difference in the cell's properties. One piece of the question is, if the population has a 2-fold to 4-fold range, why do you see an effect at all with a 30 percent modulation? And that’s because 30 percent from one of those solutions can still, with all the other compensating currents, give you an effect, even though that 30 percent variation would still be part of the expected range within the population. I think the same thing is true of the network. What we know from our old data is that many neuromodulators have what we like to call state-dependent actions. That is to say they had much stronger actions on networks that were going slowly than they did on networks that were going quickly. Part of the answer is that every network is in a slightly different configuration. Modulators may always take them in the same direction, but the extent to which they will take them will depend on the underlying structure. One of the things that we’re trying to do now is address this question in the models to really see whether many modulators go after multiple currents and multiple sites of action. So, what we think is going on is that modulators may find correlated sets of currents and synapses to act on so that the whole system can go in the right direction, but it gets more or less of that 289

movement because of its action on multiple different currents in different sites. DR. KLEINFELD: So, you’re saying the modulator should cause things to co-vary in principle a little like you see in these analyses? DR. MARDER: Or the modulators have been chosen to go after the right set of compensating or cooperative currents and parameters. DR. KLEINFELD: So, the second question was does number one imply that if you added an interfering RNA, that you should be able to drive the from— DR. MARDER: Yes, if one could get the interfering RNA to work in your lobster that is sitting at 12 degrees and has really large pools of message and protein, yes. DR. DOYLE: So, I sort of just want to make a quick comment tying it up with my talk yesterday. If you were to do the same kind of analysis on a range of clocks, you would presumably have found that the parameter space that gave good behavior was very small. And so, early in the development of timepieces, it was very difficult to get clocks that were accurate. And if you take and do the same analysis on this, you would find that almost none of the parameters matter much at all, and that there are a few that have to be perfect more or less. I have only studied bacteria, but bacteria appear to do the same thing that the advanced technologies do. In advanced technologies, like my digital watch, you use the network architecture to create extreme robustness to the things that are hard to regulate well. And you make extremely fragile architectures for things that are easy to regulate. So the reason digital watches are so cheap is there is a large amount of integrated circuitry in there that does all of the extra stuff, and it's cheap and sloppy. And there are a few little things that can be manufactured to very high tolerances also cheaply. By structuring the network in the right way, you can combine these very sloppy things that are cheap to do with a few things that are very precise, and the consequences are that the network then has incredible robustness. We know that everything that has been studied in bacteria seems to hold that. I don't know if it holds here. But it looks as though this is a standard thing in biology and advanced technologies, but not in primitive technologies. It’s only in our most advanced technologies that we are doing that. DR. MARDER: My guess is that anything useful from bacteria the nervous system has kept and used, because that’s how brains were able to develop. I think that's really cool. The advantage of working in bacteria is you have fantastic genetics. It’s going to be much harder for us neuroscientists to get some of those answers than it was for you. DR. POULOS: I’m Steve Poulos from the NSA. What keeps these cells pyloric? DR. MARDER: The way I like to think about it is that early in development these cells are determined to have a certain set of firing properties, and that is to say certain genes are turned 290

on that says I want to burst at approximately this frequency. Those then become set points for a series of homeostatic tuning. REFERENCES Bucher, D., A.A. Prinz, and E. Marder. 2005. “Animal-to-animal variability in motor pattern production in adults and during growth.” Journal of Neuroscience 25:1611-1619. Goldman, M.S., J. Golowasch, E. Marder, and L.F. Abbott. 2001. “Global structure, robustness, and modulation of neuronal models.” Journal of Neuroscience 21:5229-5238. Liu, Z., J. Golowasch, E. Marder, and L.F. Abbott. 1998. “A model neuron with activity- dependent conductances regulated by multiple calcium sensors.” Journal of Neuroscience 18:2309-2320. Prinz, A.A., D. Bucher, and E. Marder. 2004. “Similar network activity from disparate circuit parameters.” Nature Neuroscience 7:1345-1352. 291

Next: Dynamics and Resilience of Blood Flow in Cortical Microvessels--David Kleinfeld, University of California at San Diego »
Proceedings of a Workshop on Statistics on Networks (CD-ROM) Get This Book
×
Buy Cd-rom | $123.00 Buy Ebook | $99.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

A large number of biological, physical, and social systems contain complex networks. Knowledge about how these networks operate is critical for advancing a more general understanding of network behavior. To this end, each of these disciplines has created different kinds of statistical theory for inference on network data. To help stimulate further progress in the field of statistical inference on network data, the NRC sponsored a workshop that brought together researchers who are dealing with network data in different contexts. This book - which is available on CD only - contains the text of the 18 workshop presentations. The presentations focused on five major areas of research: network models, dynamic networks, data and measurement on networks, robustness and fragility of networks, and visualization and scalability of networks.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!