Transcript of EP 180 – Lynne Kiesling on the Electrical Grid

The following is a rough transcript which has not been revised by The Jim Rutt Show or Lynne Kiesling. Please check with us before using any quotations from this transcript. Thank you.

Jim: Today’s guest is Lynne Kiesling. She’s an economist focusing on regulation, market design, and the economics of digitization and smart grid technologies in the electricity industry. She’s a research professor in the School of Engineering, Design and Computing at the University of Colorado Denver, and she’s co-director of the Institute for Regulatory Law and Economics. Lynne also provides advisory and analytical services as President of Knowledge Problem LLC, and she’s an adjunct professor in the Masters of Science in Energy and Sustainability Program at Northwestern University. You got a lot going on there, Lynne. Welcome to the Jim Rutt Show.

Lynne: Thanks, Jim. I’m happy to be here. Do have a lot going on, lots for us to talk about.

Jim: Indeed. Yeah. I met Lynne at a event at the Santa Fe Institute where she gave a very interesting presentation, even though it was relatively short within the time constraints of the program that we were both on. So, I thought I’d invite her here, and we can do a dive into one of my favorite domains, which is the electrical grid and the things that are happening and should happen, but might not, to the electrical grid in the years ahead. One of the things that’s particularly interesting about electricity as a product is it’s got a useful life of about one microsecond. It gets generated, and then either distributed and used, or put down into the ground. At least that was the case before there was any form of storage. What are some of the other interesting things about electricity as a product category?

Lynne: Well, I think, if I can build from your physical grounding there, pun definitely intended. The fact that it is really only useful for about a microsecond means that, when you think about it at a kind of transactional layer, supply and demand have to be in real-time balance at all times. So, that has some pretty serious implications for how we designed the grid back in the late 1890s after… I don’t know if you want to talk about Tesla, and Edison, and Westinghouse.

Jim: Nah, nah, nah.

Lynne: Nah.

Jim: We don’t need that ancient history. Y’all can look it up. Tesla won. Just gave you a spoiler there, folks.

Lynne: Accompanied by one of my favorite Pittsburghers, George Westinghouse. Yeah. For the past 130 years, the architecture of the grid, because of the alternating current nature of it, requires real-time balance between supply and demand. What that meant, in a time of mechanical systems and analog controls, is that you needed to have a big central control room in order to achieve that supply/demand balance. So, you had an engineer in the control room kind of flipping switches and twiddling dials to keep that balance in check.

Jim: Yeah. That is a very interesting thing. If you think about it, say a large-scale grid, several states, there are consumption points coming on and going off all the time. You turn on your electric stove, you just flicked on a little pole on the grid, not quite as volatile, at least not until recently. On the supply side, generators are coming on, going off, having unexpected outages, having scheduled maintenance, etc. So, there’s this very crazy dance going on between demand and supply that’s got to be balanced down to the microsecond, or whatever’s left over has to be grounded out into the ground.

Lynne: Yeah.

Jim: I don’t know how much is grounded out anymore, but I know it was a non-trivial amount back in the old days.

Lynne: Yeah. I’ve lived in Chicago for most of my adult life, and we infamously had what were two of the early, pioneering, big power plants from Sam Insull, Fisk and Crawford. By the 1990s, the city had grown out to where the plants were located. So, there was a lot of environmental problems associated with those neighborhoods. But, the other thing that was really a problem was when they had excess generation, there were periods where they would just boil the water that was in the channel going down to the Illinois River. So, yeah. I think storage is going to definitely reduce our waste because that’s really, really wasteful, both physically, environmentally, and economically, to be putting all the resources into generating this valuable product. At any point in time, if you have excess, so it really isn’t valuable in that moment, then you waste it.

Jim: Yup. Then especially, again, this is the old days picture, which we still sort of live in. There was basically two kinds of power generation. There’s base load, which is big, hunking power plants that take quite a while to start up and shut down. Then, there’s so-called peakers. They can be turned on and off more rapidly. Some of them quite rapidly, some of them almost instantaneously, but not quite.

Typically, the base load guys are coal plants or oil plants, not that there’s very many of those anymore. And, nuclear plants, hard to start up and shut down for a number of reasons. They’re quite economically efficient, at least if you don’t count the externalities costs of CO2 for the coal plants. Then, the peakers used to be diesel engines. Now, a lot of them are natural gas. Hydro can also be a peaker, if it’s designed correctly. So, as you start thinking about the loads coming on in the afternoon, everybody’s coming home from work, getting ready to fire up their kitchens, etc., or a hot day where the air conditioning is going to be a big demand, the peakers start to come online, based on some rationale, I suppose in theory at least, on the ones that are most efficient coming on first and the least efficient coming on last. Is that kind of a tolerable cartoon version of peakers and base load?

Lynne: Yeah. I think that’s a pretty good summary of how it works. Let me layer on some of the economics of why it is that way. The big, honking power plants, the central station, large-scale generators like nuclear and big coal-fired power plants, they have very high fixed costs. They’re very capital intensive so it takes a lot of assets to build these things. Then once you’ve built them, the actual additional costs that you incur to generate an additional kilowatt-hour of electricity is really small. So, it’s basically your fuel costs, and some labor, and a little bit of some other stuff. It’s just dwarfed by the capital cost, the fixed cost of constructing the plant itself. That means that those big power plants have something called economies of scale, which means that, at least for some large amount of demand, if you have just one plant generating all that electricity, that that’s going to be the lowest average cost way of doing it. So, that’s why you get this concentration in the early 20th century.

I mentioned Sam Insull. He’s one of the guys who really pioneered and pushed General Electric, and others, to engineer these big power plants. It really did transform the economics of the industry, and we have that to this day. But the trade-off of that, as you say, is that it takes a while, and it takes a lot of effort and a lot of cost, to spin those guys up. In other words, they’re what we call slow ramping generators. So, you don’t ever want to turn those things down except for when you have to for maintenance. So, you want them to run as flat out as possible because turning them on/off is extremely time-consuming and costly. Whereas for peakers, it’s more like turning on and off a jet engine. In particular, since the 1980s, the combined-cycle gas turbine, which uses natural gas, is extremely energy efficient and has about half the greenhouse gas emissions of a coal-fired power plant. It literally is a jet engine that you could just hook up in a power plant, and just turn on and off as needed.

Jim: Got it. But now, things are getting more interesting. Tell us about distributed energy resources.

Lynne: We are starting to socialize the idea of calling it an energy transition. So, this energy transition that we’re in is fascinating. We’re coming from this very early 20th century notion of the electric utility as this kind of vertically integrated system. That was Edison’s idea. Edison was like, “We’re building a system. It’s all integrated, from the generator through the wires.” For him, it was even into the lighting fixtures inside the home. That, to some degree, I would argue, came from technological necessity that those mechanical and analog machines all had to work together and be coordinated with humans in the loop, so to speak.

So, there’s two big areas of technological change that we’ve been having since the ’80s. One is the big changes in generation technologies. The combined-cycle gas turbine jet engine on a platform is one of them. Then in the ’90s, there was really substantial innovation in wind turbine technologies. So, that’s one reason why wind power has proliferated the way it has, the production cost has fallen and wind turbines have become so much more energy efficient because of a lot of innovations that happened in the ’90s. With the long blades, some of the composite materials that are used to make wind turbines, the way that rotors are made, all kinds of pieces parts that go into a wind turbine, and they’ve all gotten better and improved energy efficiency. That means that we have more distributed wind turbines. That’s good from a low carbon perspective.

Then, both the gas turbine and the wind turbine are larger scale, what I guess we would call utility scale type technologies. What has really gotten interesting in, I guess I would say the past 15 years, has been the falling production costs for solar photovoltaic. Solar PV is a kind of mid-20th century technology. It started really, really small as part of the Apollo missions. Only more recently has it been used since the 1970s to generate electricity for some use other than just on spacecraft so that’s a much more demanding use.

So, there’s been a lot of innovation recently. The production costs for PV panels have plummeted 60% in the past decade. There’s a lot of interesting dynamics there. Some of it is just kind of natural innovation and entrepreneurship. Some of it is policy driven, and those two intersect. Once you have that, and you can have more distributed rooftop solar panels, that’s when we get into what you referred to, this idea of distributed resources. So with PV, you have this distributed ability to generate low carbon energy. I say low carbon, not zero carbon, on a life cycle basis because, of course, the process of producing PV panels is energy intensive. That’s still a work in progress from a carbon perspective. So, you’ve got the distributed PV that can go on the roof of your house, the roof of your garage, as a canopy over parking lots, and kind of make use of unused roof space or airspace to generate electricity.

Then, the other really important distributed energy resource, from a complimentary perspective to this, is the electric vehicle and battery storage. I put those two together because essentially an EV is just a battery on wheels that provides you two types of value streams. One, it can store energy, and two, it can transport you places you want to go. Otherwise, the EV and the battery have a lot of similarities in terms of being able to spread production and consumption, spread demand and supply, and balance them out over time so that really helps.

Storage is the holy grail in the grid. It’s the thing that gets you out of that problem that you mentioned of having to have that microsecond balance. These distributed resources are really now coming into their own economically and physically.

Jim: Of course, not only are they distributed, but they’re also intermittent. Right? There are times when the wind doesn’t blow. Guess what? All night solar doesn’t do anything. Even during the day during cloud cover, it can fall quite considerably, and it can be quite localized. Big old thunderstorm goes across a big solar farm, for instance. Right? Oops.

Lynne: Yup.

Jim: Big fallout in production. Of course, wind can vary. It’s got tread lines. You can predict it some ways in advance. It can come and go, as well. Not only are they distributed, which is quite different than the big, giant plants. Everybody knew where they were and what they were doing. And, if you had to, you could talk to them on the phone, find out whether they were up, or not. But, these things are scattered every which way, and are constantly going up and down, and losing and gaining productive capacity.

Lynne: That variability, I think, over the next few years, that will turn from a bug into a feature. If you put yourselves in the shoes of kind of a 1960s era distribution system, control room engineer, operator person, and you say, “Okay. I have to control what’s going on in the grid to balance supply and demand. I have these big generators, and they’re always running. I just have to basically tell them to ramp up and down a bit over a certain amount of time.” That’s a very, very different problem from what you just articulated.

I’m not an engineer, but the engineering challenge of this is substantial. It’s one of the things that, I think, is causing a lot of consternation in the industry, and a lot of hesitation about distributed resources. We don’t know how to dispatch those things. We can’t dispatch those things because they’re intermittent. They show up when they show up, and we can’t dispatch that way. So, part of what’s going to have to happen, I think, is a cultural rethinking of the process of operating the grid, as the grid becomes more heterogeneous in its resources and in its behavior. But, that’s a real tricky challenge for the engineers.

Jim: Yeah. I understand, also, that the way the grids were designed makes perfect sense. I like to say, the lines coming into your house, the assumption was the power was going in, not coming out. But now, where people have rooftop solar, or occasionally wind, they could be moving large amounts of power, maybe even more than their house normally consumes, back out into the grid.

Lynne: Yeah.

Jim: That’s an unanticipated change in network topology.

Lynne: Yeah. Exactly. The change in the nature, if we want to think in terms of network topology, and think about the grid as a system. So, I’ll just start by saying system, although I have an idea of where I want to go with this. I’m sure we’ll get there. In the kind of 20th century architecture and 20th century network topology, producers are producers. They have these large-scale generators. All they do is they push current on this one-way flow in the direction of consumers. The consumers might be a big factory, or an office building, or a hospital, or an apartment building, or a house. There’s industrial, commercial, and residential customers, of different sizes, but that’s all they are, is consumers. Right? It’s just, as you said, the power’s coming in. I flip the switch, and the light goes on. And, in that 20th century landscape, the technology and the regulations both combined to make electric service to sell electric service to customers as a commodity service. Right? I flip the switch, and the light goes on. They’re all just electrons. I don’t care as long as I have, as they say, cold beer and warm showers.

So, that’s the kind of 20th century landscape. Regulation fed into that by embedding this form of pricing to customers, that was basically a fixed rate price. What the electric utility would do, they basically estimate their costs, report their costs to the regulator, plus a rate of return on their assets. That all gets wrapped up in what’s called a revenue requirement. Then, you go through this elaborate calculation to figure out the price to charge to those three types of customers, industrial, commercial, and residential. When you add up the expected revenue, it exactly equals the utility’s total cost. So, total cost equals total revenue, average cost equals average revenue, and the prices that customers paid were fixed, and-

Jim: And, by time of day, also. Right? If it came out 7 cents a kilowatt-hour, that’s what you paid, 3:00 in the morning or 5:00 on a Friday afternoon.

Lynne: Exactly. You’re mentioning the time difference for a very important reason, that’s the actual value of what you’re consuming. The value of the electricity differs considerably from 3:00 in the morning to 5:00 in the afternoon. Also, the cost of producing it. Right? At 3:00 in the morning, when most everyone’s asleep, you crank up that big, nuclear power plant, and your marginal cost is very, very low. So, the cost of producing that kilowatt-hour at 3:00 AM is really low. The value’s also low, but you’re paying an average price. At 5:00 PM, it’s the exact opposite. The cost of producing that kilowatt-hour is high, both in terms of resources and also in terms of the emissions impact. But, the value to you of that 5:00 PM kilowatt-hour is high because that’s when you need your air conditioning. You’re still paying that average price that’s lower than your value at that time. So, you have this mismatch between price and value by doing this averaging.

Jim: Yeah. Particularly now that we have these intermittent sources that are relatively strongly correlated with the diurnal cycle. The wind is, on average, stronger at night, not by a lot, but by a bit. Of course, solar goes to zippo at night. We have, now, this temporal pattern. We always had a temporal pattern of usage, and now we have a temporal pattern of supply to lay on top of the temporal pattern of usage. It’s not all that well correlated. The solar better correlated than the wind, but some definite non correlations there, as well.

Lynne: Yeah. That’s one reason why storage has always been the holy grail, and continues to be. If you can capture that inexpensively produced wind power at 3:00 AM. Then, save it up and use it in the morning. Both economically and environmentally, that’s very beneficial. The other thing that’s, I think, going to be a really seismic architectural shift is… In this 20th century world, you’ve got these big generators, it’s just this one-way flow from generators to customers. As customers, we’re all paying these fixed average rates. What that means is, from an architectural and operational perspective, the way you run the grid is that you ramp the supply up and down to make sure it meets demand. So, that means that you do a lot of demand forecasting, or what’s called load forecasting, to anticipate, as best you can, what demand is going to look like in half an hour, in a day, in a week, next year.

You do a lot of this demand forecasting so that you can get the supply resources lined up, it’s called resource adequacy, to be able to meet that demand. But now, as you say, with these intermittent resources, supply is going to be more variable. It might be both economically and environmentally beneficial to let supply vary. But then, what does that mean? That means that if supply’s varying, then you need to be able to vary demand to meet supply, as opposed to varying supply to meet demand. So, I think that’s going to be a pretty big seismic shift in our thinking of how we run the grid.

Jim: Yeah. I remember when, I think, California… Was that the first big grid to go to time of day pricing for residential customers? It was quite a while ago. I knew of people that were setting their dryers to run at 3:00 in the morning, for that reason. We can, of course, do a whole lot more than just that.

Lynne: Right. During the California energy crisis, 2000/2001 period, there was a period when San Diego Gas and Electric customers could see a real-time price. I think, as someone who is very strongly in favor of dynamic pricing, it was a little bit not ready for prime time. It was there, they did it. Yeah. You schedule what you can to happen at 3:00 in the morning. I think you learn, very quickly, just how much variation there is in the cost of generating electricity over the course of the day, which of course most people don’t think about. We’re just habituated to think, “Okay. It’s always there. I have to flip the switch, and the light goes on. I don’t have to think about it any more than that.”

Actually, in Chicago, the utility there, ComEd, has a residential real-time price rate, and that’s what we’re on. I have a digital thermostat which I could program. There’s this “If this, then that” little, plug-in algorithm that I can program into my thermostat to make it price responsive. But, I haven’t quite debugged my code, yet.

Jim: And of course, we know that only one-tenth to 1% of people are ever going to do that. So, the thermostats need to have that software built in. You just throw a switch and say, “Run algorithm B,” for instance. Right? That’s how these things will actually go to market. Toggling in the program into your thermostat, not going to happen too often.

Lynne: Right. This gets us, to kind of fast-forward, to the other of the two big areas of transformational technology. One is the generation technologies that we’ve been discussing, and especially the distributed resources. Then, the storage that we can use to save it up and shift it intertemporally. The other big area of technological change, that’s part of the energy transition, is digitization. Since the mid-2000s, first it started as smart grid, and then it became grid modernization. The digitization of the electric grid has been going on for about 17 years, incrementally. Digital meters, the smart meter to meter your consumption at your home, is one type of technology to do that. Then, on the inside of the grid, the guts of the grid, whether it’s in transformers, or substations, the ways that you can do fault detection and automated repair, even within the wires network itself.

This is not my area of specialty, but I think there’s an area, it’s mostly high voltage focused, is something called dynamic line rating, which is a way that you can use digital sensors to monitor current flow in real-time and to see essentially what the capacity is of a given wire. If it’s operating at less than full capacity, you can re-rate the line to be able to have more flow one way or the other. So essentially, you can increase the total flow of the wire by doing this digital dynamic line rating, so-

Jim: Wow.

Lynne: Yeah. That’s pretty cool.

Jim: Now, just for a point of information. Today, if somebody has rooftop solar at their house, is it communicating in real-time back to some control center, or are they just dumping their electrons into the wires?

Lynne: It depends. A couple of years ago, the IEEE issued a digital inverter standard. If you have solar panels on your roof, the solar panels capture the sun’s rays and basically create electrons. The current that comes out of that is direct current. In order to get the electricity from your panels onto the grid, you have to change that direct current to alternating current. So, you have to go from a line to a sine wave, and that means going through an inverter.

Now, IEEE has these standards for digital inverters to, as far as I understand it, basically make that physics handshake a little more straightforward and easier to monitor, easier to just interconnect those resources onto the grid without it being disruptive. If they have those digital inverters, there’s a lot more visibility into what your panels are doing. I don’t know how widespread those are in use. We’re definitely in a transition process on those. I think otherwise, you put your excess generation out on the grid. I think each panel installation has a physical limit. You can’t put out more than a certain amount. The distribution system engineers will set that so that it doesn’t cause any disruption to the rest of the grid.

Jim: Got it. That’s interesting. Now, when you think about systems and systems mediated by computers, if the rooftop solar system and its inverter is sending useful telemetry back up the network, doesn’t do any good unless there’s the right software on the other end to make sense out of it. As a person who spent much of their career building large-scale systems for industries, not the electrical industry, more financial services, and then later the internet itself, we know that sometimes these big players, their software technology can be really behind the times. How are the people that were managing the grid, these days, with respect to the currentness of their software and their ability to take advantage of this new telemetry that’s coming at them?

Lynne: Yeah. I think your use of the word telemetry is exactly right, and almost immediately gets beyond the extent of my expertise. It’s important because the coordination that has to happen in the grid is very much about phase angles, and because we’re talking about waves, keeping all these angles within certain ranges so that you don’t disrupt the sine waves. It’s been a challenge over the past 15 years. A big part within the utility focus of the grid modernization and smart grid has been building that software capability. They can take advantage of all these digital sensors and be able to have more visibility into what’s going on in the system, and to have more fine-grained control of it. Different utilities are moving with different speeds towards this. It’s definitely still a work in progress.

At the kind of bleeding edge part of the continuum, you can have things like substation virtualization. You take your substation, and you create a digital twin of it. You can use that to manage what’s going on in your substation. It reduces the amount of time you have to spend rolling crews out to go maintain things out in the field. You can do more remotely. The famous things that people will say about the digitization of the grid is… Certainly when I was a kid, a squirrel falls in the transformer in your backyard, and your lights go out. You know it, but the utility doesn’t know it. So, you have to call them and say, “Hey, my lights are out.” At least now, we tell them on Twitter, as opposed to having to phone them up. They are getting more visibility into knowing when and where there are outages. So, the digitization is really good for reliability outcomes for customers getting a more reliable service.

Jim: I got to tell you a funny story about that. Here in Virginia, where I live, before they renamed themselves Dominion Power, our statewide electrical utility was called Virginia Power. It turned out there was a poor woman, that was her name, Virginia Power. When the lights went out someplace, people would call directory assistance. Some percentage of the operators, this is how far back in time it was, this was operators, would give them the number for poor Miss Virginia Power. I don’t remember what she did, but it was quite a mess. Right?

Lynne: Oh. After the invention of the answering machine, whenever there’s a storm, I would just turn off my ringer and put the answering machine on to say, “No. I’m not that Virginia Power.”

Jim: “Call this number, please.”

Lynne: Right.

Jim: Right?

Lynne: Right.

Jim: Yeah.

Lynne: The other area of digital technology that I think is really important, on the other side of the transaction, if you will, is on the customer side. We’ve been talking about, within the guts of the wires, the digital sensing and monitoring, and automation in substations, and transformers, and the implications of that for the control room and software. I should actually say, as long as I’m thinking about the control room, and we’ve talked about distributed energy resources, one area of a lot of research and some new investments, right now, is in an area called DERMS, which is distributed energy resource management system. This is a software platform that enables a distribution utility to manage the DER that it has interconnected on its network. It’s still very much a centrally controlled paradigm. It’s a software platform that’s intended to try to create better management, and coordination, and visibility, when you have more DER, so-

Jim: Gotcha. Before we move on to the other side, let me wrap up two other things. Then, we’ll move on to the punchline about the demand side stuff. First, let’s talk a little bit about storage. Right? As I was mentioning in our little pre-grade discussion, I’ve studied mass electrical storage back in 2004 when it was quite the hot topic. Couple of billion dollars a year were being invested into the area, but my analysis said, “Not even close to being economical to be able…” The case I studied was, take overnight nuclear power from Northern New England, store it in Brooklyn, and sell it to New York in the middle of the day. You could buy it for half a cent a kilowatt-hour. You could sell it for an average about 10 cents a kilowatt-hour, so a 20 to one markup. In those days, the costs of the batteries were just too high.

You talked about, earlier, capital costs. This particular business model was totally dominated by the capital cost per kilowatt-hour of storage, and it was off by about a factor of two. Since then, there have been considerable advancements in battery power, and there’s been some development of pump water storage, etc. Of course, as you mentioned, this very interesting thing of these electronic vehicles with their big batteries on wheels, that have been created for another purpose, with a different set of economics. Right? They didn’t have to make sense for storing mass power, but they might be able to be used for that. What does the storage world look like to you, right now, and where do you see it going?

Lynne: I think you’re exactly right. In 2004, the storage technologies didn’t have the energy efficiency to be economical. They definitely weren’t ready for prime time. When I teach energy economics or environmental economics, then we’re talking about the cost of storage. I usually talk with students about different types of storage. So, the battery that you put in your watch, or the battery even that’s in your phone, what you pay per kilowatt-hour for the charge you get to run your device is probably one to two orders of magnitude higher than you pay for that seven to 10 cents a kilowatt-hour that you pay in your house. Getting storage to be an economical value proposition has been a real challenge. I think, just from a kind of innovation economics perspective, one of the biggest drivers of that has been the development of the lithium-ion battery.

So, I have to [inaudible 00:40:57] Tesla a lot of credit for bringing down the cost of battery production and making the economics of storage much more realistic. Storage, it’s not a generation technology. It’s basically an intertemporal storage technology. You’re just shifting so that you can save some up so that you can use it later. That in and of itself has a value, and you have to compare it to its opportunity cost. The opportunity cost is, “What’s the next best alternative?” The next best alternative is, “I consume from my grid power.” So, that’s why the comparison is always made to, “What do you pay at your house?”

The storage, it’s not just the lithium-ion battery. The electric vehicle has two value propositions. The vehicle can transport me where I want to go. That’s kind of value proposition number one. But increasingly, with a digitized grid, that intertemporal use of your car’s battery as a way to store energy for future use in some way other than driving, is another value proposition that I think maybe in a few minutes we can dig into in a little more detail. There’s all sorts of other storage innovations going on at large scales. I mean, we can think of hydro, and hydro is one of the millennial old technology for providing energy that’s not human or animal. Hydro generation is itself a form of storage because you can turn it on and off, largely at will, as long as you’ve got enough water. When you think at that kind of scale, there’s other interesting technologies that people are coming up with. I don’t remember the name of the company, but there’s one that basically has big cement blocks, and when-

Jim: It’s called Energy Vault, I believe.

Lynne: Yes. Energy Vault. Yes. Excellent. You heat up the blocks when electricity is cheap. And then, during the day when electricity is expensive, you release the energy from the blocks so it’s a form of thermal storage. The other one I’m really interested in, right now, is a company called Form Energy. They are developing an iron-oxide battery, which basically means that the energy storage is by rusting and unrusting a block of iron-oxide. So, that’s pretty cool.

Jim: Yeah. I love them. I hope that one works. As we all know, lithium and some of the other components that are needed in lithium-ion batteries are getting hard to find. I mean, there’s enough in the earth as it turns out, but it’s going to be quite a race to get them out fast enough to reach our zero carbon goals.

Lynne: Yeah. Lithium and cobalt. There was a good article in the Economist last week about how the interest in batteries and electric vehicles is driving people to look for sources of lithium and for sources of cobalt that aren’t in places like the Congo or China. No. It’s a very dynamic space right now.

Jim: Yeah. I expect there will be breakthroughs, whether the Form Energy guys. I think Bill Gates is an investor in them. Whether that works or not, somebody’s going to come up with some interesting electro chemistries.

Lynne: Yep.

Jim: There’s such a gigantic and obvious demand for it, which is very interesting. Now, before we move on to the other side of the question, one other thing I’m going to put forward here is, we’re talking about a lot of new electronics to make the grid work in this new, complex systems environment that it’s in, where both production and the consumption is changing radically all the time. It’s worth noting and worth thinking about, and hope that people are putting enough attention on this, that makes the grid more subject to both cyber attack and massive failures from exogenous events like massive solar flares.

Lynne: Yes, yes. Solar flares, terrorist attacks from electromagnetic pulse. There’s a substantial attention to cybersecurity in the grid modernization and smart grid focus. There’s some overlap with what I think of as good internet principles, things like having open standards. I tend to think about these questions from, I guess, I would call it a Linux perspective. Right? The idea that you have a shared set of code and common interfaces. Then, you design your security around the common interfaces. That’s a better way to achieve security than to have proprietary architectures.

Of course, the electricity industry is completely a proprietary architecture industry. Ever since 1895, when Westinghouse and Tesla went up to Niagara Falls, and they built the entire Niagara Falls generating station basically as a bespoke artisanal engineering project. Ever since then, it has been this 140 years of one-off type design.

So, the idea of standardization, modularity, open interfaces, some of the kind of deep internet principles that we’re used to, is a cultural shift in this industry. The cybersecurity is definitely part of the planning, and the thinking, and the design principles that go into thinking about grid modernization. I mean, some of it is very deep and very important, having airlocks on certain devices that just can’t be on the internet. Some of it is as simple as… When you buy that digital thermostat, you shouldn’t be able put it up in your home and turn it on. The first thing you should have to do is change the default password. So, some of it’s as simple as that, but there is a very wide range of cybersecurity practice.

I think utilities are getting comfortable with it. One big challenge is this is an area that is not very familiar to regulators. When utilities come in and propose, “Okay. We want this gold-plated consulting company to come in and do our cybersecurity best practices audit.” If you’re a regulator, do you have the background to be able to assess whether or not that’s a prudent investment? So, getting the kind of cybersecurity, utility regulator, public interest, public service, constellation all lined up, I think that’s going to be a big challenge, just getting the information and awareness promulgated.

Jim: Those culture change things are harder than they sound. Right?

Lynne: Oh, yes.

Jim: They’ve been doing a good job. I mean, our electrical grid in the United States and Canada is remarkable on a worldwide basis. They can rightly pat themselves on the backs. “We’ve kept the grid up for 100 years, with a few intermittent failures. But, overall pretty damn good. And, you’re asking us to change. Mm. That could take a while.”

Lynne: Yeah, yeah. Cultural change is definitely challenging, especially when it’s in an area where you don’t have subject matter expertise. A lot of these digital technology, and especially the cybersecurity issues, it’s like, “We know that we don’t know that, but that means that we don’t necessarily know how best to find a path forward.”

Jim: Yeah.

Lynne: That is a challenge. Yeah.

Jim: All right. Well, let’s now move on to where the story gets even more interesting. You and some of your collaborators have worked on an idea called TESS, the Transactive Energy Service System platform. As I understand it, from your presentation at SFI and some reading I did yesterday and today, one of the key ideas here is that the demand side also becomes automated, and can communicate over the network and do stuff. Is that one of the core interesting ideas that’s added, now, to the mix?

Lynne: Yes. Exactly. I mean, just speaking as a researcher, my interest in this topic came out of a topic. This goes back to the 1960s, well before my time. Certainly by the time I was working on this in the early 2000s, there was a very rich economics literature and very rich policy conversation around dynamic pricing and the idea that it would be economically beneficial to have pricing to retail customers that more closely matches the way costs vary over time. The way we discussed that at 3:00 in the morning, generating electricity is cheap, and at 5:00 in the afternoon it’s expensive. With the old style peakers, there was more emissions, so it was also more environmentally costly.

So, what if we had dynamic pricing that allowed prices to vary in ways that more accurately reflected actual costs? So, that dynamic pricing conversation had been going on since the time of Marcel Bloteau in the early 1960s.

By the 1990s, and certainly around the period of regulatory restructuring in the mid-90s, there’s a lot of conversation about fixed prices. It’s one of the vectors of inefficiency in the regulated system. We need more dynamic pricing to give better information and better signals to consumers so that we have better demand side incentives. At the time, we didn’t yet really have digital technology on the radar. Then, by about 2004, 2005, we all are starting to have these cell phones, not smartphones yet, but cell phones. So, you have digital technology in your pocket. So, if you’re on the train on the way to work, you can receive a text message that tells you, “Hey, it’s 8:00 AM. Tomorrow at 8:00 AM, here’s what the price of electricity is going to be.” So, you could go home that night and program your thermostat to take advantage of some of the cost saving opportunity you have.

So, that’s, I think, one of the origin stories. The other origin story is… At about that same time period, I was working with Vernon Smith, who is a experimental economist, who in the 1960s founded the field of experimental economics, and in 2002 won the Nobel Prize in economics for his work. Shortly after that, he and I were working together and trying to work with regulators to have them use experimental economics as a way to test some of their regulatory design proposals, or market design proposals, before you actually release them in the wild. This is when we were all still very much having the hangover from the California electricity crisis, when too many market designs were released in the wild without proper testing. A lot of people suffered the consequences of that.

So, experimental economics is a paradigm. It’s a methodology in economics where you create a laboratory setting to test a specific hypothesis. In this case, it might be a market hypothesis. So, you divide half the room into buyers, half the room into sellers. You tell the buyers what their values are. You tell the sellers what their costs are. Most importantly, you tell them what the rules are, in the market. Then, you let them trade and see what happens. Then, you might run another treatment where you use a different set of rules, and you see how that set of rules perform versus the first set of rules. I think if you put these together with digital technology, that’s where you get this idea of TESS, or transactive energy, that you can automate your energy related devices in the home to respond to price signals.

The kind of received wisdom in electricity is, “Oh, people don’t want dynamic prices. They don’t want to have to sit there and twiddle the thermostat to try to save money.” The beautiful thing about digital technologies is that it reduces those transaction costs. You can automate your preferences into your thermostat, basically have some algorithm that you just plug it right in and say, “Okay. Here’s how I want you to behave.” Then, it can turn that into bids. “In this particular time period, my thermostat is willing to pay 8 cents a kilowatt-hour for that electricity.” You submit that as a bid so your thermostat, your EV, your water heater, can participate in a local market on your behalf without your having to stand there and do the work manually.

Jim: Yeah. And in fact, you don’t probably even have to worry about the price too much. You might set some parameters like you’d say, “Make it so 99.99% of the time it doesn’t get any colder than 65 in my house, and you figure out how to do it.” Right?

Lynne: Right.

Jim: As an example of the kind of interface that a human could understand, which could then be converted into bids, behind the scenes, by the right software.

Lynne: Yeah. My collaborators and I haven’t done this, but I think it’s one of the next types of things that we will do. I think of it very Boolean, “Here’s the temperature set point I want, and if the price goes above say 9 cents, then change my temperature set point by two degrees.” Right?

So, I think of it in that very Boolean way. It may well be more the way you just described, very amenable to a machine learning approach. So, I suspect we will see more of that kind of machine learning, gathering data about what’s going on in the market. Then, taking your parameters and basically fitting your parameters to get the kind of best joint outcome of whatever you want in terms of savings on your bill, or how much of what you’re consuming is low carbon, whatever your preferences are. I think machine learning will be an approach that does what you just described.

Jim: Yeah. That makes a lot of sense. Now, you mentioned the need to test. Right? One of the things that we know about complex systems is you can’t really get there by static mathematical analysis, in general, but even more so where the players are agentic. They have personal goals. “I want to save money on my electric bill, but I don’t want my toes to get cold in my living room.” Right? Or, the person on the other side who’s trying to maximize their profit. So, when you have network competition between agentic enterprises, and many of them coming and going through a intermediation platform like a grid and a market attached to a grid, you can’t really know what’s going to happen. Right? How do you-

Lynne: Yeah.

Jim: … test for something like that?

Lynne: I think you’re exactly right, and you said it really well. Usually, the way I say it is much more jargony. Once you have all of these agents with their own individual preferences, and their preferences are subjective and they’re private. They’re personal to them. I don’t know what the trade-off is for you between warm beer and cold showers, and saving money on your bill. I don’t know that. You know that. That’s true for every single one of us. So, when you take all of these many, many, many people with their private, subjective preferences, and their own projects, and you put them all together, and they’re all using this electric system as an input into the variety of things that they want to do, it’s a very complex system. Technically speaking, complex meaning not deterministic. You can’t necessarily begin at point A and deduce your way to a known, specific certain outcome. Right? So, how do you test in that kind of environment?

My collaborator, David Chassin, who’s at SLAC National Laboratory at Stanford, several years ago, when he was at Pacific Northwest National Laboratory, he wrote a agent-based modeling platform called GridLAB-D. So that’s what he and his engineers use. They use GridLAB-D as an agent-based modeling platform. So, in agent-based modeling, what you do is you’re defining who all the agents are. In this case, it would be like, “Okay. Suppose we have 100 houses.” We can say how big the houses are, how many kilowatts of electricity capacity they usually consume. Suppose we have a couple office buildings and some other stuff. So, we have this system with these agents in it. Then, we put solar panels on top of half of the houses so they now have a production capability. You basically, then, set some parameters and run the model. So, it’s not a closed form. I come from economics. So for an economist, mathematically, we’re always looking for the closed form solution. Right? “Here’s my mathematical model. I’m going to solve for p-star and q-star for equilibrium pricing quantity, and I’m going to get this nice closed form solution.”

Whereas, complex systems are very open form. Because they’re not deterministic and non-linear, you can’t do that closed form kind of calculation of equilibrium, agent based simulation, and just running thousands and thousands and thousands of iterations. Then, you change the parameters and run thousands and thousands and thousands of iterations. It’s a more inductive, or I guess, if I’m channeling my Charles Purse late 19th century philosophy, I would call it abductive way. It’s not deductive because you can’t necessarily get to that single deterministic, “Here’s the solution we’re going to get.” But, you can basically get an inductive/abductive idea that, “95% of the time the solutions we’re going to get are going to be in this range from A1 to A2.”

Jim: Gotcha. Yeah. For people who want to learn more about agent-based modeling, we had a really good episode back on EP90 with Josh Epstein. He went really deep in agent-based modeling as applied to the social sciences. There’s another resource for our listeners out there because this is a fascinating field, one that we use a lot at the Santa Fe Institute. Okay. Now the other thing, of course, you can do is smaller-scale experiments. You were involved, I believe, with the Olympic Peninsula Testbed Project.

Lynne: Yes.

Jim: Why don’t you tell the audience about that?

Lynne: Sure. This was, again, back in the mid-2000s. At the time, I was working with Vernon Smith. We were doing experimental economics, very much focused on testbedding market designs and policy designs before they go out in the wild. We went and did a study session at Pacific Northwest National Lab right at the time when they were starting to develop this idea of transactive energy. They were working with Bonneville Power Administration. Bonneville’s a big, federal, hydroelectric facility in the Northwest. They operate a big transmission network, and then connect up to a distribution network to go out to the local public utility districts out on the Olympic Peninsula, way up in the northwest corner of Washington state.

The public utility district up there, the PUD, was doing their demand forecasting, and finding that they were expecting their demand to increase over time because it’s a very beautiful place. It’s lovely, pristine. People wanted to move there. They were concerned that they would soon exceed their distribution feeder wires capacity. It’s beautiful and pristine. Most of the surface area of the Olympic Peninsula is also a national park, so the last thing you want to do is do the traditional utility thing, which would’ve been to “put iron in the ground” and build a power plant.

So, they were working with P&L in Bonneville to think about other approaches to better match supply and demand. They were starting to think, in what we today would think of, in terms of demand flexibility, “How can we harness demand flexibility to be a better match given our supply constraints?”

We were out doing an experimental economics session, and did this double auction market experiment. The P&L engineers were like, “Okay. That looks like it would fit really well with what we’re trying to do in this project with Bonneville and the local PUD. So, I started working with them on this Olympic Peninsula Testbed Demonstration project, and it was a field experiment.

The field experiment had two components, at least the portion of it that I’m going to describe here, had two components. So, we had 130 households participate, and every household received a digital, two-way, programmable communicating thermostat. This is in 2005 so you’re kind of most common digital thermostat, at the time, if you went to Home Depot and bought one, you’d be able to program in particular time slots. Like, “I want to go to work at 8:00, so turn the heat temperature down. I come back at 5:00, so turn the heat temperature up.” That kind of thing. This programmable thermostat had much more digital intelligence embedded in it so we could program in their contract choices.

That was the second component of the experiment, they got to select, from most favorite to least favorite, which contract type they wanted, whether they wanted to have a fixed price contract, a time of use, a peak/off-peak contract, or a real-time price contract. Then, we had a control group that only had the thermostat. The kind of contract experiment was a layer on top of their existing bills. We allocated each of them to one of the treatment groups in the contract. Then, we had this control group that didn’t have any particular contract layer on top of their existing bill.

I mention all that just to say this is a pretty standard laboratory experimental design. But what was interesting, we wanted to take, I guess, the 110 households that weren’t going to be in the control group and randomize them across the three different contract types. The utility wasn’t so keen on that idea. So, we asked them to pick, “Which one do you want to be in?” Interestingly, after learning about… We prepared educational materials. About two-thirds of them wanted to be in the real-time price group. This just completely runs counter to the received wisdom in the industry. The received wisdom in the industry is, “People don’t want price variability. They want their bills to be low and stable, and they don’t want price variability.” Of course, what’s true now, if you have a digital technology that you can use to control and manage your bill by responding to prices without your having to be there twiddling the thermostat yourself, you’re like, “Hey, the thermostat can do this for me, so sure, I’ll take the real-time price.”

So anyway, we divvied them up into the three groups and ran them for a year. I was in charge of designing the real-time market. We did it as a double auction so each of the participating households could each submit a bid. It was every five minutes. So, the bid is essentially the thermostat communicating into the market, “Here’s what I’m willing to pay, here’s what I’m willing to pay, here’s what I’m willing to pay.” Different people would set different thresholds, so different trigger prices. You would get demand responsiveness, demand flexibility out of having those thermostats participating on behalf of their owners in this local energy market. It worked really well. The real-time market, they saved about 20% on their bill. They had a slight overall increase in their energy consumption. Overall, the system also did a really good job of managing the wire’s capacity constraint.

So, that was the origins of transactive energy. Ever since then, Dave and I have been working to keep moving along with this because it is a very original approach to demand flexibility. Now, with these distributed energy resources, there’s a lot more going on than just thermostats and water heaters.

Jim: Indeed, indeed. Now, on the flip side of it, you hear horror stories like the big freeze down in Texas, where people got $30,000 electrical bills, etc. Could it be that there’s a power law distributed, or fat tail distribution, of the effects of these market based systems, and that if you took a long enough timeframe, maybe people wouldn’t be better off, particularly those people who are adverse to a big financial shock?

Lynne: Yeah. The winter storm, URI, in 2021 was a big learning event for everyone. The Texas legislature has essentially taken the type of market that I just described, that we did in the Olympic Peninsula, and they have made that illegal, which in my opinion, is throwing the baby out with the bathwater. I have so many thoughts about this, I’m not quite sure where to start. There were really two retail companies in Texas that were offering residential customers a real-time price, wholesale pass-through type contract. One was called Griddy and one was called Octopus Energy. Octopus Energy, I think they kind of had a little bit of a better financial hedge situation. They were able to say to their customers, “All right. We’re going to cap your bills. You’re only going to be responsible for a certain portion of this high run-up in the bills during the winter storm.” Whereas, I don’t think Griddy was in that kind of financial position.

The thing that really breaks my heart is that Griddy had already announced to their customers that they were going to be introducing an insurance product. Right? If you take the analogy between buying electricity, and having that price variability, and buying a plane ticket, where you have some probability that your trip might get canceled and you don’t want to get stuck holding the bill for that plane fare. So, what do you do? You buy your plane ticket, and you buy non-refundable because it’s cheapest. We’ve all learned, since 1978, that that’s how airplane customers behave, they buy the cheapest ticket. Then, you buy a travel insurance contract on top of that.

I think that’s one approach that you can take. It’s kind of like you were saying earlier, about sort of guardrails, or setting your parameters so that the temperature never goes above this or below this. You can also do it so that your spending never goes above this or below this. One way you can do that is by having this price insurance that you layer on top of the real-time market. The problem with Griddy was that they were going to start selling that insurance contract to their customers on March 1st, and the storm was February 14th. Then, they went bankrupt.

The challenge for places, or situations like that winter storm URI, is twofold. Number one, getting the financial instruments and financial markets right so that you have enough opportunities for parties to hedge and to basically lay off risk on people who are willing to bear the risk. Residential customers, for the most part, are not willing to bear a whole lot of risk. At the same time, create an opportunity for them to choose to accept price signals. We know that those price signals enable them to save money, for which they have to take some risk. That also better aligns their behavior and their incentives with the underlying true actual costs in the system. It’s a tricky situation, but just outlying a wholesale, pass-through, real-time market is not the right approach.

Jim: Yeah. This is what economists would call institutional design question. Right?

Lynne: Precisely.

Jim: Why don’t you talk about that a little bit? You, as a economist who has spent your, at least a fair part of your last many years, working in this space, what are some of the things that regulators, practitioners, the public, should keep in mind when thinking about institutional design to be able to take advantage of this new opportunity?

Lynne: Yeah. It’s a good question. It’s a multifaceted question. Part of the challenge is, I think, because we’re coming out of our 20th century experience. Our reaction to technological change is very historically contingent. Right? We’re used to this 20th century, large-scale, central generation, centralized control of what goes on in the grid. The users around the distribution edge are just passive consumers. They flip the switch and the light goes on, and pay a fixed price, and don’t think about it. Whereas now, the technology landscape is so different, with so much more diverse generation technologies, the increasingly economical storage technologies, all the digital devices that we have to control and automate our response to all sorts of things, including price signals. It’s a very different landscape. Yet, our preconceptions about regulation and regulation’s role as consumer protection and the utility as vertically integrated monopoly, our preconceptions around all of that are slower to change than the technology is. So, technology is way out ahead of us, and I think the institutions evolve more slowly. Part of that is just human’s discomfort with risk and kind of defaulting to the status quo.

The thing that is, I think, going to be important and challenging is a kind of willingness to reevaluate those hard questions. With these new technologies and the way they change the economies of scale, and generation, and economies of scope, and the way they change what consumers can do and how consumers can use digital technologies to compare prices and protect themselves in the sense of consumer protection, how should the regulatory footprint change? What should regulation be doing differently and how should it be doing it differently? That’s, I think, one of the fundamental regulatory institutional design questions.

Then on the flip side of that, the utility business model is the vertically integrated firm, where the generation, the wires, and the retailer, are all in one firm. That’s an artifact of the early 20th century. In some states, it’s been kind of chipped away on the generation side. In some states we have wholesale power markets because of the technological changes of things like the controlled cycle gas turbine making generation markets more competitive. In many states, they still have that vertically integrated footprint. Whereas, the underlying economics would suggest that the utility business model should really match what we call the natural monopoly footprint. Right? The places where regulation is still deemed to be an important protection against monopoly behavior.

So, the implication of this is that utilities should shrink to become the best possible wires company that they can, and the rest can all be done through competitive markets. Those markets still have to have some design. Markets require rules. Rules emerge organically, but they also are designed, and they change over time as conditions change. The institutional design is a very dynamic and fluid thing in this industry. You’re coming from this preconception that’s so heavily administered and regulated.

Jim: Yup, yup. And yet, we do know it’s possible because, as you point out, several states have gone to wholesale power generation. I think, generally, it’s worked. Hasn’t it?

Lynne: It has, generally. Nothing is ever perfect. I think this was one of the failings of the California restructuring leading into the California electricity crisis in 2000/2001. This kind of benchmarking of the notion of “perfect competition.” I think we’ve learned a lot in the 23 years since then, having a little more epistemic humility, realizing that we can’t know it all, and we can’t achieve perfection, but that we can do something. My favorite thing to do here is to quote former Texas PUC chair and first chairman Pat Woods, who is very fond of saying that, “Competition on its worst day will do a better job of protecting consumers than I did as a regulator on my best day.” For me, that’s the right benchmark. It’s not the perfection, but it’s the, “Compared to what?” Right? So, “Compared-

Jim: Yeah.

Lynne: … to regulation, how well do these markets do?” I have a litany of institutional design critiques I could make of wholesale power markets. In general, they have performed better than the counterfactual of, had they stayed in a vertically integrated structure.

Jim: Yeah, yeah. Just sort of think about this a little bit. One potential role for the regulators is to require that the institutional designs trim off the tail risk for people who aren’t capable of bearing-

Lynne: Yep.

Jim: … tail risk, the big fluctuations on the rare event. You know? Your grandmother gets a bill for $30,000. The regulator should make sure that the system has things like you described, which is an insurance policy that, in the aggregate, works fine for big insurance companies and Wall Street firms that want to back that pool. They can afford month to month, or year to year, fluctuation so long as the nickels and dimes add up to more than the hits over time, while grandma doesn’t get exposed to the once every 100 year chance of a $30,000 electric bill. That doesn’t seem too hard to envision.

Lynne: Yeah. I think that’s exactly right. It’s a lesson that we learned by looking across different industries. If you look at financial markets or some other industries, you can think in terms of this tail risk framing that you just articulated, but that’s a new framing and a new mindset in an electricity regulation context. So, I think that’s exactly the right way to go. I think it’s up to those of us who work in this space to help move us there.

Jim: Yeah. Now, I’ll pound the table on one of my pet peeves. One of the reasons people in traditional analytical frames don’t get it is they tend to think in terms of Bayesian statistical distributions, bell curves. Right? You can say, “Oh, this is a 7 sigma event. It’ll happen once every 10,000 years.” Well, wrong people. A tremendous number of emergent complex systems fluctuations have turned out to be power law distributions. Much fatter tail. Sorry. I still recall almost wanting to throw a knife at the TV, back in 2008, where some CEO of a big financial services company said, “So, how could we be held responsible? This was a 16 sigma event,” i.e. once in the history of the universe. Right? If you looked at it from a reasonable fat tailed distribution analysis, it came out to about once in 100 years, which is almost exactly what it was. Right?

Lynne: Yup.

Jim: … the next biggest one since the Great Depression. People, locked into the statistics 101 Bayesian analysis, tend to not have a good intuitive feel for tail risk.

Lynne: I will join you in pounding the tail on tail risk, or pounding the table on tail risk.

Jim: Indeed. Well, there’s lots of other interesting things in your work, but I think we’re just about up to our time. I really want to thank you for an extraordinarily interesting session about the grid and where it may be going.

Lynne: Thanks, Jim. I really enjoyed the conversation.