CVIndependent

Mon07222019

Last updateTue, 18 Sep 2018 1pm

This summer’s statistics on electricity use and generation included a significant gem: Over the last 12 months, power generation from coal has dropped to a three-decade low.

That’s party-worthy news for the climate, for air quality, for folks who live near power plants and for the natural-gas industry, which is partly responsible for coal’s decline. Just days later, however, the Trump administration crashed the shindig, causing a major buzzkill.

No, the president’s attempts to revive coal have not succeeded. But on Sept. 18, the Interior Department snuffed out new rules aimed at lowering the oil and gas industry’s methane emissions, just days after the Environmental Protection Agency started the process of euthanizing its own methane regulations. This is a bummer not only for the planet, but also for the natural-gas industry’s efforts to portray its product as the clean fossil fuel.

Coal began its climb to dominate the electricity mix in the 1960s, peaking in the mid-2000s, when power plants burned about 1 billion tons per year, generating about half of the nation’s electricity—and an ongoing disaster. Donald Trump likes to talk about “clean, beautiful coal.” It’s anything but. The smokestacks that loom over coal power plants kick out millions of tons of planet-warming carbon dioxide annually, along with mercury, sulfur dioxide, arsenic and particulates, all of which wreak havoc on human health. What’s left over ends up as toxic (sometimes radioactive) piles of ash, clinkers and scrubber sludge.

When natural gas is burned to produce power, however, it emits only about half the carbon dioxide of coal, and virtually none of the other pollutants associated with burning coal. So during the 2008 election season—when climate politics were less polarized than now—both parties pushed natural gas in different ways, with Republicans chanting, “Drill, baby, drill,” and Democrats calling natural gas a “bridge” to greater reliance on renewable energy sources. At the same time, advances in drilling were unlocking vast stores of oil and gas from shale formations, driving down the price of the commodity and making it more desirable to utilities.

As a result, natural gas gobbled up a growing share of the nation’s electricity mix, while coal’s portion withered. In 2008, natural gas generated 21 percent of the electricity in the United States; now, its share is 33 percent. Coal use, meanwhile, plummeted from 48 percent to 29 percent over the same period. In consequence, the electric-power sector’s total carbon dioxide emissions have dropped by 700 million metric tons over the last decade, with an attendant decrease in other harmful pollutants. Every megawatt-hour of coal-fired electricity that is replaced by gas-fired electricity is a net win for the planet—and the humans who live on it.

Except when it’s not. Natural gas has an Achilles’ heel: When it is sucked from the earth and processed and moved around, leaks occur. The main ingredient in natural gas is methane, a greenhouse gas with 86 times the short-term warming potential of carbon dioxide. Every punctured pipeline, leaky valve and sloppy gas-well completion eats away at any climate benefits. And if methane’s leaking, so, too, are other harmful pollutants, including benzene, ethane and hydrogen sulfide. And so the fuel’s green credentials, and one of the industry’s main marketing tools, end up wafting into thin air.

When the Obama administration proposed rules that would make the oil and gas industry clamp down on methane emissions, it was a gift, not a punishment. Not only would people and the climate benefit; the natural gas industry would be able to sell itself as a clean fuel and a bridge to the future.

The Obama-era rules are similar to those passed in Colorado in 2014, with the industry’s support. Far from being onerous, they simply require companies to regularly look for and repair leaks and to replace faulty equipment. Some companies already do this on their own; the Obama rules would simply mandate this responsible behavior across the board. That’s why the Republican-controlled Congress ultimately decided not to kill the rules. That, however, did not discourage Trump.

Trump is not being “business-friendly” by ending the rules. Rather, he is once again indulging his own obsession with Obama and destroying his predecessor’s legacy, regardless of the cost to human health and the environment. Trump’s own EPA estimates that its rule rollback will result in the emission of an additional 484,000 tons of methane, volatile organic compounds and other hazardous pollutants over the next five years. Meanwhile, the death of the Interior Department’s methane rule will add another half-million tons of pollutants to the air. In the process, it will erode the pillars of the once-vaunted natural gas bridge.

Then again, maybe the time has come to let that bridge burn. We get 70 times more electricity from solar sources now than we did in 2008, and renewables hold 11 percent of the total share of power generation. Perhaps just as significant is a less-noticed fact: Electricity consumption in the U.S. has held steady for the last decade, even dropping during some years, despite a growing population, a burgeoning economy, harder-working air conditioners and more electric devices. That means we’re becoming more efficient and smarter about how we use energy. If we keep this up, we’ll be able to cross that fossil fuel chasm—no matter how many bridges Trump burns down.

Jonathan Thompson is a contributor to Writers on the Range, the opinion service of High Country News. He is the author of River of Lost Souls: The Science, Politics and Greed Behind the Gold King Mine Disaster.

Published in Community Voices

Gov. Jerry Brown made international news when he vowed to fight President Donald Trump’s attempts to cut America’s climate-change research and rescind the nation’s commitment to the Paris Agreement.

Brown’s commitment to fighting climate change seems real, and under his leadership, his state has engaged in numerous greenhouse-gas-reduction plans. But there are caveats to his commitment, including the continued growth in fossil fuel extraction in California, and the state’s near-explosive population growth—both of which drive emissions up, not down.

There’s another issue that California needs to address: methane emissions from hydropower, particularly at Hoover Dam, the source of a significant portion of Los Angeles’ electricity.

About 25 years ago, a small team of scientists in Brazil started measuring the methane produced at hydropower dams and reservoirs. Led by Philip Fearnside, the scientists found surprising results, indicating that hydropower dams and reservoirs in tropical countries like Brazil emit high levels of methane—sometimes as much as a coal-fired power plant. Fearnside referred to these hydropower producers as “methane factories.”

The studies have multiplied over the last two decades, and in 2006, the Intergovernmental Panel on Climate Change included calculations for measuring “Methane Emissions From Flooded Land” in making national greenhouse-gas inventories. Since 2006, study after study has confirmed high levels of methane emissions from dams and reservoirs, and when the Environmental Protection Agency measured methane emissions from a reservoir in the Midwestern United States in 2016, the emissions detected were as high as those measured in the Brazilian hydropower plants.

In September of last year, an international team of scientists synthesized dozens of studies around the globe and found that hydropower’s methane emissions have been dramatically under-measured. This analysis, published in Bioscience and funded by the Army Corps of Engineers, the EPA and U.S. National Science Foundation, made international news with its conclusion that the Intergovernmental Panel on Climate Change needed to revise its calculations and include hydropower’s significant emissions in its climate change scenarios.

Another study, published in September 2016 by a team of Swiss scientists, used previous measurements at dams and reservoirs around the world to create a model that estimated methane emissions from nearly 1,500 hydropower plants and other dams and reservoirs across the planet. The study’s conclusions further rocked the climate-change world: Climate-change emissions from Hoover Dam and Lake Mead on the Colorado River near Las Vegas were found to be about equal to those of coal-fired power plants that produced the same amount of electricity.

Why do dams and reservoirs produce emissions like methane? The answer is that when organic material such as vegetation, sediment, algae and other runoff decomposes underwater at a reservoir, methane is released. This is a natural process called “anaerobic decomposition,” but it is dramatically intensified in dam and reservoir systems that are not natural lakes. Take Hoover Dam and Lake Mead as an example. Lake Mead is enormous—about one-quarter the size of Rhode Island. The reservoir level fluctuates over the year, causing many square miles of its banks to periodically dry up, grow vegetation and then get flooded again each year.

Large amounts of sediment are also washed down the Colorado River every year. This sediment coats the bottom of the lake and also dries up along the miles of caked mud on the lake’s hot banks. Thus, Hoover Dam and Lake Mead work together to create a high-methane-producing hydropower system. Even though measurements and estimates of methane are very recent, as far back as 1948, the U.S. Geological Survey was examining what it was then called “gas pits" in the mud flats of Lake Mead.

About 50 percent of Hoover Dam’s electricity is wired to the Los Angeles area. Yet no greenhouse gas emissions calculations—in Los Angeles or statewide in California—include Hoover Dam’s contribution. That’s like having a large coal-fired power plant burning in downtown Los Angeles whose climate change impact is completely ignored.

California has 1,400 dams and reservoirs. Most of them produce far less methane than Hoover Dam, but many of those dams’ emissions are neither estimated nor measured. It’s time for California to acknowledge its methane emissions from hydropower, measure them—and, finally, offset or stop them.

Gary Wockner is a contributor to Writers on the Range, the opinion service of High Country News, where this piece first appeared. He is the director of the Save the Colorado River Campaign and the author of River Warrior: Fighting to Protect the World’s Rivers.

Published in Community Voices

On the 10th floor of Xcel Energy’s downtown Denver office building, energy traders sit before banks of screens filled with flickering, colored digits, as they buy and sell electricity for the utility’s sprawling service areas. In one corner, a trader monitors the Midwest wholesale market, and in another, the Southwest Power Pool—an odd name, given that it actually covers the Great Plains, not the Southwest.

On a recent day, an electronic map showed North Dakota in blue; the price of the state’s wind power was near zero. On the other hand, southern Indiana was burnt orange, with the price of a kilowatt-hour near 8 cents. Five minutes later, Ohio turned pale green as the price dropped to 5 cents.

Meanwhile, on the other side of the room, the trader handling Colorado had no fancy, color-coded price map. When he needed to buy or sell, he had to get on the phone and call around to other utilities to find out what they had, at what prices. Then he had to fix the price, coordinate the dispatch of the electricity, and file the paperwork—all things being done automatically across the room by the Midcontinent Independent System Operator, or MISO, and the Southwest Power Pool, which covers all or parts of seven states.

There, in a nutshell, is the state of affairs when it comes to Western electricity markets. While 60 percent of the nation’s electricity is handled through computerized regional markets, the West is stuck in the 1980s.

Electricity sales in the West are Balkanized among 38 “balancing authorities,” or local markets.

All provisions for necessary plants and power, including backup reserves, must be made by the utilities in each local market, while the companies in the neighboring market do the same. Electrons don’t flow between them.

But in a bigger market, electricity—a perishable commodity that moves at the speed of light—can travel wherever there is demand. There is less need for redundant backup systems, as someone is always making electricity, and someone is always buying.

“If Iowa wants to go to 80 percent (wind), they can, because they belong to the Midwest ISO,” says Steve Berberich, chief executive officer of the California Independent System Operator (CAISO), an in-state wholesale market.

But the day of a Western electricity market, also known as a regional transmission organization (RTO), may be at hand. CAISO and Portland, Ore.-based PacifiCorp, which operates power plants in six Western states, are looking to form a regional market. Berberich says he hopes that market can be extended across the entire West.

On the eastern end of the region, seven utilities, including Xcel, have formed the Mountain West Transmission Group, which extends from Wyoming into New Mexico and Arizona. The group—a precursor to an RTO—is trying to develop a uniform transmission charge, or tariff, for the region. Currently, each utility has its own charge for moving electricity through its wires. Once it has developed a uniform tariff, it may join one of the nearby regional transmission organizations or create its own market.

Regional markets have a lot of moving parts. MISO operates a day-ahead market where wholesale power is sold from utility to utility for the coming day, as well as a real-time market to fill in for unexpected demand or outages. Electricity suppliers submit bids to MISO, which then fills orders for that power starting with the lowest price. The price at which all orders are filled is called the clearing price, calculated by algorithms and computers for the spot, or real-time market every five minutes.

In this bidding system, wind and solar, with their steadily declining prices, are becoming more attractive to utilities.

“Any time you can avoid a fuel burn, you’ve got an opportunity for savings,” says Stephen Beuning, Xcel’s director of market operations.

At the moment, however, wind power from Wyoming or solar electricity from California can’t easily move around the West. On one day, CAISO had to dump 485 megawatts of wind and 657 megawatts of solar, because there was no way to sell it to utilities outside its grid.

“We can’t get to the goal of 50 to 60 percent renewable energy by 2050 without an RTO,” says Zichella.

In theory, a West-wide RTO would have allowed California to sell that excess wind and solar to, say, Utah or Colorado, thus avoiding the need to burn natural gas there. Similarly, Colorado utilities could ship excess wind power to California to back up solar during times of peak demand.

Setting up an RTO isn’t easy, though.

“The software is a huge expense—and California has created it and is willing to share with the West,” says Nancy Kelly, a senior energy policy adviser with Western Resource Advocates, an environmental group.

California’s offer to share, however, is being met warily around the West by those who are concerned that while a Golden State-dominated system might be good for California, it may be less so for others.

CAISO is controlled by the California governor and Legislature. “That is going to have to change to be acceptable to the PacifiCorp states,” says Bryce Freeman, administrator of the Wyoming Office of Consumer Advocate. “Unless that is resolved, it’s a fool’s errand.” PacifiCorp operates in Oregon, Washington, California, Utah, Wyoming and Idaho.

In Utah, lawmakers are drafting a bill to give them veto power over joining the CAISO market. “We aren’t opposed,” says Jeffrey Barrett, deputy director of the Utah Governor’s Office of Energy Development. “We just want to make sure it is a good deal for Utah.”

The state has among the lowest electricity rates in the West—a competitive advantage it doesn’t want to lose, Barrett says.

Though they concede that a regional grid could help renewables, the Sierra Club is opposed to the current CAISO expansion plan, because it would bring 24 coal-fired PacifiCorp units into the regional system.

“In bumping up the productivity of these coal plants, it will throw a lifeline to some, allowing them to operate for another 16 years,” said Travis Ritchie, an attorney with Sierra Club’s Beyond Coal campaign.

Still, economic forces and renewable-energy policies look to be pushing the West toward a regional market. A CAISO study released in July found the proposed RTO would lead to up to $1.5 billion in savings annually in California by 2030—equal to a 3 percent cut in electricity rates.

It would also lead to a reduction in toxic and greenhouse gas emissions across the West, according to the study, although there would be a slight bump up in the early years from the PacifiCorp coal-fired plants.

The analysis, however, didn’t look at benefits outside California. “A big question is: Will costs and benefits be equal across the system,” says Elta Kolo, an analyst with GTM Research, an energy consulting firm. “It will be crucial to get consumers on board.”

The West presents some unique challenges. The New England ISO covers six states, but is an area one-thirteenth the size of the size of the West, a region with a mix of sparsely populated states and heavily urban ones, states with ambitious renewable energy standards, and those heavily tied to coal.

“They are different, but still similar in that they need electrons, they need reserve capacity, and they need to make money,” says Amanda Ormond, managing director of the Western Grid Group, which advocates for a more efficient grid to promote renewable energy.

“A Western market is almost certainly inevitable,” Ormond says. “Most of the utilities in this country and the rest of the world operate in organized markets, because it is more efficient. It is going to happen.”

Published in Environment

Maybe you’re sitting on the couch right now, reading this as you light up a joint. Maybe you’re in one of the states where what you’re doing is no longer a crime, so you’re feeling pretty good, because your leisure activity will no longer lure the police into your home.

Sorry to harsh your buzz, but that marijuana, legal or not, probably sucked up a lot of electricity during its cultivation. One study estimates that it takes as much energy to produce 18 pints of beer as it does just one joint (and that doesn’t factor in the energy used to make the three Sara Lee cheesecakes thawing in the fridge for when the munchies kick in). That “green” you’re smoking isn’t all that green after all.

With medicinal and/or recreational marijuana legal in most of the West, utilities and grid operators are a bit worried about the impacts these energy-hogs will bring to their grids, and excited about the profits they’ll bring to their bottom line. The issue is pressing enough that it got its own session—“The Straight Dope on Energy and the Marijuana Industry”—at the Nov. 11 annual meeting of the National Association of Regulatory Utility Commissioners in Austin, Texas.

Attendees learned that Xcel Energy, which serves most of urban Colorado, sells some 300 gigawatt hours of electricity to pot-growers per year, or enough to power some 35,000 homes. The U.S. marijuana-growing industry could soon buy as much as $11 billion per year in electricity.

These statistics are alarming, and will only get more so as legalization spreads. But legalization, if approached correctly, also opens doors of opportunity. The biggest guzzlers of electricity also hold the most potential for realizing gains via efficiency.

Back in 2011, a California energy and environmental systems analyst, Evan Mills, published a paper quantifying the carbon footprint of indoor cannabis production. That footprint, he discovered, was huge. His findings included:

• While the U.S. pharmaceutical sector uses $1 billion per year in energy, indoor cannabis cultivation uses $6 billion.

• Indoor cannabis production consumes 3 percent of California’s total electricity, 9 percent of its household electricity, and 1 percent of total U.S. electricity (equivalent to 2 million U.S. homes per year).

• U.S. cannabis production results in 15 million tons of greenhouse gas emissions per year, or the same as emitted by 3 million cars.

• Cannabis production uses eight times as much energy per square foot as other commercial buildings, and 18 times more than an average home.

Mills’ paper generated a flurry of media coverage, much of it sensationalist, which was then used to point out that pot-smokers are hypocritical (based on the incorrect assumption that pot-smokers are necessarily environmentalists). Some pundits even used the findings—along with some dubious math—to justify other carbon-intensive activities, such as mining Alberta’s tar sands.

But Mills wasn’t picking on pot, per se. He was focused only on indoor cultivation. And he made sure to point out that a lot of marijuana’s energy use is actually energy waste. Many growers, for example, use diesel generators to power their operations to avoid suspicious electric bills. They grow in places where there are no windows, without the benefit of sunlight, relying entirely on artificial lighting (which is extremely bright and energy intensive). When greenhouses are used, they tend to be of bad and inefficient design.

Most of this wastefulness occurs not because dope farmers are gluttonous slobs, but because they need to stay hidden in order to stay out of jail. So by simply legalizing and legitimizing the trade, some states have taken the first step in taking a bite out of cannabis’ energy footprint. For one thing, legalization allows farmers to move their crops outdoors, where it takes no more energy to grow a pot plant than it does a carrot or tomato (though yields and, some say, potency are far lower than growing pot indoors). It allows them to ditch the dirty, wasteful generators and hook up to the grid. (And to actually pay for the power they use: Electricity theft by pot producers is said to total as much as $100 million per year.) It also allows utilities and farmers to work together on maximizing efficiency.

The Northwest Power and Conservation Council, for example, has found that huge savings could be realized if farmers switched to efficient greenhouses and to LED lighting, and their yields would increase. In his paper, Mills suggests implementing energy-efficiency incentives as well as energy-conscious construction codes for grow operations. There’s also tremendous potential for using all the pot-growers on a single grid together as a demand-response resource.

Demand response works like this: Demand on the grid spikes, perhaps because everyone turns on their energy-sucking flat screen televisions to watch the football game all at once. The utility needs to meet that demand by putting more power into the grid. The conventional way of doing this is to fire up a power plant, usually natural gas-fired, which is expensive and polluting.

But in demand response, the new power needs are met by curtailing the power use of a bunch of customers by, say, telling their hot water heaters to shut down for an hour or so, via smart meters. This has the same effect as injecting more power into the grid to meet the increased demand. In other words, the consumers, collectively, become a sort of backup power plant.

A more rudimentary form of demand response is for the utilities to coordinate with the growers to shut down the power-sucking devices during peak load hours, such as when everyone else is cranking their air-conditioners, and turn them on during off-peak hours, like in the middle of the night. If the growers are on time-of-use electricity rates, that would be the most cost-effective way to go, anyway.

Unfortunately, many utilities are slow to seize the opportunities legalization presents. Xcel Energy’s representative at the Austin meeting said that the company has been wary of working with growers on efficiency, because it might look like they’re promoting drug production. And the Bonneville Power Administration, the massive federal utility in the Pacific Northwest, doesn’t allow any of its efficiency incentives to go to cannabis cultivators, because growing marijuana is still against federal law.

Even the best efficiency measures, however, won’t make ganja growing entirely green, as is clear from this anecdote from the Austin meeting, as reported by SNL Financial:

Driving that point home, John Morris, policy and regulatory affairs director for energy efficiency consultant CLEAResult, reported that one pot grower in the West is converting a 90,000-square-foot warehouse to produce the plant, and that despite installing energy efficient lighting and other devices, including a $2 million solar panel on the roof, that grower still expects to pay around $1 million a month for electricity.

Yes, you read that right: $1 million a month. For electricity.

Jonathan Thompson is a senior editor of High Country News, where this story originally appeared.

Published in Environment

The water-energy nexus spans the world of electricity generation and water movement, particularly in Western states. It takes water to produce steam for coal, natural gas and nuclear power plants, and they usually need water to cool them down. Huge amounts of electricity are needed to pump water across the desert; the Southern Nevada Water Authority is Nevada’s biggest user of electricity, and the Central Arizona Project relies heavily on the Navajo Generating Station to keep water moving through the canals.

Surely the most obvious link between water and energy, and between climate and electricity generation, though, is found at the West’s numerous hydroelectric generation stations, and here in California—deep in a nasty drought—we’re feeling that link in a painful way.

The relationship is pretty simple: More water in a reservoir or river equals more potential for generating electricity by releasing that water to turn turbines. All of California’s reservoirs are far below the average levels for this time of year. New Melones reservoir is sitting at just 17 percent of capacity; Shasta reservoir, one of the state’s biggest hydroelectric power plants, is only 49 percent full. Meanwhile, Lake Mead and Lake Powell, backed up behind the Southwest’s two biggest hydropower plants, are at critically low levels. This is impacting electricity generation, without a doubt, and doing so during the hottest months of summer, when the grid is already stressed.

We wanted to get a more concrete sense of exactly how the drought is affecting hydropower generation in California and beyond, so we dug into the data and crunched some of the numbers. The map below in the “Media” field gives a rundown on the biggest hydropower generators in the Southwest, including all of California’s plants with a generating capacity of 200 megawatts or more, along with Glen Canyon Dam, Hoover Dam and Davis Dam on the Colorado River. Click on the icons, and you’ll see a graph of power generation from 2009 through 2014.

Here are the highlights of the number-crunch.

In California, drought has taken a direct hit on hydropower generation, as is quite evident in the map below, and the graph above. In 2014, California’s collective hydropower plants kicked out about 17 million megawatt hours of juice. Not bad, except that it was only about one-third of what they produced in 2011, an unusually wet year, and about half of an average year in pre-drought times.

The average home uses about 11,000 kilowatt hours of power—or 11 megawatt hours—each year. In other words, something like 2.8 million fewer homes were powered by hydroelectricity in 2014 as compared to 2011 in California. And things have only gotten worse since.

The good news is that burgeoning wind and solar in the state have stepped in and taken up some of the slack. The bad news is that more natural gas power has also been needed to fill the deficit.

Drought diminishes hydropower production from Colorado River dams as well, though, as directly as in California. Hoover Dam is a huge hydroelectric power plant, with a nameplate capacity as large as some of the biggest coal-fired power plants, and its “fuel” supply is running out as Lake Mead reaches historically low levels. So it’s no wonder that folks are concerned about its status as a power producer. So far, though, drought hasn’t had as severe an impact on hydropower production here or in Glen Canyon and Davis Dams, above and below Hoover, respectively. That’s because the operators of these dams can’t withhold a bunch of turbine-turning water just because their reservoirs are running on empty. In fact, Hoover dam’s turbines produced 33,000 more megawatt hours of power during the first quarter of this year, even as it approached record low levels, than during the first quarter of 2014. Yet dry times still do impact hydropower production, because the lower the reservoir, the less force the water has to turn the turbines. Lake Mead’s “dead pool”—the level at which it could no longer turn the turbines—was once 1,050 feet, about 25 feet below what it is now (and a level that is not in the cards at least for a few more years). But new wide-head turbines have been installed over the last decade, which lowers dead pool to about 950 feet, thus extending Hoover’s hydropower life for a while.

Hydroelectricity is an especially valuable form of power. Hydro is a good source of baseload power, meaning it can put a steady stream of juice into the grid around the clock. But it’s also valuable in that grid operators can crank it up or down relatively quickly, meaning it can be used to balance out variable power sources like wind and solar, or it can step in to account for a sudden surge of power demand, due to everyone turning on their air conditioners in the hottest time of the day, coupled with a drop in solar generation as the sun dips toward the horizon. It’s in this capacity that Hoover and Glen Canyon are huge assets to the southwestern grid.

If drought-induced hydropower loss is going to happen anywhere, though California’s a good place for it. If any other state lost tens of millions of megawatthours of power from one source, they’d likely replace it with coal-fired power. But California is fast on its way to completely phasing out coal, and it’s also under the most robust renewable portfolio standards, so it can only get so dirty when looking for a stand-in for hydro.

Jonathan Thompson is a senior editor of High Country News, where this story first appeared

Published in Environment

When the Hoover Dam was built in 1936, it was the largest concrete structure—and the largest hydropower plant—in the world, a massive plug in the Colorado River, as high as a 60-story building.

For nearly 80 years, the dam has been producing dependable, cheap electricity for millions of people in the Southwest, but as water levels in Lake Mead continue to drop, the future of “the greatest dam in the world” is more precarious than it ever has been, and utilities across the desert—including local power provider Southern California Edison—are taking notice.

Lake Mead, the 112-mile reservoir created by the dam, was recently projected to hit 1,074.73 feet above sea level, the lowest it has been since it was filled in 1937. Thanks to a 16-year drought and serious over-allocation, Lake Mead is now just 37 percent full. Although a “miracle May” of rain means the water level will rise again, the longer term prognosis is more worrisome: If water levels continue their downward trend, the amount of energy generated by the Hoover Dam will fall, leading to higher electricity costs for 29 million people in the desert Southwest.

That's because a shallower reservoir means less water pressure against the turbines, generating less electricity. A recent report by graduate students at the University of California, Santa Barbara, in conjunction with the Western Water Policy Program, examines the economic and physical impacts as Lake Mead’s elevation falls: With each 25-foot drop, total energy costs increase by roughly 100 percent, compared to a full reservoir. The costs paid by contractors for hydropower double at 1,075 feet, triple at 1,050 feet, and quadruple at 1,025 feet. At 895 feet, the turbines won’t run—a level they call “dead-pool.”

Dead pool is not imminent, and in the short term, less generation at Hoover won’t translate into soaring electrical bills, says Frank Wolak, an economics professor at Stanford. That’s because utilities buy “futures” contracts for energy, which guarantee a certain price for a period of time. It’s like buying a plane ticket in advance: The price is significantly less than one bought on the same day as a flight. In the case of Hoover, many of those contracts span up to 10 years and were negotiated before low water levels became a significant concern.

Still, Hoover’s power capacity has dropped nearly 25 percent since 2000, and the 53 hydropower facilities run by the U.S. Bureau of Reclamation across the West are producing 10 percent less power than a few years ago, despite rising demand. So when those futures contracts run out—and continued low water levels appear likely—bottom-barrel prices for hydropower will likely be a thing of the past.

That means that utilities currently relying on Hoover’s power, such as the Overton Power District No. 5, which serves 15,000 people in Nevada on the southern end of Lake Mead, are wary. Overton buys 20 percent of its power from the Hoover Dam, 5 percent from other hydro projects, and 75 percent on the spot market (where energy is traded on day-by-day basis). The utility anticipates having to replace 5 percent of its hydropower with another, more-expensive energy source, says Mendis Cooper, Overton’s general manager. That switch won’t translate into sky-high energy bills, likely just a 1 to 2 percent increase. But if Lake Mead continues to fall, and shortages become routine, his customers could see more dramatic increases in their electricity bills. 

“We’ve been having those discussions,” Cooper says, noting that the major topic is moving to more renewables, like solar, as well as improving efficiency.

Luckily, the West has ambitious renewable goals, says Wolak, which will likely make up more of the region’s energy mix and help mitigate the loss of hydropower in the future.

Still, renewables aren’t a panacea. Wind and solar are far more volatile and require backup power sources, such as gas-fired power plants. And though the prices for renewables have come down in recent years, they’re still no match for cheap, federally subsidized hydropower.

“They solve the resource issue,” Cooper says, “but not the price issue.”

Sarah Tory is an editorial fellow at High Country News, where this story first appeared.

Published in Environment

The spring of 2011 was wetter than usual in the Pacific Northwest. A huge snow year was followed by rain, and during the peak, runoff water was ripping through the hydroelectric turbines on Bonneville Power Administration’s dams.

Spring is also the windy season, and hundreds of new turbines in the region were pumping juice into the electrical grid. Even when substantial electricity exports to California were taken into account, the combined wind and hydropower plants were generating more carbon-free electricity than the region’s residents and businesses could consume.

But too much of a good thing is, well, too much. In order to keep the grid from being overloaded, the BPA forced the wind farms to shut down, bashing their bottom line. Controversy and lawsuits ensued: Both wind-farmers and salmon advocates would have preferred it if the BPA had spilled the water over the dams, rather than run it through the turbines.

Regardless of who’s right in this case, the whole brouhaha could have been avoided, and the dams and turbines could have continued to pump out power, had one piece of technology been introduced: a giant battery. Charge it up during times of oversupply, and draw from it during times of need.

If only it were that easy. Thanks to the high cost and immature technology, large-scale energy storage remains rare in the North American Grid.

In October, however, California’s utility regulators shook the battery world up: They required the state’s biggest utilities—Pacific Gas and Electric, Southern California Edison, and San Diego Gas and Electric—collectively to install 1,325 megawatts of energy storage by the end of 2024 (“where megawatt represents the peak power capacity of the storage resource in terms of the maximum discharge rate,” according to the California Public Utility Commission draft decision documents).

While that’s only about the same capacity as one large coal power plant, it will be a huge leap. Today’s biggest storage projects are only around 140 megawatts, and the largest battery project in California is a mere four megawatts.

Electricity is a crazy product. Just about any other good can be manufactured, stored and then distributed to the consumer according to demand. But electricity is cranked out of turbines or solar panels and sent to consumers at the speed of light. At any given moment, production must be equal to demand, lest the whole grid go haywire, meaning that outages ensue—thus the Northwest wind shutdown of 2011. Solar panels only generate significant amounts of power for a limited time each day, and their peak output tends to occur a few hours earlier than peak demand. A big thunderhead blocking the sun’s rays from a solar array can cut the array’s output by as much as 80 percent in seconds. Wind, meanwhile, tends to blow mostly at night, when demand is low, and second-to-second fluctuations can cause a wind farm’s output to vary dramatically.

That means that for every added megawatt of wind or solar in a particular section of the grid, the operators of that grid have to have on hand a controllable, more-predictable power source—usually either hydroelectric or fossil fuel—to smooth the bobbles and back things up in case of dramatic drops in generation. But large-scale storage could also play the backup and smoothing role, thereby displacing fossil fuels in the grid. Indeed, without it, the hope of a fossil fuel-free grid is no more than a pipe dream.

At a recent gathering of the Rocky Mountain Association of Energy Engineers in Denver, the importance of storage was clear. Several people mentioned the significance of California’s new requirement. And the first ever Randy Udall scholarship, announced at the conference, went to University of Colorado at Boulder doctoral student Michelle Lim for her work on energy storage. Robert Welch, a prominent energy consultant, pointed out that energy storage isn’t a totally new idea: When a utility stockpiles coal, it’s storing energy.

Unfortunately, it’s not quite that easy with actual electricity. You can’t just pile it up in a warehouse. Having said that, there is no shortage of storage technologies—though many are untested or costly—from which the California utilities can choose to fulfill the mandate. Just a few examples:

Batteries/chemical storage: Near San Jose, Calif., the brand new Yerba Buena sodium sulfur battery system has a capacity of 4 megawatts, and will be able to keep a more constant supply of power flowing into Silicon Valley. But batteries are expensive, and take a while to charge, plus the charge doesn’t last a long time. They are the Achilles’ heel of electric vehicles, too.

Crowdsourcing storage: One electric car battery isn’t going to do much aside from power the car for 100 miles, but 100,000 of them, all hooked into the grid at the same time, could provide large-scale backup. It’s a serious proposal, and it makes sense. The only problem is timing: How do you ensure that an adequate number of folks will have their cars plugged in when you need to draw from their batteries for backup?

Pumped hydro storage: A reservoir with hydropower is essentially a giant battery, though we can’t control the charging process. Put a reservoir on a hill; pump water up to it during times of high power production; and release the water to turn a turbine during high demand—and you’re getting somewhere. It’s not a bad idea, and there are several scattered across the U.S., but they require specific topographic conditions.

Thermal storage: In Gila Bend, Ariz., the Solana solar project reflects sunlight onto tubes, creating heat and steam, which turns turbines that generate electricity. In the process, the sun also heats up salt, which retains the heat, which is later released to generate steam and power when the sun’s not shining.

Compressed air: Not unlike pumped hydro-storage, this system uses excess power to pump air into underground chambers, where it is pressurized. It can later be released to turn turbines and generate electricity.

And there are many more, including flywheels; superconducting magnetic energy; and filling and emptying giant, hollow spheres that hang from offshore wind facilities in order to turn a turbine.

None of these technologies are perfect, and some remain infeasible at a large scale. But a decade ago, the same could be said for large-scale wind and solar. As states started requiring utilities to add set percentages of renewable energy to their portfolios, innovation accelerated, and costs came down.

California’s new requirement has the power to do the same with energy storage. If other states follow, as is often the case, it will provide the push needed to bring those big batteries online—and finally loosen fossil fuels’ grip on our electrical grid.

Jonathan Thompson is a senior editor at High Country News, the site from which this was cross-posted. The author is solely responsible for the content.

Published in Environment

Minutes before 4 p.m. on a sizzling September day two years ago, right at the time when they were most needed, San Diego’s air conditioners suddenly died.

Thousands of television and computer screens also flickered into darkness. Stoplights stopped working; gas stations ceased pumping; and traffic slowed to a snarl. Trains ground to a halt, and planes idled on the runway. Wastewater treatment pumps shut down, spewing some 4 million gallons of raw sewage into the Pacific. Around 2.7 million “customers”—amounting to anywhere from 5 to 7 million people—lost their power, with some remaining in darkness for 12 hours or more.

As commuters extricated themselves from highway gridlock, and batteries faded away on millions of electronic devices, folks flocked to the handful of neighborhood bars that—thanks to generators—were able to keep their lights and refrigeration going. There, they could drink away the darkness and speculate as to what had caused this sudden plague of electrical impotence.

Many assumed it was terrorism—the San Onofre Nuclear Generating Station had been sabotaged, they said, or the North Koreans had set off an electromagnetic pulse that fried the grid, or maybe an Iranian cyber-attack had crippled the computers that keep the modern world humming. Others blamed solar flares for disrupting the cosmic electromagnetic field, or suggested that a more earthly storm had caused distant wind farms to go haywire.

Then again, perhaps a raven just landed on the wrong piece of equipment out in the desert and got fried, its death rattle reverberating through the transmission lines all the way to San Diego.

Their guesses weren’t stupid or outlandish; they all involved genuine threats to the power grid. But the biggest power outage to hit the Western Grid in a decade actually started hundreds of miles east, at a substation outside of Yuma, Ariz. And it began not with a bang, but with a misplaced checkmark that ultimately crashed Southern California’s electrical system.

That event was no freak occurrence. It could have happened anywhere, at any time, in the complex machine that generates, transports and delivers power to nearly every corner of the nation. Our proliferating air conditioners and gadgets have put new pressures on the grid, as the demand for electricity has grown twice as fast as the infrastructure needed to carry that power. Now we’re straining it even more by trying to get that power from cleaner, more-fickle sources.

Meanwhile, the electrical grid—the circulation system of today’s modern age—is still stuck back in the ‘80s, like one of those guys from high school who cling to a mullet haircut and his Dungeons and Dragons dice.

Compared with what’s to come, some say, San Diego’s blackout might seem in retrospect no more than an excuse for a candlelit block party.

 

A 6-foot-high, gleaming white cross stands alongside a mostly empty road near Tonopah, Ariz. At its foot is a tiny Nativity scene, and an angel on the end of a stick. The shrine’s purpose isn’t clear, but its location makes it seem like an altar dedicated to the patron saint of electricity.

Behind it, rising from the desert scrub, are the elongated containment domes of Palo Verde Nuclear Generating Station’s three reactors. And beyond them lie two huge fields of photovoltaic panels and three natural-gas power plants. There is probably no other five-square-mile patch on the planet with more electrical-generating capacity.

Yet just as critical is the network of transformers, switches and wires through which all that electricity flows. Congregated just north of Palo Verde, a dozen high voltage transmission lines slice the sky, crackling ominously as they link up in the giant, skeletal “switchyard.” The lines, gently curving toward earth between each giant tower, lead inward and outward in every direction, like spokes on a bicycle wheel, making the Palo Verde/Hassayampa switchyard the biggest power-trading hub and crossroads in the region—the electrical Union Station of the West.

Among these lines is the 500-kilovolt Hassayampa-North Gila, or H-NG, line. On Sept. 8, 2011, about 1,300 megawatts—almost half of all the electricity imported into Southern California that day, enough to power about 1.5 million homes—flowed westward through the H-NG line. All that juice was needed because it was so hot; the mercury hit 115 degrees in El Centro, and even sea breeze-cooled Oceanside reached 88. By mid-afternoon, air conditioners cranked across the Southwest, while irrigation pumps pushed water onto the half-wilting lettuce in California’s Imperial Valley.

On its way to San Diego, the H-NG line passes through the North Gila substation outside of Yuma. Just before 2 p.m., Arizona Public Service, the substation’s operator, sent a technician out to fix a capacitor bank, used to stabilize voltage in long-distance lines. As he worked—we’ll call him Kilo Watt, since the utility has kept his or her identity secret—he marked each completed step on a checklist to make sure that he didn’t miss anything. But then Watt, distracted, put a mark in the wrong place, causing him to skip one critical step: bypassing all that juice around the capacitor bank so that he could work on it safely.

At approximately 3:27 p.m., he cranked open the disconnect switch, an event that should have been uneventful. Instead, the 500 kilovolts still running through the line began to arc—the current leaping through the air, much the way the evil Emperor zapped Luke Skywalker in Return of the Jedi. Watt continued cranking the switch, hoping to manually break the arc. Instead, the writhing electrical serpent grew larger, and some 43 milliseconds later, the entire H-NG line “tripped,” like a home circuit breaker, and shut down.

The chain reaction, which would climax just 11 minutes later, had begun.

But the roots of the San Diego blackout are deeper than that Yuma substation. In order to really understand what happened, we need to travel back in time to the grid’s primordial days among the hard-rock mines of the Rocky Mountains.

 

During the spring of 1891, in a canyon in the mountains southwest of Telluride, Colo., icy water from the South Fork of the San Miguel River rushed through a funnel-like tube, crashing into and turning a Pelton waterwheel attached to a nearby 100-horsepower generator in the Ames hydropower plant. As the turbine spun, it generated 3,000 volts of alternating electrical current, which was then shipped by copper wire three miles to a huge motor in the Gold King Mill, perched on the side of a treeless slope far above.

Even as the motor roared to life, a battle raged over the future of what would become the electrical grid. On one side was direct current, or DC—the kind generated by batteries, lightning and static electricity—which Thomas Edison had used to light up a Manhattan neighborhood in 1882. On the other side of this so-called War of the Currents was AC, alternating current, embodied by Nikola Tesla, the eccentric Croat genius who had worked for and later been spurned by Edison.

Edison, in a morbid fit of desperation, played the danger card. He used alternating current to publicly electrocute house pets, sheep, horses and, finally, a retired circus elephant named Topsy, that, to be fair, had already been sentenced to death for killing three of its trainers. Topsy’s demise was immortalized on film, and today, you can find a YouTube video of the smoking elephant in all its grainy, demented glory. 

Yet even fear didn’t help Edison’s cause. After traveling at useful voltages for about a mile, DC petered out. AC, meanwhile, could be “stepped up” to high voltages in order to push it across long distances, then “stepped down” with transformers for use in home or industry. The Ames power plant, one of the first commercial industrial applications of AC, dealt a severe blow to DC, and was a seed of what today is a mostly AC grid.

The owners of the Ames plant strung new lines from the plant to more mines, then to town and beyond, becoming the Telluride Power Co., which would own and operate several generating stations and hundreds of miles of transmission lines. Similar systems, built by similar utility monopolies, grew up around the nation.

Until World War II, each utility’s grid was fairly self-contained, with fossil-fueled or hydroelectric power plants located close to the residents and industries that used their power. But in the middle of the 20th century, as long-distance transmission technology improved, the utilities oozed outward, building huge coal power plants in the interior West near the mines, which sent power hundreds of miles across mesa and canyon to Palm Springs, Los Angeles, San Diego, Phoenix. Meanwhile, each of the three distinct grids—the Western, Eastern and ERCOT, or Texas—became more internally interconnected to increase reliability.

The Western Grid’s 240,000 megawatts of generating capacity come from sources as varied as dams in British Columbia to coal-fired plants in northern Mexico, traveling on 120,000 miles of high-voltage transmission lines, plus countless miles of distribution lines, the smaller wires that deliver power to your home. As it expanded from one-town micro-grids to today’s weblike Leviathan, the grid grew in an organic fashion, with new components welded on to the old ones, like additions slapped on to trailers in the rural West. Hydropower from that same Ames plant now travels alongside coal- and solar-generated electrons in transmission lines built in the 1980s.

Operation and regulation of the grid is a similar mishmash. In the late 1970s and early 1980s, as the Bell telecommunications monopoly was dismantled, a similar effort was made to transform electricity from a service provided by monopoly utilities into a commodity traded on an open market. For the first time, nonutilities were able to build power plants, mostly natural gas-fired, and sell power to the utilities. In 1998, California dove into the open-market concept by opening the California Power Exchange. But unscrupulous operators gamed the system, with some producers creating false power shortages in order to up prices, and Enron engaging in its own crazy scheme of shipping power out of state, then back in, to dodge state price caps. That drove the utilities to the verge of collapse, caused “brownouts,” led to the recall of California Gov. Gray Davis and, in 2001, ended the power-exchange experiment.

Today, about 80 percent of California’s grid is run by the California Independent System Operator, a nonprofit entity that allows wholesale power producers access to the grid. It’s essentially still the open market, though purportedly less prone to gaming than the earlier exchange, and it follows the same model as in most of the Eastern and Texas grids. The rest of the West, though, is stuck somewhere in between the old model and the new, with monopolized utilities—a mixture of investor-owned, municipal and co-ops, each of which is regulated differently—still running the show.

Federal policy—or the lack thereof—hasn’t helped. The authors of the 2011 MIT report, The Future of the Electric Grid, bemoan the fact that in other industries such as natural gas, telecommunications and airlines, federal policy was reformed after the 1970s to reflect market realities. “In contrast,” they write, “despite dramatic changes in the electric power sector, federal policies established in the 1930s … still play a central role in that sector.” In other words, just as new pieces have been added onto the old grid, new policies have been piled on top of antiquated ones.

It sounds chaotic, and as the San Diego outage and others reveal, it often is. When the H-NG power line shut down back at Palo Verde, the electricity sought out the path of least resistance towards its destination, which in this case was a tangle of lines in the inland desert that weren’t equipped to handle such high voltages. Seconds after that arc had crackled over the Yuma substation, lines, transformers and other equipment from Mexico up into the Imperial Valley were pushed to their limits, and began to fail. Some physicists will tell you that this phenomenon is an inevitable consequence of a grid that has evolved to operate under principles of self-organized criticality, prone to the same sort of non-linear, cascading cataclysm as wildfires, avalanches or earthquakes.

But for the most part, this gargantuan contraption is so seamlessly reliable that most of the millions of people who use it forget it exists. A small army of technicians is dedicated to keeping it that way, perching in front of monitors in rarely seen control rooms around the country.

 

On the south end of Scottsdale, Ariz., a low-slung, mostly windowless brick building sits back from the street between a Spanish colonial apartment complex and an upscale mobile home park. No sign tells the curious passerby what might lie within, yet the gleaming razor wire atop the surrounding wall raises unsavory possibilities. A guard is there to stop the curious, and if she fails, the device that pops out of the pavement and rips your tires to shreds will definitely succeed.

This is the operations center for the Salt River Project, one of the nation’s largest municipal utilities. The executives and the clerks hang out at another building a couple of miles away in Tempe, but this is where the real grid action goes down. When potential crises strike, operators here do their best to keep them from spreading. (SRP wasn’t hit by the San Diego blackout, but since it operates the Palo Verde switchyard, it was peripherally involved.) Most of the time, though, technicians spend their time keeping power flowing to some 1 million customers in the Phoenix metro area from the utility’s power generators, which range from shares in coal plants as far away as northern Colorado to small hydroelectric facilities on Scottsdale’s canals.

You’d think the stress of keeping all those air conditioners running would wear on Mark Avery, SRP’s grid manager. But he’s fit, trim and looks no older than 50, with a full head of salt-and-pepper hair. When he tells me that he started his career as an operator trainee in 1974 at Navajo Generating Station in northern Arizona, I have to ask him to repeat himself.

Of all the baffling facts about the grid, perhaps the most mind-boggling, Avery tells me, is its constant need to be kept in balance. “The Western Grid is like a giant bucket,” he says, “with a bunch of spouts running in and out, and you have to keep the water level constant.” That is, the amount of electricity being fed into the Western grid by thousands of generators must always be equal to the load––meaning the amount being used by its millions of customers. Lose the balance, and the frequency of the alternating current will drift away from the optimum 60 cycles per second, which could cause equipment to fail and result in outages. In the Western Grid, that balancing act is performed simultaneously by 38 different authorities; Avery and his colleagues oversee one of them.

Each day, using models based on weather forecasts and historical patterns, SRP’s marketing team draws up a demand forecast for the following day, and schedules generation from SRP’s own array of generators (or from neighboring utilities if it’s cheaper) to “follow” the demand curve. They also schedule plenty of extra backup power—usually from fast-firing natural gas or oil “peaking” turbines—to make up for forecast errors or to compensate for a downed power line or plant. The grid operators are then responsible for implementing the daily plan, and for tweaking it as it unfolds with hourly forecasts and scheduling. Over the course of the hour, they make up for energy imbalances—or deviations from the plan—by turning generation up or down. Minor, second-to-second bobbles are “regulated” automatically by software, typically by adjusting Hoover Dam’s hydroelectric turbines.

Over time, this balancing act has become more and more challenging. Four decades ago, the greatest demand came from big industrial facilities like factories or mines that ran round-the-clock or on a set schedule. The generation sources were also steady and predictable, coming mostly from “baseload power” –– meaning coal or nuclear.

In the 1980s, the demand side of the equation began to change radically. As manufacturing moved overseas and people poured into the region, residential and commercial customers—whose electricity demand curve has bigger daily ups and downs—took up a larger share of overall demand. The air-conditioning revolution arrived at the same time: Between 1980 and 2009, the percentage of Western homes with air conditioning shot up dramatically, so that now there are more than 18 million homes with power-gulping cooling systems on the Western grid. On a summer’s day in the desert Southwest, the overall electrical load at 5 p.m. can be twice what it was at 5 that morning, mostly due to the energy it takes to cool us all down; it can account for about 30 percent of total peak electricity demand in California or Arizona.

The new sources of power feeding into the grid are even less predictable. Solar and wind energy can swing up and down dramatically during a single hour. A massive dust storm or thunderheads moving in on a summer afternoon can cut production from a photovoltaic array by 80 or 90 percent in a matter of seconds. Wind-power swings are less violent, but can be huge: California’s collective turbine output can vary by 3,000 megawatts or more over the course of a day, and by 100 megawatts in an hour. The greater the percentage of solar and wind in the mix, then, the greater the potential for errors in the day- and hour-ahead scheduling, and the more potential for imbalances, instability and outages.

“It’s not the same kind of dispatchable, turn a lever, decide a day ahead what you’re going to run the next day with any kind of certainty system that we’re used to,” says Brian Parsons, transmission and wind integration group manager at the National Renewable Energy Laboratories in Golden, Colo. Utilities typically respond to that uncertainty by adding two megawatts of natural gas backup capacity for every three megawatts of added wind power, chalking up the expense of building and operating the reserve to wind’s “ancillary costs.”

For Mark Avery, the variability is virtually a non-issue, because only about 3 percent of SRP’s energy mix comes from solar and wind. But in California, where the state has required utilities to get 33 percent of their power from renewables by 2020, it’s been a significant source of hand-wringing, as officials scramble to make sure they have enough reserves to cover wind and solar’s variability.

Fossil-fuel pushers regularly warn that replacing their steady plants with fickle solar and wind will plummet us all into darkness. They point to Germany, which now gets more than 20 percent of its power from non-hydroelectric renewables, primarily solar and wind. That has pushed the transmission system to “the brink of capacity,” according to that grid’s federal overseer, and renewables-caused voltage swings have resulted in machine malfunctions at Hamburg factories.

But is the problem really with renewables, or with the grid and the way it is run?

 

In the spring of 2011, rivers in the Pacific Northwest swelled when an unusually ample snowpack melted, and the water backed up behind the big electricity-generating dams of the Columbia River and its tributaries.

The Bonneville Power Administration, which manages the dams and its own grid-balancing area, either had to run that extra water through the turbines, and put thousands of megawatts of additional power into the grid, or spill it over the dams without generating electricity. The decision seemed simple—produce the power, and sell it for a bundle, right? But it wasn’t, because of the limitations of the grid and the presence of native fish.

Even during normal water levels, the collective power plants and dams in BPA’s balancing area generate far more power than its customers can use. In late April of this year, for example, the dams and a growing number of wind farms together cranked out as much as 6,000 megawatts—enough for some 6 million homes—more than BPA’s customers could use. So both the BPA and many of the wind farms have contracts to sell that power elsewhere, much of it going directly to California by way of the Pacific Intertie, an 850-mile-long, high-voltage DC “electricity superhighway” from near the Columbia River’s Dalles Dam down to Los Angeles.

As the massive 2011 snowmelt began, power consumption everywhere was down, due to a combination of the recession and mild temperatures in the Northwest and California. The grid operators had to figure out how to curtail power production in order to maintain balance. Spilling the water over the dams, though, would raise the percentage of dissolved gases, such as nitrogen and oxygen, in the river downstream, which, in turn, could kill migrating endangered salmon with something called gas bubble trauma. So the power behemoth forced the wind farms, which rely on BPA’s transmission to get their product to market, to shut down so that it could keep its hydropower operation going full-tilt.

For nearly two months, 2,000 wind turbines sat idle, causing their owners to lose between $2 million and $5 million in potential revenue, even as the dams—not to mention coal plants in other parts of the West—continued to generate juice. The wind companies, claiming discrimination, sued the BPA and filed a formal complaint with the Federal Energy Regulatory Commission, or FERC. Save our Wild Salmon intervened on wind’s side, arguing that the BPA was using the salmon-saving argument without basis—the group believes that the benefits to fish from spilling water offset the harm from gases—to keep from having to unload its hydropower at “negative prices” (paying others to take the electricity, a not uncommon practice in electricity markets). FERC ruled in favor of wind and sent the BPA back to the drawing board. Last year, the BPA curtailed far less wind power and compensated wind companies for resulting losses to the tune of some $3 million (mere chump change for the BPA, whose total budget is around $4.4 billion).

Most observers agree that’s not a sustainable solution; the FERC commissioners noted in their ruling that an expansion and improvement of the grid, i.e., more transmission, could alleviate the pain of all the parties involved by opening up more pathways to market that surplus power. In so doing, the commissioners allied themselves with a growing group of environmentalists who want to change the grid by integrating massive amounts of renewable energy to help combat climate change.

Enter the grid-oriented greens.

Amanda Ormond has been involved in energy issues for over two decades, including a seven-year stint as director of the Arizona State Energy Office, and a subsequent career as a consultant, working mostly with renewable energy companies. Today, she is considered one of the region’s foremost experts on solar power and is a member of the Western Grid Group, an independent organization made up largely of former utility regulators and state officials who are devoted to transforming the grid to increase the amount of renewables in our energy mix. Ormond has a wholesome look—long brown hair cut straight at the bangs, with freckles on her nose—that belies the intensity with which she thinks and talks about these issues.

To those who fret about the destabilizing effects of adding too much solar and wind to the grid, Ormond has a quick response: Cooperate, share and take advantage of the West’s geographical diversity to iron out those variable output curves. The concept is called geographical smoothing, and it’s been embraced by everyone from grid-oriented greens to scientists and engineers.

“The West is very diverse, and that’s a good thing,” says Ormond. “You want plants all over the place, because the wind’s always blowing somewhere.”

David Mooney, center director at the National Renewable Energy Laboratory, agrees. “The more geographically diverse your (wind and solar) systems are, the less variability,” he says. In other words, a dip in output by a wind farm in Tehachapi, Calif., can be offset by turbines in Wyoming; ditto for a solar farm in New Mexico, which might reach peak output two hours before a facility near San Diego. It’s also easier to predict fluctuations over a broader area: An MIT study found that when a geographic region’s diameter is increased, forecast errors are reduced by as much as half.

To help accomplish this smoothing, grid-oriented greens would like to see the West’s 38 balancing areas join together into an “energy imbalance market,” or EIM, that would allow them to share both renewables and the natural gas-fired plants that back them up. Electricity sales would take place on a five-minute schedule, rather than an hourly one, because that’s more in rhythm with the ups and downs of solar and wind. That would alleviate the need for each utility to build its own backup plants, and would therefore lower the cost of integrating renewables into the grid. “Say a utility has 50 generators,” says Ormond, “but there are 4,000 in the West. If you had access to all 4,000, it would be more efficient.”

Tom Acker, a professor of mechanical engineering at Northern Arizona University, points out that for renewable-energy producers, it makes no sense to restrict themselves to such a small area. He compares the utilities’ current approach—building up all their own natural gas reserve plants—to buying a big SUV for everyday use, even though you really need it only a few days out of the year. Under an EIM, a bunch of balancing utilities would be able to share their renewables and that SUV, not to mention the transmission lines. “The progressive way of thinking is to share. … That is absolutely crucial or we’ll never get renewable energy into the system.”

Indeed, an energy imbalance market, says Cameron Yourkowski, policy analyst at Renewable Northwest Project, a Portland-based advocacy group, would have allowed wind operators or the BPA to put all that surplus power up for bid during the spring of 2011.

EIMs are slowly taking hold in the West. Xcel Energy, which serves most of Colorado’s heavily populated Front Range, is pushing for an energy imbalance market to expand the pool of reserves—both fossil fuel and renewable—from which it can draw to back up its burgeoning quiver of wind power. And in February, the California Independent System Operator and PacifiCorp announced that they would create a real-time energy imbalance market by autumn of 2014.

But a well-connected market requires a well-connected grid, and that’s the catch: The current fossil-fuel-centric grid has left many of the windiest, sunniest places marooned, without a way to get solar and wind power to population centers. How much new transmission is actually needed remains uncertain.

“Local groups will say we can do this with rooftop solar,” and therefore minimal additional transmission, says Gary Graham, of the Boulder-based Western Resource Advocates. “It’s just not the case. You can’t get that many solar panels on roofs that fast in the West” to reach his group’s goal of 50 to 80 percent renewables by 2050. “You need utility scale.” And that, says Graham, could require up to 25,000 miles of new transmission in the West alone.

In today’s climate, that would be a herculean feat. Try stringing 600 miles of cable, held up by hundreds of 200-foot towers, and someone will try to stop you, guaranteed.

Transmission is notoriously difficult to build, because a single line can cross so many jurisdictions, from private land, where landowners only get a one-time payment for an easement that will last forever, to federal, tribal and state lands. While the feds can push a natural gas pipeline across multiple states using condemnation powers, they can’t do the same with transmission lines, thanks to a 1935 law. A provision in the 2005 Energy Policy Act established “national interest” transmission corridors—including one leading from Southern Arizona into the San Diego area to alleviate congestion there—through which the feds could, theoretically, have backstop condemnation authority if the states dawdled. However, courts sympathetic to environmental concerns have mostly hamstrung the law.

About two dozen major interstate transmission lines to enable renewables are in various stages of permitting in the West, but none are proceeding very quickly.

Colorado billionaire Phil Anschutz wants to build a huge wind farm near Rawlins, Wyo., and ship the power to California via Las Vegas via a 600 kilovolt, $3 billion direct current line, the TransWest Express. Environmentalists aren’t thrilled about 1,000 turbines in sage grouse habitat. And at a recent pro-transmission conference in Denver, Bill Metcalf, of the Rocky Mountain Farmers Union, balked at the TransWest line, because it is direct current—meaning few or no on- and off-ramps en route—so other wind farms along the way won’t be able to connect to it, making it exclusively Anschutz’s electricity highway.

The proposed SunZia line from wind-rich central New Mexico to the Tucson area has hit opposition because it could cross through environmentally sensitive areas such as southern Arizona’s San Pedro River Valley. In California, a relatively short line, from the wind farms of Tehachapi to the Los Angeles grid, is on hold thanks to resistance from a well-to-do community in its path.

In 2010, FERC tried to push grid expansion by encouraging stronger regional planning efforts, and bringing in stakeholders at an earlier stage. Ormond is somewhat optimistic about how that’s working in the West.

“Instead of looking just at voltage flows and contingency plans,” she says, “they’ve been looking at: What do we need, holistically, going forward? Let’s look at all these possible futures and concentrate on building the paths that are most necessary.”

Stringing wire all over the landscape is not the answer, says Ormond, though some wire must be strung. Even more important is putting the tangle of wires we’ve already got to better use.

“If we do a combination of using the stuff we have better, doing some better sharing, adding new technology products like EIM ... then we’re not going to need tons and tons of transmission,” says Ormond. Operators have to make the grid “smarter” by outfitting it with better monitoring equipment and automating more of its operations: During the San Diego outage, controllers in each of the five affected balancing areas had to call each other on the phone to figure out what was going on. Transmission lines that are filled to capacity only during a few hot days in the summer should be opened to wind and solar for the rest of the year. Rooftop solar and other distributed generation sources should be beefed up with strong incentive programs. Viable energy storage would solve nearly all of the problems posed by intermittent renewables. But technology on a large-enough scale is still years, maybe decades, away.

If the green gridders are a faction of the environmental movement, they are not fire-in-the-eyes activists; they’re more like technocratic problem solvers, working behind the scenes with environmental groups, utilities and state and federal regulators to make the grid less carbon-intensive. They’re making progress, though it’s slow.

Rather than opposing new transmission projects outright, as they might have once done, environmental groups such as The Wilderness Society and the Natural Resources Defense Council have taken a role in the FERC-pushed, stimulus-funded transmission-planning process and actively push projects that will bring more renewables to the grid so long as their aesthetic and environmental impacts are deemed acceptable.

“I think we are moving in the right direction ... toward a more regionally integrated system,” says Ormond. But she worries that progress might be impeded by deepening political partisanship; Ormond worked under two Republican governors, and Arizona’s current solar incentives were put in place by a GOP Arizona Corporation Commission. Today, however, Republicans routinely use renewables—especially news-making failures, such as Solyndra—as a whipping boy.

“Clean energy does not need to be a partisan issue. In fact, it’s really bad if it is,” she says. “Bottom line is: It’s not good for the country.”

 

As the spark that lit the San Diego Blackout hurtled full-throttle across the electrical landscape, various lines, substations and generators tripped off-line, throwing the system out of balance. Grid operators tried to extinguish the flare-ups by cranking up peak generators, but they weren’t quick enough. Meanwhile, the various collapses in the system forced virtually all of San Diego’s electricity onto one set of power lines, running from San Onofre Nuclear Generating Station southward into the city. At 3:38 p.m., the lines tripped, and San Onofre’s reactors shut down.

Milliseconds later, most of Southern California was without power.

The ensuing 12 hours of darkness cost the city and its businesses an estimated $100 million for everything from spoiled food to lost productivity to government overtime. Officials at San Diego Gas and Electric quickly used the failure to their political advantage, speculating that if the Sunrise Powerlink—a controversial power line bringing solar and wind power 120 miles from the Imperial Valley west to San Diego—had been in operation, it might have prevented the outage or helped facilitate a quicker recovery. (The Powerlink started carrying wind power from the also-controversial Ocotillo wind farm this January.)

It took regulators six months to sort through what had happened, and in spring 2012, they released a report detailing the to-the-millisecond timeline, and assigning blame for the outage on poor communication, bad procedures and sloppy planning. Grid-oriented greens were at least somewhat validated: Had better “real-time situational awareness,” or a smarter grid, been in place, the whole thing might have been avoided, according to the report. A good energy imbalance market might have given grid operators quicker access to backup, alleviating some of the pain.

One thing is certain: More blackouts will occur. California grid operators worry they could come this summer: San Onofre was been offline for repairs (unrelated to the 2011 outage) since January 2012, and in June, Southern California Edison www.sce.com/ announced it planned to decommission the plant.

But perhaps the biggest, most insidious threat is our warming climate, which is already attacking the grid on many a front, according to the report, Global Climate Change Impacts in the United States. Weather-related outages have increased tenfold in the last two decades, and it’s only bound to get worse. Hotter days mean bigger peak loads; higher loads and higher temperatures strain power lines, causing them to lose more of the electricity flowing through them and to sag into vegetation: The West’s biggest outage thus far put some 7.5 million people across seven states into the dark when, during a triple-digit heat wave, a line near Portland sagged into a filbert tree, sparking a cascading outage. And, of course, heat and drought exacerbate wildfires, which can take out major power lines as happened in 2007 in San Diego, as well as diminish hydroelectric capacity from reservoirs. Meanwhile, it’s our fossil-fueled electricity system that emits the largest share—some 40 percent—of greenhouse gases.

We may have reached the point at which adaptation is the best approach. While shopping malls across San Diego shut down entirely, and Hooters turned away customers, some bars fired up generators to keep the lights on, and the customers poured in. An Albertsons grocery store kept the coolers humming with a natural-gas-powered fuel cell and had a banner day. They had all effectively thumbed their noses at the 20th century’s finest engineering achievement and instead gone back in time to the days of the ultra-local Ames micro-grid. By doing so, they breezed right through what so many others experienced as a catastrophe.

On Sept. 8, 2011, Sasha Seyb, a freelance decorative artisan in her late 30s, who has lived in downtown San Diego for several years, was driving from work on the coast to her home when the outage hit. She first noticed that streetlights were blinking, and then realized that neither her radio nor her cell phone worked. Her first reaction was to panic, thinking some sort of major catastrophe had hit. But after she got home, and the news circulated that it was merely a technical glitch way over in Arizona, she and her whole neighborhood simply breathed a sigh of relief.

“Everybody was outside; the kids were all eating ice cream; adults were drinking their beer and grilling steaks,” she says. “It was a giant block-party barbecue. It was a really neat vibe, a nice feeling.

“And that night, you could actually see the stars for once.”

Jonathan Thompson is a senior editor at High Country News, where this story originally was published.

Published in Environment

For more than a century, monopoly electric utilities have nurtured the West. They fed the mines and the mills, and now deliver the juice to our thirsty digital devices and air conditioners.

Now, it appears as if the offspring is offing its mother, as rooftop solar slowly strangles utilities.

While the green media has gleefully spread word of this apparent matricide, it was first spawned by a report right out of the utility industry itself, and then bolstered by a prominent utility executive, lending it credence. The concern from the industry is fairly straightforward: If customers produce their own energy, they won’t need to buy it from the utility, and revenues will drop. And if those consumers produce more energy than they use, they become competitors, lower the price of electricity and take another bite out of the utilities’ bottom line—until we just don’t need the utilities anymore at all.

The idea of this sort of rooftop revolution is as rousing and lovely as that of wiping out our industrialized food system, with backyard and rooftop gardens. But it’s also nearly as implausible for two reasons: scale and dependency.

If any utility should be under threat from rooftop solar, it would be the Phoenix area’s Arizona Public Service. The state is one of the best places in the world to generate solar power, and it has a strong net metering program that allows homes with distributed generation to recoup their costs and then some. APS boasts that 24,000 of its customers have taken advantage of the sun and the incentives. While that’s a hefty number, it represents only about 2 percent of the utility’s more than 1 million customers. That may put a tiny dent in APS’ $600 million-plus yearly profit, but it’s a long way from being an existential threat. Even the 150,000 solar rooftops here in California, with a max capacity of less than half of what the Palo Verde Nuclear Generating Station kicks out at any given moment, is a mere drop in the total energy bucket.

With the cost of solar panels continuing to drop, it is conceivable that 2 percent could become 20 percent. But that still won’t necessarily be the death knell for utilities, because distributed generation as we know it now is still desperately dependent on the grid, and the utilities that run and operate it. David Roberts, over at Grist, recently noted that “a home creating its own power basically unplugs itself from the grid. … The electricity that’s generated onsite on a solar home is used by that home or its immediate neighbors. It barely touches the utility’s transmission and distribution system.”

While Roberts’ explainer on this issue is otherwise excellent, this passage doesn’t quite cut it. Even figuratively, one could say that a home “unplugs” only during those very rare moments when it produces exactly as much power as it uses. That might happen for a few minutes during the day. For the remaining 86,000 seconds in the day, rooftop solar is very plugged in.

Solar generation typically reaches its peak around 1 p.m., right at a time when residential power use is relatively low, because air conditioners have yet to crank up too much, and the residents are at work. Power flows from house to grid, where it adds to the current that is flowing towards the “load,” or places that need it. That might be a neighbor, unless her house’s panels are also generating surplus power, in which case it could be the Walmart down the street or the factory in a neighboring town. Residential power use then swings upward as the afternoon progresses, peaking around 5 p.m., as folks get home from work, and air conditioners rev up. By this time, solar power is on the downswing, so the typical residence will use more power than rooftop solar generates. It’s payback time, when residences that generated all that surplus power in the middle of the day get it “back” from the grid (though now the juice is most likely coming from natural gas, hydropower, coal or nuclear plants). In essence, a transaction is taking place that allows the rooftop solar home to treat the grid like a big battery, storing up excess power and then releasing it when needed.

This transaction is critical for rooftop solar to make any sense (unless one is inclined to attune one’s energy use precisely to the cycle of the sun, or to put in a big enough battery bank to back up all that solar on site, but more on that later). But the transaction can’t take place without the grid. And in most parts of the West—California being the exception—the grid is run by monopoly utilities, and they’re the ones firing up the so-called peaking generators necessary to keep the power on when the sun dims and demand is at its highest. Indeed, the cost of building and running those “peakers”—which in many cases are essentially power-generating, natural gas-guzzling jet engines—are the big threat to utilities. Yet they become more and more necessary as more solar—be it rooftop or utility-scale—is put into the grid.

There are ways around this quandary. The obvious one is for all those folks with distributed generation to battery-up and literally unplug from the grid. The other is to broaden the push for distributed generation beyond rooftop solar, to small-scale hydro-power, geothermal, wind and even small natural-gas plants, so that the collective input from distributed generation can meet demand at all times of the day so as to ease the dependence on the utilities (though not necessarily the dependence on the grid … decoupling from the grid will be a lot harder than cutting the utilities loose, for a number of reasons.)

In the meantime, the utilities might want to consider the recent warnings as a wake-up call. Rather than go to battle with distributed generation—by trying to kill incentives or cut down net metering programs—they’d do well to adapt to it, even embrace it. This won’t be easy. It's a hugely complex issue, but it may be the only way out. Rooftop solar can be the utilities’ killer, or savior, depending on how the utilities handle things.

Cross-posted from High Country News. The author is solely responsible for the content.

Published in Community Voices