It came down to the final minutes before midnight last night for SB 722, the bill that would make law California’s 33% renewable energy goal by 2020. But as the bill’s author State Senator Joe Simitian says, “The clock just ran out. It’s as simple and painful as that.” Continue reading
Author Archives: Lauren Sommer
Lauren is a radio reporter covering environment, water, and energy for KQED Science. As part of her day job, she has scaled Sierra Nevada peaks, run from charging elephant seals, and desperately tried to get her sea legs - all in pursuit of good radio. Her work has appeared on Marketplace, Living on Earth, and NPR's Morning Edition and All Things Considered. You can find her on Twitter at @lesommer.
Last year, as part of a radio series on methane, I drove down to visit John Fiscalini, who was building a huge methane “digester” to convert his cows’ “byproducts” into clean energy, and reduce the carbon footprint of his sizable dairy farm and cheese factory outside Modesto. After millions of dollars in design and construction costs, Fiscalini was fed up with state air and water regulators, who he felt were pulling him in different directions. A year later, have things improved? Not so much, as Quest’s Lauren Sommer found out, when she returned to the San Joaquin Valley for an update. — Craig Miller
Three years ago, KQED’s QUEST visited a Central Valley dairy that was taking an innovative approach to its waste problem. Instead of collecting thousands of pounds of cow manure in open holding ponds, Joseph Gallo Farms uses it in a renewable energy technology known as a methane digester. Continue reading
Energy storage is something we’ve come to take for granted in everyday life. Our cell phones, iPods, cars and computers all depend on batteries. But storing large amounts of energy for the electric grid is another matter entirely. It’s a technical challenge that has yet to be met–but will need to be for the coming age of renewable energy.
California’s grid is designed to deliver electricity on a real-time basis. Every four seconds, the grid operators at the California Independent System Operator (ISO) have to ensure that the energy supply meets the demand in the state, something that’s known as “balancing” the grid (you can see today’s electricity forecast on the ISO site). As a result, they coordinate the one piece of the system that they have control over: the power plants. Continue reading
In a kind of cruel paradox, heat has always been the enemy of solar panels. At higher temperatures, photovoltaic cells become less efficient, which is problematic in an industry where efficiency is the name of the game. That heat also represents wasted energy.
Today, researchers at Stanford University announced that they may have helped solve that problem. Nick Melosh of Stanford’s Materials Science & Engineering department set out to make use of the wasted heat. He and his colleagues created a solar cell technology that uses both light and heat to generate electricity. It’s called “photon-enhanced thermionic emission” (or PETE for short). “This is the first time that a process has been reported that can use the heat and the photons together harmoniously,” says Melosh. Continue reading
Amidst all the fretting over the development of solar and wind technology, it hasn’t been lost on some scientists that there are organisms on the planet that have already cracked the renewable energy code: plants.
Photosynthesis is a highly efficient way of converting sunlight to fuel. So why not try to copy them?
Transportation is the top source of greenhouse gas emissions in California. So in a state where car culture rules, what will it take to get us out of our cars?
That’s the goal behind SB 375, a bill passed in 2008 that links greenhouse gases to urban sprawl. Under this first-in-the-nation policy, the state’s 18 regional planning organizations must reduce the emissions coming from vehicles through land use and transportation planning. This week, the Air Resources Board is expected to release the draft emission reduction targets that the agencies must meet by 2020 and 2035.
While the chances of getting Californians out of their cars completely are slim, the idea is to reduce the number of miles traveled through more public transit, more “walkable” communities and denser development. (Learn more about that in this Quest story about transit villages).
According to a report released today, that development approach can have some dramatic benefits, considering how California is expected to grow. By 2050, some projections put the population at 60 million, adding seven million new households.
The planning firm Calthorpe Associates looked at those housing needs and ran a number of growth scenarios, in a study funded by the California Strategic Growth Council and California High Speed Rail Authority. They compared a business-as-usual approach of low-density suburbs (30% urban and compact growth) to a “growing smart” scenario with more urban in-fill and transit-oriented development (90% urban and compact growth). While that last scenario may sound like the land of endless condos, according to Peter Calthorpe, it would still be 53% single family homes. Calthorpe calls it “a shift back to what California used to build–bungalows.”
Here are some of the benefits they found for the scenario by 2050:
- Reduces the number of vehicle miles traveled by nearly 3.7 trillion
- Saves more than $194 billion in capital infrastructure costs
- Saves 19 million acre-feet of water
- Prevents the release of 70 million metric tons of carbon dioxide equivalent, or 25% less than business-as-usual
- Saves California households $6,400 per year in auto-related costs and utility bills.
In-fill development can often cost more than low-density development and this report doesn’t take housing prices into account. Indeed, costs may be one of the biggest challenges for SB 375, since both the state and cities are facing budget crises and a lull in the housing market.
Under the bill, state transportation funding will be prioritized for projects that meet the SB 375 goals. But according to Hasan Ikhrata, Executive Director of the Southern California Association of Governments (one of the regional organizations doing the planning), financial incentives will be key to reaching the goals. “I think the biggest challenge is to find incentives to help cities, because cities want to do this, but they don’t have the resources to do it without help,” he said.
The Quest/Climate Watch series “33×20: California’s Clean Power Countdown” continues on Monday, with the first of two parts on one company’s attempt to build one of the nation’s largest PV solar arrays in San Benito County.
With its ambitious 33%-by-2020 renewable energy goal, California will be looking for renewable megawatts from all corners of the state. While the state may hit 18-19% by the end of this year, reaching 33% will require approximately a doubling of renewable power, since the state’s energy appetite will continue to grow in the meantime.
So, where will the energy come from? According to the California Public Utilities Commission, wind and solar will have to carry much of the “load.” Check out the CPUC projections in the charts below.
Almost lost amid the Copenhagen media clutter was last week’s meeting of the American Geophysical Union in San Francisco. So this week we’re playing a little catch-up. Lauren Sommer has the second of three posts on things that caught our attention at AGU.
Carbon capture technology has largely focused on the most convenient emissions sources–namely the stacks at large power plants. But as Columbia University’s Allen Wright showed at the American Geophysical Union conference in San Francisco last week, there are other ways to do it.
Wright and colleagues demonstrated their “air capture” technology, where carbon dioxide is absorbed straight from the air by something that looks a lot like a gadget for cleaning Venetian blinds. It’s a special plastic material with a sponge-like consistency. Once the carbon is absorbed, the material is exposed to water or water vapor which causes the carbon to be released. It can then be captured. Wright says it captures CO2 three to five times better than a leaf in full sunlight.
On a large scale, this technology might be built into “artificial trees” that could be stationed anywhere around the globe. The prototype, designed by Wright’s Global Research Technologies, doesn’t look much like a tree. It’s a shipping container with a circular, rotating basket on top where the air capture units are exposed to the air. After one rotation, the baskets would be brought “downstairs” where the carbon is captured. From there, the carbon could be geologically sequestered or even used to make beverages bubbly.
Of course, the main criticism of this approach is efficiency. Carbon dioxide is only about 0.04% of the atmosphere, which is why more concentrated sources like power plant stacks get more attention. Wright says capturing carbon from power generation will be important, “but capture at the stack isn’t enough. It won’t do what has to be done. Air capture has the advantage of being able to deal with emissions from anywhere on the planet from any source.”
Cars are one of the sources he’s talking about. Their prototype unit is designed to capture a ton of carbon a day, which would neutralize the emissions from about 20 cars. They hope to get the cost of each carbon-capturing unit down to the price of car, so the cost of reducing a ton of carbon could one day be similar to other technologies.
Still, to make an impact on global emissions, millions of these units would need to dot the landscape. And just as with renewable energy, NIMBY issues are a potential roadblock. But as is a common refrain these days, Wright says if we’re serious about cutting emissions, we’ll need every technology that shows promise.
Nearly lost amid the three-ring circus of Copenhagen coverage is the annual gathering in San Francisco of the American Geophysical Union. We’re doing our best to staff selected sessions there. Climate Watch contributor Lauren Sommer was there for some grim new research on groundwater in the Central Valley.
California’s Central Valley has lost nearly enough water in the past six years to fill Lake Mead, according to NASA scientists presenting at the American Geophysical Union Conference in San Francisco this week. Nearly two-thirds of that loss–20.3 cubic kilometers of water–is from groundwater depletion.
With the recent drought, groundwater has been an important water source for California’s Central Valley agriculture, but getting a picture of that water use hasn’t been easy. Water districts haven’t been required to report groundwater pumping in their areas. That’s something the recent Delta overhaul package of legislation now requires, but according to Jay Famiglietti of UC Irvine, the records to date aren’t very complete. Wells are sparse and the measurements have been sporadic.
The majority of the water loss since 2003 has been focused in the San Joaquin Basin at the southern end of the Central Valley, which is losing 3.5 cubic kilometers of water each year. The bulk of that loss is the result of groundwater depletion.
Famiglietti says this is due to a “triple threat” in California. First came the drought, then decreased water allocation and more groundwater pumping. Finally, with less surface water, the groundwater aquifers have a reduced opportunity to recharge. Famiglietti says it’s clear that California is using groundwater at an unsustainable rate, which “poses significant threats to food production in US and the California economy.”
This large-scale picture of California’s groundwater comes from NASA’s Grace project. Twin satellites orbiting the Earth detect changes in the gravitational field, caused by the movement of water. Those satellite measurements act like a“scale at the bottom of the ocean weighing how much water is in each of these spots,” according to NASA’s Michael Watkins. They also detect changes in snow, surface water and soil moisture.
The Grace project, though, is becoming a “senior citizen,” according to Watkins and is reaching the end of its technological life. He says quality of their water research, which has included other spots around the globe, speaks to the need for another generation of the project. Famiglietti says, though this data can’t replace ground measurements, he hopes it will be taken into account by state agencies faced with making the tough choices about California’s aquifers.
Lauren Sommer’s two-part radio series on carbon capture in California airs this week on The California Report. You can also view her slide show at the end of this post.
The idea seems simple enough: In order to get energy, we burn carbon. In most cases, that carbon comes out of the ground in the form of natural gas or coal. So instead of releasing the resulting carbon dioxide emissions into the atmosphere, why not put it back into the ground?
Of course, carbon capture and storage/sequestration (CCS) is much more complicated than that. Nonetheless it’s a strategy that’s being pursued aggressively by both international leaders and US Energy Secretary Steven Chu, who would like to see it deployed in ten years.
There are obstacles on both the “capture” and “storage” side of the equation. In terms of technology, however, “storage” is much further along, thanks to the oil and gas industry, which is already using CO2 in oil recovery. Injecting compressed CO2 into oil fields forces more oil to the surface in a process known as enhanced oil recovery. As many in the industry will remind you, they have three decades of experience doing this.
Keeping it underground is another matter. In the western US, the West Coast Regional Carbon Sequestration Partnership (WestCarb) is setting up a number of pilot projects to study how CO2 can be safely stored underground. As Technical Director Larry Myer explained to me, one of the primary goals is to simply work out the regulatory, siting, and liability issues.
As with any waste issue, choosing the site is the most important–and often most difficult–issue. California’s Central Valley has plenty of underground saline aquifers and depleted oil and gas fields that could hold CO2. But the trick is finding a site where the geology can securely store it and where there’s little risk of groundwater contamination. On the plus side, scientists know that CO2 is slowly immobilized underground, which lessens the risk over time. But how long that takes is still under study.
As for the “capture” issue, there are three ways to separate CO2 from power plant emissions.
- In today’s Climate Watch story, I describe Oxyfuel technology, in which natural gas is burned in pure oxygen. Since the outputs are steam and carbon dioxide, the CO2 can be easily siphoned off. But that requires building new power plants from scratch.
- The second option seeks to deal with the carbon dioxide before the fuel is burned; a “pre-combustion” approach. Or for all you wonks out there: Integrated Gasification Combined Cycle (IGCC). The downside to this process is that it requires gobs of energy, which makes it expensive.
- Finally, there’s the “post-combustion” approach. That’s where the CO2 is “scrubbed” from flue gas after the fuel is burned. Existing plants can be retrofitted with this technology, but it also comes with large energy penalty, just like IGCC.
A price on carbon, through either a cap-and-trade system or carbon tax, would change the economic case for CCS, but there are a lot of strikes against the technology. So why pursue it?
The argument goes like this: In order to achieve steep emissions cuts–say an 80% reduction worldwide by 2050–it may be an important tool (or stabilization wedge). The world will continue to use fossil fuels in the near term and despite the enormous growth of renewable energy, it’s still a drop in the bucket. That’s why many believe that CCS is a crutch the world needs to wean ourselves from fossil fuels.