Research Vessel Docked for Lack of Funds

The research vessel Point Lobos, docked at Moss Landing. Photo: Craig Miller

The research vessel Point Lobos, docked at Moss Landing. Photo: Craig Miller

Update: Since this original post, the status of the Point Lobos was updated by Paul Rogers in the San Jose Mercury News. The article also adds detail on the finances of MBARI and its primary funder.

The Monterey Bay Aquarium Research Institute (MBARI) has been forced to dock the workhorse of its research fleet, the R/V Point Lobos.

For years the vessel has ventured out three-to-five times a week, to conduct short-term experiments in the deep canyons of Monterey Bay. The ship serves as a platform for the Institute’s remote-control submarine, ROV Ventana. The Point Lobos and its robo-sub have played critical roles in recent experiments to study the effects of ocean acidification, among other endeavors.

In the late 1980s, MBARI converted the 110-foot vessel from its original duty as an oilfield service boat in the Gulf of Mexico. Since then it’s completed more than 3,500 missions, with its seven-person crew and various teams of scientists.

The remotely operated sub Ventana, perched on the afterdeck of the Point Lobos. Photo: Craig Miller

The remotely operated sub Ventana, perched on the afterdeck of the Point Lobos. Photo: Craig Miller

Institute spokesman Kim Fulton-Bennett says that as of April 1, the Lobos is “mothballed” for the time being, its future uncertain. “It means we won’t have as frequent access to the ocean as we did,” Fulton-Bennett told me, as we stood on the dock at Moss Landing.

“By going out several times a week, we’ve got a database of observations that goes back 20 years.”

But the Wall Street woes of the past couple of years have taken their toll on the investment portfolios of many foundations, including MBARI’s primary funder, the David and Lucile Packard Foundation, which Fulton-Bennett says has reduced it’s funding for MBARI.

H-P co-founder David Packard launched MBARI in 1987 with the goal of applying advanced technology to marine research. The Institute relies on the Packard Foundation for 80% of its funding, typically $30-to-40 million per year, according to the MBARI annual report.

The Institute has two other vessels in its research fleet, a converted pilot boat called the Zephyr and the 117-foot Western Flyer, a twin-hulled, ultra-stable vessel that resembles a catamaran-style ferry. The Flyer is deployed on longer-duration missions in open sea, usually with project-specific funding.

The Point Lobos is featured in Lauren Sommer’s radio report/audio slide show for KQED’s Quest series.

Environmental Risks of “Geoegineering”

This week, as scientists meet in Monterey to discuss the potential for large-scale climate intervention strategies, we’re posting short discussions on some of the issues surrounding “geoengineering.”

87784767Aside from the political and economic risks associated with geoengineering, which we explored in Monday’s radio segment on The California Report, critics warn that climate intervention strategies involve some serious potential environmental consequences as well.

In one 2008 study, scientists at the Lawrence Livermore National Lab found that one of the leading geoengineering ideas–blocking solar radiation by pumping sulfur aerosol into the stratosphere–may lead to decreased precipitation across the globe.

Climate scientist Phil Duffy, now of the education organization Climate Central and one of the authors of the 2008 study, says that the decrease in precipitation would follow a slowdown of the overall hydrologic cycle, caused by a decrease in evaporation.  Blocking sunlight reduces evaporation, and since what comes down much first go up, less evaporation means less rain and snow.  As this geoengineering scheme is being proposed as an emergency brake to counter effects of climate change like drought, this is problematic news.

Stratospheric sulfur injection could also seriously damage the Earth’s ozone layer above the Arctic, another 2008 study found.  And opponents fear that it could lead to acid rain, which could exacerbate the growing problem of ocean acidification.

Ken Caldeira of the Carnegie Institute for Science says that computer modeling from his lab indicates that even if the strategy improved living conditions for 90% of the people on the planet, it’s likely that 10% would suffer negative environmental consequences, and, he says, it would be hard to predict where on the planet that 10% would be.

“We’ve come to the conclusion that there are no experiments that will tell you ahead of time what the regional effects will be,” said Caldeira.

Another high-profile strategy involves fertilizing the ocean with iron as a way to  encourage algae blooms for carbon sequestration.  Algae absorb carbon dioxide as they grow, and the theory is that when they die, they’ll sink to the bottom of the ocean and take the CO2 with it.  There is conflicting research about whether this could work as a long-term sequestration strategy, but a recent study suggests that regardless of whether it’s effective at sequestering CO2 or not, fertilizing the oceans with iron could harm marine ecosystems.   The research shows that increases in algae from the genus Pseudonitzschia can promote concentrations of domoic acid, a poison that can kill birds and marine mammals.  Richard Black has more on the new findings at the BBC website.

For more on the potential risks of geoengineering, Alan Robock‘s article “20 Reasons Why Geoengineering May Be a Bad Idea” appears in the May/June issue of the Bulletin of Atomic Scientists.

Hot Topics in San Diego

NASA's "Dynamic Planet" exhibit at the San Diego Convention Center. Photo: Craig Miller

NASA's "Dynamic Planet" exhibit at the San Diego Convention Center. Photo: Craig Miller

SAN DIEGO –The annual meeting of the American Association for the Advancement of Science (AAAS) draws “thousands” of scientists in virtually every endeavor, from astrophysics to zoology. In climate science circles there was no lack of topics to choose from this year. Among them:


Several sessions were devoted to the notion of fending off climate change by tinkering with earth systems. In technical sessions and news briefings, there was a range of opinion on display, from “Let’s try it” to “Let’s look at it,” to “Don’t even think about it.” There seems to be general agreement that techniques like seeding the atmosphere with particulates could yield rapid results–but the idea is fraught with political controversy and legal pitfalls. Stanford’s Ken Caldeira likened the idea to a cancer patient who accepts the risks of chemotherapy, in order to avoid worse consequences. Philosophy professor (and Caldeira’s former teacher) Martin Bunzl, firmly rejected that analogy, saying that unlike cancer therapy, the risks are not well known and “You can’t just turn it off.” Bunzl directs the Climate and Social Policy Initiative at Rutgers University.

At Climate Watch, we’re preparing an explanatory radio feature on geo-engineering, for broadcast in the coming weeks.


The plight of the planet’s oceans was a focus of the conference, with numerous discussions of acidification, marine reserves and the newly implemented concept of “marine spatial planning,” an effort to map the oceans’ topography, biota and habitat, then translate that into a kind of zoning plan for human use (an approach specifically mandated by the Obama administration last year).

In October, researchers will formally conclude the Census of Marine Life, a 10-year collaboration among scientists in 80 countries, to “assess and explain the diversity, distribution and abundance of life in the ocean.” During a media briefing at AAAS, census Co-Chief Scientist Ron O’Dor estimated that the final tally would include 5,000 newly discovered species (“not counting the microbials”), from flying sea cucumbers to the “Rasta sponge,” which, according to O’Dor’s colleague, Shirley Pomponi, appears to sport dreadlocks and also “produces an anti-cancer compound.” O’Dor said one general conclusion from the census would be that while it is “large and resilient, we can’t keep insulting the ocean forever.”

Science & Policy

In keeping with the meeting’s theme of “Bridging Science and Society,” and reflecting the current angst over credibility in science, there were overflow sessions with titles such as “A Wobbly Three-Legged Stool: Science, Politics and the Public.” While people spilled out the door of that room, hard-science lectures in adjacent rooms drew just a smattering of people. In an interview with Climate Watch, Brad Allenby, a professor of engineering and ethics at Arizona State University, lamented that “the climate change discussion has become so polarized, even among scientists, that it’s difficult to present the public with factual information that is credible.”

European Union exhibit at AAAS. Some attendees commented that the exhibit hall seemed sparse this year. Photo: Craig Millerl

European Union exhibit at AAAS. Some attendees commented that the exhibit hall seemed sparse this year. Photo: Craig Miller

National Climate Service

NOAA chief Jane Lubchenko used the occasion of the conference to talk up her agency’s new National Climate Service, funded by legislation last year. The new branch will provide one-stop shopping for climate research and tools for policymakers, including those at the state and local level. Lubchenko says she hopes to have the new unit operational by October, when the federal fiscal year turns over.

Polar Bears and Sea Ice: Sorting it Out

87514496A recent post I wrote to highlight a radio discussion of the current plight of polar bears, drew a challenge from Russell Steele, one of our regular readers. Steele questioned some of the scientific conclusions underlying dire predictions for the bears.

To help sort some of this out, I asked for responses from two highly regarded scientists in the field. Here’s a response to the specific reader challenge from Mark Serreze, Director of the National Snow & Ice Data Center, in Boulder, CO:

It is unclear what Mr. Steele is trying to get at with reference to the seasonal cycles in sea ice extent from the AMSR-E data. The AMSR-E data, while valuable, only go back to 2002. Through combining SSM/I and SMMR satellite data with other information sources for earlier years, we have a decent record of Arctic sea ice extent going back to the early 1950s. The relevant issue is the long-term decline in end-of-summer (September) ice extent evident in this record, with the extreme September minima of recent years (represented in the short AMSR-E record) serving as exclamation points. The observed rate of September ice loss exceeds expectations from nearly all climate models.

I also turned to Waleed Abdalati. Now director of the Earth Sciences Observation Center at the University of Colorado, Abdalati is a veteran of the Cryospheric Sciences and Terrestrial Hydrology programs at NASA, and one of the most articulate people I’ve heard speak on the subject of polar ice. He offers the following:

I am not an expert on polar bears, but I do think it is safe to say that
 their primary habitat, the Arctic sea ice, is severely threatened.  I, and 
most of my colleagues believe we are well on our way to an ice-free Arctic
 in summer any time between this decade and the next 40 years.

 is because of two things:  1) it will be decades before the ocean has 
finished its response to present-day greenhouse forcing, so the impacts of 
what we’ve done already have not been fully realized; and 2) the loss of
 sea ice is self-compounding: when it starts to shrink, exposing a 
darker more (heat) absorbing ocean underneath, the likelihood of its continued
 shrinking is greater (ice melts, exposes darker ocean, absorbs more heat, 
melts more ice, exposes darker ocean, and so-on).

Of course the flipside
 of this is that as ice starts to grow, it is more inclined to grow, but
 against the backdrop of the increased warming, the former is far more likely 
than the latter. Finally, as thick multi-year ice disappears, it is
 replaced with thinner and younger ice that is more vulnerable to surface 
melt from the atmosphere, bottom melting from sea water, and being carried
 away to lower, warmer latitudes by ocean current and wind.

So back to the polar bears: If their habitat disappears and they are unable 
to hunt seals, their main source of food, they seem to stand little or no
 chance of survival. I am not a wildlife biologist but its hard for me to 
believe they as a population can sustain themselves on land and with only a
 seasonally-present ice cover. In some cases, the fact that they face more
 challenges on sea ice than in the past, has driven them to forage inland,
 creating the illusion in some people’s minds that their populations are 
increasing, because there are more sightings on land. Who knows? Maybe 
they’ll evolve to hibernate in late summer, when there is no ice, and hunt
 the rest of the year.

There is an added effect that doesn’t get much attention.  There was a 
fascinating study by a Canadian Biologist (Ian Stirling) and a sea ice
 expert (Claire Parkinson) [Stirling, I., and C.L. Parkinson. 2006. Possible 
Effects of Climate Warming on Selected Populations of Polar Bears (Ursus
maritimus) in the Canadian Arctic. Arctic 59(3): 261-275.], which suggested 
that the bears are also losing weight, and approaching the weights at which 
they have historically not been able to bear cubs.  So not only is the
population threatened by starvation, the ability to replenish the population
 seems diminished.

I don’t believe we can say anything with absolute certainty,
 so I, myself would not make the statement that the polar bears are doomed–but I will say that the outlook for them, in my view, looks very, very bad.

Author: Polar Bears Doomed No Matter What We Do

US Fish & Wildlife Service

Photo: US Fish & Wildlife Service

Because our charter at Climate Watch is to examine climate change from the California perspective, you don’t see a lot here about melting ice caps and imperiled polar bears. But Michael Krasny’s interview with Richard Ellis on KQED’s Forum program is well worth an hour of your time.

Ellis is the author of On Thin Ice: The Changing World of the Polar Bear (Random House, 2009) and it’s fair to say that he managed to stun Krasny with a declaration that the species is “doomed,” no matter what we might try to do to save it at this point. Ellis says there is already too much warming in the pipeline (what scientists call “committed” warming) to reverse the disintegration of the bears’ arctic habitat.

Polar bear populations have been a topic of persistent confusion, recently amplified in an op-ed piece written by former Alaska governor Sarah Palin for The Washington Post.

According to the advocacy group Polar Bears International, there is little room for doubt about the animal’s decline. The organization’s website breaks down the numbers, which point to a “scientifically documented decline in the best-studied population, Western Hudson Bay, and predictions of decline in the second best-studied population, the Southern Beaufort Sea.”

The PBI analysis goes on to explain that:

The Western Hudson Bay population has dropped by 22% since 1987. The Southern Beaufort Sea bears are showing the same signs of stress the Western Hudson Bay bears did before they crashed, including smaller adults and fewer yearling bears.

At the most recent meeting of the IUCN Polar Bear Specialist Group (Copenhagen, 2009), scientists reported that of the 19 sub-populations of polar bears, eight are declining, three are stable, one is increasing, and seven have insufficient data on which to base a decision. (The number of declining populations has increased from five at the group’s 2005 meeting.)

Regardless of whether you share the conclusions of Ellis and PBI about the future of the “poster child for global warming,” the Forum interview is a fascinating hour.

A Sea Change in Ocean Policy Promised

Reed Galin

Photo: Reed Galin

A phalanx of high-level federal officials marched into San Francisco today to announce a major shift in the way the federal government oversees the oceans.

The top-level administrators from the White House and several agencies held a public meeting to launch efforts toward a first-ever National Ocean Policy, in which they say restoring a healthy ecosystem will be a top priority.

The newly formed Interagency Ocean Policy Task Force is led by Nancy Sutley, chair of the White House Council on Environmental Quality and one of President Obama’s top advisors on the environment. She arrived surrounded by representatives from the National Oceanic and Atmospheric Administration (NOAA), EPA, Navy, Coast Guard and Dept. of Interior (which, odd as it sounds, is responsible for vast tracts on the outer continental shelf).

Asked why we’re just getting around to a unified national ocean policy, Sutley said that “Too often the federal government sits in its stovepipes,” with each agency taking a narrow view. This effort is an attempt to break through traditional parochialism in favor of a more holistic approach to the challenges.

Task force member Jane Lubchenco, who heads NOAA, said that for the first time, policy makers are saying loudly that “healthy oceans matter.” And right now, she says, they’re not real healthy.

“At a global scale, I would say that oceans are in critical condition,” said Lubchenco. ” Most people are unaware of how much disruption and depletion has occurred within the oceans. We’re seeing the symptoms of much of that. It’s time to get on with the solutions.”

The task force will address a growing array of concerns, from shrinking fisheries to higher acid levels in the ocean—many of which are likely related to climate change.

Lubchenco, who is also an Undersecretary of Commerce, told me that “Climate change is exacerbating many of the existing challenges for ocean uses. There’s very good evidence that climate change is already having very significant impacts on oceans.” Lubchenco also cited “the related problem of ocean acidification,” and reeled off a laundry list of  climate impacts, including “loss of biological diversity, increasing transport of invasive species, nutrient pollution, habitat loss, and over-fishing.”

Lubchenco added “That sum total of stresses on ocean ecosystems means that we need to be taking new approaches.” The most sweeping of those “new approaches” will be “ecosystem-based management,” a term used repeatedly in the Interim Report issued by the task force this month.

According to the report:

“The implementation of ecosystem-based management embodies a fundamental shift in how the United States manages these resources, and provides a foundation for how the remaining objectives would be implemented…It would provide the opportunity to ensure proactive and holistic approaches to balance the use and conservation of these valuable resources. This broad-based application of ecosystem-based management would provide a framework for the management of our resources, and allow for such benefits as helping to restore fish populations, control invasive species, support healthy coastal communities and ecosystems, restore sensitive species and habitats, protect human health, and rationally allow for emerging uses of the ocean, including new energy production.”

The task force will also be taking its own stab at some long-term solutions for the troubled Sacramento River Delta. The interim report is open for public comment until October 10.

How a Data-Gathering Ocean Robot was Born

Thayer Walker is a San Francisco-based freelance writer, who first reported on development of the Wave Glider for The New York Times. His radio segment for Climate Watch, was produced by Nathanael Johnson and is scheduled to air Monday, 8/31 on KQED’s The California Report.

Red Flash in the Sunset

By Thayer Walker

A Silicon Valley engineering firm called Liquid Robotics recently launched a new device called a Wave Glider off the California coast. It’s the latest and perhaps, the most ingenious design in the growing network of “autonomous ocean samplers,” which is to say sea-going science robots. The glider is a sensor-carrying platform powered entirely by wave energy and it has the potential to collect an enormous amount of data about the ocean, which in turn will give scientists a better understanding of climate change.

"Red Flash" at sea. Photo: Liquid Robotics

"Red Flash" at sea. Photo: Liquid Robotics

Jim Bellingham, Chief Technologist at the Monterey Bay Aquarium Research Institute, calls the device a “transformational development” in the field of ocean science and technology, but when the inventors came up with the idea they weren’t trying to revolutionize marine studies, they were trying to eavesdrop on humpback whales.

In 2005, Joe Rizzi, the chairman of Liquid Robotics, wanted to listen to the song of humpback whales off the coast of his home in Puako, Hawaii. He anchored a hydrophone near the shore, but instead of picking up whale song he heard the sound of frying bacon. Snapping shrimp, a small crustacean that uses its powerful claw to generate sound blasts that stuns its prey, had drowned out the call of the whales.

When Rizzi moved the hydrophones into deeper water he captured clear whale sounds, but kept losing the moored devices to rough seas. “The difficulty,” says Rizzi “was holding a hydrophone anchored in 600 feet of water during winter storms.” Rizzi realized that to keep a hydrophone stationary, he would need a powered device. “You’re not going to do that with a battery and a motor and a solar panel,” he explains. “The amount of power to hold station in a 60 mile per hour wind and 10 foot waves is thousands of watts. We recognized that it’s an energy problem.”

Rizzi presented the problem to Liquid Robotics CEO Roger Hine, who quickly came up with a design, and the Wave Glider was born.  “We weren’t sitting around thinking this thing would have a lot of scientific uses,” says Rizzi, “but as it turns out, there are a lot more uses for this device than just listening to whales.”

The Wave Glider's passive propulsion system harnesses the up-and-down motion of sea swells for locomotion. Diagram: Liquid Robotics

The Wave Glider's passive propulsion system harnesses the up-and-down motion of sea swells for locomotion. Diagram: Liquid Robotics

A Climate Reporter’s Candy Store

I’m spending the week in Boulder, CO, attending a series of lectures and discussions at the National Center for Atmospheric Research (NCAR). The center is a hub for climate modeling using some of the world’s most advanced computers–but scientists here are working on a dizzying array of projects, from “wind prospecting” models for siting utility-scale wind farms in Colorado, to tracking the ozone drift from California wildfires, to studying the relationship between weather and meningitis in Sub-Saharan Africa.

With the Flatiron Mountains as a backdrop, architect I. M. Pei used the Mesa Verde cliff dwellings as inspiration for the NCAR headquarters building, in Boulder. Photo: Craig Miller

With the Flatiron Mountains as a backdrop, architect I. M. Pei used the Mesa Verde cliff dwellings as inspiration for the NCAR headquarters building, in Boulder. Photo: Craig Miller

While NCAR works closely with NOAA (which also has a major research center in town), it is not part of it. NCAR is funded by the National Science Foundation and managed by something called the University Corporation for Atmospheric Research (UCAR), a consortium of about 75 North American universities, as well as major institutions abroad.

About 400 scientists work under the NCAR umbrella, including Kevin Trenberth, a leading authority on the link between El Nino and global climate. Right before hopping a plane for Australia this week, Trenberth, head of NCAR’s Climate Analysis Section, reaffirmed what NOAA and others have been saying; that we may be in for a significant El Nino event this fall and winter.

“There are good signs below the surface of the ocean in the tropical Pacific that this is the real deal,” said Trenberth. He echoed some of the optimism expressed by many Californians that the result could be an overdue dousing after three years of accumulating drought conditions. “The odds are, if it’s a good El Nino,” said Trenberth, “that there is more likelihood of a southerly storm track that’ll bring a lot of weather systems into southern California in particular. It’s not always clear what happens in northern California but the odds are that there’s a much more active southern storm track right across the U.S. and in particular in California.”

The IBM Bluefire 76-teraflop computer, centerpiece of NCAR's supercomputing center. Photo: Craig Miller

The IBM Bluefire 76-teraflop computer, centerpiece of NCAR's supercomputing center. Photo: Craig Miller

NCAR scientists continue to refine their climate models, which have been downloaded by more than 10,000 scientists around the world. UCAR invests $20-to-$30 million every four years in it’s Computational & Information Systems Lab (CISL), to maintain it’s state-of-the-art status. CISL chief Rich Loft says it’s probably the most advanced supercomputing center devoted largely to climate analysis.

Even so, NCAR is busy building a bigger, faster one–but not here. The new supercomputer, which may be ready by 2012, will be sited near Cheyenne, Wyoming, mostly to take advantage of the cheap, abundant electric power in that area. Loft and NCAR Director Eric Barron both concede the paradox that the most advanced computer assault on global warming is itself a huge gobbler of electricity, much of which comes from coal-fired power plants. The Wyoming facility will suck down 4.5 megawatts of power. Barron says at least there’s a major wind farm “right next door.”

The center’s carbon footprint is probably also swollen slightly by its own air force. NCAR operates two aircraft packed with advanced instrumentation; a hulking C-130 Hercules and a sleek, high-altitude Gulfstream V. Sadly, no rides were offered this week.

Plan Moves Climate Adaptation to Front Burner

A one-fifth reduction in per capita water use by 2020 is among the goals outlined in a new state report on adapting to climate change.

Released by the California Natural Resources Agency (CNRA) as a “discussion draft,”  the 2009 California Climate Adaptation Strategy is being billed as the nation’s first comprehensive game plan for adaptation to climate change.

Reed Galin

Photo: Reed Galin

Most of the state’s high-profile climate initiatives (and battles) have been about mitigation; how to reduce greenhouse gas emissions to slow down warming. This report swings the spotlight over to adaptation; what needs to be done to accommodate the climate change effects that are already “in the pipeline.”

While the California’s centerpiece climate law was passed three years ago, this week’s CNRA report concedes that “adaptation is a relatively new concept in California policy.” The 161-page white paper comes in response to an executive order from the Governor last fall, calling for a statewide adaptation strategy.

The draft divides the strategy into seven “sectors:” Public health, biodiversity and habitat, ocean and coastal resources, water, agriculture, and forestry.

Tony Brunello, Deputy Secretary for Climate Change and Energy at CNRA, says “This is the first report that really looks at how climate change is going to impact the state and what we need to do about it.”

But Brunello stopped short of conceding that mitigation is a lost cause. “You only have half a deck if you’re only focused on mitigation,” he said. “You need to focus on both mitigation and adaptation to truly be prepared.”

Some strategies attack both. Brunello points to water conservation measures, which save both water and energy (20% of the energy used in the state is deployed moving water around).

The plan is designed to work in consort with the California Air Resources Board’s implementation plan for AB-32, the state’s multifaceted attack on greenhouse gas emissions. CNRA says one of its goals is to “enhance” existing efforts, rather than create new programs and offices that need funding.

CNRA also promises to use the “best available science in identifying climate change risks and adaptation strategies.” Andrew Revkin has a useful overview of the mounting challenges to climate scientists, published this week in the New York Times.

One planned product from the adaptation plan is an interactive website devoted to climate adaptation, with maps and data to assist local planners. CNRA hopes to have that in place by early next year. The draft plan now enters a 45-day period for public comment.

NOAA Confirms El Nino

Image from NASA

Warm water patterns in the Pacific during normal (upper) and El Nino (lower) years. The lower image is from 1995-96. Image from NASA

Scientists with the National Oceanic and Atmospheric Administration today confirmed what many had pretty much surmised: El Nino is back.

Officially the El Nino Southern Oscillation (ENSO), the cyclical pattern of ocean conditions has broad implications for weather and the Pacific food chain.

According to the NOAA news release:

“NOAA expects this El Niño to continue developing during the next several months, with further strengthening possible. The event is expected to last through winter 2009-10.”

NOAA’s Climate Prediction Center suggested about a month ago that conditions were right for the return of El Nino.

More recently, the high incidence of underweight sea lion pups turning up along the California coast was taken by some as a harbinger of ENSO. During El Nino cycles, normal upwelling of deep, cold water slows down, essentially shutting down the “food elevator” for many species.

Of course, there can be an upside. According to NOAA:

“El Niño’s impacts depend on a variety of factors, such as intensity and extent of ocean warming, and the time of year. Contrary to popular belief, not all effects are negative. On the positive side, El Niño can help to suppress Atlantic hurricane activity. In the United States, it typically brings beneficial winter precipitation to the arid Southwest, less wintry weather across the North, and a reduced risk of Florida wildfires.”

Links to climate change are less clear. Some scientists have suggested that warming air and sea temperatures might bring about more and longer El Nino events.