FLOOD INSURANCE: More Information Needed on Subsidized Properties

Report

What GAO Found

The Biggert-Waters Flood Insurance Reform Act of 2012 (Biggert-Waters Act) immediately eliminated subsidies for about 438,000 National Flood Insurance Program (NFIP) policies, but subsidies on an estimated 715,000 policies across the nation remain. Depending on factors such as policyholder behavior, the number of subsidized policies will continue to decline over time. For example, as properties are sold and the Federal Emergency Management Agency (FEMA) resolves data limitations and defines key terms, more subsidies will be eliminated. GAO analysis found that remaining subsidized policies would cover properties in every state and territory where NFIP operates, with the highest numbers in Florida, Louisiana, and California. In comparing remaining subsidized and nonsubsidized policies GAO found varying characteristics. For example, counties with the highest and lower home values had a larger percentage of subsidized versus nonsubsidized policies.

Data constraints limit FEMA’s ability to estimate the aggregate cost of subsidies and establish rates reflecting actual flood risks on previously subsidized policies. FEMA does not have sufficient historical program data on the percentage of full-risk rates that subsidized policyholders have paid to estimate the financial impact–in terms of the difference between subsidized and full-risk premium rates–to NFIP of subsidies. Also, because not all policyholders are required to provide documentation about their flood risk, FEMA generally lacks information needed to apply full-risk rates (as required by the Biggert-Waters Act) on previously subsidized policies. FEMA is encouraging these policyholders to voluntarily submit this documentation. Federal internal control standards state that agencies should identify and analyze risks associated with achieving program objectives and develop a plan for obtaining needed data. Without this documentation, the new rates may not accurately reflect a property’s full flood risk, and policyholders may be charged rates that are too high or too low relative to their risk of flooding.

Options from GAO’s previous and current work for reducing the financial impact of subsidies on NFIP include (1) adjusting the pace of subsidy elimination, (2) targeting assistance or subsidies based on financial need, or (3) increasing mitigation efforts, such as relocation or elevation that reduce a property’s flood risk. However, these options have advantages and disadvantages. Moreover, the options are not mutually exclusive, and combining them could help offset some disadvantages.FEMA should develop and implement a plan to obtain flood risk information needed to determine full-risk rates for properties with previously subsidized rates. FEMA agreed with the recommendation.

Why GAO Did This Study

FEMA, which administers NFIP, estimated that in 2012 more than 1 million of its residential flood insurance policies–about 20 percent–were sold at subsidized rates; nearly all were located in high-risk flood areas. Because of their relatively high losses and lower premium rates, subsidized policies have been a financial burden on the program. Due to NFIP’s financial instability and operating and management challenges, GAO placed the program on its high-risk list in 2006. The Biggert-Waters Act eliminated subsidized rates on certain properties and mandated GAO to study the remaining subsidized properties. This report examines (1) the number, location, and characteristics of properties that continue to receive subsidized rates compared with full-risk rate properties; (2) the information needed to estimate the historic cost of subsidies and establish rates for previously subsidized policies that reflect the risk of flooding; and (3) options to reduce the financial impact of remaining subsidized policies. GAO analyzed NFIP data on types of policies, premiums, and claims and publicly available home value and household income data. GAO also interviewed representatives from FEMA, insurance industry associations, and floodplain managers.

What GAO Recommends

FEMA should develop and implement a plan to obtain flood risk information needed to determine full-risk rates for properties with previously subsidized rates. FEMA agreed with the recommendation.

For more information, contact Alicia Puente Cackley at (202) 512-8678 or cackleya@gao.gov.

Original Link

FLOOD INSURANCE: Implications of Changing Coverage Limits and Expanding Coverage

Report

What GAO Found

The National Flood Insurance Program (NFIP) currently has more than 5.5 million policyholders insured for about $1.3 trillion who pay about $3.5 billion in annual premiums, but less than half purchase maximum coverage–a possible indicator of how many might purchase additional coverage were it offered. However, from 2002 through 2012, the proportion of residential and commercial policies at maximum building coverage rose substantially–from 11 to 42 percent and from 21 to 36 percent, respectively. States along the Gulf and East Coasts have the most residential policyholders with maximum coverage. In addition, states with higher median home values generally have a higher percentage of policyholders purchasing coverage up to the limit. Industry stakeholders said that an unknown number of policyholders with higher-value properties choose to purchase additional, or excess, coverage above the NFIP limit through the private flood insurance market–a small and selective group of insurers.

Increasing coverage limits could increase the net revenue of the program and have varying effects on NFIP, the private insurance market, and consumers. Assuming that higher coverage limits had been in effect from 2002 through 2011, GAO’s analysis suggests that NFIP still would have suffered losses during years with catastrophic floods, such as 2004 and 2005, but would have experienced net increases in revenue in other years. Such increases could have offset future losses or helped avoid additional debt, but the overall financial impact and risk to the program would depend on the adequacy of the rates charged, which GAO has questioned in the past, and the number of policyholders opting for additional coverage. Regarding the private flood insurance market and consumers, higher NFIP coverage limits could decrease participating insurers’ overall risk exposure and provide more options to consumers, but might lessen participation of private insurers, as consumers might need to purchase less private insurance.

Adding optional coverage to NFIP for business interruption and additional living expenses could result in less uninsured risk in the market, but further negatively impact the financial stability of the program. Industry stakeholders told GAO that business interruption coverage is generally purchased by only larger companies, as its high cost prohibits small- and medium-sized companies from being able to afford it. In addition, adding business interruption coverage to NFIP could be particularly challenging. For example, properly pricing risk, underwriting, and claim processing can be complex. NFIP officials have stated that they would have to hire additional expertise in-house to offer this coverage. Similarly, offering optional coverage for additional living expenses has many of the same potential effects on NFIP, the private market, and consumers, although this coverage is generally less complex to administer.

Why GAO Did This Study

NFIP was created in 1968 and is the only federal flood insurance available. It may be the sole source of insurance to some residents of flood-prone areas. Mainly due to catastrophic losses in 2005, the program became indebted to the U.S. Treasury and has been unable to repay this debt. Because of NFIP’s financial instability and management challenges, GAO placed the program on its High-Risk List in 2006. The Biggert-Waters Flood Insurance Reform Act of 2012 introduced many changes to the program and mandates GAO to study the effects of increasing the maximum coverage limits ($250,000 for residential buildings and $500,000 for commercial buildings) and providing optional coverage for business interruption and additional living expenses. This report discusses (1) existing flood insurance coverage, (2) the potential effects of changing NFIP coverage limits, and (3) the potential effects of allowing NFIP to offer optional coverage for business interruption and additional living expenses. To address these objectives, GAO analyzed data from NFIP’s databases of policies and claims, reviewed prior reports, and interviewed brokers, insurers, and representatives from consumer advocacy and industry organizations.

What GAO Recommends

GAO continues to support previous recommendations to the Federal Emergency Management Agency (FEMA) that address the need to ensure that the methods and data used to set NFIP rates accurately reflect the risk of losses from flooding. FEMA agreed and has taken some steps to begin to implement them.

For more information, contact Alicia Puente Cackley, 202-512-8678, or cackleya@gao.gov.

Original link

Disaster Refugees – 2012

32.4 million people uprooted in both rich and poor countries
A new report from NRCs Internal Displacement Monitoring Centre (IDMC). reveals that 32.4 million people were forced to flee their homes in 2012 by disasters such as floods, storms and earthquakes.
Download:
» Full Report
 » Press release

While Asia and west and central Africa bore the brunt, 1.3 million were displaced in rich countries, with the USA particularly affected.

98% of all displacement in 2012 was related to climate- and weather-related events, with flood disasters in India and Nigeria accounting for 41% of global displacement in 2012. In India, monsoon floods displaced 6.9 million, and in Nigeria 6.1 million people were newly displaced. While over the past five years 81% of global displacement has occurred in Asia, in 2012 Africa had a record high for the region of 8.2 million people newly displaced, over four times more than in any of the previous four years.

Overwhelming Risk – Rethinking Flood Insurance in a World of Rising Seas

Union of Concerned Scientists

Our coasts face growing risks from sea level rise. Today’s flood insurance system encourages development that increases these risks — and taxpayers nationwide pay the price.

Sea level is rising and increasing the risk of destructive flooding events during powerful coastal storms. At the same time, increasing coastal development and a growing population are putting more people and more property in harm’s way.

This risky pattern of development is being reinforced by the taxpayer-subsidized National Flood Insurance Program, which sets artificially low insurance rates that do not reflect the true risks to coastal properties. When major disasters strike, taxpayers nationwide are left liable for billions of dollars in insurance claims and disaster relief.

We urgently need to reform our insurance system to more effectively manage and reduce these coastal risks — risks that are projected only to grow in a warming world.

Overwhelming Risk: Rethinking Flood Insurance in a World of Rising Seas – Executive Summary

Overwhelming Risk: Rethinking Flood Insurance in a World of Rising Seas – Full Report

Also see: Encroaching Tides

EPA Proposed Rules on Power plant Carbon Dioxide

Power plant CO2 Proposed Regs

 

Original site: http://www2.epa.gov/carbon-pollution-standards

 

 

NOAA views on potential environmental risks of Louisiana Coastal Restoration Plan

NOAA views on potential environmental risks of Louisiana Coastal Restoration Plan

“NMFS supports efforts to ameliorate coastal wetland loss in Louisiana to maintain socio economic, storm protection, and ecological services these habitats provide. Most coastal restoration efforts can benefit nursery and foraging functions supportive of a wide variety of economically important marine fishery species. However, the proposed diversion may have adverse impacts to economically important estuarine/marine fisheries and their habitats. NMFS is concerned the MBSD could (1) displace marine fishery species from currently productive habitats to less supportive habitats, (2) reduce marine fishery productivity, (3) convert essential fish habitat (EFH) to areas no longer supportive of some federally managed marine fishery species or their prey items, (4) render wetlands impacted by diversions more susceptible to erosion from storms, (5) degrade water quality, and (6) cause socio-economic hardship to those involved in the commercial and recreational fishing industries. To allow for informed decision making, these issues should be thoroughly evaluated by methods acceptable to NMFS and the results incorporated into the planned EIS.”

Also see the comments on diversions in the NOAA RESTORE Act Science Program – Science Plan

Analysis of Datums and Elevations in USACE Projects: FINAL REPORT

Analysis of Datums and Elevations in USACE Projects: FINAL REPORT

The USACE assembled this team as a direct result of the catastrophic losses endured after Hurricane Katrina struck the Gulf Coast in 2005. The emphasis on vertical accuracy came about as a result of an extensive investigation into the causes of levee failures in and around New Orleans. According to report written by USACE engineers entitled “Lessons Learned,” published in the Point of Beginning website, “…elevation values used in construction were based on geodetic datums, not About This Report the local mean sea level as was the intent. Elevation values were often from older epochs of the existing NGVD 29 geodetic datum [National Geodetic Vertical Datum of 1929] instead of the most current published values” (http://www.pobonline. com/Articles/Article_Rotation/BNP_GUID_9-5 2006_A_10000000000000360086). Design engineers assumed that the NGVD 29 datum was equivalent to mean sea level and used NGVD 29 values as such, resulting in 1- to 3-foot differences between the intended design and constructed elevations.” Moreover, the team learned that, “Construction projects were tied to/based on only one benchmark and often the datum epoch or date established was not included in construction documents.” This is particularly troublesome given that “subsidence across the region has caused published benchmark elevations to change by more than 2 feet in the past 50 years.”

In light of these findings, the USACE set out to assess reference datum accuracy requirements that are currently in place, and to establish whether the USACE is able to perform reliable datum uncertainty analyses to ascertain the risks of project failure. To that end, the USACE contracted with the Conrad Blucher Institute for Surveying and Science at Texas A & M University-Corpus Christi to objectively answer these critical questions. Under the leadership of Dr. Gary Jeffress, Executive Director of the Blucher Institute, the Task 11 team, as it is referred to by the USACE, was assembled to collect, analyze, and ultimately present their findings regarding current datum accuracy requirements, and to determine to what extent the USACE is able

to perform reliable uncertainty analyses to ascertain the risks of project failure.

Over the course of a year, the Task 11 Team produced a series of reports. The first report, Phase 1, provided an assessment of the reference datum accuracy requirements that are currently in place in the USACE, and what challenges and obstacles district staff encounter as they carry out needed projects for their Districts. This phase included a thorough literature review of current USACE policy manuals, circulars, and other documents, and an analysis based on numerous interviews with USACE staff from districts around the country. The Phase 1 report concluded that engineers and surveyors from the USACE generally understand the issues related to datum uncertainty and that they welcome reducing this uncertainty as well as establishing a uniform approach to a datum uncertainty analysis.

The report revealed that each District has its own agenda and timeline to achieve implementation, which is largely dependent upon the resources and available personnel with the education and experience to undertake the work. Over the past decade, Districts have reduced personnel and funding for elevation analysis. Some Districts face significant funding, personnel, and resource shortages needed to implement datum uncertainty analyses. Next, the Task 11 team produced a report entitled, “Uncertainty Model for Orthometric, Tidal, and Hydraulic Datums for Use in Risk Assessment Models.” Phase 2 offered a technical discussion of risk assessment, specifically regarding relevant orthometric and water level datums and datum conversion for use in protection grade design, and discussion of a suggested approach to integrating vertical uncertainty into future USACE project risk assessments.

Findings in Phase 2 included an analysis of existing risk assessment guidelines within USACE, as well as a statistical discussion of perceived risk versus actual risk. This statistical discussion goes on to compare and quantify accuracy versus uncertainty. Each datum used by the USACE is analyzed for uncertainty and the accompanying risks, including terrestrial datum and water level datum, and datum conversions, such as converting legacy NGVD 29 measurements to NAVD 88 elevations.

The findings in Phase 2 also revealed that a very limited analysis of risks associated with converting legacy datums to modern datums has been conducted by the USACE, and that these initial studies reveal the complexities involved in the process, as well as a lack of historical data coverage of significant portions of the United States. Both Phase 1 and Phase 2 technical reports can be read in their entirety at: http://www.agc.army. mil/ndsp/index.html.

This report, the last in the series, is designed to clearly summarize the findings of this yearlong study, offer recommendations, and provide an overview of the findings suitable for non-technical readers.

Surface Temperature Reconstructions for the Holocene – The hockey stick debate

Surface Temperature Reconstructions for the Last 2,000 Years Committee on Surface Temperature Reconstructions for the Last 2,000 Years, National Research Council, ISBN: 0-309-66144-7, 160 pages, 7 x 10, (2006)

Because widespread, reliable instrumental records are available only for the last 150
years or so, scientists estimate climatic conditions in the more distant past by analyzing
proxy evidence from sources such as tree rings, corals, ocean and lake sediments, cave
deposits, ice cores, boreholes, glaciers, and documentary evidence. For example, records of
Alpine glacier length, some of which are derived from paintings and other documentary
sources, have been used to reconstruct the time series of surface temperature variations in
south-central Europe for the last several centuries. Studying past climates can help us put the
20th century warming into a broader context, better understand the climate system, and
improve projections of future climate.

Zhengyu Liu, Jiang Zhu, Yair Rosenthal, Xu Zhang, Bette L. Otto-Bliesner, Axel Timmermann, Robin S. Smith, Gerrit Lohmann, Weipeng Zheng, and Oliver Elison Timm The Holocene temperature conundrum PNAS 2014 ; published ahead of print August 11, 2014, doi:10.1073/pnas.1407229111

http://www.pnas.org/content/early/2014/08/07/1407229111.full.pdf+html – original link

A recent temperature reconstruction of global annual temperature shows Early Holocene warmth followed by a cooling trend through the Middle to Late Holocene [Marcott SA, et al., 2013, Science 339(6124):1198-1201]. This global cooling is puzzling because it is opposite from the expected and simulated global warming trend due to the retreating ice sheets and rising atmospheric greenhouse gases. Our critical reexamination of this contradiction between the reconstructed cooling and the simulated warming points to potentially significant biases in both the seasonality of the proxy reconstruction and the climate sensitivity of current climate models.

[This builds on the piece from Science last year that reconstructed the temperature record for the last 11.5K years of the Holocene. The physical record – tree rings, pollen, etc. – shows overall slight cooling, which would be consistent with the standstill in sea level. But the climate models show warming when they hindcast this period. If the physical record is correct, then the current warming is much more anomalous than previously thought, and the models might be underestimating future warming. The article does not rule out that there might be problems with the physical record, but points out that until the question is resolved, there is a significant question about the predictions about sensitivity to CO2 in the current models.]

 

 

 

Starting in the late 1990s, scientists began combining proxy evidence from many different
locations in an effort to estimate surface temperature changes averaged over broad
geographic regions during the last few hundred to few thousand years. These large-scale
surface temperature reconstructions have enabled researchers to estimate past temperature
variations over the Northern Hemisphere or even the entire globe, often with time resolution
as fine as decades or even individual years. This research, and especially the first of these
reconstructions published in 1998 and 1999 by Michael Mann, Raymond Bradley, and
Malcolm Hughes, attracted considerable attention because the authors concluded that the
Northern Hemisphere was warmer during the late 20th century than at any other time
during the past millennium. Controversy arose because many people interpreted this result
as definitive evidence of anthropogenic causes of recent climate change, while others criticized
the methodologies and data that were used.

In response to a request from Congress, this committee was assembled by the National
Research Council to describe and assess the state of scientific efforts to reconstruct surface
temperature records for the Earth over approximately the last 2,000 years and the implications
of these efforts for our understanding of global climate change.

Warming of the Oceans and Implications for the (Re)insurance Industry

• There is new, robust evidence that the global oceans have warmed
significantly. Given that energy from the ocean isthe key driver of extreme
events, ocean warming has effectively caused a shift towards a “new normal”
for a number of insurance-relevant hazards. Thisshift is quasi irreversible—
even if greenhouse gas(GHG) emissions completely stop tomorrow, oceanic
temperatures will continue to rise.
• In the non-stationary environment caused by ocean warming, traditional
approaches, which are solely based on analysing historical data, increasingly
fail to estimate today’s hazard probabilities.Aparadigmshiftfromhistoric to
predictive risk assessment methodsis necessary.
• Due to the limits of predictability and scientific understanding of extreme
eventsin a non-stationary environment, today’slikelihood of extreme events
is ambiguous. As a consequence, scenario-based approaches and tail risk
modelling become an essential part of enterprise risk management.
• In some high-risk areas, ocean warming and climate change threaten the
insurability of catastrophe risk more generally. To avoid market failure, the
coupling of risk transfer and risk mitigation becomes essential.

Warming of the Oceans and Implications for the (Re)insurance Industry

Hurricane Sandy Rebuilding Task Force Releases Rebuilding Strategy

FOR IMMEDIATE RELEASE:
HUD No. 13-125
Monday, August 19, 2013
CONTACT:
Aaron Jacobs
Hurricane Sandy Rebuilding Task Force
Aaron.F.Jacobs@hud.gov

HURRICANE SANDY REBUILDING TASK FORCE RELEASES REBUILDING STRATEGY
 Strategy will ensure families, small businesses and communities are stronger, more economically
competitive and better able to withstand future storms, and serve as a model for communities
across the country

WASHINGTON– President Obama’s Hurricane Sandy Rebuilding Task Force, chaired by Housing and Urban Development (HUD) Secretary Shaun Donovan, today released a rebuilding strategy to serve as a model for communities across the nation facing greater risks from extreme weather and to continue helping the Sandy-affected region rebuild.  The Rebuilding Strategy contains 69 policy recommendations, many of which have already been adopted, that will help homeowners stay in and repair their homes, strengthen small businesses and revitalize local economies and ensure entire communities are better able to withstand and recover from future storms.  Read more.

“I want to thank Secretary Donovan, the Hurricane Sandy Rebuilding Task Force, and the thousands of federal response and recovery personnel who have helped the region recover. We have cut red tape, piloted cutting edge programs and strengthened our partnership with state and local officials,” President Obama said. “While a great amount of work remains, we will stand with the region for as long as it takes to recover.”

“President Obama was clear that his Administration is committed to staying in the Sandy-impacted region until the work was done, and today marks a crucial step in that journey,” Secretary Donovan said. “This Rebuilding Strategy will protect families, small businesses and communities across the region, and the taxpayers’ investment in them, from the risks posed by sea level rise and more extreme weather events – risks that are made worse by the reality of a changing climate.”

Among the recommendations that will have the greatest impact on Federal funding is a process to prioritize all large-scale infrastructure projects and map the connections and interdependencies between them, as well as guidelines to ensure all of those projects are built to withstand the impacts of climate change. The Strategy also explores how to harden energy infrastructure to minimize power outages and fuel shortages – and ensure continuation of cellular service – in the event of future storms.

The goal of these and other recommendations in the Strategy is to:

•Align federal funding with local rebuilding visions.
•Cut red tape and get assistance to families, businesses, and communities efficiently and effectively, with maximum accountability.
•Coordinate the efforts of the Federal, State, and local governments, with a region-wide approach to rebuilding.
•Ensure the region is rebuilt in a way that makes it more resilient – that is, better able to withstand future storms and other risks posed by a changing climate.

In addition to the recommendations that are directly linked to Sandy Supplemental funding, the Rebuilding Strategy also includes additional policy recommendations that will have a significant impact on how the region rebuilds. Finally, in recognition of the increased risk the region and the nation face from extreme weather events, the Rebuilding Strategy includes recommendations that, if implemented, will improve our ability to withstand and recover effectively from future flood-related disasters across the country.

Several of the policies and principles developed by the Hurricane Sandy Rebuilding Task Force were also incorporated into President Obama’s Climate Action Plan, which laid out a series of responsible and common sense steps to prepare communities for the impacts of a changing climate, including the need for the Federal government to make investments based on the most up to date information about future risks.

As laid out in the Rebuilding Strategy, the Task Force has also taken steps to ensure the implementation of these recommendations, each of which will be carried out by a Federal Department or Agency or an existing interagency working group. Implementation will be tracked by a team which will also build on the Task Force Program Management Office’s work to track and release data on Federal spending from the Sandy supplemental funding bill.

Read the full Hurricane Sandy Rebuilding Strategy.

 

State of the Climate 2012 – NOAA

Blunden, J., and D. S. Arndt, Eds., 2013: State of the Climate in 2012. Bull. Amer. Meteor. Soc., 94 (8), S1–S238.

Briefing Slides

NOAA: 2012 was one of the 10 warmest years on record globally

The end of weak La Niña, unprecedented Arctic warmth influenced 2012 climate conditions

Worldwide, 2012 was among the 10 warmest years on record according to the 2012 State of the Climate report released online today by the American Meteorological Society (AMS). The peer-reviewed report, with scientists from NOAA’s National Climatic Data Center in Asheville, NC serving as lead editors, was compiled by 384 scientists from 52 countries. It provides a detailed update on global climate indicators, notable weather events, and other data collected by environmental monitoring stations and instruments on land, sea, ice, and sky.

“Many of the events that made 2012 such an interesting year are part of the long-term trends we see in a changing and varying climate—carbon levels are climbing, sea levels are rising, Arctic sea ice is melting, and our planet as a whole is becoming a warmer place,” said acting NOAA Administrator Kathryn D. Sullivan, Ph.D. “This annual report is well-researched, well-respected, and well-used; it is a superb example of the timely, actionable climate information that people need from NOAA to help prepare for extremes in our ever-changing environment.”

Conditions in the Arctic were a major story of 2012, with the region experiencing unprecedented change and breaking several records. Sea ice shrank to its smallest “summer minimum” extent since satellite records began 34 years ago. In addition, more than 97 percent of the Greenland ice sheet showed some form of melt during the summer, four times greater than the 1981–2010 average melt extent.

The report used dozens of climate indicators to track and identify changes and overall trends to the global climate system. These indicators include greenhouse gas concentrations, temperature of the lower and upper atmosphere, cloud cover, sea surface temperature, sea-level rise, ocean salinity, sea ice extent and snow cover. Each indicator includes thousands of measurements from multiple independent datasets.

  • Warm temperature trends continue near Earth’s surface: Four major independent datasets show 2012 was among the 10 warmest years on record, ranking either 8th or 9th, depending upon the dataset used. The United States and Argentina had their warmest year on record.
  • La Niña dissipates into neutral conditions:  A weak La Niña dissipated during spring 2012 and, for the first time in several years, neither El Niño nor La Niña, which can dominate regional weather and climate conditions around the globe, prevailed for the majority of the year.
  • The Arctic continues to warm; sea ice extent reaches record low: The Arctic continued to warm at about twice the rate compared with lower latitudes. Minimum Arctic sea ice extent in September and Northern Hemisphere snow cover extent in June each reached new record lows. Arctic sea ice minimum extent (1.32 million square miles, September 16) was the lowest of the satellite era. This is 18 percent lower than the previous record low extent of 1.61 million square miles that occurred in 2007 and 54 percent lower than the record high minimum ice extent of 2.90 million square miles that occurred in 1980. The temperature of permafrost, or permanently frozen land, reached record-high values in northernmost Alaska. A new melt extent record occurred July 11–12 on the Greenland ice sheet when 97 percent of the ice sheet showed some form of melt, four times greater than the average melt this time of year.
  • Antarctica sea ice extent reaches record high: The Antarctic maximum sea ice extent reached a record high of 7.51 million square miles on September 26. This is 0.5 percent higher than the previous record high extent of 7.47 million square miles that occurred in 2006 and seven percent higher than the record low maximum sea ice extent of 6.96 million square miles that occurred in 1986.
  • Sea surface temperatures increase: Four independent datasets indicate that the globally averaged sea surface temperature for 2012 was among the 11 warmest on record.  After a 30-year period from 1970 to 1999 of rising global sea surface temperatures, the period 2000–2012 exhibited little trend. Part of this difference is linked to the prevalence of La Niña-like conditions during the 21st century, which typically lead to lower global sea surface temperatures.
  • Ocean heat content remains near record levels: Heat content in the upper 2,300 feet, or a little less than one-half mile, of the ocean remained near record high levels in 2012. Overall increases from 2011 to 2012 occurred between depths of 2,300 to 6,600 feet and even in the deep ocean.
  • Sea level reaches record high: Following sharp decreases in global sea level in the first half of 2011 that were linked to the effects of La Niña, sea levels rebounded to reach record highs in 2012. Globally, sea level has been increasing at an average rate of 3.2 ± 0.4 mm per year over the past two decades.
  • Ocean salinity trends continue: Continuing a trend that began in 2004, oceans were saltier than average in areas of high evaporation, including the central tropical North Pacific, and fresher than average in areas of high precipitation, including the north central Indian Ocean, suggesting that precipitation is increasing in already rainy areas and evaporation is intensifying in drier locations.
  • Tropical cyclones near average: Global tropical cyclone activity during 2012 was near average, with a total of 84 storms, compared with the 1981–2010 average of 89. Similar to 2010 and 2011, the North Atlantic was the only hurricane basin that experienced above-normal activity.
  • Greenhouse gases climb: Major greenhouse gas concentrations, including carbon dioxide, methane, and nitrous oxide, continued to rise during 2012. Following a slight decline in manmade emissions associated with the global economic downturn, global CO2 emissions from fossil fuel combustion and cement production reached a record high in 2011 of 9.5 ± 0.5 petagrams (1,000,000,000,000,000 grams) of carbon, and a new record of 9.7 ± 0.5 petagrams of carbon is estimated for 2012. Atmospheric CO2 concentrations increased by 2.1 ppm in 2012, reaching a global average of 392.6 ppm for the year. In spring 2012, for the first time, the atmospheric CO2 concentration exceeded 400 ppm at several Arctic observational sites.
  • Cool temperature trends continue in Earth’s lower stratosphere: The average lower stratospheric temperature, about six to ten miles above the Earth’s surface, for 2012 was record to near-record cold, depending on the dataset. Increasing greenhouse gases and decline of stratospheric ozone tend to cool the stratosphere while warming the planet near-surface layers.

 

State of the Climate in 2012: Highlights

2012: Extreme Events
2012: Earth’s Surface Temperature
2012: Temperature of the Lower Stratosphere
2012: Humidity
2012: Snow in the Northern Hemisphere
2012: Global Sea Level
2012: Glaciers
2012: Sea Surface Temperature
2012: Ocean Heat Content
2012: Arctic Sea Ice