Latest Mississippi River Delta News: March 6, 2015

9 years 2 months ago

Carbon credits could generate $1.6 billion for Louisiana coastal restoration, study says
By Mark Schleifstein, The Times-Picayune. March 05, 2015
“Louisiana could earn up to $1.6 billion for coastal restoration projects over the next 50 years by selling credits for storing carbon in wetland plants and soils, according to a new study by New Orleans-based Tierra Resources, Entergy Corp. and the ClimateTrust.” (Read More)

Is BP responsible for the deaths of over 1,000 dolphins along the Gulf coast? (Audio)
By Laine Kaplan-Levenson, WWNO. March 05, 2015
“A report published last month found that an unusually high number of bottlenose dolphins have been dying all along the Gulf Coast since February 2010. This unusual mortality event, or UME, began two months before the 2010 BP oil spill, but groups including the National Oceanic and Atmospheric Administration say the spill is responsible for the continued die-off of this species.” (Read More)

Gulf of Mexico turns deadly for dolphins
By Rachel Nuwer, The New York Times. March 02, 2015
“Some injuries, including lung and adrenal lesions, observed in 2011 in live dolphins examined in Louisiana’s Barataria Bay were consistent with exposure to petroleum products.” (Read More)

lbourg

Latest Mississippi River Delta News: March 6, 2015

9 years 2 months ago

Carbon credits could generate $1.6 billion for Louisiana coastal restoration, study says
By Mark Schleifstein, The Times-Picayune. March 05, 2015
“Louisiana could earn up to $1.6 billion for coastal restoration projects over the next 50 years by selling credits for storing carbon in wetland plants and soils, according to a new study by New Orleans-based Tierra Resources, Entergy Corp. and the ClimateTrust.” (Read More)

Is BP responsible for the deaths of over 1,000 dolphins along the Gulf coast? (Audio)
By Laine Kaplan-Levenson, WWNO. March 05, 2015
“A report published last month found that an unusually high number of bottlenose dolphins have been dying all along the Gulf Coast since February 2010. This unusual mortality event, or UME, began two months before the 2010 BP oil spill, but groups including the National Oceanic and Atmospheric Administration say the spill is responsible for the continued die-off of this species.” (Read More)

Gulf of Mexico turns deadly for dolphins
By Rachel Nuwer, The New York Times. March 02, 2015
“Some injuries, including lung and adrenal lesions, observed in 2011 in live dolphins examined in Louisiana’s Barataria Bay were consistent with exposure to petroleum products.” (Read More)

lbourg

Latest Mississippi River Delta News: March 6, 2015

9 years 2 months ago

Carbon credits could generate $1.6 billion for Louisiana coastal restoration, study says
By Mark Schleifstein, The Times-Picayune. March 05, 2015
“Louisiana could earn up to $1.6 billion for coastal restoration projects over the next 50 years by selling credits for storing carbon in wetland plants and soils, according to a new study by New Orleans-based Tierra Resources, Entergy Corp. and the ClimateTrust.” (Read More)

Is BP responsible for the deaths of over 1,000 dolphins along the Gulf coast? (Audio)
By Laine Kaplan-Levenson, WWNO. March 05, 2015
“A report published last month found that an unusually high number of bottlenose dolphins have been dying all along the Gulf Coast since February 2010. This unusual mortality event, or UME, began two months before the 2010 BP oil spill, but groups including the National Oceanic and Atmospheric Administration say the spill is responsible for the continued die-off of this species.” (Read More)

Gulf of Mexico turns deadly for dolphins
By Rachel Nuwer, The New York Times. March 02, 2015
“Some injuries, including lung and adrenal lesions, observed in 2011 in live dolphins examined in Louisiana’s Barataria Bay were consistent with exposure to petroleum products.” (Read More)

lbourg

Mothers Out Front Inspires Plans to Go 100 % Renewable In Cambridge, MA

9 years 2 months ago

Written by Ronnie Citron-Fink

Cambridge City Councillor Dennis Carlone poses with members of Mothers Out Front following their testimony in support of terminating the city’s contract with TransCanada. Photo: Dennis Carlone blog

 

Last month, the President’s veto to block the Keystone XL oil pipeline wasn’t the only blow to TransCanada Corporation. A group of mothers and grandmothers from Mothers Out Front inspired the City of Cambridge, Massachusetts to send the energy giant a strong message that parents demand clean energy. With the approval of an order to terminate the City’s relationship with TransCanada, Cambridge will look to identify a new supplier to provide up to 100% renewable energy for all municipal operations.

According to this EcoWatch post:

“The policy order came out of a series of discussions Carlone had with Mothers Out Front, a two-year-old Cambridge-based grassroots advocacy group describing themselves as “mothers, grandmothers, and other caregivers who can no longer be silent and still about the very real danger that climate change poses to our children’s and grandchildren’s future.” The group provided testimony at city council on behalf of the resolution.

“Our organization has a strategy for creating a clean energy future but we need your help,” said Beth Adams, the mother of two young boys, in her testimony. “We are working on the ground to get individuals to weatherize their homes, conserve their energy and to switch to clean electricity. My family has made the switch, along with 100 other people in Cambridge including councillor Carlone. I am here tonight to ask for more bold climate action and leadership from the city of Cambridge to help us ensure a livable climate for our children and for future generations.”

This policy order was the product of a series of discussions that Councillor Carlone had with members of Mothers Out Front, a promising new advocacy group that formed in Cambridge and is now expanding to other municipalities and states. Additional information about this order was posted to Councillor Carlone’s blog last month, click here for that report. And posted above is a video featuring powerful public testimony and council discussion on the matter.

Moms Clean Air Force applauds partner, Mothers Out Front, for their hard work and dedication to preserving a safe future for children. Thank you Mothers Out Front for reminding our legislators and the power industry to never ever underestimate the power of moms!


TELL YOUR SENATORS: HELP STOP CLIMATE CHANGE





Ronnie Citron-Fink

At The Brink: Ocean Tipping Points

9 years 2 months ago
Coral reefs seem delicate, but when they are healthy they can take a lot of abuse.  I’ve seen corals recover from severe hurricanes and even volcanic eruptions. But coral reefs can also transition suddenly from colorful, vibrant ecosystems to mere shadows of themselves.  Decades of scientific investigation have shed a lot of light on this, […]
Rod Fujita

At The Brink: Ocean Tipping Points

9 years 2 months ago

By Rod Fujita

Healthy Coral in the Gardens of the Queen, Cuba. Photo: Noel Lopez Fernandez

Coral reefs seem delicate, but when they are healthy they can take a lot of abuse.  I’ve seen corals recover from severe hurricanes and even volcanic eruptions. But coral reefs can also transition suddenly from colorful, vibrant ecosystems to mere shadows of themselves.  Decades of scientific investigation have shed a lot of light on this, and in a recent publication, my colleagues and I summarize a lot of the data that have been collected on Caribbean coral reefs to identify where these dangerous “tipping points” are.  This work is part of the Ocean Tipping Points project, a collaboration between several institutions aimed at finding tipping points in all kinds of marine ecosystems so that managers can implement measures that will keep these ecosystems well away from the brink.

A brief story

In 1987, I lived on a lighthouse straddling a healthy coral reef for several months, trying to understand how it worked.  The reef was alive with dozens of fish species and brightly colored corals.  I could see (and hear) hundreds of parrotfish, surgeonfish, and aggressive little damselfish mowing down algae all over the reef.  The grunts were doing their daily commute between their nutrient-rich feeding grounds in nearby seagrass meadows and their shelters within the reef.  I got to know the resident barracudas, stealthily waiting for a chance to snap up a meal.  From atop the lighthouse, I could watch sharks move in and out of the reef, along with the occasional eagle ray or dolphin.

My reef had lots of biodiversity, and enough fish in each “functional group” (e.g., grazers and predators) to carry out their ecological roles.  Most of it was covered with living coral; seaweed was scarce.  These are the characteristics of a healthy coral reef, one that can produce many different kinds of ecosystem goods and services like sustainable fisheries and dive tourism.

A couple of decades later, it’s obvious that coral reefs are not faring very well.  Climate change has caused mass bleaching around the world – where corals lose their colorful microscopic partners and starve.  Raw sewage is routinely dumped onto reefs, which are very sensitive to the nutrients in the sewage – in this case, it is too much of a good thing as the extra nutrients stimulate seaweed growth and allows them to overgrow the corals.  It turns out that sometimes these changes are gradual, but often they are quite sudden and dramatic, when reefs cross over what we call an ocean tipping point.

What are Tipping Points?

A Tipping Point can cause healthy coral to degrade abruptly. Photo Credits: Left, Kendra Karr. Right, blog.soleilorganique.com

Tipping points occur when small shifts in human pressures or environmental conditions bring about large, sometimes abrupt changes in a system. Such tipping points are ubiquitous – they can be found in human society, physical systems, ecosystems, and even in the planet’s climate system.

Because it is often very difficult for ecosystems to recover after a tipping point is crossed, the Ocean Tipping Points (OTP) project was formed to characterize tipping points in ocean ecosystems and provide ocean resource managers with practical guidance to help avoid abrupt changes. EDF is a partner of OTP, as are the National Center for Ecological Analysis and Synthesis (NCEAS), the Bren School for Environmental Science and Management, the Center for Ocean Solutions, and the National Oceanic and Atmospheric Administration (NOAA).

Healthy ocean ecosystems have checks and balances—grazers control seaweed growth, predators control grazer populations, etc. – and redundancy and complementarity (different species do similar things but in slightly different ways).  These attributes help them stay away from tipping points.

However, if key pieces of ecosystems are removed through harvesting or damaged by for example pollution, the ecosystem becomes less resilient and even relatively small impacts can push the system past the tipping point, resulting in ecological collapse and the loss of valuable ecosystem goods and services.

Tipping points have been well documented in coral reefs, but they occur in many other ocean ecosystems too.  For example, luxuriant kelp forests that support marine mammals and a myriad of other species provide us with various ecosystem services like seafood, agar (sugar made from kelp), recreation, and sheltering the coastline from waves. However, these habitats can become barren very rapidly when they reach a tipping point. This happened in the 1800s when fur hunting became prevalent—decreasing the sea otter population. With fewer sea otter to consume urchins, urchins became overabundant, and overgrazed the kelp.

Fortunately, science is providing insights into the factors that make ocean ecosystems more capable of resisting these kinds of changes, and more able to bounce back when they are damaged.

 New Science, New Hope

Our new paper synthesizes hundreds of data points collected over many years from over 20 countries in the Caribbean Sea.  We found that, however counterintuitive it may seem, coral cover is not the best indicator of coral reef health. Once coral cover has visibly declined, the data suggest that many other changes may have already occurred that are likely to make recovery very difficult.  Fish population status seems to be an earlier and more useful indicator of coral reef status.

We were also able to quantify tipping points for coral reefs in the Caribbean, based on these data. Coral reef managers can simply compare fish density in their coral reef to densities associated with these tipping points to assess the risk of ecosystem collapse.  They can also use the tipping points to guide management aimed at keeping the system in a “safe operating space,” where enough fish are left in the system to maintain the reef in a healthy state while providing “pretty good yields” – not the maximum amount of yield possible, but enough to sustain a fishery while hedging against the risk of collapse.  While tipping points should be determined for each reef, the generalized tipping points we found in our study can provide useful guidance for risk assessment and precautionary management.

It is easy to despair when thinking about coral reefs.  They are sensitive to many different kinds of threats, and they are subject to all of them: we are losing them very quickly.  But the data show that many of the coral reefs in the Caribbean with marine reserves have large fish populations, healthy corals, and many other indicators of good health.  The Cuban reefs that we have been fortunate enough to visit are absolutely spectacular, and remind me of my early days diving on Caribbean reefs that were covered with living coral and swarming with fish and urchins doing their jobs keeping the reefs healthy.  These healthy reefs are the hope of the Caribbean – they will provide the seeds of recovery, if we manage to reduce threat levels and keep reefs within their safe operating spaces, away from tipping points.

Rod Fujita

At The Brink: Ocean Tipping Points

9 years 2 months ago

By Rod Fujita

Healthy Coral in the Gardens of the Queen, Cuba. Photo: Noel Lopez Fernandez

Coral reefs seem delicate, but when they are healthy they can take a lot of abuse.  I’ve seen corals recover from severe hurricanes and even volcanic eruptions. But coral reefs can also transition suddenly from colorful, vibrant ecosystems to mere shadows of themselves.  Decades of scientific investigation have shed a lot of light on this, and in a recent publication, my colleagues and I summarize a lot of the data that have been collected on Caribbean coral reefs to identify where these dangerous “tipping points” are.  This work is part of the Ocean Tipping Points project, a collaboration between several institutions aimed at finding tipping points in all kinds of marine ecosystems so that managers can implement measures that will keep these ecosystems well away from the brink.

A brief story

In 1987, I lived on a lighthouse straddling a healthy coral reef for several months, trying to understand how it worked.  The reef was alive with dozens of fish species and brightly colored corals.  I could see (and hear) hundreds of parrotfish, surgeonfish, and aggressive little damselfish mowing down algae all over the reef.  The grunts were doing their daily commute between their nutrient-rich feeding grounds in nearby seagrass meadows and their shelters within the reef.  I got to know the resident barracudas, stealthily waiting for a chance to snap up a meal.  From atop the lighthouse, I could watch sharks move in and out of the reef, along with the occasional eagle ray or dolphin.

My reef had lots of biodiversity, and enough fish in each “functional group” (e.g., grazers and predators) to carry out their ecological roles.  Most of it was covered with living coral; seaweed was scarce.  These are the characteristics of a healthy coral reef, one that can produce many different kinds of ecosystem goods and services like sustainable fisheries and dive tourism.

A couple of decades later, it’s obvious that coral reefs are not faring very well.  Climate change has caused mass bleaching around the world – where corals lose their colorful microscopic partners and starve.  Raw sewage is routinely dumped onto reefs, which are very sensitive to the nutrients in the sewage – in this case, it is too much of a good thing as the extra nutrients stimulate seaweed growth and allows them to overgrow the corals.  It turns out that sometimes these changes are gradual, but often they are quite sudden and dramatic, when reefs cross over what we call an ocean tipping point.

What are Tipping Points?

A Tipping Point can cause healthy coral to degrade abruptly. Photo Credits: Left, Kendra Karr. Right, blog.soleilorganique.com

Tipping points occur when small shifts in human pressures or environmental conditions bring about large, sometimes abrupt changes in a system. Such tipping points are ubiquitous – they can be found in human society, physical systems, ecosystems, and even in the planet’s climate system.

Because it is often very difficult for ecosystems to recover after a tipping point is crossed, the Ocean Tipping Points (OTP) project was formed to characterize tipping points in ocean ecosystems and provide ocean resource managers with practical guidance to help avoid abrupt changes. EDF is a partner of OTP, as are the National Center for Ecological Analysis and Synthesis (NCEAS), the Bren School for Environmental Science and Management, the Center for Ocean Solutions, and the National Oceanic and Atmospheric Administration (NOAA).

Healthy ocean ecosystems have checks and balances—grazers control seaweed growth, predators control grazer populations, etc. – and redundancy and complementarity (different species do similar things but in slightly different ways).  These attributes help them stay away from tipping points.

However, if key pieces of ecosystems are removed through harvesting or damaged by for example pollution, the ecosystem becomes less resilient and even relatively small impacts can push the system past the tipping point, resulting in ecological collapse and the loss of valuable ecosystem goods and services.

Tipping points have been well documented in coral reefs, but they occur in many other ocean ecosystems too.  For example, luxuriant kelp forests that support marine mammals and a myriad of other species provide us with various ecosystem services like seafood, agar (sugar made from kelp), recreation, and sheltering the coastline from waves. However, these habitats can become barren very rapidly when they reach a tipping point. This happened in the 1800s when fur hunting became prevalent—decreasing the sea otter population. With fewer sea otter to consume urchins, urchins became overabundant, and overgrazed the kelp.

Fortunately, science is providing insights into the factors that make ocean ecosystems more capable of resisting these kinds of changes, and more able to bounce back when they are damaged.

 New Science, New Hope

Our new paper synthesizes hundreds of data points collected over many years from over 20 countries in the Caribbean Sea.  We found that, however counterintuitive it may seem, coral cover is not the best indicator of coral reef health. Once coral cover has visibly declined, the data suggest that many other changes may have already occurred that are likely to make recovery very difficult.  Fish population status seems to be an earlier and more useful indicator of coral reef status.

We were also able to quantify tipping points for coral reefs in the Caribbean, based on these data. Coral reef managers can simply compare fish density in their coral reef to densities associated with these tipping points to assess the risk of ecosystem collapse.  They can also use the tipping points to guide management aimed at keeping the system in a “safe operating space,” where enough fish are left in the system to maintain the reef in a healthy state while providing “pretty good yields” – not the maximum amount of yield possible, but enough to sustain a fishery while hedging against the risk of collapse.  While tipping points should be determined for each reef, the generalized tipping points we found in our study can provide useful guidance for risk assessment and precautionary management.

It is easy to despair when thinking about coral reefs.  They are sensitive to many different kinds of threats, and they are subject to all of them: we are losing them very quickly.  But the data show that many of the coral reefs in the Caribbean with marine reserves have large fish populations, healthy corals, and many other indicators of good health.  The Cuban reefs that we have been fortunate enough to visit are absolutely spectacular, and remind me of my early days diving on Caribbean reefs that were covered with living coral and swarming with fish and urchins doing their jobs keeping the reefs healthy.  These healthy reefs are the hope of the Caribbean – they will provide the seeds of recovery, if we manage to reduce threat levels and keep reefs within their safe operating spaces, away from tipping points.

Rod Fujita

A Little-Known Federal Rule Brings Invisible Pollution Into Focus

9 years 2 months ago

By Peter Zalzal

Legal fellow Jess Portmess also contributed to this post.

Unlike an oil spill, most greenhouse gas emissions are invisible to the naked eye. Though we can’t see them, this pollution represents a daily threat to our environment and communities, and it is important to understand the extent of this pollution and where it comes from.

This is why in 2010 the Environmental Protection Agency (EPA) finalized a rule requiring facilities in the oil and gas industry to report yearly emissions from their operations.

The Rule is part of a larger greenhouse gas measurement, reporting, and disclosure program called for by Congress and signed into law by President George W. Bush. By coincidence, the rule is known as Subpart W.

The emissions data required by the Rule helps communities near oil and natural gas development better understand pollution sources, and gives companies better ways to identify opportunities to reduce emissions.

As these policies have gotten stronger under the Obama administration, industry has continued to fight them in federal court.

Shedding light on methane

One of the greenhouse gases covered by Subpart W is methane, which a substantial body of research shows is leaking at a significant rate from oil and natural gas infrastructure. Methane is a powerful air pollutant, over 80 times more potent than carbon dioxide in the first 20 years after it is emitted. And, as the primary component of natural gas, methane emissions represent a waste of a valuable energy resource.

In January, the Obama administration announced that several federal agencies will take actions aimed at reducing harmful methane emissions. As part of those actions, this summer, EPA will propose standards including direct regulation of methane from the oil and gas sector—an action that is urgently needed to begin reducing this pollution.

Strengthening reporting

The reporting requirements have led to a better understanding of methane emissions from the oil and gas sector, though there are important opportunities to continue to strengthen the program to provide communities and stakeholders with additional transparency, and deepen understanding of emissions from these sources. In fact, the administration has recognized the importance of improving data and transparency as part of Subpart W, both in its 2014 Strategy to Reduce Methane Emissions and in its recently announced goal to reduce methane emissions.

That’s why EPA has recently moved forward with two actions to strengthen greenhouse gas emissions reporting:

First, in December 2014, EPA proposed to strengthen requirements for emissions reporting by requiring reporting from sources previously not covered by the rule.

Deepening understanding of emissions from these sources (which include completions at oil wells, emissions from gathering and boosting systems, and transmission pipeline blowdowns between compressor stations) is especially critical given the increasing growth of emissions in so-called “tight-oil” formations like the Bakken and Eagle Ford shale and the already significant (and growing) national network of gathering and boosting infrastructure.

In fact, a recent study led by Colorado State University and funded in part by EDF found that gathering and boosting facilities have significant methane leaks, which were especially high when compared to other sources that, unlike gathering and boosting, were required to undertake comprehensive leak detection and repair.

EDF, along with colleagues in other organizations, submitted comments last week on the proposed rule urging the adoption of these improvements and recommending additional ways to strengthen reporting from these sources.

Last fall, EPA also finalized a rule that will increase the quality and uniformity of the reported oil and natural gas emissions data.

Under additional changes finalized in November 2014, companies (with limited exceptions) can no longer use non-standardized and unreliable measurement methods (known as best available monitoring methods or BAMM) when recording and reporting emissions from their operations. The change is an important step forward, which will allow for more rigorous, comparable emissions data and improve the transparency of the reporting program.

Industry pushes back on the recently-finalized rule

The American Petroleum Institute (API) has filed a legal challenge to EPA’s rule removing BAMM in the U.S. Court of Appeals in Washington, D.C. At the same time API is claiming the recently announced clean air measures are not needed to reduce oil and gas sector methane emissions, it is suing in court to prevent transparent understanding of those emissions.

This isn’t the first time API has taken legal action to block public transparency in the oil and natural gas sector—this is the third in a series of such lawsuits API has filed seeking to impede the meaningful assessment and disclosure of emissions data.

Last week EDF, along with our colleagues in other organizations, filed a motion to intervene in the lawsuit to defend EPA’s strengthened standards and support the public’s fundamental right-to-know about harmful methane emissions from the oil and natural gas sector.

Fostering the adoption of monitoring technologies

To build on the emissions information and transparency created by Subpart W reporting, EPA should also take additional actions to deploy proven technologies that can directly and transparently measure and quantify leaks, as provided for in the administration’s January announcement. Equipment leaks are the largest source of emissions, as identified by ICF International—both system-wide and also due to super emitters which account for substantial, additional emissions. Directly monitoring these leaks is critical to further improve understanding of emissions from these sources, promote accountability, and enhance transparency.

Rigorous, transparent data is the foundation for protecting public health and the environment from harmful emissions. EPA has taken two critically important steps to strengthen emissions reporting from the oil and gas sector, and we urge the agency to build on these two recent improvements, and continue to ensure the public has full, timely, and reliable information about the scope and sources of oil and gas emissions.

Photo credit: Earthworks

Peter Zalzal

A Little-Known Federal Rule Brings Invisible Pollution Into Focus

9 years 2 months ago

By Peter Zalzal

Legal fellow Jess Portmess also contributed to this post.

Unlike an oil spill, most greenhouse gas emissions are invisible to the naked eye. Though we can’t see them, this pollution represents a daily threat to our environment and communities, and it is important to understand the extent of this pollution and where it comes from.

This is why in 2010 the Environmental Protection Agency (EPA) finalized a rule requiring facilities in the oil and gas industry to report yearly emissions from their operations.

The Rule is part of a larger greenhouse gas measurement, reporting, and disclosure program called for by Congress and signed into law by President George W. Bush. By coincidence, the rule is known as Subpart W.

The emissions data required by the Rule helps communities near oil and natural gas development better understand pollution sources, and gives companies better ways to identify opportunities to reduce emissions.

As these policies have gotten stronger under the Obama administration, industry has continued to fight them in federal court.

Shedding light on methane

One of the greenhouse gases covered by Subpart W is methane, which a substantial body of research shows is leaking at a significant rate from oil and natural gas infrastructure. Methane is a powerful air pollutant, over 80 times more potent than carbon dioxide in the first 20 years after it is emitted. And, as the primary component of natural gas, methane emissions represent a waste of a valuable energy resource.

In January, the Obama administration announced that several federal agencies will take actions aimed at reducing harmful methane emissions. As part of those actions, this summer, EPA will propose standards including direct regulation of methane from the oil and gas sector—an action that is urgently needed to begin reducing this pollution.

Strengthening reporting

The reporting requirements have led to a better understanding of methane emissions from the oil and gas sector, though there are important opportunities to continue to strengthen the program to provide communities and stakeholders with additional transparency, and deepen understanding of emissions from these sources. In fact, the administration has recognized the importance of improving data and transparency as part of Subpart W, both in its 2014 Strategy to Reduce Methane Emissions and in its recently announced goal to reduce methane emissions.

That’s why EPA has recently moved forward with two actions to strengthen greenhouse gas emissions reporting:

First, in December 2014, EPA proposed to strengthen requirements for emissions reporting by requiring reporting from sources previously not covered by the rule.

Deepening understanding of emissions from these sources (which include completions at oil wells, emissions from gathering and boosting systems, and transmission pipeline blowdowns between compressor stations) is especially critical given the increasing growth of emissions in so-called “tight-oil” formations like the Bakken and Eagle Ford shale and the already significant (and growing) national network of gathering and boosting infrastructure.

In fact, a recent study led by Colorado State University and funded in part by EDF found that gathering and boosting facilities have significant methane leaks, which were especially high when compared to other sources that, unlike gathering and boosting, were required to undertake comprehensive leak detection and repair.

EDF, along with colleagues in other organizations, submitted comments last week on the proposed rule urging the adoption of these improvements and recommending additional ways to strengthen reporting from these sources.

Last fall, EPA also finalized a rule that will increase the quality and uniformity of the reported oil and natural gas emissions data.

Under additional changes finalized in November 2014, companies (with limited exceptions) can no longer use non-standardized and unreliable measurement methods (known as best available monitoring methods or BAMM) when recording and reporting emissions from their operations. The change is an important step forward, which will allow for more rigorous, comparable emissions data and improve the transparency of the reporting program.

Industry pushes back on the recently-finalized rule

The American Petroleum Institute (API) has filed a legal challenge to EPA’s rule removing BAMM in the U.S. Court of Appeals in Washington, D.C. At the same time API is claiming the recently announced clean air measures are not needed to reduce oil and gas sector methane emissions, it is suing in court to prevent transparent understanding of those emissions.

This isn’t the first time API has taken legal action to block public transparency in the oil and natural gas sector—this is the third in a series of such lawsuits API has filed seeking to impede the meaningful assessment and disclosure of emissions data.

Last week EDF, along with our colleagues in other organizations, filed a motion to intervene in the lawsuit to defend EPA’s strengthened standards and support the public’s fundamental right-to-know about harmful methane emissions from the oil and natural gas sector.

Fostering the adoption of monitoring technologies

To build on the emissions information and transparency created by Subpart W reporting, EPA should also take additional actions to deploy proven technologies that can directly and transparently measure and quantify leaks, as provided for in the administration’s January announcement. Equipment leaks are the largest source of emissions, as identified by ICF International—both system-wide and also due to super emitters which account for substantial, additional emissions. Directly monitoring these leaks is critical to further improve understanding of emissions from these sources, promote accountability, and enhance transparency.

Rigorous, transparent data is the foundation for protecting public health and the environment from harmful emissions. EPA has taken two critically important steps to strengthen emissions reporting from the oil and gas sector, and we urge the agency to build on these two recent improvements, and continue to ensure the public has full, timely, and reliable information about the scope and sources of oil and gas emissions.

Photo credit: Earthworks

Peter Zalzal

A Little-Known Federal Rule Brings Invisible Pollution Into Focus

9 years 2 months ago

By Peter Zalzal

Legal fellow Jess Portmess also contributed to this post.

Unlike an oil spill, most greenhouse gas emissions are invisible to the naked eye. Though we can’t see them, this pollution represents a daily threat to our environment and communities, and it is important to understand the extent of this pollution and where it comes from.

This is why in 2010 the Environmental Protection Agency (EPA) finalized a rule requiring facilities in the oil and gas industry to report yearly emissions from their operations.

The Rule is part of a larger greenhouse gas measurement, reporting, and disclosure program called for by Congress and signed into law by President George W. Bush. By coincidence, the rule is known as Subpart W.

The emissions data required by the Rule helps communities near oil and natural gas development better understand pollution sources, and gives companies better ways to identify opportunities to reduce emissions.

As these policies have gotten stronger under the Obama administration, industry has continued to fight them in federal court.

Shedding light on methane

One of the greenhouse gases covered by Subpart W is methane, which a substantial body of research shows is leaking at a significant rate from oil and natural gas infrastructure. Methane is a powerful air pollutant, over 80 times more potent than carbon dioxide in the first 20 years after it is emitted. And, as the primary component of natural gas, methane emissions represent a waste of a valuable energy resource.

In January, the Obama administration announced that several federal agencies will take actions aimed at reducing harmful methane emissions. As part of those actions, this summer, EPA will propose standards including direct regulation of methane from the oil and gas sector—an action that is urgently needed to begin reducing this pollution.

Strengthening reporting

The reporting requirements have led to a better understanding of methane emissions from the oil and gas sector, though there are important opportunities to continue to strengthen the program to provide communities and stakeholders with additional transparency, and deepen understanding of emissions from these sources. In fact, the administration has recognized the importance of improving data and transparency as part of Subpart W, both in its 2014 Strategy to Reduce Methane Emissions and in its recently announced goal to reduce methane emissions.

That’s why EPA has recently moved forward with two actions to strengthen greenhouse gas emissions reporting:

First, in December 2014, EPA proposed to strengthen requirements for emissions reporting by requiring reporting from sources previously not covered by the rule.

Deepening understanding of emissions from these sources (which include completions at oil wells, emissions from gathering and boosting systems, and transmission pipeline blowdowns between compressor stations) is especially critical given the increasing growth of emissions in so-called “tight-oil” formations like the Bakken and Eagle Ford shale and the already significant (and growing) national network of gathering and boosting infrastructure.

In fact, a recent study led by Colorado State University and funded in part by EDF found that gathering and boosting facilities have significant methane leaks, which were especially high when compared to other sources that, unlike gathering and boosting, were required to undertake comprehensive leak detection and repair.

EDF, along with colleagues in other organizations, submitted comments last week on the proposed rule urging the adoption of these improvements and recommending additional ways to strengthen reporting from these sources.

Last fall, EPA also finalized a rule that will increase the quality and uniformity of the reported oil and natural gas emissions data.

Under additional changes finalized in November 2014, companies (with limited exceptions) can no longer use non-standardized and unreliable measurement methods (known as best available monitoring methods or BAMM) when recording and reporting emissions from their operations. The change is an important step forward, which will allow for more rigorous, comparable emissions data and improve the transparency of the reporting program.

Industry pushes back on the recently-finalized rule

The American Petroleum Institute (API) has filed a legal challenge to EPA’s rule removing BAMM in the U.S. Court of Appeals in Washington, D.C. At the same time API is claiming the recently announced clean air measures are not needed to reduce oil and gas sector methane emissions, it is suing in court to prevent transparent understanding of those emissions.

This isn’t the first time API has taken legal action to block public transparency in the oil and natural gas sector—this is the third in a series of such lawsuits API has filed seeking to impede the meaningful assessment and disclosure of emissions data.

Last week EDF, along with our colleagues in other organizations, filed a motion to intervene in the lawsuit to defend EPA’s strengthened standards and support the public’s fundamental right-to-know about harmful methane emissions from the oil and natural gas sector.

Fostering the adoption of monitoring technologies

To build on the emissions information and transparency created by Subpart W reporting, EPA should also take additional actions to deploy proven technologies that can directly and transparently measure and quantify leaks, as provided for in the administration’s January announcement. Equipment leaks are the largest source of emissions, as identified by ICF International—both system-wide and also due to super emitters which account for substantial, additional emissions. Directly monitoring these leaks is critical to further improve understanding of emissions from these sources, promote accountability, and enhance transparency.

Rigorous, transparent data is the foundation for protecting public health and the environment from harmful emissions. EPA has taken two critically important steps to strengthen emissions reporting from the oil and gas sector, and we urge the agency to build on these two recent improvements, and continue to ensure the public has full, timely, and reliable information about the scope and sources of oil and gas emissions.

Photo credit: Earthworks

Peter Zalzal

Clean Energy Industry is Not Yet Mature – and that’s a Good Thing

9 years 2 months ago

By Peter Sopher

Last year, global investment in clean, renewable sources of energy grew by a better-than-expected 16 percent to $310 billion, according to Bloomberg New Energy Finance (BNEF). Industry watchers applauded the strong showing, but the numbers imply more than just robust growth. A careful analysis leads us to two additional illuminating conclusions about the industry’s current level of development and its future.

 

  1. The clean energy industry is in a development phase

In 2013, China’s gross domestic product (GDP) grew 8.5 percent, with investment comprising 47 percent of GDP. By contrast, GDP in the United States expanded 1.9 percent, with investment comprising 16.8 percent. As a developing country, China’s growth rate is significantly higher, and a telling characteristic for developing countries is that investment makes up a relatively large percentage of GDP.

This pattern doesn’t just hold true for countries; we also see a similar dynamic when looking at industries. According to BNEF, the oil & gas (O&G) industry spent $913 billion on capital expenditures, or capex, last year, while the market capitalization, or market cap, for the top ten companies in the NYSE Arca Oil & Gas Index stood at $1.63 trillion. By contrast, the market cap for the top ten companies in the Wilder Hill New Energy Global Index was much smaller at $164 billion. The Wilder Hill New Energy Global Index comprises 107 companies from around the world that cover a broad spectrum of clean energy technologies.

While O&G capex dwarfed last year’s clean energy investment of $310 billion by about three times, the market cap for the top ten largest O&G companies was about ten times larger than for the top ten clean energy companies. Seen in a different light, clean energy investment was about twice as large as the market cap of the ten largest clean energy companies. By contrast, O&G capex was half as large as the market cap of the ten largest O&G companies.

The clean energy industry’s high investment relative to market capitalization – as compared to the oil and gas industry – indicates the clean energy industry is in a development phase and, thus, more likely than a developed industry to undergo dynamic growth given favorable conditions.

  1. Clean energy investors are optimistic about future technological improvement

A measure for investors’ optimism for future technological improvement is an industry’s level of venture capital (VC) investment. Total investment by VC firms in the U.S. hit $65 billion in 2012-2013, according to Ernst & Young.

During this same span, investment by VC and private equity in clean energy came to $6.7 billion, according to BNEF, more than fifteen times the level of VC investment in the oil and gas industry, which was only $383 million in the same time frame. VC investment in solar and wind energy alone stood at $1.9 billion, five times O&G VC investment.

VC investment points to a rosy future for technology development in the clean energy industry as compared with the O&G industry. From 2012-13:

  • VC investment in clean energy in the U.S. accounted for 10 percent of total U.S. VC funding, while VC investment in solar and wind energy alone came to 2.9 percent of the total. Contrast this with a paltry 0.6 percent for the entire O&G industry.
  • The unadjusted value for VC investment in clean energy in the U.S. was 4 percent of the current market cap of the top ten companies in the Wilder Hill New Energy Global Index. By comparison, the figure for the O&G industry was 0.02 percent of the current market cap of the top ten companies in the NYSE Arca O&G Index.

High levels of venture capital investment in the clean energy industry – relative to the fossil fuel industry – indicate clean energy investors’ optimism for future innovation in clean energy technologies.

The fossil fuel industry, on the other hand, is seeing much lower VC investment and as a result, is likely to see fewer new innovative technologies in the near future. Although it's much larger, the fossil fuel industry is no longer experiencing the dynamic growth we're currently seeing in clean energy.

Solar and wind energy, as well as other clean energy technologies, are far from exhausting their potential. The clean energy industry is clearly in a state of growth development and has a long way to go before it's a mature industry. And that's a good thing.

Peter Sopher

Clean Energy Industry is Not Yet Mature – and that’s a Good Thing

9 years 2 months ago

By Peter Sopher

Last year, global investment in clean, renewable sources of energy grew by a better-than-expected 16 percent to $310 billion, according to Bloomberg New Energy Finance (BNEF). Industry watchers applauded the strong showing, but the numbers imply more than just robust growth. A careful analysis leads us to two additional illuminating conclusions about the industry’s current level of development and its future.

 

  1. The clean energy industry is in a development phase

In 2013, China’s gross domestic product (GDP) grew 8.5 percent, with investment comprising 47 percent of GDP. By contrast, GDP in the United States expanded 1.9 percent, with investment comprising 16.8 percent. As a developing country, China’s growth rate is significantly higher, and a telling characteristic for developing countries is that investment makes up a relatively large percentage of GDP.

This pattern doesn’t just hold true for countries; we also see a similar dynamic when looking at industries. According to BNEF, the oil & gas (O&G) industry spent $913 billion on capital expenditures, or capex, last year, while the market capitalization, or market cap, for the top ten companies in the NYSE Arca Oil & Gas Index stood at $1.63 trillion. By contrast, the market cap for the top ten companies in the Wilder Hill New Energy Global Index was much smaller at $164 billion. The Wilder Hill New Energy Global Index comprises 107 companies from around the world that cover a broad spectrum of clean energy technologies.

While O&G capex dwarfed last year’s clean energy investment of $310 billion by about three times, the market cap for the top ten largest O&G companies was about ten times larger than for the top ten clean energy companies. Seen in a different light, clean energy investment was about twice as large as the market cap of the ten largest clean energy companies. By contrast, O&G capex was half as large as the market cap of the ten largest O&G companies.

The clean energy industry’s high investment relative to market capitalization – as compared to the oil and gas industry – indicates the clean energy industry is in a development phase and, thus, more likely than a developed industry to undergo dynamic growth given favorable conditions.

  1. Clean energy investors are optimistic about future technological improvement

A measure for investors’ optimism for future technological improvement is an industry’s level of venture capital (VC) investment. Total investment by VC firms in the U.S. hit $65 billion in 2012-2013, according to Ernst & Young.

During this same span, investment by VC and private equity in clean energy came to $6.7 billion, according to BNEF, more than fifteen times the level of VC investment in the oil and gas industry, which was only $383 million in the same time frame. VC investment in solar and wind energy alone stood at $1.9 billion, five times O&G VC investment.

VC investment points to a rosy future for technology development in the clean energy industry as compared with the O&G industry. From 2012-13:

  • VC investment in clean energy in the U.S. accounted for 10 percent of total U.S. VC funding, while VC investment in solar and wind energy alone came to 2.9 percent of the total. Contrast this with a paltry 0.6 percent for the entire O&G industry.
  • The unadjusted value for VC investment in clean energy in the U.S. was 4 percent of the current market cap of the top ten companies in the Wilder Hill New Energy Global Index. By comparison, the figure for the O&G industry was 0.02 percent of the current market cap of the top ten companies in the NYSE Arca O&G Index.

High levels of venture capital investment in the clean energy industry – relative to the fossil fuel industry – indicate clean energy investors’ optimism for future innovation in clean energy technologies.

The fossil fuel industry, on the other hand, is seeing much lower VC investment and as a result, is likely to see fewer new innovative technologies in the near future. Although it's much larger, the fossil fuel industry is no longer experiencing the dynamic growth we're currently seeing in clean energy.

Solar and wind energy, as well as other clean energy technologies, are far from exhausting their potential. The clean energy industry is clearly in a state of growth development and has a long way to go before it's a mature industry. And that's a good thing.

Peter Sopher

Understanding effects of chemical dispersants on marine wildlife is critical to whale population

9 years 2 months ago

By Matthew Phillips, NWF

During and after the 2010 BP oil spill, clean-up crews relied heavily on chemical dispersants to break up oil slicks in the Gulf of Mexico. In total, crews used more than 2 million gallons of dispersants, namely Corexit 9500 and 9527, applying them directly to the head of the leaking well and over the surface waters of the Gulf. Dispersants break down oil into small droplets that easily mix with water and, in theory, biodegrade quickly. The intention is to reduce the amount of oil in an area, dispersing it throughout the water column. While debate continues over the efficacy of dispersants in cleaning up spills, their use continues to rise, despite little data on their suspected toxicity. For this reason, scientists have begun looking into the effects of these powerful chemicals on marine wildlife.

Chemical dispersants being sprayed into the Gulf following the BP oil spill.

In a recent study out of the University of Southern Maine, “Chemical dispersants used in the Gulf of Mexico oil crisis are cytotoxic and genotoxic to sperm whale skin cells,” researchers tested the effects of the chemical dispersants Corexit 9500 and Corexit 9527 to sperm whale skin cells. There is a small population, around 1600 sperm whales, residing in the Gulf. Since these whales inhabit part of the area inundated with oil after the BP spill, there is a high chance some of these whales came into contact with oil, and with the dispersants. With so few whales, the population is highly susceptible to disturbance: any chaotic or harmful event threatens its overall vitality. In addition, the most recent IUCN Red List of Threatened Species classified sperm whales as Vulnerable, meaning they are at risk of extinction. Therefore, understanding the effects of chemical dispersants on sperm whales is critical for ensuring the population’s health and stability.

To begin the process, researchers grew skin cells from samples obtained from Gulf whales before the spill. They applied varying concentrations of the two dispersants to the cultured cells, and measured the effects for one day. Since chemicals can be harmful in different ways, researchers studied the dispersants’ toxicity to the cells (called cytotoxicity) and to chromosomes (called genotoxicity). They found both dispersants to be poisonous to the cells, but only Corexit 9527 to be toxic to the cells’ genetic material.

Cytotoxicity and genotoxicity have different implications. A chemical that is cytotoxic –poisonous only at the cellular level—may cause fatal or non-fatal issues for individual organisms, such as skin lesions or respiratory complications. A chemical that is genotoxic, that disrupts the functions of genes, can leave an imprint on the next generation. It may cause problems in mating, reproduction, or calf development. A sperm whale exposed to Corexit 9527 may be unable to reproduce. If she can reproduce, she may have mal-formed or non-reproductive offspring.

Genotoxicity can have lingering detrimental effects on the population as a whole, endangering its future. While it would be valuable to know which dispersant is more toxic, researchers caution it is difficult to compare them because the effects are very different. Corexit 9500 was slightly more cytotoxic, but Corexit 9527 was significantly more genotoxic. Ultimately, the choice of which to use may depend on which outcome is more, or perhaps less, desirable.

There is no way to determine how many whales were exposed to dispersants, nor the degree of exposure. But a 2014 study citing widespread health problems among Gulf marine mammals, including complications in fetus development, gives cause for concern. The population’s size, the toxicity of chemical dispersants, and reports of toxin-related health problems make clear that Gulf sperm whales are at risk. Researchers will continue monitoring the health of the population, and only time will illuminate the full effects. Until we know more, we’re left wondering: if dispersants harm wildlife, how useful are they?

lbourg

Understanding effects of chemical dispersants on marine wildlife is critical to whale population

9 years 2 months ago

By Matthew Phillips, NWF

During and after the 2010 BP oil spill, clean-up crews relied heavily on chemical dispersants to break up oil slicks in the Gulf of Mexico. In total, crews used more than 2 million gallons of dispersants, namely Corexit 9500 and 9527, applying them directly to the head of the leaking well and over the surface waters of the Gulf. Dispersants break down oil into small droplets that easily mix with water and, in theory, biodegrade quickly. The intention is to reduce the amount of oil in an area, dispersing it throughout the water column. While debate continues over the efficacy of dispersants in cleaning up spills, their use continues to rise, despite little data on their suspected toxicity. For this reason, scientists have begun looking into the effects of these powerful chemicals on marine wildlife.

Chemical dispersants being sprayed into the Gulf following the BP oil spill.

In a recent study out of the University of Southern Maine, “Chemical dispersants used in the Gulf of Mexico oil crisis are cytotoxic and genotoxic to sperm whale skin cells,” researchers tested the effects of the chemical dispersants Corexit 9500 and Corexit 9527 to sperm whale skin cells. There is a small population, around 1600 sperm whales, residing in the Gulf. Since these whales inhabit part of the area inundated with oil after the BP spill, there is a high chance some of these whales came into contact with oil, and with the dispersants. With so few whales, the population is highly susceptible to disturbance: any chaotic or harmful event threatens its overall vitality. In addition, the most recent IUCN Red List of Threatened Species classified sperm whales as Vulnerable, meaning they are at risk of extinction. Therefore, understanding the effects of chemical dispersants on sperm whales is critical for ensuring the population’s health and stability.

To begin the process, researchers grew skin cells from samples obtained from Gulf whales before the spill. They applied varying concentrations of the two dispersants to the cultured cells, and measured the effects for one day. Since chemicals can be harmful in different ways, researchers studied the dispersants’ toxicity to the cells (called cytotoxicity) and to chromosomes (called genotoxicity). They found both dispersants to be poisonous to the cells, but only Corexit 9527 to be toxic to the cells’ genetic material.

Cytotoxicity and genotoxicity have different implications. A chemical that is cytotoxic –poisonous only at the cellular level—may cause fatal or non-fatal issues for individual organisms, such as skin lesions or respiratory complications. A chemical that is genotoxic, that disrupts the functions of genes, can leave an imprint on the next generation. It may cause problems in mating, reproduction, or calf development. A sperm whale exposed to Corexit 9527 may be unable to reproduce. If she can reproduce, she may have mal-formed or non-reproductive offspring.

Genotoxicity can have lingering detrimental effects on the population as a whole, endangering its future. While it would be valuable to know which dispersant is more toxic, researchers caution it is difficult to compare them because the effects are very different. Corexit 9500 was slightly more cytotoxic, but Corexit 9527 was significantly more genotoxic. Ultimately, the choice of which to use may depend on which outcome is more, or perhaps less, desirable.

There is no way to determine how many whales were exposed to dispersants, nor the degree of exposure. But a 2014 study citing widespread health problems among Gulf marine mammals, including complications in fetus development, gives cause for concern. The population’s size, the toxicity of chemical dispersants, and reports of toxin-related health problems make clear that Gulf sperm whales are at risk. Researchers will continue monitoring the health of the population, and only time will illuminate the full effects. Until we know more, we’re left wondering: if dispersants harm wildlife, how useful are they?

lbourg

Understanding effects of chemical dispersants on marine wildlife is critical to whale population

9 years 2 months ago

By Matthew Phillips, NWF

During and after the 2010 BP oil spill, clean-up crews relied heavily on chemical dispersants to break up oil slicks in the Gulf of Mexico. In total, crews used more than 2 million gallons of dispersants, namely Corexit 9500 and 9527, applying them directly to the head of the leaking well and over the surface waters of the Gulf. Dispersants break down oil into small droplets that easily mix with water and, in theory, biodegrade quickly. The intention is to reduce the amount of oil in an area, dispersing it throughout the water column. While debate continues over the efficacy of dispersants in cleaning up spills, their use continues to rise, despite little data on their suspected toxicity. For this reason, scientists have begun looking into the effects of these powerful chemicals on marine wildlife.

Chemical dispersants being sprayed into the Gulf following the BP oil spill.

In a recent study out of the University of Southern Maine, “Chemical dispersants used in the Gulf of Mexico oil crisis are cytotoxic and genotoxic to sperm whale skin cells,” researchers tested the effects of the chemical dispersants Corexit 9500 and Corexit 9527 to sperm whale skin cells. There is a small population, around 1600 sperm whales, residing in the Gulf. Since these whales inhabit part of the area inundated with oil after the BP spill, there is a high chance some of these whales came into contact with oil, and with the dispersants. With so few whales, the population is highly susceptible to disturbance: any chaotic or harmful event threatens its overall vitality. In addition, the most recent IUCN Red List of Threatened Species classified sperm whales as Vulnerable, meaning they are at risk of extinction. Therefore, understanding the effects of chemical dispersants on sperm whales is critical for ensuring the population’s health and stability.

To begin the process, researchers grew skin cells from samples obtained from Gulf whales before the spill. They applied varying concentrations of the two dispersants to the cultured cells, and measured the effects for one day. Since chemicals can be harmful in different ways, researchers studied the dispersants’ toxicity to the cells (called cytotoxicity) and to chromosomes (called genotoxicity). They found both dispersants to be poisonous to the cells, but only Corexit 9527 to be toxic to the cells’ genetic material.

Cytotoxicity and genotoxicity have different implications. A chemical that is cytotoxic –poisonous only at the cellular level—may cause fatal or non-fatal issues for individual organisms, such as skin lesions or respiratory complications. A chemical that is genotoxic, that disrupts the functions of genes, can leave an imprint on the next generation. It may cause problems in mating, reproduction, or calf development. A sperm whale exposed to Corexit 9527 may be unable to reproduce. If she can reproduce, she may have mal-formed or non-reproductive offspring.

Genotoxicity can have lingering detrimental effects on the population as a whole, endangering its future. While it would be valuable to know which dispersant is more toxic, researchers caution it is difficult to compare them because the effects are very different. Corexit 9500 was slightly more cytotoxic, but Corexit 9527 was significantly more genotoxic. Ultimately, the choice of which to use may depend on which outcome is more, or perhaps less, desirable.

There is no way to determine how many whales were exposed to dispersants, nor the degree of exposure. But a 2014 study citing widespread health problems among Gulf marine mammals, including complications in fetus development, gives cause for concern. The population’s size, the toxicity of chemical dispersants, and reports of toxin-related health problems make clear that Gulf sperm whales are at risk. Researchers will continue monitoring the health of the population, and only time will illuminate the full effects. Until we know more, we’re left wondering: if dispersants harm wildlife, how useful are they?

lbourg

Latest Mississippi River Delta News: March 5, 2015

9 years 2 months ago

The Deepwater Horizon Catastophe 5 Years On
By Brett Garling, National Geographic. March 05, 2015
“But the world witnessed Deepwater Horizon. Millions of gallons of oil flooded the Gulf of Mexico everyday — for 87 days. The biggest accidental oil spill ever. Five years later the effects of the Deepwater Horizon blowout still endure.” (Read More)

$1 billion cost estimate prompts Louisiana to rethink coastal project
By John Snell, WVUE-FOX8. March 04, 2015
“Without widespread use of diversions in the state's coastal tool box, scientists have warned Louisiana would end up with a much smaller footprint of wetlands, maintained at a significantly higher cost.” (Read More)

Despite land loss, Native American community clings to life along the Mississippi River
By John Snell, WVUE-FOX8. March 04, 2015
“Much of the land that once protected this place is gone, dissected by oil and gas canals, chewed away by salt water, and cut off from the Mississippi River.” (Read More)
 

lbourg

Latest Mississippi River Delta News: March 5, 2015

9 years 2 months ago

The Deepwater Horizon Catastophe 5 Years On
By Brett Garling, National Geographic. March 05, 2015
“But the world witnessed Deepwater Horizon. Millions of gallons of oil flooded the Gulf of Mexico everyday — for 87 days. The biggest accidental oil spill ever. Five years later the effects of the Deepwater Horizon blowout still endure.” (Read More)

$1 billion cost estimate prompts Louisiana to rethink coastal project
By John Snell, WVUE-FOX8. March 04, 2015
“Without widespread use of diversions in the state's coastal tool box, scientists have warned Louisiana would end up with a much smaller footprint of wetlands, maintained at a significantly higher cost.” (Read More)

Despite land loss, Native American community clings to life along the Mississippi River
By John Snell, WVUE-FOX8. March 04, 2015
“Much of the land that once protected this place is gone, dissected by oil and gas canals, chewed away by salt water, and cut off from the Mississippi River.” (Read More)
 

lbourg

Latest Mississippi River Delta News: March 5, 2015

9 years 2 months ago

The Deepwater Horizon Catastophe 5 Years On
By Brett Garling, National Geographic. March 05, 2015
“But the world witnessed Deepwater Horizon. Millions of gallons of oil flooded the Gulf of Mexico everyday — for 87 days. The biggest accidental oil spill ever. Five years later the effects of the Deepwater Horizon blowout still endure.” (Read More)

$1 billion cost estimate prompts Louisiana to rethink coastal project
By John Snell, WVUE-FOX8. March 04, 2015
“Without widespread use of diversions in the state's coastal tool box, scientists have warned Louisiana would end up with a much smaller footprint of wetlands, maintained at a significantly higher cost.” (Read More)

Despite land loss, Native American community clings to life along the Mississippi River
By John Snell, WVUE-FOX8. March 04, 2015
“Much of the land that once protected this place is gone, dissected by oil and gas canals, chewed away by salt water, and cut off from the Mississippi River.” (Read More)
 

lbourg

EPA Relaunches SaferChoice Product Labeling Program

9 years 2 months ago

By EDF Staff

by Jennifer McPartland, Ph.D., Health Scientist

Today, the EPA Design for the Environment Program (DfE) Safer Choice program (formerly, the safer product labeling program) unveiled its newly redesigned family of three product labels. The voluntary Safer Choice program seeks to recognize and bring consumer awareness to those products whose chemical ingredients represent the safest among those within a particular chemical functional class (e.g., solvents).

Today’s milestone is the result of a public process led by the EPA DfE program to solicit feedback on a new label that better communicates the goals and purpose of the program. After more than a year, and 1,700 comments and six consumer focus groups later, the new labels will be arriving soon to a store shelf near you. 

The purpose of the EPA DfE program is to drive inherently safer chemicals and products to the marketplace. DfE accomplishes this work through two major activities: the DfE alternatives assessment partnerships and the Safer Choice program. This post and the new labels pertain to the latter.

To reiterate, the Safer Choice program—and the alternatives assessment program for that matter—is not a regulatory program, but an entirely voluntary opportunity for chemical companies and product manufacturers to gain recognition for their leadership in chemical safety. For a product to be recognized under Safer Choice, each ingredient in the product must pass criteria that delineate those chemicals that present the least hazard within that chemical’s functional class (e.g., surfactant, colorant, solvent, etc.).

In addition, a product must meet other requirements for ingredient disclosure, packaging, and performance.  Details on the chemical and product criteria are available on the Safer Choice Standard and Criteria website.

Chemicals that have been found to meet the Safer Choice chemical criteria are listed by functional class on Safer Choice’s safer chemical ingredient list. Products that have been awarded the label can be found on the Safer Choice product website. To date, over 2,500 products have received the Safer Choice label and approximately 650 chemicals are listed. Most of the Safer Choice-recognized products to date are household and industrial cleaners, but the program intends to expand into the personal care product space.

As mentioned earlier, the Safer Choice program is rolling out a family of three new labels:

  • the primary Safer Choice label;
  • the Safer Choice label for institutional and industrial products; and
  • the Safer Choice fragrance free label.

Identifying fragrance-free products is of particular importance for individuals that have fragrance allergies or sensitization concerns. Indeed, many product manufactures have taken recent steps to disclose more information about the fragrances in products they sell.

So why has EPA decided to refresh its label?  According to EPA, the label redesign is intended to accomplish the following goals:

  • Better convey the scientific rigor of EPA's product evaluation and the benefits to people and the environment with a label that is easier to display on products, materials, and in digital media;
  • Increase buyers' recognition of products bearing EPA's Safer Product Label; and
  • Encourage innovation and development of safer chemicals and chemical-based products.

There has been criticism from both the chemical industry and the EPA Inspector General on how adequately the prior label clearly communicated the scope and meaning of the label to consumers. It’s fair to say that putting “Design for the Environment” on a label doesn’t really convey that the program is focused specifically on reducing chemical hazard. The new label and tagline, “Safer Choice, Meets U.S. EPA Safer Product Standards” better communicates the chemical focus of the program.

Many shoppers seem to be unfamiliar with the DfE program, especially when compared to programs like EPA Energy Star; the hope is that the new label and recent commitments from major retailers like Walmart and Wegman’s to the Safer Choice program will increase consumer awareness of the program. Walmart’s sustainable chemistry policy — which EDF helped develop — includes a commitment to strive to formulate and label its Walmart brand products under the Safer Choice program. Wegman’s has already made significant strides in getting several of its Wegman’s-brand products recognized by Safer Choice.

We certainly hope that the new labels will inspire more businesses to pursue recognition by the program, whether through chemical and production innovation or through the types of retailer leadership activities we’ve seen by Walmart and Wegman’s. The Safer Choice program provides an important and valuable opportunity to drive inherently safer chemicals to the market and to reward, with the credibility and backing of the federal government, those businesses that devote R&D to doing so.

Further reading:

EDF Staff