After rising for nearly two decades, carbon dioxide emissions from United States energy use began to fall sharply and unexpectedly in 2007.
For years now, experts attributed this decrease to the drop in energy demand during the economic recession that began late that year, and to the huge surge in cheap natural gas that displaced coal in our energy mix during this period. But they overlooked another key change that drove the drop in emissions just as much: the rapid rise in renewable energy production.
By 2013, our country’s annual carbon dioxide emissions had decreased by 11 percent – a decline not witnessed since the 1979 oil crisis. Our research shows that the growth of renewable energy sources accounted for 31 percent of that 640-million metric ton carbon drop.
The impact from renewables is just below the 34-percent contribution the switch from petroleum and coal to natural gas made to the emissions decline – a fact that, until now, has previously gone largely unrecognized.
Between 2007 and 2013, wind-generated electricity grew almost five-fold to 168 terawatt hours, enough to power 15 million average American homes. Utility-scale solar grew to 8.7 TWh. At the same time, bioenergy production grew an equivalent of 39 percent to 1,400 TWh, which includes biofuels in the transportation sector.
These positive trends have since continued, in particular for solar and wind energy.
Renewables are the bigger story here
The switch from coal to natural gas certainly contributed to the drop in emissions, but there’s an important caveat here. The climate benefits from this trend are undermined by the presence of methane leakage along the natural gas supply chain, the extent of which is likely underestimated in national greenhouse gas emission inventories.
It’s become clear that incentives to support the expansion of renewable capacity successfully helped reduce carbon dioxide emissions between 2007 and 2013 – and that decreasing costs for renewable energy offer some hope for continued progress under this administration.