A Brief History of Carbon Offsets

Carbon offsets have a relatively short history, with the concept first gaining widespread attention in the late 1990s and early 2000s. The idea behind carbon offsets is to create a market-based approach to addressing climate change by allowing individuals and organizations to invest in projects that reduce or remove carbon emissions from the atmosphere.

The first carbon offset program was established in 1989 in response to the Montreal Protocol, which aimed to reduce the use of ozone-depleting substances. This program allowed companies to offset their use of ozone-depleting substances by investing in projects that reduced other greenhouse gas emissions.

However, it wasn’t until the early 2000s that carbon offsets gained widespread attention as a tool for addressing climate change. The Kyoto Protocol, adopted in 1997, included provisions for carbon offsetting as a way for countries to meet their emissions reduction targets.

Since then, carbon offset programs have grown in popularity, with a wide range of organizations and individuals purchasing offsets to help reduce their carbon footprint. Today, there are many types of carbon offset programs, including those focused on renewable energy, reforestation, and energy efficiency.

While carbon offsets have been criticized by some for not always leading to a net reduction in carbon emissions, they remain an important tool for individuals and organizations looking to take action on climate change. As the world continues to work towards a more sustainable future, carbon offsets are likely to continue playing a role in reducing carbon emissions and supporting sustainable development.



Thank you for signing up.
Go NetZero now.

Take the NetZero pledge

To determine the amount of carbon offsets required for you to go NetZero, you can use average U.S. carbon emission levels, or you can estimate your carbon footprint more precisely with our calculator.