🌍 Experimenting with Climate Action

The Problem

We believe that data for growth companies has two fundamental elements: analytics and experimentation. Analytics is there to help you understand your users, your marketing, and your product. And experimentation is there to understand how changes to your product and marketing impact business outcomes, usually by running controlled tests.

Typically when we talk to clients about experimentation, they have in mind a northstar metric that they would like to improve, and more often than not that is something related to either revenue or user engagement.

But for Climaider the challenge was different. Their mission is to help individuals to reduce their carbon emissions, either through carbon offsetting, or through challenges that the user undertakes to change their lifestyle in an emission-reducing way, for example by buying second-hand clothing, or pledging not to fly.

They wanted our help to construct a sequence of experiments, that would identify how changes to their product and their growth activities impacted on the total CO2 reduction of their userbase.

The Solution

Experimentation typically follows a four step process:

  1. Ideation: what could be improved, what could we change?
  1. Prioritization: which initiatives should we do first?
  1. Execution: how do we try this in a controlled way?
  1. Analysis: what was the impact?

Below are a few highlights from running that process with the Climaider team.

We used two primary methods for ideation: delving into analytics data to find insights and opportunities, and applying concepts seen succeed in other verticals. With these approaches, we came up with some pretty interesting experiment ideas, including:

  1. auto-targeted acquisition channels: we noticed that traffic from the alternative search engines duckduckgo (privacy-focussed) and ecosia (climate-focussed) had a great conversion rate, so we experimented with running ads on those platforms
  2. video persuasion: we noticed a steep drop-off prior to selecting an offsetting plan, so we tried showing an explainer video with the founder on the screen before
  3. new challenges: we baked a feature into the app to let users "bookmark" challenges for the future, and included a long list of challenges we hadn't built yet. This list now forms the backlog for the product team to work on.
  4. challenge dosing: how many climate challenges should we let a user start at once?
  5. onboarding priorities: should we focus on getting users to complete challenges, or getting their permission to send notifications?

We used an adapted method of the RICE prioritization technique to score our experiment ideas, and run controlled A/B tests on their react app, analyzing the results with Amplitude funnels. This has enabled the Climaider team to see how their changes are impacting conversion rates and ultimately CO2 reduction.

We've still got lots of experiment ideas we want to run on the Climaider product, and are delighted that our work has such a meaningful impact.