How American Express Boosted Survey Response Rates Up To 3x
Do you wonder how to increase survey response rates? I recently had a great conversation with Luis Angel-Lalanne, vice president of customer listening at American Express. Luis and the team at Amex made incredible progress, resulting in 2.5–3x higher response rates! In some functions in certain countries, response rates even increased by 8x.
Luis generously agreed to share the details behind this success. Here is the transcript of my conversation with Luis.
Can you please give us an overview of the changes that led to such a high increase in response rates?
Luis: Certainly. We made four key changes:
- New email language, look, and feel
- Embedded the first question in the email invite
- Shortened the survey to the bare minimum of questions that we could not determine using operational data
- Refreshed suppressions we applied before sending the survey invite
This resulted in response rates that increased 2.5x to 3x. When you add in the changes to the suppressions, total survey volume more than tripled. The overall response rate we achieved exceeded our goal. This revolutionary increase certainly captured people’s attention!
How did you go about making those changes?
Luis: First, we wanted to understand the opportunity. We spoke with our CFM [customer feedback management] platform partner to understand what a best-in-class transactional survey response rate was. We saw that we had opportunity to improve.
Next, we decided that nothing was off limits other than our CX [customer experience] beacon metric. We took a hard look at the questions we asked in the survey and whether we could get the same information using internal data. We have data to tell us things like why the customer called and if they went online first, so these questions were ripe to be removed. That said, we did debate this. Giving up these questions that gave us the customer point of view is not easy, and our internal data did not always match what we heard from the customer. What eventually helped us overcome any dissent is that our customers expect that we know them and why they called. My team owned the decision to remove these questions based on our desire to have the survey deliver an exceptional customer experience and not ask questions that the customer would expect we [already] had the answer to.
So with our goal in mind, we devised a suite of tests: three different email invites, two different versions of the survey, two different versions of the customer satisfaction question, and at the end, we had time to test a couple different email subject lines. We tested this across some of our largest studies in countries with the most survey volume. We set up a testing plan to rotate to a new email invite every two weeks. With real-time reporting, we were able to get results immediately and build confidence that our goals were totally achievable.
You tested a lot, which is great advice. But still, change is hard. How did you bring people within Amex along?
Luis: We brought our key partners into the goal and the planned changes right from the start. Before we had run a single test, we started branding this as “VoC [voice of the customer] Revolution.”
When we shared the plans with our partners, we made sure we had examples of the three email invites we planned to test. This made the audience part of the process, as they all picked their favorites and started hypothesizing on why theirs would win.
Once we started testing, we sent out weekly updates with the most recent results and the upcoming tests planned. This was important, because the test plan covered a couple months and we didn’t want our partners to forget the excitement of the transformation.
Once we selected the winning combination, we then worked with our operational partners, HR, and the communications group to ensure that we rolled out this significant change as carefully as possible.
And one key issue in changing surveys is always a concern about losing the ability to trend. How did you overcome that challenge?
Luis: Yes. That was the final piece of the roll-out puzzle. We needed to proactively communicate that this dramatic increase in survey volume would impact the scores and the trends. The theory that, when you inspire more people to respond, you draw in more of the neutrals and passively satisfied, that held true for us. We were transparent in showing how the math works when more neutrals and passively satisfied customers respond. While disrupting the trends of key metrics is never easy, we were able to demonstrate that it was unquestionably the right thing to do.
Impressive! So what’s next on your agenda when it comes to VoC at Amex?
Luis: After driving such a significant (and disruptive) change, we are not looking to drive another change of this magnitude anytime soon. We’re now looking to optimize what we have: Can we tweak the language to inspire more customers to leave us commentary? Can we slightly tweak the survey for different key journeys? So while the revolution may be over for now, we always will be testing and evolving the survey. The revolutionary energy on the team now is focused on using NLP [natural language processing] to model customer satisfaction for all calls. This work is in its early stages but promising!
Read more about CX survey best practices in my report, “Design Better CX Surveys With This Checklist.”
Thank you so much, Luis, and good luck!