Experimentation in the movies sometimes gets a bad rap – you think of mad scientists blowing up labs or aliens arriving to probe unsuspecting humans or accidental AI monsters. It leaves the imagination to form an image of experimenters as cold-hearted, calculating and removed from reality. Real world experimentation is typically much more mundane, but the stereotypes often linger. This is unfortunate. The primary goal of experimentation (if you’re not a mad scientist) is: Does this thing work like I think it does? Does this feature deliver the results or benefits it is supposed to? If not why? This makes it an extremely powerful tool for designing products that work and are actually good for customers.
At TrueAccord we believe that experimentation is an integral part of designing a product that fulfills our mission to “reinvent the debt collections space by delivering great customer experiences that empower consumers to regain control of their financial health and help them to better manage their financial future.” Whenever possible we launch experiments, not outright features. This strategy has three main and essential benefits:
Tests our instincts are right or our models are functional
Allows us to gain valuable insights into who our customers are and what they need
Mitigates potential negative effects
Test Our Instincts: How do you ensure your team is actually moving the product forward? Only investing energy in features and experiences that will create an effective and positive debt collection experience? Experimentation. The TrueAccord team is full of clever people with clever ideas, but we know it’s important not to found our product on untested hunches. By testing our instincts before taking another step in the same direction, we make sure we invest energy where it matters and wait to develop our knowledge base before proceeding in directions we clearly do not yet understand.
Customer Insights: Understanding why your product works is often more important than understanding if it works. The real benefits of an experimentation infrastructure are in its ability to provide diversified and descriptive data as well as the emphasis on stopping to take a look. At TrueAccord we know it’s essential to understand if we’re looking at the problem the right way and if not what we’ve missed: Do we understand our customers’ needs?
We launched a new “better” email format that we rolled out as a variation across a spread of existing email content. After a 3 month run, we asserted that it was indeed performing significantly better in terms of both average open and click rate. This was surprising. We hadn’t changed anything that should have affected opens.
New base template content saw an open rate increase of ~10%! First Email: New base template and Second Email: Control
Upon further investigation, we realized that the new format unintentionally changed the email preview from displaying the start of our email content to consistently showing a formally-worded disclaimer! We then launched another experiment to ensure our findings were correct.
Mitigates Negative Effects: It’s easy in any industry to get blindsided by simple outcome metrics, especially in debt collection where the end objective is repayment. At TrueAccord we would consider it a failure if our product worked, but it worked for the wrong reasons – if our collections system converted, but didn’t provide a good experience for the consumer. Experimentation is our first wall of defense against treading down this path.
After researching existing accounts, we realized there was a need for more self-service tools in payment plan management. We developed a new payment plan account page and rolled out an experiment that automatically redirected some customers to this page any time they viewed the website while their plan was active.
We found that this did decrease payment plan breakage and increase liquidation, but because our system was set up to detect other types of impact we discovered it also increased outreach to our engagement team in the category of “Website Help”. Consumers were confused as to why they were not landing on the pages they expected upon navigating to our website. We had the right idea, but our implementation was not ideal for the consumer.
Experiment vs Control: % of inbound engagement team communication by category (total # of inbound communications was approx. the same)