In this post, we talk about the problem of customer returns in the Electronics industry, using return rates as a proxy for customer satisfaction. We explore solving this problem by using a data driven marketing automation approach to increase customer satisfaction, via reducing return rates.
Jon Buys a Fitness Band.
It’s January, and Jon’s New Year’s resolutions include fitness goals. Jon buys a fitness band that is more than just a fitness band — it’s a wristband with a fitness tracker, a heart rate monitor, a downloadable app, and a feature, that Jon only vaguely understands, called bluetooth.
Jon knows he’s not the most tech-savvy guy in the world, but he also knows he’s certainly not the least. He scrutinizes the instructions, downloads the app, and makes an account with the company that makes the app. After a few frustrating misfires and some emails to customer support, Jon manages to link the fitness band to the app on his phone. He spends the next week trying to use the fitness band to track his workouts. Jon can see how many steps he’s taken, but he’s having trouble toggling between the step counter and the heart rate monitor. He knows that somewhere on the app his totals should be registered and easy to visualize, but he can’t find them. Jon isn’t sure how to input his fitness goals, and his fitness band keeps giving him notifications about going to bed hours earlier than he’s normally used to.
A week after buying his fitness band, Jon takes it off and puts it back in its box. He’s relieved.
A week after that, Jon seals up the fitness band, throws on a return shipping label, and sends it back to the retailer.
Unfortunately for electronics retailers, the Jon scenario is all too common.
In 2016, US online retailers reported return rates between 20% – 40%. Research by Accenture estimates that 68% of consumer electronics returns are labeled as NTF (no trouble found), and another 27% are due to buyer’s remorse, which can occur for a variety of reasons similar to NTF, including subpar consumer experience and a lack of consumer education. Taken together, these figures suggest that 95% of consumer electronics returns are for reasons other than product defects.
According a study by the National Retail Federation, 49% of retailers now offer free return shipping, and high rates of returns are putting an increased financial burden on electronics retailers that are already operating on razor-thin margins. Unfortunately for retailers, a free return policy is quickly becoming a “must have” to maintain customer satisfaction and ensure repeat business in competitive online market places. So, the only way to decrease costs is to increase customer satisfaction and reduce rates of return.
Jon needs to enjoy his fitness band enough to keep it.
To create a data driven, automated method that ensures an enjoyable purchase, there are two big ideas:
- Ensure that the customer purchases the right product configuration upfront
- Proactive personalized outreach when you have reason to believe that the customer is likely to return
- In either case, predictive data and personalized notifications technology that understands customer behavior holds the key.
There are three aspects to a data driven, automated method:
- Data Collection
- Predictive Modeling
- Consumer Messaging
The data warehousing system needs to collect the customer behavior of the user prior to and after the purchase. If you’re an automated online retailer like Amazon, instrumenting the commerce website and app exhaustively goes a long way in aiding this type of collection. If you’re a Comcast or DirectTV attempting to reduce service cancellations, your customer “behavior” is typically split between website, phone and the DVR. Either way, classifying (and re-classifying continuously) a customer base in real-time into happy vs troubled, requires collection of behavioral data. The less comprehensive the collection, the more noisy and stale the data. Garbage in, garbage out.
With our work in this area, Intempt has successfully reduced return rates for connected devices that collect data and can communicate with one another and the mothership. Here is a quick graphic that describes the way data may be generated and collected:
Capturing profile and behavior data from device, apps and auxiliary systems
In this particular case, a consumer purchased the device online so we were able to track and collect their clickstream data prior to purchase, as well as understand their previous purchase history and satisfaction via prior NPS surveys. Because the device was connected to the internet via bluetooth tether, it could send data back periodically. We were able to understand their behavior pattern with the device post purchase.
Post purchase data collection allowed us to construct a fuller picture of the consumer’s propensity to purchase the device prior to the transaction and their interaction/engagement with the device within the first 1,3,7 days after activating the device. Access to both the deviceID and the email address of the user (entered during purchase and re-entered during activation along with registration info like phone, data-of-birth and gender) via a companion mobile app - provided a way to stitch the consumer and device profile together.
Having this information continuously collected and enriched in our systems allows us to perform the predictive data modeling and targeted messaging aspects.
Predictive Modeling helps to ensure customers’ satisfaction with their purchases and decrease customer returns in a data driven, automated manner.
Before we began solving the customer returns problem by using a data-driven marketing automation, we knew (via customer RMA reasons) that the main reasons for customer returns fall into one of three categories:
- Unwanted gift (1/2 of all returns)
- Hardware issue (1/3rd of all returns) – The ideal outcome here is an exchange, not a return. Usually faulty units have battery or screen issues that get resolved in subsequent device builds. Software issue (1/5th of all returns) – These are rarely caused by an actual defect, but by inadequate onboarding and training – a significant focus of optimization.
- Our returns analysis focused on evaluating customer behavior at the beginning of the user’s time with the product — how the device is used over the first three days, initial week and month to determine when to engage with the customer to shift the retention curve away from a return and into continued usage.
We developed a predictive model to predict likelihood of returns. This predictive model allows us to figure out whom to reach out to and when, as well as to measure the impact of that outreach to close the loop.
Returned vs Retained Cohort Event Comparison; data illustrative, not actual
After observing customer behavior for a user in the first seven days, we built a classification model that could predict whether or not an individual is likely to replace or request a refund for their device, with 80–90% accuracy.
We did this by performing a simple cohort analysis to get a feel for what the retained vs returned user was doing on the device. We stack ranked their events by highest to lowest number of occurrences. Our database has events captured by email, and we’re able to look across devices, apps and support tickets.
Random Forest Model
Confusion Matrix for RF Classification Model; data illustrative, not actual
We then took the variables above and ran them through a random forest model to predict the dependent variable, which predicted whether the user would return the device or not. 70% of sample to train model; 30% of sample to test model.
We’re able to predict with a high degree of accuracy within n days since device activation (n=3 days, 7 days, 30 days) to understand if a user has a high likelihood of return.
So now that we can predict whether or not a consumer will make a return, how do we change that behavior? Messaging strategy and execution is a main focus of Consumer Messaging.
Personalized Customer Notifications
Personalized Customer Notifications help retailers to stay in touch with their customers, support and educate them.
Once the user was classified (3 days post registration) as an at-risk user, we sent them a series of real-time notifications at 7, 14, 21 and 30 day periods to get them to change behavior (and re-classify as not at-risk). Within the at-risk group, users were grouped into two groups, control and treated. 90% of users were treated with notifications and 10% of users are at-risk but not provided a real-time notification.
What this allows us to do is measure lift, which is the difference between the % of users that were at-risk, notified and did not return the device vs. those who were at-risk, not notified and did not return the device. This allowed us to concretely measure the effect of a real-time notification campaign.
R+7, 14, 21, 30 day notification campaign strategy for an at-risk user cohort. ~35% of users who were targeted with messages offering support and education were responsive, and a majority within the ~35% did not initiate a return.
So what does this mean for Jon?
Using the techniques outlined in these posts, Jon would have been targeted as an at risk user as a result of his initial pattern of interactions with the device in the first three days of owning his fitness band. Jon would have been contacted with targeted personalized notifications offering additional support and education regarding the features of his new purchase in the critical first weeks of use. Our work indicates that as a result of this type of targeted messaging, Jon is likely to have converted to a satisfied customer.
Doing this type of collection, modeling and messaging for a single variable (risk vs not-at-risk cohort) is conceivable to do manually, if a company has access to an in-house data scientist and marketer combo. In reality, users are often in a multitude of different states with your products, and there are a requisite number of particular notifications that they will respond to.
Figuring out these micro cohorts of current user behavior, and running a variety of campaigns simultaneously to get them to more valuable cohorts predictably, requires a software tool to work at scale from collecting data, to predictive modeling, to providing notifications.
After years of trial and analysis working in this area, Intempt Technologies has built that tool.