Best practices for predictive analytics and consumer scoring

Advances in AI and predictive analytics are using consumer scores to automate business decisions to predict things like risk and fraud. But concern over fairness means companies need to make scores transparent to consumers.

Regardless of industry, it’s imperative for businesses to tread carefully with their consumer scoring implementations.

Here are some best practices companies can follow to future-proof their efforts and stay out of the headlines:

  • Run a thorough risk analysis. Pam Dixon, founder and executive director at the World Privacy Forum, posed this hypothetical when speaking with Insider Intelligence: “Pretend that all of your customers learn that you’re using this score. What do you need to do in order to make sure that if that happened, you could truly defend your purchase of the score, your analysis of the safety and efficacy of the score, your use qualifications and guidelines for the score, and your recourse for consumers who were harmed by the use of that score?”
  • Verify data quality and provenance. What data is being used to feed the scoring analytics? Is it authentically permissioned?
  • Make sure models and AI are explainable. Explainability is especially necessary for highly regulated industries such as financial services, healthcare, or telecoms, and forthcoming AI regulation is likely to require explainability for all algorithmic decision-making.
  • Be as transparent as possible. What is the range of possible scores? How are the numbers interpreted? How long is the score kept and how frequently is it updated? Companies should also ask vendors if they can inform consumers that they’re using the scoring tool.
  • Evaluate scores for fairness. Even if scores aren’t based on sensitive data, their analytic outcomes can serve as proxies for discrimination. Evaluate scores for disparate impact on protected classes based on age, race, gender, income, etc.
  • Use the right score for the job. Credit scores have been used for applications beyond finance for things like tenant screening, hiring, and even the SAT adversity score. That’s proven contentious with consumers and is potentially subject to regulation.
  • Give consumers access and the ability to contest or correct. Can consumers request their scores? Can they contest an algorithmic decision or review the data points that went into a scoring model?

"Behind the Numbers" Podcast