Model Distillation for Revenue Optimization: Interpretable Personalized Pricing
Abstract
Data-driven pricing strategies are becoming increasingly common, where customers are offered a personalized price based on features that are predictive of their valuation of a product. It is desirable for this pricing policy to be simple and interpretable, so it can be verified, checked for fairness, and easily implemented. However, efforts to incorporate machine learning into a pricing framework often lead to complex pricing policies that are not interpretable, resulting in slow adoption in practice. We present a novel, customized, prescriptive tree-based algorithm that distills knowledge from a complex black-box machine learning algorithm, segments customers with similar valuations and prescribes prices in such a way that maximizes revenue while maintaining interpretability. We quantify the regret of a resulting policy and demonstrate its efficacy in applications with both synthetic and real-world datasets.
Cite
Text
Biggs et al. "Model Distillation for Revenue Optimization: Interpretable Personalized Pricing." International Conference on Machine Learning, 2021.Markdown
[Biggs et al. "Model Distillation for Revenue Optimization: Interpretable Personalized Pricing." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/biggs2021icml-model/)BibTeX
@inproceedings{biggs2021icml-model,
title = {{Model Distillation for Revenue Optimization: Interpretable Personalized Pricing}},
author = {Biggs, Max and Sun, Wei and Ettl, Markus},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {946-956},
volume = {139},
url = {https://mlanthology.org/icml/2021/biggs2021icml-model/}
}