title | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Model Distillation for Revenue Optimization: Interpretable Personalized Pricing |
Data-driven pricing strategies are becoming increasingly common, where customers are offered a personalized price based on features that are predictive of their valuation of a product. It is desirable for this pricing policy to be simple and interpretable, so it can be verified, checked for fairness, and easily implemented. However, efforts to incorporate machine learning into a pricing framework often lead to complex pricing policies that are not interpretable, resulting in slow adoption in practice. We present a novel, customized, prescriptive tree-based algorithm that distills knowledge from a complex black-box machine learning algorithm, segments customers with similar valuations and prescribes prices in such a way that maximizes revenue while maintaining interpretability. We quantify the regret of a resulting policy and demonstrate its efficacy in applications with both synthetic and real-world datasets. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
biggs21a |
0 |
Model Distillation for Revenue Optimization: Interpretable Personalized Pricing |
946 |
956 |
946-956 |
946 |
false |
Biggs, Max and Sun, Wei and Ettl, Markus |
|
2021-07-01 |
Proceedings of the 38th International Conference on Machine Learning |
139 |
inproceedings |
|