“Stochastic Bandits for Sticky Recommendations” — Seminar by Prof Theja Tulabandhula (25th of June 2019)

Published by Joeran Beel on

We are delighted to announce a seminar by Prof Theja Tulabandhula from the University of Illinois Chicago. The seminar will start at 10:00 am, Tuesday, 25 of June 2019 in the Small Conference Room in the O’Reilly Institute at Trinity College Dublin.

Theja is visiting our lab and presents his latest research “Stochastic Bandits for Sticky Recommendations”.

Abstract: We consider sequential decision problems related to making recommendations. A platform needs to show good and timely recommendations to its users to engage them and increase revenues, while not knowing their behavioral patterns a priori. We consider two behavioral effects that modulate relevance: (a) the users have a latent propensity to act on a recommendation based on its position in a sequence of recommended items, and (b) the users have a latent propensity to act on a recommendation only if it has been shown to them repeatedly. In both settings, the platform has to simultaneously learn the quality of its recommendations and the corresponding user behavior, while exploiting the information it knows so far. We develop new bandit algorithms with regret guarantees when considering both these effects, and validate their performance with experiments.

For more details refer to https://arxiv.org/abs/1901.07734 and https://arxiv.org/abs/1811.09026.

Prof Theja Tulabandhula from the University of Illinois Chicago

Prior to the University of Illinois Chicago, Theja was a Research Scientist at Xerox Research (Machine Learning and Statistics group) from 2014-2016. He received a PhD in Electrical Engineering and Computer Science from MIT, and a Dual degree in Electrical Engineering from IIT Kharagpur where he was awarded the Prime Minister’s Gold Medal. His research interests include designing new machine learning and optimization methods for sequential decision making, with a focus on applications in retail and transportation.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.