Flores, Claudio C.; Medeiros, Marcelo C. - 2020
Bandit problems are pervasive in various fields of research and are also present in several practical applications … one of the most popular bandit solutions, the original et-greedy heuristics, to high-dimensional contexts. Moreover, we … range of exploration of new actions. We find reasonable bounds for the cumulative regret of a decaying et-greedy heuristic …