Generalized Maximum Entropy for Supervised Classification
Abstract
The maximum entropy principle advocates to
evaluate events’ probabilities using a distribution that maximizes
entropy among those that satisfy certain expectations’ constraints. Such principle can be generalized for arbitrary decision
problems where it corresponds to minimax approaches. This
paper establishes a framework for supervised classification based
on the generalized maximum entropy principle that leads to
minimax risk classifiers (MRCs). We develop learning techniques
that determine MRCs for general entropy functions and provide
performance guarantees by means of convex optimization. In
addition, we describe the relationship of the presented techniques
with existing classification methods, and quantify MRCs performance in comparison with the proposed bounds and conventional
methods.