Speaker
Description
Standard models for categorical and ordinal data, such as log-linear, association models and logistic regression models for binary or ordinal responses, as well as the Mallows model for rank data are revisited and defined through statistical information theoretic properties in terms of the Kullback–Leibler (KL) divergence. In the sequel, replacing the KL by the φ-divergence, which is a family of divergences including the KL as special case, these models are generalized to flexible families of models. The suggested models are discussed in terms of their properties, estimation and fit. Finally, their potential is illustrated by characteristic examples.
Key-words: Cressie–Read power divergence, distance-based probability models, maximum likelihood estimation