Date of Award

1-1-2012

Language

English

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

College/School/Department

Department of Mathematics and Statistics

Content Description

1 online resource (iv, 37 pages) : illustrations (some color)

Dissertation/Thesis Chair

Carlos C Rodríguez

Committee Members

Ariel Caticha, Kevin Knuth, Malcolm Sherman

Keywords

Antidata, Bayesian Inference, Entropic Prior, Bayesian statistical decision theory, Entropy, Mathematical statistics

Subject Categories

Statistics and Probability

Abstract

Carlos C. Rodríguez [6] [7] has published a family of priors, so named (by John Skilling [4]) the Entropic Priors, that decay as a function of Kullback-Leibler divergence in order to encode a blend of proximity near a prior estimate, with uniformity over the hypothesis space of probability distributions. The asymmetry of the KL divergence gives rise to a continuum of Entropic Priors. Here we examine only the extreme cases, the 0- and 1-entropic priors, the former coinciding with the standard and convenient conjugage priors, but the latter being the unique optimizer of a simple notion of ignorance. The original contributions of this paper are:

Share

COinS