Penalization, bias reduction, and default priors in logistic and related categorical and survival regressions

Stat Med. 2015 Oct 15;34(23):3133-43. doi: 10.1002/sim.6537. Epub 2015 May 26.

Abstract

Penalization is a very general method of stabilizing or regularizing estimates, which has both frequentist and Bayesian rationales. We consider some questions that arise when considering alternative penalties for logistic regression and related models. The most widely programmed penalty appears to be the Firth small-sample bias-reduction method (albeit with small differences among implementations and the results they provide), which corresponds to using the log density of the Jeffreys invariant prior distribution as a penalty function. The latter representation raises some serious contextual objections to the Firth reduction, which also apply to alternative penalties based on t-distributions (including Cauchy priors). Taking simplicity of implementation and interpretation as our chief criteria, we propose that the log-F(1,1) prior provides a better default penalty than other proposals. Penalization based on more general log-F priors is trivial to implement and facilitates mean-squared error reduction and sensitivity analyses of penalty strength by varying the number of prior degrees of freedom. We caution however against penalization of intercepts, which are unduly sensitive to covariate coding and design idiosyncrasies.

Keywords: Bayes estimators; Firth bias reduction; Jeffreys prior; bias correction; logistic regression; maximum likelihood; penalized likelihood; regularization; shrinkage; sparse data; stabilization.

MeSH terms

  • Bayes Theorem
  • Bias*
  • Case-Control Studies*
  • Computer Simulation
  • Data Interpretation, Statistical*
  • Humans
  • Infant, Newborn
  • Likelihood Functions
  • Logistic Models*
  • Perinatal Death / prevention & control
  • Risk Factors