Cyrus Cousins


Bounds and Applications of Concentration of Measure in Fair Machine Learning and Data Science

Bounds and Applications of Concentration of Measure in Fair Machine Learning and Data Science

Cyrus Cousins

Abstract

I introduce novel concentration-of-measure bounds for the supremum deviation, several variance concepts, and a family of game-theoretic welfare functions. I then apply these bounds to analyze machine learning and data science problems, with deep theoretical consequences and impactful practical ramifications. Parts I and II assume an independently and identically distributed (i.i.d.) setting, where we observe many statistically independent occurrences, and part III extends some of my methods and themes to study non-i.i.d. mean estimation problems. Naturally, some conclusions are weaker in these relaxed settings, but I find that many of the same data-dependent and variance-sensitivity themes apply, and give efficient algorithms for real-world problems where the i.i.d. assumption is prohibitive.

This thesis is split into three parts, each representing a significant advancement in its respective field; the first in statistical learning theory, with an emphasis on algorithms and generalization guarantees making efficient use of training data; the second in the axiomatic underpinnings, practice, and statistical elements of fair machine learning; and the third in showing that themes and bounds from sampling problems in standard i.i.d. settings translate well into non-i.i.d. settings. Special attention is paid to keep each part independently readable, however the reader will better appreciate thematic connections, applications, and technical synergy when they are considered as a whole.

Concretely, I note that the methods of parts I and III are not mutually incompatible, opening the door to strong uniform convergence bounds in non-i.i.d. settings, and furthermore, the fair learning setting of part II benefits from the statistical bounds of part I, and similar analysis is possible in non-i.i.d. settings. Indeed, in a connected and interdependent world, it may be the case that practical fair systems need to consider the intricacies of non-i.i.d. learning. Ultimately, the importance of philosophically grounded and statistically rigorous fair-learning systems, operating on real-world data with all the messy dependence structures that may entail, seems to be not only of immense academic interest, but also a necessary step in joining the theory and practice of machine learning and its societal impact.

Keywords

Statistical Learning Theory ♦ Fair Machine Learning ♣ Markov-Chain Monte-Carlo ♥ Statistical Data Science ♠ Probabilistic Analysis

  1. Read my dissertation.
  2. Read my thesis proposal.


Overview

Three Parts of a Whole




News

  • My dissertation was awarded the 2021 Joukowsky Outstanding Dissertation Prize in the physical sciences.
  • On March 25, 2021, I successfully defended my thesis.
  • On November 6, 2020, I passed my thesis proposal presentation.

Thesis Publications

  • C. Cousins, M. Riondato. Sharp uniform convergence bounds through empirical centralization, NeurIPS 2020.
  • A. Mazzetto, C. Cousins, D. Sam, S. Bach, E. Upfal. Adversarial Multi Class Learning under Weak Supervision with Performance Guarantees, ICML 2021.
  • C. Cousins. An axiomatic theory of provably-fair welfare-centric machine learning, Neurips 2021.

The remaining chapters are awaiting publication, and more information will be available soon. In the meantime, the entire thesis is publicly available.