Cyrus Cousins

Adversarial Multi Class Learning under Weak Supervision with Performance Guarantees

Adversarial Multi Class Learning under Weak Supervision with Performance Guarantees

Alessio Mazzetto, Cyrus Cousins, Dylan Sam, Stephen Bach, and Eli Upfal

Abstract

We develop a rigorous approach for using a set of arbitrarily correlated weak supervision sources in order to solve a multiclass classification task when only a very small set of labeled data is available. Our learning algorithm provably converges to a model that has minimum empirical risk with respect to an adversarial choice over feasible labelings for a set of unlabeled data, where the feasibility of a labeling is computed through constraints defined by rigorously estimated statistics of the weak supervision sources. We show theoretical guarantees for this approach that depend on the information provided by the weak supervision sources. Notably, this method does not require the weak supervision sources to have the same labeling space as the multiclass classification task. We demonstrate the effectiveness of our approach with experiments on various image classification tasks.

Keywords

adversarial learning ♦ Semisupervised Learning ♣ weak supervision ♥ computational learning theory ♠ statistical learning theory

Read the full paper



Materials

Slides Video


Check out our Slides!



undefined