Depending on the amount of data to process, file generation may take longer.

If it takes too long to generate, you can limit the data by, for example, reducing the range of years.

Chapter

Download BibTeX

Title

Regret Bounds for Multilabel Classification in Sparse Label Regimes

Authors

[ 1 ] Instytut Informatyki, Wydział Informatyki i Telekomunikacji, Politechnika Poznańska | [ P ] employee

Scientific discipline (Law 2.0)

[2.3] Information and communication technology

Year of publication

2022

Chapter type

chapter in monograph / paper

Publication language

english

Abstract

EN Multi-label classification (MLC) has wide practical importance, but the theoretical understanding of its statistical properties is still limited. As an attempt to fill this gap, we thoroughly study upper and lower regret bounds for two canonical MLC performance measures, Hamming loss and Precision@κ. We consider two different statistical and algorithmic settings, a non-parametric setting tackled by plug-in classifiers à la k-nearest neighbors, and a parametric one tackled by empirical risk minimization operating on surrogate loss functions. For both, we analyze the interplay between a natural MLC variant of the low noise assumption, widely studied in binary classification, and the label sparsity, the latter being a natural property of large-scale MLC problems. We show that those conditions are crucial in improving the bounds, but the way they are tangled is not obvious, and also different across the two settings.

URL

https://proceedings.neurips.cc/paper_files/paper/2022/file/240d297094fc76d1e7aa27b01f221b00-Paper-Conference.pdf

Book

Advances in Neural Information Processing Systems 35 (NeurIPS 2022)

Presented on

36th Conference on Neural Information Processing Systems (NeurIPS 2022), 29.11.2022 - 01.12.2023, New Orleans, United States

Open Access Mode

publisher's website

Open Access Text Version

final published version

Date of Open Access to the publication

at the time of publication

Ministry points / chapter

5

Ministry points / conference (CORE)

200

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.