Generalization Bounds for Deep Transfer Learning Using Majority Predictor Accuracy

Cuong N. Nguyen, Lam Si Tung Ho, Vu Dinh, Tal Hassner, Cuong V. Nguyen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We analyze new generalization bounds for deep learning models trained by transfer learning from a source to a target task. Our bounds utilize a quantity called the majority predictor accuracy, which can be computed efficiently from data. We show that our theory is useful in practice since it implies that the majority predictor accuracy can be used as a transferability measure, a fact that is also validated by our experiments.

Original languageEnglish
Title of host publicationProceedings of 2022 International Symposium on Information Theory and Its Applications, ISITA 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages139-143
Number of pages5
ISBN (Electronic)9784885523410
StatePublished - 2022
Externally publishedYes
Event17th International Symposium on Information Theory and Its Applications, ISITA 2022 - Tsukuba, Ibaraki, Japan
Duration: 17 Oct 202219 Oct 2022

Publication series

NameProceedings of 2022 International Symposium on Information Theory and Its Applications, ISITA 2022

Conference

Conference17th International Symposium on Information Theory and Its Applications, ISITA 2022
Country/TerritoryJapan
CityTsukuba, Ibaraki
Period17/10/2219/10/22

Bibliographical note

Publisher Copyright:
© 2022 IEICE.

Fingerprint

Dive into the research topics of 'Generalization Bounds for Deep Transfer Learning Using Majority Predictor Accuracy'. Together they form a unique fingerprint.

Cite this