Registration is open - make plans to attend
Get pricing and details
>
Conference 12033 > Paper 12033-123
Paper 12033-123

Contrastive learning meets transfer learning: a case study in medical image analysis

In person: 23 February 2022 • 5:30 PM - 7:00 PM PST

Abstract

Annotated medical images are typically rarer than labeled natural images since they are limited by domain knowledge and privacy constraints. Recent advances in transfer and contrastive learning have provided effective solutions to tackle such issues from different perspectives. The state-of-the-art transfer learning (e.g., Big Transfer (BiT)) and contrastive learning (e.g., Simple Siamese Contrastive Learning (SimSiam)) approaches have been investigated independently, without considering the complementary nature of such techniques. It would be appealing to accelerate contrastive learning with transfer learning, given that slow convergence speed is a critical limitation of modern contrastive learning approaches. In this paper, we investigate the feasibility of aligning BiT with SimSiam. The results suggest that the BiT models accelerate the convergence speed of SimSiam. When used together, the model gives superior performance over both of its counterparts.

Presenter

Vanderbilt Univ. (United States)
Undergraduate student at Vanderbilt University.
Presenter/Author
Vanderbilt Univ. (United States)
Author
Aadarsh Jha
Vanderbilt Univ. (United States)
Author
Vanderbilt Univ. (United States)