Penalizing small errors using an adaptive logarithmic loss

Document Type

Conference Proceeding

Publication Title

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract

Loss functions are error metrics that quantify the difference between a prediction and its corresponding ground truth. Fundamentally, they define a functional landscape for traversal by gradient descent. Although numerous loss functions have been proposed to date in order to handle various machine learning problems, little attention has been given to enhancing these functions to better traverse the loss landscape. In this paper, we simultaneously and significantly mitigate two prominent problems in medical image segmentation namely: i) class imbalance between foreground and background pixels and ii) poor loss function convergence. To this end, we propose an Adaptive Logarithmic Loss (ALL) function. We compare this loss function with the existing state-of-the-art on the ISIC 2018 dataset, the nuclei segmentation dataset as well as the DRIVE retinal vessel segmentation dataset. We measure the performance of our methodology on benchmark metrics and demonstrate state-of-the-art performance. More generally, we show that our system can be used as a framework for better training of deep neural networks.

First Page

368

Last Page

375

DOI

10.1007/978-3-030-68763-2_28

Publication Date

2-21-2021

Keywords

Class imbalance, FocusNet, Loss functions, Semantic segmentation, U-Net

Comments

IR Deposit conditions:

  • OA version (pathway a)
  • Accepted version 12 month embargo
  • Must link to published article
  • Set statement to accompany deposit

Share

COinS