Locality Sensitive Teaching

Document Type

Conference Proceeding

Publication Title

Advances in Neural Information Processing Systems

Abstract

The emergence of the Internet-of-Things (IoT) sheds light on applying the machine teaching (MT) algorithms for online personalized education on home devices. This direction becomes more promising during the COVID-19 pandemic when in-person education becomes infeasible. However, as one of the most influential and practical MT paradigms, iterative machine teaching (IMT) is prohibited on IoT devices due to its inefficient and unscalable algorithms. IMT is a paradigm where a teacher feeds examples iteratively and intelligently based on the learner's status. In each iteration, current IMT algorithms greedily traverse the whole training set to find an example for the learner, which is computationally expensive in practice. We propose a novel teaching framework, Locality Sensitive Teaching (LST), based on locality sensitive sampling, to overcome these challenges. LST has provable near-constant time complexity, which is exponentially better than the existing baseline. With at most 425.12× speedups and 99.76% energy savings over IMT, LST is the first algorithm that enables energy and time efficient machine teaching on IoT devices. Owing to LST's substantial efficiency and scalability, it is readily applicable in real-world education scenarios. © 2021 Neural information processing systems foundation. All rights reserved.

First Page

18049

Last Page

18062

Publication Date

12-6-2021

Keywords

Energy conservation, Iterative methods, Learning systems, 'current, Constant time complexity, Energy savings, Energy-savings, Home devices, Locality sensitives, Teachers', Teaching algorithms, Teaching paradigm, Training sets

Comments

IR Deposit conditions: non-described

Proceedings available in NeurIPS 2021

Share

COinS