Automatic schelling points detection from meshes

Document Type


Publication Title

IEEE Transactions on Visualization and Computer Graphics


Mesh Schelling points explain how humans focus on specific regions of a 3D object. They have a large number of important applications in computer graphics and provide valuable information for perceptual psychology studies. However, detecting mesh Schelling points is time-consuming and expensive since the existing techniques are mostly based on participant observation studies. To overcome these limitations, we propose to employ powerful deep learning techniques to detect mesh Schelling points in an automatic manner, free from participant observation studies. Specifically, we utilize the mesh convolution and pooling operations to extract informative features from mesh objects, and then predict the 3D heat map of Schelling points in an end-to-end manner. In addition, we propose a Deep Schelling Network (DS-Net) to automatically detect the Schelling points, including a multi-scale fusion component and a novel region-specific loss function to improve our network for a better regression of heat maps. To the best of our knowledge, DS-Net is the first deep neural network for detecting Schelling points from 3D meshes. We evaluate DS-Net on a mesh Schelling point dataset obtained from participant observation studies. The experimental results demonstrate that DS-Net is capable of detecting mesh Schelling points effectively and outperforms various state-of-the-art mesh saliency methods and deep learning models, both qualitatively and quantitatively.

First Page


Last Page




Publication Date



Deep learning, Deep Neural Network, Feature extraction, Geometric Deep Learning, Heat Map Regression, Heating systems, Image edge detection, Mesh Schelling Points, Point cloud compression, Shape, Three-dimensional displays


IR Deposit conditions:

  • OA version (pathway a)
  • Accepted version: No embargo
  • When accepted for publication, set statement to accompany deposit (see policy)
  • Must link to publisher version with DOI
  • Publisher copyright and source must be acknowledged