Welcome to Journal of Graphics share: 

Journal of Graphics ›› 2025, Vol. 46 ›› Issue (4): 826-836.DOI: 10.11996/JG.j.2095-302X.2025040826

• Computer Graphics and Virtual Reality • Previous Articles     Next Articles

Multi-dimensional implicit neural compression of time-varying volume data based on damping activation functions

GUO Rong(), JIAO Chenyue, GAO Xin, DENG Jiakang, TIAN Xiaoxian, BI Chongke()   

  1. Department of Intelligence and Computing, Tianjin University, Tianjin 300354, China
  • Received:2024-10-22 Revised:2024-12-27 Online:2025-08-30 Published:2025-08-11
  • Contact: BI Chongke
  • About author:First author contact:

    GUO Rong (2001-), master student. Her main research interest covers scientific visualization. E-mail:guorong_sx_sd@hotmail.com

  • Supported by:
    National Natural Science Foundation of China(62172294)

Abstract:

Time-varying volume data from large-scale numerical simulations holds significant value in scientific research, but its vast size poses severe challenges to I/O bandwidth and disk storage, hindering the efficiency of data visualization and analysis. Traditional lossy compression techniques tend to lose important features at high compression ratios. While deep learning models have shown good performance in compressing volumetric data, they typically require access to the entire dataset before compression, which presents certain limitations. Implicit neural representations (INRs) have emerged as a powerful tool for compressing large-scale volumetric data due to their versatility. However, existing methods primarily rely on a single large multi layer perceptron (MLP) to encode global data, resulting in slow training and inference, and difficulty in accurately capturing high-frequency information. To address these issues, a uniform partitioning implicit neural representation method based on a damping function was proposed for efficient compression of large-scale time-varying volume data. First, a uniform partitioning strategy was applied, where multiple MLPs were used to fit local data separately. By equalizing partition sizes, balanced parallelization among the MLPs was achieved, significantly improving the efficiency of both training and inference. Second, a damping function was introduced as the activation function to overcome spectral bias, enabling the capture of high-frequency information without the need for extra positional encoding. Finally, adjacent cell information replication was used to solve boundary artifacts from partitioning. Experimental results showed that this method achieved higher compression efficiency and finer detail preservation across several time-varying volume datasets, demonstrating superior compression performance.

Key words: implicit neural representation, volumetric data compression, damping function, scientific visualization, spectral bias

CLC Number: