Today's Hours: 8:00am - 6:00pm

Search

Filter Applied Clear All

Did You Mean:

Search Results

  • Article
    Zeng X, Guo Y, Li L, Liu Y.
    Comput Biol Med. 2024 Sep;179:108914.
    BACKGROUND: When multiple tasks are learned consecutively, the old model parameters may be overwritten by the new data, resulting in the phenomenon that the new task is learned and the old task is forgotten, which leads to catastrophic forgetting. Moreover, continual learning has no mature solution for image denoising tasks.
    METHODS: Therefore, in order to solve the problem of catastrophic forgetting caused by learning multiple denoising tasks, we propose a Triplet Neural-networks Collaboration-continuity DeNosing (TNCDN) model. Use triplet neural networks to update each other cooperatively. The knowledge from two denoising networks that maintain continual learning capability is transferred to the main-denoising network. The main-denoising network has new knowledge and can consolidate old knowledge. A co-training mechanism is designed. The main-denoising network updates the other two denoising networks with different thresholds to maintain memory reinforcement capability and knowledge extension capability.
    RESULTS: The experimental results show that our method effectively alleviates catastrophic forgetting. In GS, CT and ADNI datasets, compared with ANCL, the TNCDN(PromptIR) method reduced the average degree of forgetting on the evaluation index PSNR by 2.38 (39%) and RMSE by 1.63 (55%).
    CONCLUSION: This study aims to solve the problem of catastrophic forgetting caused by learning multiple denoising tasks. Although the experimental results are promising, extending the basic denoising model to more data sets and tasks will enhance its application. Nevertheless, this study is a starting point, which can provide reference and support for the further development of continuous learning image denoising task.
    Digital Access Access Options