Certified Continual Learning for Neural Network Regression
On the one hand, there has been considerable progress on neural network verification in recent years, which makes certifying neural networks a possibility. On the other hand, neural networks in practice are often re-trained over time to cope with new data distribution or for solving different tasks (a.k.a. continual learning). Once re-trained, the verified correctness of the neural network is likely broken, particularly in the presence of the phenomenon known as catastrophic forgetting. In this work, we propose an approach called certified continual learning which improves existing continual learning methods by preserving, as long as possible, the established correctness properties of a verified network. Our approach is evaluated with multiple neural networks and on two different continual learning methods. The results show that our approach is efficient and the trained models preserve their certified correctness and often maintain high utility.
Wed 18 SepDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
15:30 - 17:10 | Testing and Repairing Neural NetworksTechnical Papers at EI 9 Hlawka Chair(s): Mike Papadakis University of Luxembourg | ||
15:30 20mTalk | Interoperability in Deep Learning: A User Survey and Failure Analysis of ONNX Model Converters Technical Papers Purvish Jajal Purdue University, Wenxin Jiang Purdue University, Arav Tewari Purdue University, Erik Kocinare Purdue University, Joseph Woo Purdue University, Anusha Sarraf Purdue University, Yung-Hsiang Lu Purdue University, George K. Thiruvathukal Loyola University Chicago, James C. Davis Purdue University DOI Pre-print | ||
15:50 20mTalk | Interpretability Based Neural Network Repair Technical Papers Zuohui Chen Zhejiang University of Technology; Binjiang Institute of Artificial Intelligence, Jun Zhou Zhejiang University of Technology; Binjiang Institute of Artificial Intelligence, Youcheng Sun University of Manchester, Jingyi Wang Zhejiang University, Qi Xuan Zhejiang University of Technology; Binjiang Institute of Artificial Intelligence, Xiaoniu Yang Zhejiang University of Technology; National Key Laboratory of Electromagnetic Space Security DOI | ||
16:10 20mTalk | See the Forest, not Trees: Unveiling and Escaping the Pitfalls of Error-Triggering Inputs in Neural Network Testing Technical Papers Yuanyuan Yuan Hong Kong University of Science and Technology, Shuai Wang Hong Kong University of Science and Technology, Zhendong Su ETH Zurich DOI | ||
16:30 20mTalk | Isolation-Based Debugging for Neural Networks Technical Papers Jialuo Chen Zhejiang University, Jingyi Wang Zhejiang University, Youcheng Sun University of Manchester, Peng Cheng Zhejiang University, Jiming Chen Zhejiang University; Hangzhou Dianzi University DOI | ||
16:50 20mTalk | Certified Continual Learning for Neural Network Regression Technical Papers DOI |