Efficient DNN-Powered Software with Fair Sparse Models
With the emergence of the Software 3.0 era, there is a growing trend of compressing and integrating large models into software systems, with significant societal implications.
Regrettably, in numerous instances, model compression techniques impact the fairness performance of these models and thus the ethical behavior of DNN-powered software.
One of the most notable example is the Lottery Ticket Hypothesis~(LTH), a prevailing model pruning approach.
This paper demonstrates that fairness issue of LTH-based pruning arises from both its subnetwork selection and training procedures, highlighting the inadequacy of existing remedies.
To address this, we propose a novel pruning framework, Ballot, which employs a novel conflict-detection-based subnetwork selection to find accurate and fair subnetworks, coupled with a refined training process to attain a high-performance model, thereby improving the fairness of DNN-powered software.
By means of this procedure, Ballot improves the fairness of pruning by 38.00%, 33.91%, 17.96%, and 35.82% compared to state-of-the-art baselines, namely Magnitude Pruning, Standard LTH, SafeCompress, and FairScratch respectively, based on our evaluation of five popular datasets and three widely used models.
Our code is available at \url{https://anonymous.4open.science/r/Ballot-506E}.
Wed 18 SepDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
13:30 - 14:50 | Fairness and Safety of Neural NetworksTechnical Papers at EI 7 Chair(s): Jingyi Wang Zhejiang University | ||
13:30 20mTalk | NeuFair: Neural Network Fairness Repair with Dropout Technical Papers Vishnu Asutosh Dasu Pennsylvania State University, Ashish Kumar Pennsylvania State University, Saeid Tizpaz-Niari University of Texas at El Paso, Gang (Gary) Tan Pennsylvania State University DOI | ||
13:50 20mTalk | A Large-Scale Empirical Study on Improving the Fairness of Image Classification Models Technical Papers Junjie Yang College of Intelligence and Computing, Tianjin University, Jiajun Jiang Tianjin University, Zeyu Sun Institute of Software at Chinese Academy of Sciences, Junjie Chen Tianjin University DOI | ||
14:10 20mTalk | Efficient DNN-Powered Software with Fair Sparse Models Technical Papers Xuanqi Gao Xi’an Jiaotong University, Weipeng Jiang Xi’an Jiaotong University, Juan Zhai University of Massachusetts at Amherst, Shiqing Ma University of Massachusetts at Amherst, Xiaoyu Zhang Xi’an Jiaotong University, Chao Shen Xi’an Jiaotong University DOI Pre-print | ||
14:30 20mTalk | Synthesizing Boxes Preconditions for Deep Neural Networks Technical Papers Zengyu Liu National University of Defense Technology, Liqian Chen National University of Defense Technology, Wanwei Liu National University of Defense Technology, Ji Wang National University of Defense Technology DOI |