ISSTA 2024
Mon 16 - Fri 20 September 2024 Vienna, Austria
co-located with ISSTA/ECOOP 2024
Wed 18 Sep 2024 11:10 - 11:30 at EI 7 - LLMs for Code Generation Chair(s): Chao Peng

The rise of code pre-trained models has significantly enhanced various coding tasks, such as code completion, and tools like GitHub Copilot. However, the substantial size of these models, especially large models, poses a significant challenge when it comes to fine-tuning them for specific downstream tasks. As an alternative approach, retrieval-based methods have emerged as a promising solution, augmenting model predictions without the need for fine-tuning.

Despite their potential, a significant challenge is that the designs of these methods often rely on heuristics, leaving critical questions about what information should be stored or retrieved and how to interpolate such information for augmenting predictions.

To tackle this challenge, we first perform a theoretical analysis of the fine-tuning process, highlighting the importance of delta logits as a catalyst for improving model predictions. Building on this insight, we develop a novel retrieval-based method, FT2Ra, which aims to mimic genuine fine-tuning. While FT2Ra adopts a retrieval-based mechanism, it uniquely adopts a paradigm with a learning rate and multi-epoch retrievals, which is similar to fine-tuning.

We conducted a comprehensive evaluation of FT2Ra in both token-level and line-level code completions. Our findings demonstrate the remarkable effectiveness of FT2Ra when compared to state-of-the-art methods and its potential to genuine fine-tuning.

In token-level completion, which represents a relatively easier task, FT2Ra achieves a 4.29% improvement in accuracy compared to the best baseline method on UniXcoder. In the more challenging line-level completion task, we observe a substantial more than twice increase in Exact Match (EM) performance, indicating the significant advantages of our theoretical analysis. Notably, even when operating without actual fine-tuning, FT2Ra exhibits competitive performance compared to the models with real fine-tuning.

Wed 18 Sep

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

10:30 - 11:50
LLMs for Code GenerationTechnical Papers at EI 7
Chair(s): Chao Peng ByteDance
10:30
20m
Talk
AI Coders Are among Us: Rethinking Programming Language Grammar towards Efficient Code GenerationACM SIGSOFT Distinguished Paper Award
Technical Papers
Zhensu Sun Singapore Management University, Xiaoning Du Monash University, Zhou Yang Singapore Management University, Li Li Beihang University, David Lo Singapore Management University
DOI Pre-print
10:50
20m
Talk
When to Stop? Towards Efficient Code Generation in LLMs with Excess Token PreventionACM SIGSOFT Distinguished Paper Award
Technical Papers
Lianghong Guo Sun Yat-sen University, Yanlin Wang Sun Yat-sen University, Ensheng Shi Xi’an Jiaotong University, Wanjun Zhong Sun Yat-sen University, Hongyu Zhang Chongqing University, Jiachi Chen Sun Yat-sen University, Ruikai Zhang Huawei Cloud Computing Technologies, Yuchi Ma Huawei Cloud Computing Technologies, Zibin Zheng Sun Yat-sen University
DOI
11:10
20m
Talk
FT2Ra: A Fine-Tuning-Inspired Approach to Retrieval-Augmented Code Completion
Technical Papers
Qi Guo Tianjin University, Xiaohong Li Tianjin University, Xiaofei Xie Singapore Management University, Shangqing Liu Nanyang Technological University, Ze Tang Nanjing University, Ruitao Feng Singapore Management University, Junjie Wang Tianjin University, Jidong Ge Nanjing University, Lei Bu Nanjing University
DOI
11:30
20m
Talk
Calico: Automated Knowledge Calibration and Diagnosis for Elevating AI Mastery in Code Tasks
Technical Papers
Yuxin Qiu University of California at Riverside, Jie Hu University of California at Riverside, Qian Zhang University of California at Riverside, Heng Yin University of California at Riverside
DOI

Information for Participants
Wed 18 Sep 2024 10:30 - 11:50 at EI 7 - LLMs for Code Generation Chair(s): Chao Peng
Info for room EI 7:

Map: https://tuw-maps.tuwien.ac.at/?q=CDEG13

Room tech: https://raumkatalog.tiss.tuwien.ac.at/room/15417