Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM
Abstract
Ling-Coder-Lite leverages Mixture-of-Experts architecture for efficient code generation with performance on par with state-of-the-art models and reduced deployment resources.
Recent advancements in code large language models (LLMs) have demonstrated remarkable capabilities in code generation and understanding. It is still challenging to build a code LLM with comprehensive performance yet ultimate efficiency. Many attempts have been released in the open source community to break the trade-off between performance and efficiency, such as the Qwen Coder series and the DeepSeek Coder series. This paper introduces yet another attempt in this area, namely Ling-Coder-Lite. We leverage the efficient Mixture-of-Experts (MoE) architecture along with a set of high-quality data curation methods (especially those based on program analytics) to build an efficient yet powerful code LLM. Ling-Coder-Lite exhibits on-par performance on 12 representative coding benchmarks compared to state-of-the-art models of similar size, such as Qwen2.5-Coder-7B and DeepSeek-Coder-V2-Lite, while offering competitive latency and throughput. In practice, we achieve a 50\% reduction in deployment resources compared to the similar-sized dense model without performance loss. To facilitate further research and development in this area, we open-source our models as well as a substantial portion of high-quality data for the annealing and post-training stages. The models and data can be accessed at~https://huggingface.co/inclusionAI/Ling-Coder-lite.
Community
The open-sourced models & datasets:
Ling-Coder-Lite: https://huggingface.co/inclusionAI/Ling-Coder-lite
Ling-Coder-Lite-base: https://huggingface.co/inclusionAI/Ling-Coder-lite-base
Ling-Coder-SyntheticQA: https://huggingface.co/datasets/inclusionAI/Ling-Coder-SyntheticQA
Ling-Coder-SFT: https://huggingface.co/datasets/inclusionAI/Ling-Coder-SFT
Models citing this paper 3
Datasets citing this paper 3
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper