metadata
language:
- en
library_name: pytorch
pipeline_tag: text-generation
tags:
- code
- deepseek
- gguf
- f32
- f16
- humaneval
license: mit
Model Card for wavecoder-ds-6.7b-GGUF
WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain.
Apologies for the incomplete model details, the GitHub repo doesn't exist and I'm currently trying to quant all the models.
Model Details
Model Description
WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.
- Developed by: Yu, Zhaojian and Zhang, Xin and Shang, Ning and Huang, Yangyu and Xu, Can and Zhao, Yishujie and Hu, Wenxiang and Yin, Qiufeng
- Model type: Large Language Model
- Language(s) (NLP): English
- License: DeepSeek License (Model)
Model Sources
- Repository: [More Information Needed]
- Paper [optional]: [More Information Needed]
- Demo [optional]: [More Information Needed]
Uses
Coding