nextai-team commited on
Commit
b8c5f06
·
verified ·
1 Parent(s): 44d69fa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -18,7 +18,7 @@ tags:
18
 
19
  Model Details
20
 
21
- Model Name: Moe-4x7b-QA-Code-Inst Publisher: nextai-team Model Type: Question Answering & Code Generation Architecture: Mixture of Experts (MoE) Model Size: 4x7 billion parameters
22
 
23
  Overview
24
 
@@ -34,20 +34,20 @@ Model Architecture employs a Mixture of Experts (MoE) architecture, which allow
34
 
35
  Training Data The model has been trained on a diverse and extensive corpus comprising technical documentation, open-source code repositories, Stack Overflow questions and answers, and other programming-related texts. Special attention has been given to ensure a wide range of programming languages and frameworks are represented in the training data to enhance the model's versatility.
36
 
37
- Performance Moe-4x7b-QA-Code-Inst demonstrates significant improvements in accuracy and relevance over its predecessor, particularly in complex coding scenarios and detailed technical queries. Benchmarks and performance metrics can be provided upon request.
38
 
39
  Limitations and Biases
40
 
41
- While Moe-4x7b-QA-Code-Inst represents a leap forward in AI-assisted coding and technical Q&A, it is not without limitations. The model may exhibit biases present in its training data, and its performance can vary based on the specificity and context of the input queries. Users are encouraged to critically assess the model's output and consider it as one of several tools in the decision-making process.
42
 
43
  Ethical Considerations
44
 
45
- We are committed to ethical AI development and urge users to employ Moe-4x7b-QA-Code-Inst responsibly. This includes but is not limited to avoiding the generation of harmful or unsafe code, respecting copyright and intellectual property rights, and being mindful of privacy concerns when inputting sensitive information into the model.
46
 
47
  Usage Instructions
48
 
49
- For detailed instructions on how to integrate and utilize Moe-4x7b-QA-Code-Inst in your projects, please refer to our GitHub repository and Hugging Face documentation.
50
 
51
- Citation If you use Moe-4x7b-QA-Code-Inst in your research or application, please cite it as follows:
52
 
53
- @misc{nextai2024moe4x7b, title={Moe-4x7b-QA-Code-Inst: Enhancing Question Answering and Code Generation with Mixture of Experts}, author={NextAI Team}, year={2024}, publisher={Hugging Face} }
 
18
 
19
  Model Details
20
 
21
+ Model Name: Moe-4x7b-reason-code-qa Publisher: nextai-team Model Type: Question Answering & Code Generation Architecture: Mixture of Experts (MoE) Model Size: 4x7 billion parameters
22
 
23
  Overview
24
 
 
34
 
35
  Training Data The model has been trained on a diverse and extensive corpus comprising technical documentation, open-source code repositories, Stack Overflow questions and answers, and other programming-related texts. Special attention has been given to ensure a wide range of programming languages and frameworks are represented in the training data to enhance the model's versatility.
36
 
37
+ Performance demonstrates significant improvements in accuracy and relevance over its predecessor, particularly in complex coding scenarios and detailed technical queries. Benchmarks and performance metrics can be provided upon request.
38
 
39
  Limitations and Biases
40
 
41
+ While represents a leap forward in AI-assisted coding and technical Q&A, it is not without limitations. The model may exhibit biases present in its training data, and its performance can vary based on the specificity and context of the input queries. Users are encouraged to critically assess the model's output and consider it as one of several tools in the decision-making process.
42
 
43
  Ethical Considerations
44
 
45
+ We are committed to ethical AI development and urge users to employ Moe-4x7b-reason-code-qa responsibly. This includes but is not limited to avoiding the generation of harmful or unsafe code, respecting copyright and intellectual property rights, and being mindful of privacy concerns when inputting sensitive information into the model.
46
 
47
  Usage Instructions
48
 
49
+ For detailed instructions on how to integrate and utilize Moe-4x7b-reason-code-qa in your projects, please refer to our GitHub repository and Hugging Face documentation.
50
 
51
+ Citation If you use Moe-4x7b-reason-code-qa in your research or application, please cite it as follows:
52
 
53
+ @misc{nextai2024moe4x7b, title={Moe-4x7b-reason-code-qa: Enhancing Question Answering and Code Generation with Mixture of Experts}, author={NextAI Team}, year={2024}, publisher={Hugging Face} }