zR commited on
Commit
37d58a3
1 Parent(s): 9e62f54
Files changed (6) hide show
  1. .mdl +0 -0
  2. .msc +0 -0
  3. .mv +0 -1
  4. LICENSE +70 -0
  5. README.md +61 -5
  6. README_zh.md +41 -0
.mdl DELETED
Binary file (47 Bytes)
 
.msc DELETED
Binary file (1.1 kB)
 
.mv DELETED
@@ -1 +0,0 @@
1
- Revision:master,CreatedAt:1720144586
 
 
LICENSE CHANGED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ The CodeGeeX4 License
2
+
3
+ 1. 定义
4
+
5
+ “许可方”是指分发其软件的 CodeGeeX 团队。
6
+ “软件”是指根据本许可提供的 CodeGeeX4 模型参数。
7
+
8
+ 2. 许可授予
9
+
10
+ 根据本许可的条款和条件,许可方特此授予您非排他性、全球性、不可转让、不可再许可、可撤销、免版税的版权许可。
11
+ 本许可允许您免费使用本仓库中的所有开源模型进行学术研究,对于希望将模型用于商业目的的用户,需在[这里](https://open.bigmodel.cn/mla/form)完成登记。经过登记的用户可以免费使用本模型进行商业活动,但必须遵守本许可的所有条款和条件。
12
+ 上述版权声明和本许可声明应包含在本软件的所有副本或重要部分中。
13
+ 如果您分发或提供 THUDM / 智谱AI 关于 CodeGeeX4 开源模型的材料(或其任何衍生作品),或使用其中任何材料(包括 CodeGeeX4 系列的所有开源模型)的产品或服务,您应:
14
+
15
+ (A) 随任何此类 THUDM / 智谱AI 材料提供本协议的副本;
16
+ (B) 在相关网站、用户界面、博客文章、关于页面或产品文档上突出显示 “Built with CodeGeeX4”。
17
+ 如果您使用 THUDM / 智谱AI的 CodeGeeX4 开源模型的材料来创建、训练、微调或以其他方式改进已分发或可用的 AI 模型,您还应在任何此类 AI 模型名称的开头添加 “CodeGeeX4”。
18
+
19
+ 3. 限制
20
+
21
+ 您不得出于任何军事或非法目的使用、复制、修改、合并、发布、分发、复制或创建本软件的全部或部分衍生作品。
22
+ 您不得利用本软件从事任何危害国家安全和国家统一,危害社会公共利益及公序良俗,侵犯他人商业秘密、知识产权、名誉权、肖像权、财产权等权益的行为。
23
+ 您在使用中应遵循使用地所适用的法律法规政策、道德规范等要求。
24
+
25
+ 4. 免责声明
26
+
27
+ 本软件“按原样”提供,不提供任何明示或暗示的保证,包括但不限于对适销性、特定用途的适用性和非侵权性的保证。
28
+ 在任何情况下,作者或版权持有人均不对任何索赔、损害或其他责任负责,无论是在合同诉讼、侵权行为还是其他方面,由软件或软件的使用或其他交易引起、由软件引起或与之相关
29
+ 软件。
30
+
31
+ 5. 责任限制
32
+
33
+ 除适用法律禁止的范围外,在任何情况下且根据任何法律理论,无论是基于侵权行为、疏忽、合同、责任或其他原因,任何许可方均不对您承担任何直接、间接、特殊、偶然、示范性、
34
+ 或间接损害,或任何其他商业损失,即使许可人已被告知此类损害的可能性。
35
+
36
+ 6. 争议解决
37
+
38
+ 本许可受中华人民共和国法律管辖并按其解释。 因本许可引起的或与本许可有关的任何争议应提交北京市海淀区人民法院。
39
+ 请注意,许可证可能会更新到更全面的版本。 有关许可和版权的任何问题,请通过 license@zhipuai.cn 与我们联系。
40
+ 1. Definitions
41
+ “Licensor” means the CodeGeeX Team that distributes its Software.
42
+ “Software” means the CodeGeeX4 model parameters made available under this license.
43
+ 2. License
44
+ Under the terms and conditions of this license, the Licensor hereby grants you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty-free copyright license.
45
+ This license allows you to use all open source models in this repository for free for academic research. For users who wish to use the models for commercial purposes, please do so [here](https://open.bigmodel.cn/mla/form)
46
+ Complete registration. Registered users are free to use this model for commercial activities, but must comply with all terms and conditions of this license.
47
+ The copyright notice and this license notice shall be included in all copies or substantial portions of the Software.
48
+ If you distribute or provide THUDM / Zhipu AI materials on the CodeGeeX4 open source model (or any derivative works thereof), or products or services that use any materials therein (including all open source models of the CodeGeeX4 series), you should:
49
+ (A) Provide a copy of this Agreement with any such THUDM/Zhipu AI Materials;
50
+ (B) Prominently display "Built with CodeGeeX4" on the relevant website, user interface, blog post, related page or product documentation.
51
+ If you use materials from THUDM/Zhipu AI's CodeGeeX4 model to create, train, operate, or otherwise improve assigned or available AI models, you should also add "CodeGeeX4" to the beginning of any such AI model name.
52
+ 3. Restrictions
53
+ You are not allowed to use, copy, modify, merge, publish, distribute, copy or create all or part of the derivative works of this software for any military or illegal purposes.
54
+ You are not allowed to use this software to engage in any behavior that endangers national security and unity, endangers social public interests and public order, infringes on the rights and interests of others such as trade secrets, intellectual property rights, reputation rights, portrait rights, and property rights.
55
+ You should comply with the applicable laws, regulations, policies, ethical standards, and other requirements in the place of use during use.
56
+ 4. Disclaimer
57
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
58
+ WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
59
+ COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
60
+ OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
61
+ 5. Limitation of Liability
62
+ EXCEPT TO THE EXTENT PROHIBITED BY APPLICABLE LAW, IN NO EVENT AND UNDER NO LEGAL THEORY, WHETHER BASED IN TORT,
63
+ NEGLIGENCE, CONTRACT, LIABILITY, OR OTHERWISE WILL ANY LICENSOR BE LIABLE TO YOU FOR ANY DIRECT, INDIRECT, SPECIAL,
64
+ INCIDENTAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES, OR ANY OTHER COMMERCIAL LOSSES, EVEN IF THE LICENSOR HAS BEEN ADVISED
65
+ OF THE POSSIBILITY OF SUCH DAMAGES.
66
+ 6. Dispute Resolution
67
+ This license shall be governed and construed in accordance with the laws of People’s Republic of China. Any dispute
68
+ arising from or in connection with this License shall be submitted to Haidian District People's Court in Beijing.
69
+ Note that the license is subject to update to a more comprehensive version. For any questions related to the license and
70
+ copyright, please contact us at license@zhipuai.cn.
README.md CHANGED
@@ -1,5 +1,61 @@
1
- ---
2
- license: other
3
- license_name: codegeex4
4
- license_link: LICENSE
5
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: glm-4
4
+ license_link: https://huggingface.co/THUDM/codegeex4-all-9b/LICENSE
5
+ language:
6
+ - zh
7
+ - en
8
+ tags:
9
+ - glm
10
+ - codegeex
11
+ - thudm
12
+ inference: false
13
+ pipeline_tag: text-generation
14
+ ---
15
+
16
+ # CodeGeeX4: Open Multilingual Code Generation Model
17
+
18
+ [中文](./README_zh.md)
19
+
20
+ We introduce CodeGeeX4-ALL-9B, the open-source version of the latest CodeGeeX4 model series. It is a multilingual code generation model continually trained on the [GLM-4-9B](https://github.com/THUDM/GLM-4), significantly enhancing its code generation capabilities. Using a single CodeGeeX4-ALL-9B model, it can support comprehensive functions such as code completion and generation, code interpreter, web search, function call, repository-level code Q&A, covering various scenarios of software development. CodeGeeX4-ALL-9B has achieved highly competitive performance on public benchmarks, such as [BigCodeBench](https://huggingface.co/datasets/bigcode/bigcodebench) and [NaturalCodeBench](https://github.com/THUDM/NaturalCodeBench). It is currently the most powerful code generation model with less than 10B parameters, even surpassing much larger general-purpose models, achieving the best balance in terms of inference speed and model performance.
21
+
22
+ ## Get Started
23
+
24
+ Use `4.39.0<=transformers<=4.40.2` to quickly launch [codegeex4-all-9b](https://huggingface.co/THUDM/codegeex2-6b):
25
+
26
+ ```python
27
+ from transformers import AutoTokenizer, AutoModelForCausalLM
28
+
29
+ device = "cuda" if torch.cuda.is_available() else "cpu"
30
+ tokenizer = AutoTokenizer.from_pretrained("THUDM/codegeex4-all-9b", trust_remote_code=True)
31
+ model = AutoModelForCausalLM.from_pretrained(
32
+ "THUDM/codegeex4-all-9b",
33
+ torch_dtype=torch.bfloat16,
34
+ low_cpu_mem_usage=True,
35
+ trust_remote_code=True
36
+ ).to(device).eval()
37
+ inputs = tokenizer.apply_chat_template([{"role": "user", "content": "write a quick sort"}], add_generation_prompt=True, tokenize=True, return_tensors="pt", return_dict=True ).to(device)
38
+ with torch.no_grad():
39
+ outputs = model.generate(**inputs)
40
+ outputs = outputs[:, inputs['input_ids'].shape[1]:]
41
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
42
+ ```
43
+
44
+
45
+ ## License
46
+
47
+ The model weights are licensed under the [Model License](MODEL_LICENSE).
48
+
49
+
50
+ ## Citation
51
+
52
+ If you find our work helpful, please feel free to cite the following paper:
53
+
54
+ ```
55
+ @inproceedings{zheng2023codegeex,
56
+ title={CodeGeeX: A Pre-Trained Model for Code Generation with Multilingual Evaluations on HumanEval-X},
57
+ author={Qinkai Zheng and Xiao Xia and Xu Zou and Yuxiao Dong and Shan Wang and Yufei Xue and Zihan Wang and Lei Shen and Andi Wang and Yang Li and Teng Su and Zhilin Yang and Jie Tang},
58
+ booktitle={KDD},
59
+ year={2023}
60
+ }
61
+ ```
README_zh.md ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # CodeGeeX4: 开源多语言代码生成模型
2
+
3
+ [CodeGeeX4 GitHub](https://github.com/THUDM/CodeGeeX4)
4
+
5
+ 我们推出了 CodeGeeX4-ALL-9B,这是最新的 CodeGeeX4 系列模型的开源版本。该模型是在 [GLM-4-9B](https://github.com/THUDM/GLM-4) 基础上持续训练的多语言代码生成模型,显著提升了代码生成能力。使用单个 CodeGeeX4-ALL-9B 模型,可以支持代码补全与生成、代码解释、联网搜索、函数调用、仓库级代码问答等多种功能,覆盖了软件开发的各个场景。CodeGeeX4-ALL-9B 在 [BigCodeBench](https://huggingface.co/datasets/bigcode/bigcodebench) 和 [NaturalCodeBench](https://github.com/THUDM/NaturalCodeBench) 等公开基准测试中取得了极具竞争力的表现。它是目前参数量少于 100 亿的最强代码生成模型,甚至超越了更大的通用模型,在推理速度和模型性能方面达到了最佳平衡。
6
+
7
+
8
+ ## 快速开始
9
+
10
+ 请使用 `4.39.0<=transformers<=4.40.2` 部署:
11
+
12
+ ```python
13
+ from transformers import AutoTokenizer, AutoModelForCausalLM
14
+
15
+ device = "cuda" if torch.cuda.is_available() else "cpu"
16
+ tokenizer = AutoTokenizer.from_pretrained("THUDM/codegeex4-all-9b", trust_remote_code=True)
17
+ model = AutoModelForCausalLM.from_pretrained(
18
+ "THUDM/codegeex4-all-9b",
19
+ torch_dtype=torch.bfloat16,
20
+ low_cpu_mem_usage=True,
21
+ trust_remote_code=True
22
+ ).to(device).eval()
23
+ inputs = tokenizer.apply_chat_template([{"role": "user", "content": "write a quick sort"}], add_generation_prompt=True, tokenize=True, return_tensors="pt", return_dict=True).to(device)
24
+ with torch.no_grad():
25
+ outputs = model.generate(**inputs)
26
+ outputs = outputs[:, inputs['input_ids'].shape[1]:]
27
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
28
+ ```
29
+
30
+ ## 引用
31
+
32
+ 如果您觉得我们的工作对您有帮助,欢迎引用以下论文:
33
+
34
+ ```
35
+ @inproceedings{zheng2023codegeex,
36
+ title={CodeGeeX: A Pre-Trained Model for Code Generation with Multilingual Evaluations on HumanEval-X},
37
+ author={Qinkai Zheng and Xiao Xia and Xu Zou and Yuxiao Dong and Shan Wang and Yufei Xue and Zihan Wang and Lei Shen and Andi Wang and Yang Li and Teng Su and Zhilin Yang and Jie Tang},
38
+ booktitle={KDD},
39
+ year={2023}
40
+ }
41
+ ```