Update README.md
Browse files
README.md
CHANGED
|
@@ -1,13 +1,42 @@
|
|
| 1 |
---
|
| 2 |
language:
|
| 3 |
-
|
| 4 |
-
|
| 5 |
library_name: transformers
|
| 6 |
license: mit
|
| 7 |
pipeline_tag: text-generation
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 8 |
---
|
| 9 |
|
| 10 |
-
# GLM-5
|
| 11 |
|
| 12 |
<div align="center">
|
| 13 |
<img src=https://raw.githubusercontent.com/zai-org/GLM-5/refs/heads/main/resources/logo.svg width="15%"/>
|
|
@@ -136,7 +165,7 @@ vLLM, SGLang, and xLLM all support local deployment of GLM-5. A simple deploymen
|
|
| 136 |
--mem-fraction-static 0.85 \
|
| 137 |
--served-model-name glm-5-fp8
|
| 138 |
```
|
| 139 |
-
|
| 140 |
Check the [sglang cookbook](https://cookbook.sglang.io/autoregressive/GLM/GLM-5) for more details.
|
| 141 |
|
| 142 |
+ xLLM and other Ascend NPU
|
|
|
|
| 1 |
---
|
| 2 |
language:
|
| 3 |
+
- en
|
| 4 |
+
- zh
|
| 5 |
library_name: transformers
|
| 6 |
license: mit
|
| 7 |
pipeline_tag: text-generation
|
| 8 |
+
base_model:
|
| 9 |
+
- zai-org/GLM-5
|
| 10 |
+
tags:
|
| 11 |
+
- unsloth
|
| 12 |
+
- glm_moe_dsa
|
| 13 |
+
---
|
| 14 |
+
<div>
|
| 15 |
+
<p style="margin-bottom: 0; margin-top: 0;">
|
| 16 |
+
<h1 style="margin-top: 0rem;">See how to run GLM-5 locally - <a href="https://docs.unsloth.ai/models/glm-5">Read our Guide!</a></h1>
|
| 17 |
+
</p>
|
| 18 |
+
<p style="margin-top: 0;margin-bottom: 0;">
|
| 19 |
+
<em><a href="https://docs.unsloth.ai/basics/unsloth-dynamic-v2.0-gguf">Unsloth Dynamic 2.0</a> achieves superior accuracy & outperforms other leading quants.</em>
|
| 20 |
+
</p>
|
| 21 |
+
<div style="margin-top: 0;display: flex; gap: 5px; align-items: center; ">
|
| 22 |
+
<a href="https://github.com/unslothai/unsloth/">
|
| 23 |
+
<img src="https://github.com/unslothai/unsloth/raw/main/images/unsloth%20new%20logo.png" width="133">
|
| 24 |
+
</a>
|
| 25 |
+
<a href="https://discord.gg/unsloth">
|
| 26 |
+
<img src="https://github.com/unslothai/unsloth/raw/main/images/Discord%20button.png" width="173">
|
| 27 |
+
</a>
|
| 28 |
+
<a href="https://docs.unsloth.ai/models/glm-5">
|
| 29 |
+
<img src="https://raw.githubusercontent.com/unslothai/unsloth/refs/heads/main/images/documentation%20green%20button.png" width="143">
|
| 30 |
+
</a>
|
| 31 |
+
</div>
|
| 32 |
+
</div>
|
| 33 |
+
|
| 34 |
+
To run, you must install llama.cpp [PR 19460](https://github.com/ggml-org/llama.cpp/pull/19460).<br>
|
| 35 |
+
You can follow instructions in our [guide here](https://unsloth.ai/docs/models/glm-5#run-glm-5-tutorials).
|
| 36 |
+
|
| 37 |
---
|
| 38 |
|
| 39 |
+
# GLM-5
|
| 40 |
|
| 41 |
<div align="center">
|
| 42 |
<img src=https://raw.githubusercontent.com/zai-org/GLM-5/refs/heads/main/resources/logo.svg width="15%"/>
|
|
|
|
| 165 |
--mem-fraction-static 0.85 \
|
| 166 |
--served-model-name glm-5-fp8
|
| 167 |
```
|
| 168 |
+
|
| 169 |
Check the [sglang cookbook](https://cookbook.sglang.io/autoregressive/GLM/GLM-5) for more details.
|
| 170 |
|
| 171 |
+ xLLM and other Ascend NPU
|