Update README.md
Browse files
README.md
CHANGED
|
@@ -9,7 +9,7 @@ base_model:
|
|
| 9 |
<div align="center">
|
| 10 |
|
| 11 |
<div align="center" style="display: flex; justify-content: center; align-items: center;">
|
| 12 |
-
<img src="
|
| 13 |
<h1 style="margin: 0; border-bottom: none;">Step-3.5-Flash</h1>
|
| 14 |
</div>
|
| 15 |
|
|
@@ -38,7 +38,7 @@ base_model:
|
|
| 38 |
|
| 39 |
Step 3.5 Flash delivers performance parity with leading closed-source systems while remaining open and efficient.
|
| 40 |
|
| 41 |
-
![]
|
| 42 |
|
| 43 |
Performance of Step 3.5 Flash measured across **Reasoning**, **Coding**, and **Agency**. Open-source models (left) are sorted by their total parameter count, while top-tier proprietary models are shown on the right. xbench-DeepSearch scores are sourced from [official publications](https://xbench.org/agi/aisearch) for consistency. The shadowed bars represent the enhanced performance of Step 3.5 Flash using [Parallel Thinking](https://arxiv.org/pdf/2601.05593).
|
| 44 |
|
|
@@ -219,6 +219,8 @@ pip install "sglang[all] @ git+https://github.com/sgl-project/sglang.git"
|
|
| 219 |
```
|
| 220 |
|
| 221 |
2. Launch the server.
|
|
|
|
|
|
|
| 222 |
- For bf16 model
|
| 223 |
SGLANG_ENABLE_SPEC_V2=1
|
| 224 |
python3 -m sglang.launch_server \
|
|
@@ -657,7 +659,7 @@ python3 -m sglang.launch_server \
|
|
| 657 |
- When using `vLLM` and `SGLang`, thinking mode is enabled by default when sending requests.
|
| 658 |
- Both support tool calling. Please use OpenAI-style tool description format for calls.
|
| 659 |
|
| 660 |
-
## Citation
|
| 661 |
|
| 662 |
If you find our work useful in your research, please consider citing the following paper:
|
| 663 |
|
|
@@ -671,8 +673,8 @@ If you find our work useful in your research, please consider citing the followi
|
|
| 671 |
primaryClass={cs.CL},
|
| 672 |
url={https://arxiv.org/abs/xxxxx},
|
| 673 |
}
|
| 674 |
-
```
|
| 675 |
|
| 676 |
## 📄 License
|
| 677 |
|
| 678 |
-
This project is open-sourced under the [Apache 2.0 License](https://www.google.com/search?q=LICENSE).
|
|
|
|
| 9 |
<div align="center">
|
| 10 |
|
| 11 |
<div align="center" style="display: flex; justify-content: center; align-items: center;">
|
| 12 |
+
<img src="stepfun.svg" width="25" style="margin-right: 10px;"/>
|
| 13 |
<h1 style="margin: 0; border-bottom: none;">Step-3.5-Flash</h1>
|
| 14 |
</div>
|
| 15 |
|
|
|
|
| 38 |
|
| 39 |
Step 3.5 Flash delivers performance parity with leading closed-source systems while remaining open and efficient.
|
| 40 |
|
| 41 |
+

|
| 42 |
|
| 43 |
Performance of Step 3.5 Flash measured across **Reasoning**, **Coding**, and **Agency**. Open-source models (left) are sorted by their total parameter count, while top-tier proprietary models are shown on the right. xbench-DeepSearch scores are sourced from [official publications](https://xbench.org/agi/aisearch) for consistency. The shadowed bars represent the enhanced performance of Step 3.5 Flash using [Parallel Thinking](https://arxiv.org/pdf/2601.05593).
|
| 44 |
|
|
|
|
| 219 |
```
|
| 220 |
|
| 221 |
2. Launch the server.
|
| 222 |
+
|
| 223 |
+
```bash
|
| 224 |
- For bf16 model
|
| 225 |
SGLANG_ENABLE_SPEC_V2=1
|
| 226 |
python3 -m sglang.launch_server \
|
|
|
|
| 659 |
- When using `vLLM` and `SGLang`, thinking mode is enabled by default when sending requests.
|
| 660 |
- Both support tool calling. Please use OpenAI-style tool description format for calls.
|
| 661 |
|
| 662 |
+
<!-- ## Citation
|
| 663 |
|
| 664 |
If you find our work useful in your research, please consider citing the following paper:
|
| 665 |
|
|
|
|
| 673 |
primaryClass={cs.CL},
|
| 674 |
url={https://arxiv.org/abs/xxxxx},
|
| 675 |
}
|
| 676 |
+
``` -->
|
| 677 |
|
| 678 |
## 📄 License
|
| 679 |
|
| 680 |
+
This project is open-sourced under the [Apache 2.0 License](https://www.google.com/search?q=LICENSE).
|