Add `library_name: transformers` to model card metadata
Browse filesThis PR adds the `library_name: transformers` metadata tag to the model card.
This tag is crucial for Hugging Face Hub functionality, as it enables the automated "How to use" widget on the model page, providing users with a readily available code snippet to get started with the model.
Evidence for `transformers` compatibility is strong:
- The `config.json` file specifies `"architectures": ["OPTForCausalLM"]` and `"model_type": "opt"`, both standard for `transformers` models.
- The `tokenizer_config.json` explicitly sets `"tokenizer_class": "GPT2Tokenizer"`.
- The `generation_config.json` includes `"transformers_version": "4.42.0.dev0"`.
This ensures that the automated code snippet will work as expected.
The model card content remains unchanged.
README.md
CHANGED
|
@@ -1,15 +1,17 @@
|
|
| 1 |
---
|
| 2 |
-
|
|
|
|
| 3 |
datasets:
|
| 4 |
- databricks/databricks-dolly-15k
|
| 5 |
language:
|
| 6 |
- en
|
|
|
|
| 7 |
metrics:
|
| 8 |
- rouge
|
| 9 |
-
base_model:
|
| 10 |
-
- facebook/opt-1.3b
|
| 11 |
pipeline_tag: text-generation
|
|
|
|
| 12 |
---
|
|
|
|
| 13 |
# MiniLLM-OPT-1.3B
|
| 14 |
|
| 15 |
[paper](https://arxiv.org/abs/2306.08543) | [code](https://github.com/microsoft/LMOps/tree/main/minillm)
|
|
|
|
| 1 |
---
|
| 2 |
+
base_model:
|
| 3 |
+
- facebook/opt-1.3b
|
| 4 |
datasets:
|
| 5 |
- databricks/databricks-dolly-15k
|
| 6 |
language:
|
| 7 |
- en
|
| 8 |
+
license: apache-2.0
|
| 9 |
metrics:
|
| 10 |
- rouge
|
|
|
|
|
|
|
| 11 |
pipeline_tag: text-generation
|
| 12 |
+
library_name: transformers
|
| 13 |
---
|
| 14 |
+
|
| 15 |
# MiniLLM-OPT-1.3B
|
| 16 |
|
| 17 |
[paper](https://arxiv.org/abs/2306.08543) | [code](https://github.com/microsoft/LMOps/tree/main/minillm)
|