Instructions to use Fastweb/FastwebMIIA-7B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Fastweb/FastwebMIIA-7B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Fastweb/FastwebMIIA-7B") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Fastweb/FastwebMIIA-7B") model = AutoModelForCausalLM.from_pretrained("Fastweb/FastwebMIIA-7B") messages = [ {"role": "user", "content": "Who are you?"}, ] inputs = tokenizer.apply_chat_template( messages, add_generation_prompt=True, tokenize=True, return_dict=True, return_tensors="pt", ).to(model.device) outputs = model.generate(**inputs, max_new_tokens=40) print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:])) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use Fastweb/FastwebMIIA-7B with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Fastweb/FastwebMIIA-7B" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Fastweb/FastwebMIIA-7B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/Fastweb/FastwebMIIA-7B
- SGLang
How to use Fastweb/FastwebMIIA-7B with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "Fastweb/FastwebMIIA-7B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Fastweb/FastwebMIIA-7B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "Fastweb/FastwebMIIA-7B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Fastweb/FastwebMIIA-7B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Docker Model Runner
How to use Fastweb/FastwebMIIA-7B with Docker Model Runner:
docker model run hf.co/Fastweb/FastwebMIIA-7B
Acknowledge license to accept the repository
This repository is publicly accessible, but you have to accept the conditions to access its files and content.
By downloading, accessing, and using the model as specified, you fully accept the FastwebMIIA's Non-Commercial License, the Acceptable Use Policy (AUP), and the other attached documents. If you do not agree to the terms and conditions in this license and the related documents, you must not download or use the model and should delete any copies you may already have.
This license grants the use of FastwebMIIA exclusively for
- personal, scientific, or academic research activities, whether theoretical or applied, and non-professional, with purely informative purposes, and
- under your own authority and responsibility during professional activities limited to your business and organizational activities, which do not have commercial purposes.
You are prohibited from using the model for commercial purposes. Specifically, the Licensee cannot:
- Integrate the model into services, products, platforms, systems, or applications if the integration is intended for resale, distribution, or making it available to third parties for a fee;
- Transfer, distribute, sell, rent, sublicense, or make the model or any part of it available to third parties, either for a fee or free, without written permission from the Licensor;
- Use the model to develop or train other LLMs, regardless of the purpose or intended use.
By completing the form you acccept Fastweb’s Privacy Policy.
Ai sensi e per gli effetti dell’art. 1341, 1° e 2° comma del Codice Civile, il Licenziatario, oltre ad accettare le “Condizioni Generali di Contratto di licenza d’uso per scopi non commerciali del modello FastwebMIIA”, approva specificamente anche le seguenti clausole vessatorie del medesimo contratto: Art. 6 – Obblighi del Licenziatario e Limiti di utilizzo; Art. 8 – Garanzie ed esclusioni; Art. 9 – Limitazione di responsabilità e manleva; Art. 10 – Proprietà Intellettuale; Art. 12 – Forza Maggiore; Art. 16– Legge applicabile e foro competente; Art. 17 – Disposizioni Finali.
Log in or Sign Up to review the conditions and access this model content.