Instructions to use ehsanaghaei/SecureBERT_Plus with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ehsanaghaei/SecureBERT_Plus with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="ehsanaghaei/SecureBERT_Plus")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("ehsanaghaei/SecureBERT_Plus") model = AutoModelForMaskedLM.from_pretrained("ehsanaghaei/SecureBERT_Plus") - Inference
- Notebooks
- Google Colab
- Kaggle
Use this model for NER
Is it possible to use this model for Entity Recognition? And if yes, could you please share how to do it?
Thanks
Is it possible to use this model for Entity Recognition? And if yes, could you please share how to do it?
Thanks
Check: https://huggingface.co/CyberPeace-Institute/SecureBERT-NER
Thanks
SecureBERT 2.0 will soon be announced by Cisco’s AI Defense team. This has been trained on a dataset 13 times larger, it supports both code and cybersecurity text, powered by an advanced transformer architecture. The release will include multiple fine-tuned models, enabling tasks such as semantic similarity search (bi-encoder and cross-encoder), named entity recognition, and code vulnerability detection. New model outperforms all SoTA models in variety of tasks.