Query Complexity Classifier

This model classifies user queries based on their complexity level so they can be routed to an appropriate Large Language Model (LLM).

The model predicts three classes:

  • Simple
  • Medium
  • Complex

It can be used as a pre-routing layer in AI systems where different LLMs handle different levels of query complexity.


Model

Base Model: DistilBERT Task: Text Classification (3 classes)


Download and Use

You can download and load the model directly from Hugging Face using the transformers library.

Install dependencies

pip install transformers torch

Load the model

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

model_name = "Shaheer001/Query-Complexity-Classifier"

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

Run inference

text = "Explain how Kubernetes architecture works."

inputs = tokenizer(text, return_tensors="pt", truncation=True)

outputs = model(**inputs)

prediction = torch.argmax(outputs.logits, dim=1).item()

labels = ["Simple", "Medium", "Complex"]

print("Predicted Complexity:", labels[prediction])

Example

Input:

Explain Kubernetes architecture

Output:

Complex

Use Case

This model can be used to build LLM routing systems where queries are automatically sent to different language models depending on their complexity.

Example workflow:

User Query โ†’ Complexity Classifier โ†’ LLM Router โ†’ Selected LLM

Downloads last month
114
Safetensors
Model size
67M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support