mlx-community/Kimi-K2.5-3bit

This model mlx-community/Kimi-K2.5-3bit was converted to MLX format from moonshotai/Kimi-K2.5 using mlx-lm version 0.30.6.

Usage

Please see the model card of the original model for example code. Note that this quant is for text-only usage.

Remember to allow remote code for tokenizer usage.

Downloads last month
1,108
Safetensors
Model size
1T params
Tensor type
BF16
U32
F32
MLX
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support

Model tree for mlx-community/Kimi-K2.5-3bit

Quantized
(15)
this model