The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
llama-cpp-python — Free-Tier Friendly Wheel
This Space provides a prebuilt llama-cpp-python wheel designed to work
reliably on Hugging Face Free tier Spaces.
No compilation. No system packages. No build failures.
If your Space crashes during pip install llama-cpp-python, this wheel is the fix.
Optimized for Hugging Face Free Tier
Hugging Face Free tier Spaces are:
- CPU-only
- Limited in memory
- Not suitable for native compilation
This wheel is built ahead of time so it can be installed instantly without triggering CMake, compilers, or BLAS detection.
What this wheel gives you
- ✅ Works on HF Free tier CPU Spaces
- ✅ Linux (ubuntu-22.04 compatible)
- ✅ Python 3.10
- ✅ OpenBLAS enabled (
GGML_BLAS=ON) - ✅ No system dependencies required
- ✅ No build step during Space startup
- ✅ Fast, reliable
pip install
How to use in a Space (Free tier)
- Download the wheel from the GitHub repository
- Upload it to your Space
- Install it in your Space startup:
pip install llama_cpp_python-*.whl>
That’s it — your Space will start without build errors.
Build details
This wheel was built using: abetlen/llama-cpp-python (recursive submodules) OpenBLAS (GGML_VENDOR=OpenBLAS) scikit-build-core ninja python -m build --wheel --no-isolation
Build environment:
OS: Ubuntu 22.04 Python: 3.10
Why not build from source on HF?
On Free tier Spaces, building from source often fails due to: Missing compilers Missing BLAS libraries Memory limits Build timeouts This prebuilt wheel avoids all of those issues.
Notes
CPU-only (no CUDA) Intended for inference workloads Not an official upstream release
Credits
All credit goes to the maintainers of llama-cpp-python and llama.cpp. This Space exists solely to make Free tier usage painless.
- Downloads last month
- 12