Instructions to use unity/inference-engine-whisper-tiny with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- unity-sentis
How to use unity/inference-engine-whisper-tiny with unity-sentis:
string modelName = "[Your model name here].sentis"; Model model = ModelLoader.Load(Application.streamingAssetsPath + "/" + modelName); IWorker engine = WorkerFactory.CreateWorker(BackendType.GPUCompute, model); // Please see provided C# file for more details
- Notebooks
- Google Colab
- Kaggle
Is run whisper broken?
2
#15 opened 9 months ago
by
jalnyn
Fix wrong tensor shape leading to words duplication
#11 opened over 1 year ago
by
dmitry-yudakov
Cannot import .onnx files into Unity
๐ฅ 2
3
#10 opened over 1 year ago
by
JuanJe89
sentis-whisper-tiny won't work to begin with
1
#9 opened over 1 year ago
by
FlatHIppo
Assets\Scripts\RunWhisper.cs(52,5): error CS0122: 'Ops' is inaccessible due to its protection level
1
#8 opened over 1 year ago
by
humbertomartinezg
Any insight on why Sentis versions <=1.6.0-pre.1 wouldn't work
1
#7 opened over 1 year ago
by
CrisBlu
encoding problem?
#5 opened almost 2 years ago
by
joonyle
OpenGL ES compatibility issue
#4 opened about 2 years ago
by deleted
Performance problem with Sentis Whisper Tiny
6
#3 opened about 2 years ago
by deleted
Meta Quest 2 Android / SnapDragon XR2 Model Loader/ inference performance
๐ 1
3
#2 opened about 2 years ago
by
smuscroft79
Could you please explain how is the original whisper tiny model exported to sentis format?
๐ 3
6
#1 opened about 2 years ago
by deleted