Unload model #431
-
Hi! How to unload faster-whisper model from VRAM? I need to use another model(NER) after work of F-W. |
Beta Was this translation helpful? Give feedback.
Answered by
guillaumekln
Aug 22, 2023
Replies: 2 comments
-
You could delete the model object and then force the Python garbage collector to run. Try this: import gc
from faster_whisper import WhisperModel
model = WhisperModel("large-v2")
del model
gc.collect() |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
nixsrv
-
Thanks. That's work. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
You could delete the model object and then force the Python garbage collector to run. Try this: