How to inference with multiple GPUs? #1723
-
I tried using |
Beta Was this translation helpful? Give feedback.
Answered by
gaotongxiao
Feb 14, 2023
Replies: 1 comment
-
Currently not - a workaround is to use multi-gpu test and set the |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
jiamying
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently not - a workaround is to use multi-gpu test and set the
--save-preds
flag, which allows you to access the prediction results later.