You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @universewill , your input is really large! Although there are some tricks (see this issue) that we can try to make the network accept larger images, I'm sorry that your input would still be too large.
Sorry again for I haven't presented a script supporting large image inference. If you are willing to re-write the dataloading & test function, here is a general idea about how to achieve it:
First set a threshold based on you gpu memory caps. And crop your input with a padding region iteratively to make every patch meet the threshold.
Send the patch sequences to the inference, and stitch them back iteratively in the same manner.
In this way, you should be able to test on large input images. (Just pay attention to the overlap area and make them blend naturally.)
Hope this solve your question!
Is there any way to solve cuda out of memory problem when input image is large?
My input is about 1240x650 large. How to get around with this problem?
The text was updated successfully, but these errors were encountered: