Big Raster Prediction #829
Replies: 1 comment 4 replies
-
Can you paste me some code so I can consider what is happening here. Its not obvious to me how any code we changed would change this behavior. If you give me a tile that used to throw the error, I will install versions going backwards until I find the commit, but my initial sense that this is a dependency issue, and not a deepforest source, since we don't to my knowledge ever check if tiles fit in memory. At the same time, I'm not sure I ever want predict_tile to throw a out of memory error. That feels like a bug in of itself, let's also try to address that root error if we can reproduce it. The entire point of predict_tile is to cut tiles into manageable pieces, so why would you then need to do anything else. Maybe you are not using the patch_size argument properly, or there is something we can do to make that better (as a percentage of the tile size for example). Let's investigate together. |
Beta Was this translation helpful? Give feedback.
-
I am trying to do inference(using predict_tile) on rasters of different sizes, that sometimes can not fit in the available RAM memory. Until the last release, the predict_tile was throwing an exception when this happens. In those cases I split the raster in N windows and perform inference on each of them. After the latest release this exception is not thrown anymore, and the process is getting killed. Do you have a suggestion of a workaround/solution for this issue?
Beta Was this translation helpful? Give feedback.
All reactions