You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, @DekuLiuTesla ,thanks for your fantastic work
When i tried to prepare for the dataset under your instruction:
I noticed that the MatrixCity dataset (link) is organized into 10 blocks, with each block containing several images. However, in the ./matrix_city_aerial/sparse/0/train/image.bin file from the COLMAP results you provided (link), all the images are combined into a single file and are indexed sequentially from 1 to 5621.
Could you please clarify the correspondence between the images in your COLMAP results (indices 0-5621) and the original dataset blocks? Specifically, how can we map each image in the COLMAP results to its original block and file in MatrixCity?
Thank you again for your help and your contributions to the community!
The text was updated successfully, but these errors were encountered:
Hi, @YaoXingbo. The provided link is the COLMAP for the whole scene, i.e. the combination of 10 officially divided blocks. And I'm afraid there is no direct way to map COLMAP to original block and file in MatrixCity. A more practical way is to regenerate the COLMAP with our script.
Hi, @DekuLiuTesla ,thanks for your fantastic work
When i tried to prepare for the dataset under your instruction:
I noticed that the MatrixCity dataset (link) is organized into 10 blocks, with each block containing several images. However, in the ./matrix_city_aerial/sparse/0/train/image.bin file from the COLMAP results you provided (link), all the images are combined into a single file and are indexed sequentially from 1 to 5621.
Could you please clarify the correspondence between the images in your COLMAP results (indices 0-5621) and the original dataset blocks? Specifically, how can we map each image in the COLMAP results to its original block and file in MatrixCity?
Thank you again for your help and your contributions to the community!
The text was updated successfully, but these errors were encountered: