This repo is the official implementation for the ICCVW'23 paper: APNet: Urban-level Scene Segmentation of Aerial Images and Point Clouds
To run our code first install the dependencies with:
conda env create -f environment.yaml
The code will be released after cleaning.
Then run the following command:
sh run_files/train_eval.sh
If you use this repo, please cite as :
@inproceedings{wei2023apnet,
author = {Weijie Wei and Martin R. Oswald and Fatemeh Karimi Nejadasl and Theo Gevers},
title = {{APNet: Urban-level Scene Segmentation of Aerial Images and Point Clouds}},
booktitle = {{Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW)}},
year = {2023}
}
Our code is heavily inspired by the following projects:
- RandLA-Net: https://github.com/QingyongHu/RandLA-Net
- RandLA-Net-pytorch: https://github.com/tsunghan-wu/RandLA-Net-pytorch
- HRNet: https://github.com/HRNet/HRNet-Semantic-Segmentation
- KPConv: https://github.com/HuguesTHOMAS/KPConv-PyTorch
- KPRNet: https://github.com/DeyvidKochanov-TomTom/kprnet
- SensatUrban-BEV-Seg3D: https://github.com/zouzhenhong98/SensatUrban-BEV-Seg3D
Thanks for their contributions.