This repository contains the code for LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents by ByeoungDo Kim, Seong Hyeon Park, Seokhwan Lee, Elbek Khoshimjonov, Dongsuk Kum, Junsoo Kim, Jeong Soo Kim, Jun Won Choi
@InProceedings{Kim_2021_CVPR,
author = {Kim, ByeoungDo and Park, Seong Hyeon and Lee, Seokhwan and Khoshimjonov, Elbek and Kum, Dongsuk and Kim, Junsoo and Kim, Jeong Soo and Choi, Jun Won},
title = {LaPred: Lane-Aware Prediction of Multi-Modal Future Trajectories of Dynamic Agents},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2021},
pages = {14636-14645}
}
- Download nuScenes dataset
- Run the script to extract preprocessed samples.
- Provide the path of the downloaded data to --path(-p) option. (default : './nuscenes/dataset')
python dataset_preprocess.py -p [dataset-path]
- To train the LaPred model, run the run.py file.
python run.py -m Lapred_original
- After training, You can evaluate the model with --eval(-e) option.
python run.py -m Lapred_original -e