Siyuan Li*,1,2, Zedong Wang*,1, Zicheng Liu1,2, Di Wu1,2, Chen Tan1,2, Jiangbin Zheng1,2, Yufei Huang1,2, Stan Z. Li†,1
Similar to natural language models, pre-trained genome language models are proposed to capture the underlying intricacies within genomes with unsupervised sequence modeling. They have become essential tools for researchers and practitioners in biology. However, the hand-crafted tokenization policies used in these models may not encode the most discriminative patterns from the limited vocabulary of genomic data. In this paper, we introduce VQDNA, a general-purpose framework that renovates genome tokenization from the perspective of genome vocabulary learning. By leveraging vector-quantized codebooks as learnable vocabulary, VQDNA can adaptively tokenize genomes into pattern-aware embeddings in an end-to-end manner. To further push its limits, we propose Hierarchical Residual Quantization (HRQ), where varying scales of codebooks are designed in a hierarchy to enrich the genome vocabulary in a coarse-to-fine manner. Extensive experiments on 32 genome datasets demonstrate VQDNA's superiority and favorable parameter efficiency compared to existing genome language models. Notably, empirical analysis of SARS-CoV-2 mutations reveals the fine-grained pattern awareness and biological significance of learned HRQ vocabulary, highlighting its untapped potential for broader applications in genomics.
We will update the implementation of VQDNA after finishing our new projects of genomic fundational model with techiniques of VQDNA. Please watch us for the latest release! Feel free to open issues if there is some questions on experiments and implementation details.
If you find this repository helpful, please consider citing:
@inproceedings{icml2024vqdna,
title={VQDNA: Unleashing the Power of Vector Quantization for Multi-Species Genomic Sequence Modeling},
author={Siyuan Li and Zedong Wang and Zicheng Liu and Di Wu and Cheng Tan and Jiangbin Zheng and Yufei Huang and Stan Z. Li},
booktitle={International Conference on Machine Learning (ICML)},
year={2024}
}