Since we created and continuously updated this Awesome-Attention-Heads repository, we realized it's time to systematically summarize the work and provide a clear research status to researchers. Thus, our survey, "Attention Heads of Large Language Models: A Survey", has been released. The LaTeX code for the survey is now available.
🔗 Related links:
arXiv Page: https://arxiv.org/abs/2409.03752
Hugging Face: https://huggingface.co/papers/2409.03752