Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to train only last few layers using FastLanguageModel #1320

Open
gneeraj97 opened this issue Nov 22, 2024 · 2 comments
Open

How to train only last few layers using FastLanguageModel #1320

gneeraj97 opened this issue Nov 22, 2024 · 2 comments

Comments

@gneeraj97
Copy link

I am trying to fine tune a model using FastLanguageModel, and I only want to train the last few layers. When I am passing the target module/layer, it is throwing error saying that it only allows "accepted_modules". The only accepted modules are : ["q_proj", "k_proj", "v_proj", "o_proj","gate_proj", "up_proj", "down_proj"].

Is there anyway I can train only last few layers using unsloth?

@danielhanchen
Copy link
Contributor

@gneeraj97
Copy link
Author

When I do that, unsloth library throws these warning-
Not an error, but Unsloth cannot patch MLP layers with our manual autograd engine since either LoRA adapters are not enabled or a bias term (like in Qwen) is used. Not an error, but Unsloth cannot patch Attention layers with our manual autograd engine since either LoRA adapters are not enabled or a bias term (like in Qwen) is used. Not an error, but Unsloth cannot patch O projection layer with our manual autograd engine since either LoRA adapters are not enabled or a bias term (like in Qwen) is used. Unsloth 2024.7 patched 32 layers with 4 QKV layers, 4 O layers and 4 MLP layers.

Is it okay to go ahead and fine tune the model ? What kind of effect would this warning have on training speed, model performance, etc ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants