Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: module 'google.protobuf.descriptor' has no attribute '_internal_create_key' #1302

Open
Sherlock-shy opened this issue Nov 18, 2024 · 2 comments

Comments

@Sherlock-shy
Copy link

Hello, was trying when using unsloth for the final local saving model with the line:

model.save_pretrained_gguf("dir", tokenizer, quantization_method = "q4_k_m")

and have the error:

Unsloth: Merging 4bit and LoRA weights to 16bit...
Unsloth: Will use up to 25.65 out of 44.1 RAM for saving.
100%|██████████| 32/32 [00:36<00:00, 1.13s/it]
Unsloth: Saving tokenizer... Done.
Unsloth: Saving model... This might take 5 minutes for Llama-7b...
Done.
==((====))== Unsloth: Conversion from QLoRA to GGUF information
\ /| [0] Installing llama.cpp will take 3 minutes.
O^O/ _/ \ [1] Converting HF to GGUF 16bits will take 3 minutes.
\ / [2] Converting GGUF 16bits to ['q4_k_m'] will take 10 minutes each.
"-____-" In total, you will have to wait at least 16 minutes.

Unsloth: [0] Installing llama.cpp. This will take 3 minutes...
Unsloth: [1] Converting model at dir into bf16 GGUF format.
The output location will be /mnt/c/Users/H0/Desktop/Coding/Python/unsloth/dir/unsloth.BF16.gguf
This will take 3 minutes...

AttributeError Traceback (most recent call last)
Cell In[18], line 1
----> 1 model.save_pretrained_gguf("dir", tokenizer, quantization_method = "q4_k_m")

File ~/miniconda3/envs/unsloth_env/lib/python3.10/site-packages/unsloth/save.py:1683, in unsloth_save_pretrained_gguf(self, save_directory, tokenizer, quantization_method, first_conversion, push_to_hub, token, private, is_main_process, state_dict, save_function, max_shard_size, safe_serialization, variant, save_peft_format, tags, temporary_location, maximum_memory_usage)
1680 is_sentencepiece_model = check_if_sentencepiece_model(self)
1682 # Save to GGUF
-> 1683 all_file_locations, want_full_precision = save_to_gguf(
1684 model_type, model_dtype, is_sentencepiece_model,
1685 new_save_directory, quantization_method, first_conversion, makefile,
1686 )
1688 # Save Ollama modelfile
1689 modelfile = create_ollama_modelfile(tokenizer, all_file_locations[0])

File ~/miniconda3/envs/unsloth_env/lib/python3.10/site-packages/unsloth/save.py:1093, in save_to_gguf(model_type, model_dtype, is_sentencepiece, model_directory, quantization_method, first_conversion, _run_installer)
1091 vocab_type = "spm,hfft,bpe"
1092 # Fix Sentencepiece model as well!
-> 1093 fix_sentencepiece_gguf(model_directory)
1094 else:
1095 vocab_type = "bpe"

File ~/miniconda3/envs/unsloth_env/lib/python3.10/site-packages/unsloth/tokenizer_utils.py:404, in fix_sentencepiece_gguf(saved_location)
398 """
399 Fixes sentencepiece tokenizers which did not extend the vocabulary with
...
127 serialized_end=1347,
128 )
129 _sym_db.RegisterEnumDescriptor(_TRAINERSPEC_MODELTYPE)

AttributeError: module 'google.protobuf.descriptor' has no attribute '_internal_create_key'
Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings...

Have tried to install and uninstall the protobuf and protoc and making sure they are the same version without comflict issue.

Environment:

accelerate==1.1.1
aiohappyeyeballs==2.4.3
aiohttp==3.10.10
aiosignal==1.3.1
asttokens @ file:///home/conda/feedstock_root/build_artifacts/asttokens_1698341106958/work
async-timeout==4.0.3
attrs==24.2.0
bitsandbytes==0.44.1
Brotli @ file:///croot/brotli-split_1714483155106/work
certifi @ file:///home/conda/feedstock_root/build_artifacts/certifi_1725278078093/work/certifi
charset-normalizer @ file:///croot/charset-normalizer_1721748349566/work
comm @ file:///home/conda/feedstock_root/build_artifacts/comm_1710320294760/work
datasets==3.1.0
debugpy @ file:///croot/debugpy_1690905042057/work
decorator @ file:///home/conda/feedstock_root/build_artifacts/decorator_1641555617451/work
dill==0.3.8
docstring_parser==0.16
et-xmlfile==1.1.0
exceptiongroup @ file:///home/conda/feedstock_root/build_artifacts/exceptiongroup_1720869315914/work
executing @ file:///home/conda/feedstock_root/build_artifacts/executing_1725214404607/work
filelock @ file:///croot/filelock_1700591183607/work
frozenlist==1.5.0
fsspec==2024.9.0
gmpy2 @ file:///tmp/build/80754af9/gmpy2_1645455533097/work
hf_transfer==0.1.8
huggingface-hub==0.26.2
idna @ file:///croot/idna_1714398848350/work
importlib_metadata @ file:///home/conda/feedstock_root/build_artifacts/importlib-metadata_1726082825846/work
ipykernel @ file:///home/conda/feedstock_root/build_artifacts/ipykernel_1719845459717/work
ipython @ file:///home/conda/feedstock_root/build_artifacts/ipython_1729866374957/work
jedi @ file:///home/conda/feedstock_root/build_artifacts/jedi_1696326070614/work
Jinja2 @ file:///croot/jinja2_1730902924303/work
joblib==1.4.2
jupyter_client @ file:///home/conda/feedstock_root/build_artifacts/jupyter_client_1726610684920/work
jupyter_core @ file:///home/conda/feedstock_root/build_artifacts/jupyter_core_1727163409502/work
markdown-it-py==3.0.0
MarkupSafe @ file:///croot/markupsafe_1704205993651/work
matplotlib-inline @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-inline_1713250518406/work
mdurl==0.1.2
mkl-service==2.4.0
mkl_fft @ file:///io/mkl313/mkl_fft_1730824109137/work
mkl_random @ file:///io/mkl313/mkl_random_1730823916628/work
mpmath @ file:///croot/mpmath_1690848262763/work
multidict==6.1.0
multiprocess==0.70.16
nest_asyncio @ file:///home/conda/feedstock_root/build_artifacts/nest-asyncio_1705850609492/work
networkx @ file:///croot/networkx_1720002482208/work
numpy @ file:///croot/numpy_and_numpy_base_1708638617955/work/dist/numpy-1.26.4-cp310-cp310-linux_x86_64.whl#sha256=d8cd837ed43e87f77e6efaa08e8de927ca030a1c9c5d04624432d6fb9a74a5ee
openpyxl @ file:///croot/openpyxl_1721752957391/work
packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1718189413536/work
pandas==2.2.3
parso @ file:///home/conda/feedstock_root/build_artifacts/parso_1712320355065/work
peft==0.13.2
pexpect @ file:///home/conda/feedstock_root/build_artifacts/pexpect_1706113125309/work
pickleshare @ file:///home/conda/feedstock_root/build_artifacts/pickleshare_1602536217715/work
pillow @ file:///croot/pillow_1721059439630/work
platformdirs @ file:///home/conda/feedstock_root/build_artifacts/platformdirs_1726613481435/work
prompt_toolkit @ file:///home/conda/feedstock_root/build_artifacts/prompt-toolkit_1727341649933/work
propcache==0.2.0
protobuf==3.19.6
psutil @ file:///opt/conda/conda-bld/psutil_1656431268089/work
ptyprocess @ file:///home/conda/feedstock_root/build_artifacts/ptyprocess_1609419310487/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl
pure_eval @ file:///home/conda/feedstock_root/build_artifacts/pure_eval_1721585709575/work
pyarrow==18.0.0
Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1714846767233/work
PySocks @ file:///home/builder/ci_310/pysocks_1640793678128/work
python-dateutil @ file:///home/conda/feedstock_root/build_artifacts/python-dateutil_1709299778482/work
pytz==2024.2
PyYAML @ file:///croot/pyyaml_1728657952215/work
pyzmq @ file:///croot/pyzmq_1705605076900/work
regex==2024.11.6
requests @ file:///croot/requests_1730999120400/work
rich==13.9.4
safetensors==0.4.5
scikit-learn==1.5.2
scipy==1.14.1
sentencepiece==0.2.0
shtab==1.7.1
six @ file:///home/conda/feedstock_root/build_artifacts/six_1620240208055/work
stack-data @ file:///home/conda/feedstock_root/build_artifacts/stack_data_1669632077133/work
sympy==1.13.1
threadpoolctl==3.5.0
tokenizers==0.20.3
torch==2.5.1
torchaudio==2.5.1
torchvision==0.20.1
tornado @ file:///croot/tornado_1718740109488/work
tqdm==4.67.0
traitlets @ file:///home/conda/feedstock_root/build_artifacts/traitlets_1713535121073/work
transformers==4.46.2
triton==3.1.0
trl==0.12.0
typing_extensions @ file:///croot/typing_extensions_1715268824938/work
tyro==0.8.14
tzdata==2024.2
unsloth==2024.11.7
unsloth_zoo==2024.11.4
urllib3 @ file:///croot/urllib3_1727769808118/work
wcwidth @ file:///home/conda/feedstock_root/build_artifacts/wcwidth_1704731205417/work
xformers==0.0.28.post3
xxhash==3.5.0
yarl==1.17.1
zipp @ file:///home/conda/feedstock_root/build_artifacts/zipp_1726248574750/work

I have no idea what is wrong anymore

@Erland366
Copy link
Contributor

Uhh ohh, can you try upgrading your protobuf library by using pip install -U protobuf?

@Sherlock-shy
Copy link
Author

I tried yet it will have conflict with the unsloth-zoo which requires protobuf<4.0.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants