Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with Loading a SLIM Model from a local directory #939

Open
chaitanya-gvs opened this issue Jul 26, 2024 · 0 comments
Open

Issue with Loading a SLIM Model from a local directory #939

chaitanya-gvs opened this issue Jul 26, 2024 · 0 comments

Comments

@chaitanya-gvs
Copy link

chaitanya-gvs commented Jul 26, 2024

I am experiencing an issue when attempting to load the slim-sentiment-tool model using the load_model method in the ModelCatalog class. The method raises a ModelNotFoundException, indicating that it cannot identify the model card for the selected model.

Steps to Reproduce

  1. Download the model files and place them in the directory /models/llmware/slim-sentiment-tool.
    from llmware.models import pull_model_from_hf, ModelCatalog
    model_card = ModelCatalog().lookup_model_card("slim-sentiment-tool")
    pull_model_from_hf(model_card, "/models/llmware/slim-sentiment-tool")
  2. Attempt to load the model using the following code:
    from llmware.models import ModelCatalog
    
    def analyse_sentiment(text):
        slim_model = ModelCatalog().load_model("/models/llmware/slim-sentiment-tool")
        response = slim_model.function_call(text, get_logits=True)
        analysis = ModelCatalog().logit_analysis(response, slim_model.model_card, slim_model.hf_tokenizer_name)
        llm_response = response['llm_response']
        confidence_score = float(analysis['confidence_score'])
        return llm_response, confidence_score
    
    text = "I am happy"
    llm_response, confidence_score = analyse_sentiment(text)
  3. Observe the following error:
    ModelNotFoundException: '/models/llmware/slim-sentiment-tool' could not be located
    

Expected Behavior

The load_model method should correctly identify and load the model card for the slim-sentiment-tool model.

Actual Behavior

The method raises a ModelNotFoundException, indicating that it cannot identify the model card for the selected model.

Environment

  • llmware version: 0.3.3
  • Python version: 3.11

Request for Assistance

Could you please assist in identifying why the load_model method is unable to find the model card from the local path and suggest any necessary changes to fix this issue?

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant