Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ask_llm does not initialize LLM #190

Closed
haesleinhuepf opened this issue Aug 11, 2024 · 4 comments · Fixed by #196
Closed

ask_llm does not initialize LLM #190

haesleinhuepf opened this issue Aug 11, 2024 · 4 comments · Fixed by #196

Comments

@haesleinhuepf
Copy link
Owner

When calling ask_llm first, the underlying infrstructure is not initialized:

from bia_bob import ask_llm
ask_llm("Give...

Error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[2], line 1
----> 1 code = ask_llm("Give me a prolog example program involving birds, humans and lizards. Respond with the prolog code only.")
      2 code

File [C:\structure\code\bia-bob\src\bia_bob\_utilities.py:7](file:///C:/structure/code/bia-bob/src/bia_bob/_utilities.py#line=6), in ask_llm(prompt, image, chat_history)
      5 """Ask the language model a simple question and return the response."""
      6 from ._machinery import Context
----> 7 return generate_response(chat_history=chat_history,
      8                   image=image,
      9                   model=Context.model,
     10                   system_prompt="",
     11                   user_prompt=prompt,
     12                   vision_system_prompt="")

File [C:\structure\code\bia-bob\src\bia_bob\_utilities.py:83](file:///C:/structure/code/bia-bob/src/bia_bob/_utilities.py#line=82), in generate_response(chat_history, image, model, system_prompt, user_prompt, vision_system_prompt)
     78 if Context.endpoint is not None:
     79     full_response = generate_response_from_openai(model, system_prompt, user_prompt, chat_history, image,
     80                                                   base_url=Context.endpoint, api_key=Context.api_key,
     81                                                   vision_model=Context.vision_model,
     82                                                   vision_system_prompt=vision_system_prompt)
---> 83 elif "gpt-" in model:
     84     full_response = generate_response_from_openai(model, system_prompt, user_prompt, chat_history, image,
     85                                                   vision_model=Context.vision_model,
     86                                                   vision_system_prompt=vision_system_prompt)
     87 elif model == "gemini" or model == "gemini-pro" or model == "gemini-pro-vision":

TypeError: argument of type 'NoneType' is not iterable
@haesleinhuepf
Copy link
Owner Author

git-bob think about this

Copy link
Contributor

This message was generated by git-bob (version: 0.2.3, model: claude-3-5-sonnet-20240620), an experimental AI-based assistant. It can make mistakes and has limitations. Check its messages carefully.

The issue appears to be caused by the Context.model being None when ask_llm is called. This happens because the LLM infrastructure is not initialized before the first call to ask_llm.

To fix this, we should modify the ask_llm function in _utilities.py to initialize the assistant if it hasn't been done already. Here's the proposed change:

def ask_llm(prompt, image=None, chat_history=[]):
    """Ask the language model a simple question and return the response."""
    from ._machinery import Context, init_assistant
    if Context.model is None:
        init_assistant()
    return generate_response(chat_history=chat_history,
                      image=image,
                      model=Context.model,
                      system_prompt="",
                      user_prompt=prompt,
                      vision_system_prompt="")

This modification ensures that the LLM infrastructure is initialized before attempting to generate a response, resolving the NoneType error.

@haesleinhuepf
Copy link
Owner Author

Awesome! git-bob implement exactly this!

@haesleinhuepf
Copy link
Owner Author

Awesome! git-bob implement exactly this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment