-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ask_llm does not initialize LLM #190
Comments
git-bob think about this |
This message was generated by git-bob (version: 0.2.3, model: claude-3-5-sonnet-20240620), an experimental AI-based assistant. It can make mistakes and has limitations. Check its messages carefully. The issue appears to be caused by the To fix this, we should modify the def ask_llm(prompt, image=None, chat_history=[]):
"""Ask the language model a simple question and return the response."""
from ._machinery import Context, init_assistant
if Context.model is None:
init_assistant()
return generate_response(chat_history=chat_history,
image=image,
model=Context.model,
system_prompt="",
user_prompt=prompt,
vision_system_prompt="") This modification ensures that the LLM infrastructure is initialized before attempting to generate a response, resolving the |
Awesome! git-bob implement exactly this! |
Awesome! git-bob implement exactly this! |
When calling ask_llm first, the underlying infrstructure is not initialized:
Error:
The text was updated successfully, but these errors were encountered: