Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using llmware models with Ollama Modelfile #1039

Open
mallahyari opened this issue Oct 8, 2024 · 7 comments
Open

Using llmware models with Ollama Modelfile #1039

mallahyari opened this issue Oct 8, 2024 · 7 comments

Comments

@mallahyari
Copy link

Hi there,
I am trying to use slim models from llmware such as 'slim-sql-tool' with Ollama. but I need to create a prompt template in Modelfile and I was wondering what would it be look like. In your HF config.json it says:
"prompt_format": " {table_schema} \n {question} \n:"

Would mind clarifying what would be the exact TEMPLATE for modelfile?
One example will make it clear with other models too.

Thanks,
M.

@limcheekin
Copy link

limcheekin commented Oct 30, 2024

@mallahyari I am user of Ollama too, you may refer to my solution for LocalAI at https://github.com/limcheekin/llmware/blob/main/examples/Models/using-localai-slim-models.py.

This is not an optimal solution, but it is what I found and workable solution at the moment.

Appreciate if you could share a better solution here for Ollama.

Hopefully we will hear from llmware team soon on this matter.

@doberst
Copy link
Contributor

doberst commented Oct 30, 2024

@limcheekin & @mallahyari - thanks for raising this - the general prompt format for a SLIM model looks like this:

full_prompt = ": " + {{context}} + "\n" + "<{}> {} </{}>".format(function, params, function) + "\n:"

context = the core text passage (or in the case of a SQL model, this will be the sql table schema)

function = (e.g., "classify") is provided in the model configs for the model - both in the llmware/model_configs.py model card, and in the Huggingface repo config.json file.

params = vary by model, some do not require/optional, while others look at the params as the 'input' into the function, e.g., in the case of SQL model, this is the query, or in the case of a slim-extract model, it would be the extraction key (e.g., "revenue growth")

For the SQL model, in particular, you can simplify the prompt even further, as suggested above:
prompt = ": " + {{sql_table_schema}} + "\n" + {{natural language query}} + "\n" + ":"

Hope this answers the question - welcome more dialog and discussion on this topic - we would be delighted to collaborate on better packaging of the SLIM prompt templates into Ollama ModelFiles - let me know! :)

@mallahyari
Copy link
Author

@doberst @limcheekin Thank you the responses. Let me clarify my problem. I am trying to use llmware SLIM models in Ollama (without using llmware library) for a project. After I download the GGUF model from HF, I need to create a "Modelfile" to be able to add it to my ollama list. in the Modelfile I have something like this:

FROM /Downloads/slim-sql.gguf

TEMPLATE """
: {{ .Prompt }} :
"""

This TEMPLATE must be the correct chat prompt template, otherwise, the model generations will be meaningless. I tried different thing based on your HF, but nothing didnt work including above template format. I was wondering what the correct TEMPLATE is?

Thank you,
M.

@limcheekin
Copy link

limcheekin commented Oct 31, 2024

@mallahyari I get you. That's exactly my use case, I'm using the SLIM models in LocalAI without using llmware library. I'm currently defining model yaml file of LocalAI for each SLIM model, for example slim-sentiment-tool.yaml.

@doberst Thanks for your support. I updated the using-localai-slim-models.py example on how to use the SLIM models in LocalAI with llmware library, appreciate your review and feedback if there's better way. I have 1 question, how to enable function_call support for OpenChatModel? Currently hitting error AttributeError: 'OpenChatModel' object has no attribute 'function_call'.

@mallahyari
Copy link
Author

@limcheekin Thank you for the code sample. I am going to try it out and see whether this format works with Ollama or not.

@doberst That would be great if there is some info/guide how to use llmware SLIM models with ollama especially the prompt template format.

@limcheekin
Copy link

@mallahyari Any update? Do you need any help?

@mallahyari
Copy link
Author

@limcheekin I haven't been able to try it due to being super busy! When I do, will let you know. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants