No local interfaces? #1440
Unanswered
ZeChArtiahSaher
asked this question in
Q&A
Replies: 1 comment
-
I hope someone here will pick up the glove and give us a connection to this awesome software lm studio That way you won't have to pay more to openai This is how it looks in their GUI: Load a model, start the server, and run this example in your terminalChoose between streaming and non-streaming mode by setting the "stream" fieldcurl http://localhost:1234/v1/chat/completions |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I understand there may be fear with some level of compliance or volatility of open source, but have you seen how well latest current local models perform? Why not have options for interfacing with thigns like lm studio api for example or any type open source api for locally run llms.
Also why not have some kind of form of search without having to query paid apis?
Beta Was this translation helpful? Give feedback.
All reactions