You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tools like Guidance help during text generation by not necessarily improving on the prompt.
A new paper called "Efficient Guided Generation for Large Language Models" does the same but with a cheaper runtime. One can provide a regex with which the model is guided during text generation. This might help with syntax-heavy tasks. E.g NER token lists ([0, 1, 2, 3]), or some inline tagging Alex B-PER is O going O to O Los B-LOC Angeles I-LOC.
The algorithm requires access to the generation task itself. This would therefore only work with self-hosted models.
The text was updated successfully, but these errors were encountered:
Tools like Guidance help during text generation by not necessarily improving on the prompt.
A new paper called "Efficient Guided Generation for Large Language Models" does the same but with a cheaper runtime. One can provide a regex with which the model is guided during text generation. This might help with syntax-heavy tasks. E.g NER token lists (
[0, 1, 2, 3]
), or some inline taggingAlex B-PER is O going O to O Los B-LOC Angeles I-LOC
.The algorithm requires access to the generation task itself. This would therefore only work with self-hosted models.
The text was updated successfully, but these errors were encountered: