Replies: 1 comment
-
Nice! For HuggingChat it's a trade-off between adding a hidden system prompt that list features & guardrails or letting users set it in their settings manually so they have full control. That means by default the model is not aware that latex is supported 😅 But you can always edit the instructions indeed! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
chat-ui supports Katex/Latex rendering - but for this to work you need to guide the LLM that the feature is available. The following addition to the System Prompt seems to work well:
Katex/Latex is supported: Use $_CONTENT_$ for inline, $$_CONTENT_$$ for display. Use "$" instead of "\(" and "$$" instead of "\[".
I've tested a few variants of this against Sonnet 3.5, GPT-4o and GPT-4o mini and Llama 3.1 70B. Both sentences are helpful (the instead of was added after testing with Sonnet 3.5).
Screenshots for demonstration below (these are Llama 3.1 70B). Prompt used for testing was
provide a comprehensive explanation of the Partial Derivative function.
Thought it worth making a note for people setting up chat-ui or Assistant prompts where this knowledge might be helpful.
For testing the base system prompt was
You are a helpful AI Assistant
Then with the above snippet added.
Without:
With:
Beta Was this translation helpful? Give feedback.
All reactions