Skip to content

AzureAI: Change default max_tokens for Llama models to 2048 (4096 cur… #2686

AzureAI: Change default max_tokens for Llama models to 2048 (4096 cur…

AzureAI: Change default max_tokens for Llama models to 2048 (4096 cur… #2686

Triggered via push November 28, 2024 14:02
Status Success
Total duration 20s
Artifacts

log_viewer.yml

on: push
Fit to window
Zoom out
Zoom in