Skip to content

AzureAI: Change default max_tokens for Llama models to 2048 (4096 currently yields an error w/ Llama 3.1) #2685

AzureAI: Change default max_tokens for Llama models to 2048 (4096 currently yields an error w/ Llama 3.1)

AzureAI: Change default max_tokens for Llama models to 2048 (4096 currently yields an error w/ Llama 3.1) #2685

Triggered via pull request November 28, 2024 13:59
Status Success
Total duration 20s
Artifacts

log_viewer.yml

on: pull_request
Fit to window
Zoom out
Zoom in