Skip to content

AzureAI: Change default max_tokens for Llama models to 2048 (4096 currently yields an error w/ Llama 3.1) #2684

AzureAI: Change default max_tokens for Llama models to 2048 (4096 currently yields an error w/ Llama 3.1)

AzureAI: Change default max_tokens for Llama models to 2048 (4096 currently yields an error w/ Llama 3.1) #2684

Triggered via pull request November 28, 2024 13:58
Status Success
Total duration 24s
Artifacts

log_viewer.yml

on: pull_request
Fit to window
Zoom out
Zoom in