Skip to content

Commit

Permalink
alpha is not needed for adopt (calculated internally using lr)
Browse files Browse the repository at this point in the history
  • Loading branch information
scap3yvt authored Nov 21, 2024
1 parent 8a85afe commit 7ed6649
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion GANDLF/optimizers/thirdparty/adopt.py
Original file line number Diff line number Diff line change
Expand Up @@ -514,7 +514,6 @@ def adopt_wrapper(parameters: dict) -> torch.optim.Optimizer:
lr=parameters.get("learning_rate", 1e-3),
betas=parameters.get("betas", (0.9, 0.999, 0.9999)),
eps=parameters.get("eps", 1e-8),
alpha=parameters.get("alpha", 5.0),
weight_decay=parameters.get("weight_decay", 0.0),
decoupled=parameters["optimizer"].get("decoupled", False),
foreach=parameters.get("foreach", None),
Expand Down

0 comments on commit 7ed6649

Please sign in to comment.