-
Hello, was wondering how to use the existing optimizers for inner optimization problems that are not in the form of least squares, such as nonlinear ones in general or perhaps convex ones. Is it possible to integrate with optimizers in torch such as Adam, or is it necessary to implement a Theseus specific optimizer? I attempted to solve a convex problem with existing optimizers designed for least squares and the downstream performance is not ideal, which led me to suspect the optimizer has something to do with it |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @bc0428. We have one optimizer that doesn't require the problem to be in the form of least squares, namely, Let me know if you have additional questions. |
Beta Was this translation helpful? Give feedback.
Hi @bc0428. We have one optimizer that doesn't require the problem to be in the form of least squares, namely,
th.DCEM
, which implements differentiable cross-entropy method (paper). To make the objective not codify a squared objective, you also need to change the default cost function aggregation function by passing a custom error metric to the objective. You can use the implementation of the default sum of squares as a guide.Let me know if you have additional questions.