-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Welcome to the Barycenter method wiki!
This repository implements the method described on https://arxiv.org/abs/1801.10533. It consists on a direct optimization manner to obtain a function optimum estimation without the function gradient but only punctual evaluation.
There are two versions of the method: its batch and recursive versions. The former consists in, given a collection of investigation points, the weighted average of its coordinates by the exponential of the negative value from respective function value multiplied by a constant (method hyperparameter), and provides the optimum value for such points. The latter attributes a gaussian noise vector to the previous estimation position and assign it to the current reading position.
Intuitively, the optimum point stays within the convex hull of provided points. To compensate this drawback, the method also has its recursive version, which requires an arbitrary initial point. Considers a "curiosity factor", which increment the barycenter found on previous iteration by a gaussian variable with known mean and standard deviation. One can "help" the curiosity factor by providing as mean for the gaussian variable a value proportional i.e. , to previous barycenter step. When not, then The method, given required constants, has an asymptotic behaviour, reassembling classical gradient methods like Newton-Raphson, Conjugate Gradients or Descending Gradient.
For the sake of completitude, consider a noisy objective function. Hence, it might be necessary to explore the neighborhood of found solution to assure optimality and discard noise effects. For such, one implements a relaxation factor .
An application for the method on Machine Learning models is available (in portuguese) in [1].
This tutorial is for Unix users. The first step is to clone the repository on desired machine. For such, go to the folder one desires to clone and, on terminal, type:
git clone [email protected]:brunolnetto/baryopt.git
Open Matlab and add the files on its path by following command typing. They allow file access from anywhere within Matlab environment.
addpath('$BARYOPT_PATH')
addpath(genpath('$BARYOPT_PATH'))
savepath
The string '$BARYOPT_PATH' stands for the cloned repository path.
So far, the author implements the optimization of a convex function here. The hyper-parameters described on section "Overview" were iteratively chosen: caution to avoid overflow of solution due exponential; moreover, depending of the number of iterations required on the method, it may not suffice to obtain desired precision.
Finally, the script plots the contour of the function and each step the method takes to reach a neighborhood of the optimum point.
Firtly, you must have python installed in your computer. For that, take a look at the URL: https://www.python.org/downloads/.
The script $BARYOPT_PATH/python/examples/main.py
implements both batch and recursive method versions. As investigation points, the author chooses an uniform point distribution over [-10, 10]x[-10, 10]. For instance, consider only the method originally proposed on article mentioned on section "Overview" i.e. 'normal'.
python main.py
[1] Pellicer, Lucas Francisco Amaral Orosco, and Felipe Miguel Pait. "BarySearch: Algoritmo de tuning de Modelos de Machine Learning com o Método do Baricentro." Anais do XIV Brazilian e-Science Workshop. SBC, 2020.