- What for?
- a Hypervolume Newton Method for solving continuous multiobjective optimization problems (MOPs),
- enabled by the analytically computation of Hypervolume Hessian Matrix.
- Why?
- the Newton method has a local quadratic convergence under some mild condition of objective functions.
- Perhaps, you'd like to refine the final outcome of some direct optimizers with the Hypervolume Newton method..
- When to use it? the objective function is at least twice continuously differentiable.
Specifically, you will find the following major functionalities:
hvd.HypervolumeDerivatives: the analytical computation of the HV Hessian and specifically, Alg. 2 described in [DEW22].hvd.HVN: Hypervolume Newton Method for (Constrained) Multi-objective Optimization Problems in [WED+22].
A pypi package will be available soon:
For now, please take the lastest version from the master branch:
git clone https://github.com/wangronin/HypervolumeDerivatives.git
cd HypervolumeDerivatives && python setup.py install --userHypervolume (HV) Indicator of a point set
Consider an objective function
import numpy as np
from hvd import HypervolumeDerivatives
from hvd.newton import HVN
# Compute the HV Hessian w.r.t. the objective points
ref = np.array([9, 10, 12])
hvh = HypervolumeDerivatives(3, 3, ref, minimization=True)
out = hvh.compute_derivatives(X=np.array([[5, 3, 7], [2, 1, 10]]), compute_hessian=True)
# Define constants for the objective space
c1 = np.array([1.5, 0, np.sqrt(3) / 3])
c2 = np.array([1.5, 0.5, -np.sqrt(3) / 6])
c3 = np.array([1.5, -0.5, -np.sqrt(3) / 6])
ref = np.array([24, 24, 24])
# Define the objective function and its derivatives
def MOP1(x):
x = np.array(x)
return np.array(
[
np.sum((x - c1) ** 2),
np.sum((x - c2) ** 2),
np.sum((x - c3) ** 2),
]
)
def MOP1_Jacobian(x):
x = np.array(x)
return np.array(
[
2 * (x - c1),
2 * (x - c2),
2 * (x - c3),
]
)
def MOP1_Hessian(x):
return np.array([2 * np.eye(3), 2 * np.eye(3), 2 * np.eye(3)])
# Compute the HV Hessian w.r.t. the decision points
hvh = HypervolumeDerivatives(
n_var=3, n_obj=3, ref=ref, func=MOP1, jac=MOP1_Jacobian, hessian=MOP1_Hessian
)
w = np.random.rand(20, 3)
w /= np.sum(w, axis=1).reshape(-1, 1)
X = w @ np.vstack([c1, c2, c3])
out = hvh.compute(X)
# Hypervolume Newton Method
max_iters = 10
mu = 20
ref = np.array([20, 20, 20])
w = np.abs(np.random.rand(mu, 3))
w /= np.sum(w, axis=1).reshape(-1, 1)
x0 = w @ np.vstack([c1, c2, c3])
opt = HVN(
n_var=3,
n_obj=3,
ref=ref,
func=MOP1,
jac=MOP1_Jacobian,
hessian=MOP1_Hessian,
X0=x0,
xl=np.full(3, -2),
xu=np.full(3, 2),
max_iters=max_iters,
verbose=True,
)
X, Y, stop = opt.run()The hypervolume indicator (HV) of a set of points is the m-dimensional Lebesgue measure of the space that is jointly dominated by a set of objective function vectors in
We show an example of 3D hypervolume indicator and the geometrical meaning of its partial derivatives as follows.
In this chart, we have three objective function to minimize, where we depicts three objective points,
Also, we include, in folder mathematica/, several cases of the hypervolume indicator Hessian computed symoblically using Mathematica.
-
[WED+22] Wang, H.; Emmerich, Michael T. M.; Deutz, A.; Hernández, V.A.S.; Schütze, O. The Hypervolume Newton Method for Constrained Multi-objective Optimization Problems. Preprints 2022, 2022110103.
-
[DEW22] Deutz, A.; Emmerich, Michael T. M.; Wang, H. The Hypervolume Indicator Hessian Matrix: Analytical Expression, Computational Time Complexity, and Sparsity, arXiv, 2022.

