Optimization and profile calculation of ODE models using second order adjoint sensitivity analysis


1 Citation

Paul Stapor, Fabian Fröhlich, and Jan Hasenauer, Optimization and profile calculation of ODE models using second order adjoint sensitivity analysis, 2018, Bioinformatics, Volume 34, Issue 13, Pages i151–i159

2 Summary

UNDER REVISION!

This paper introduces the second-order adjoint sensitivity analysis for parameter estimation in ordinary differential equation (ODE) models.

  • The Hessian computational complexity scales linearly with the number of state variables and quadratically with the number of parameters -> good for low-dimensional problems.
  • The second-order adjoint sensitivity analysis for the computation of Hessians and a hybrid optimization-integration-based approach for profile likelihood computation introduced in this paper.
  • The second-order adjoint sensitivity analysis scales linearly with the number of parameters and state variables -> good for large scale ODE models.
  • it is shown that the hybrid computation method is more than 2-fold faster than the best competitor.

3 Introduction

Many of the optimization methods like profile likelihood, bootstrapping, or sampling, use the Hessian of the objective function to determine search directions in optimization or update the vector field in the integration-based profile or to construct tailored proposal distributions for MCMC sampling.

In high dimensional ODES models, it is computationally demanding to evaluate the gradients and Hessians using finite differences or forward sensitivity analysis.

This paper provides the formulation of the second-order adjoint sensitivity to speedup the objective function gradient calculation. Hybrid approach for the calculation of profile likelihoods is also introduced , which combines the ideas the two currently existing approaches and exploits the Hessian. We provide detailed comparisons of optimization and profile likelihood calculation of the proposed approaches and state-of-the- art methods based on published models of biological processes

4 Material and methods

UNDER REVISION!

In this study, we solve the optimization problems using multi-start local optimization, an approach which has been shown to perform well in systems and computational biology (Raue et al., 2013b). Initial points for local optimizations are drawn randomly from the parameter domain X, to which optimization is restricted (box-constraints), and the results of these optimizations are sorted by their final objective function value.

Local optimization is carried out using either least- squares algorithms such as the Gauss–Newton-type methods combined with trust-region (TR) algorithms

Convergence of these methods can usually be improved, if the computed derivatives are accurate (Raue et al., 2013b). Common least-squares algorithms such as the MATLAB function lsqnonlin only use first order deriva- tives of the residuals, whereas constraint optimization algorithms like the MATLAB function fmincon exploit first and second order deriva- tives of the objective function.

In general, there are two approaches for calculating profile likelihood:


1. The optimization-based approaches: in which a parameter is fixed and are other parameters are optimized. in each step, we change the fixed-parameter by a small amount and optimize the other parameters again. we repeat the procedure to obtain the full profile. Usually, all profiles have to be computed. computationally time demanding!

2. The integration-based approach: