Optimization and profile calculation of ODE models using second order adjoint sensitivity analysis

Revision as of 14:28, 25 February 2020 by Bwday (talk | contribs)


1 Citation

Paul Stapor, Fabian Fröhlich, and Jan Hasenauer, Optimization and profile calculation of ODE models using second order adjoint sensitivity analysis, 2018, Bioinformatics, Volume 34, Issue 13, Pages i151–i159

2 Summary

This paper introduces the second-order adjoint sensitivity analysis for parameter estimation in ordinary differential equation (ODE) models.

  • The Hessian computational complexity scales linearly with the number of state variables and quadratically with the number of parameters -> good for low-dimensional problems.
  • The second-order adjoint sensitivity analysis for the computation of Hessians and a hybrid optimization-integration-based approach for profile likelihood computation introduced in this paper.
  • The second-order adjoint sensitivity analysis scales linearly with the number of parameters and state variables -> good for large scale ODE models.
  • it is shown that the hybrid computation method is more than 2-fold faster than the best competitor.

3 Introduction

Many of the optimization methods like profile likelihood, bootstrapping, or sampling, use the Hessian of the objective function to determine search directions in optimization or update the vector field in the integration-based profile or to construct tailored proposal distributions for MCMC sampling.

In high dimensional ODES models, it is computationally demanding to evaluate the gradients and Hessians using finite differences or forward sensitivity analysis.

This paper provides the formulation of the second-order adjoint sensitivity to speedup the objective function gradient calculation.