Difference between revisions of "Evaluation of Derivative-Free Optimizers for Parameter Estimation in Systems Biology"

 
(5 intermediate revisions by the same user not shown)
Line 4: Line 4:
  
 
=== Summary ===
 
=== Summary ===
Briefly describe the scope of the paper, i.e. the field of research and/or application.
+
Different parameter estimation settings necessitate the use of different optimization techniques. Hence, different local and global optimization techniques were compared in this paper by use of classic optimization test problems and also 8 ODE models. This article especially focused on the performance of ''derivative free optimizations'' (DFO) and if they are a valuable alternative to gradient-based methods.
  
 
=== Study outcomes ===
 
=== Study outcomes ===
List the paper results concerning method comparison and benchmarking:
+
This section purely focuses on the outcomes obtained for ODE-models
==== Outcome O1 ====
 
The performance of ...
 
  
Outcome O1 is presented as Figure X in the original publication.  
+
==== ODE models are complicated ====
 +
The results on the test models are not representative for the behavior of the optimization routines in ODE models.  
  
==== Outcome O2 ====
+
==== Performance of DFOs ====
...
+
Gradient-based methods outperform DFOs in the realistic ODE models in terms of converged runs. However, in cases where gradient-based methods failed, the particle swarm method ''PSWARM'' or the evolutionary method ''CMAES'' were reasonable alternatives, see Fig. 2 in the original publication.
 
 
Outcome O2 is presented as Figure X in the original publication.
 
 
==== Outcome On ====
 
...
 
 
 
Outcome On is presented as Figure X in the original publication.
 
 
 
==== Further outcomes ====
 
If intended, you can add further outcomes here.
 
  
 +
==== Finite Differences vs Sensitivity Analysis ====
 +
In the gradient-based approach, finite differences lead to inferior convergence performance compared to forward/adjoint sensitivities.
  
 
=== Study design and evidence level ===
 
=== Study design and evidence level ===
==== General aspects ====
 
You can describe general design aspects here.
 
The study designs for describing specific outcomes are listed in the following subsections:
 
 
==== Design for Outcome O1 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
==== Design for Outcome O2 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
 
...
 
 
==== Design for Outcome O ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
 
=== Further comments and aspects ===
 
  
=== References ===
+
* The optimizers were tested on 466 classic test problems and 8 ODE models.
The list of cited or related literature is placed here.
+
* The study advised special care to adequately comparing local and global optimization techniques.
 +
* The study protocol was varied (with respect to the classic problems) to account for the different needs of comparing optimizers in the ODE model setting.
 +
* The biological models were tested using the parameter estimation toolbox '''PESTO'''.
 +
* The gradient-based optimization was performed with the function ''fmincon'' with sensitivities calculated for finite differences and with forward/adjoint sensitivity analysis.

Latest revision as of 15:01, 25 February 2020

1 Citation

Evaluation of Derivative-Free Optimizers for Parameter Estimation in Systems Biology, Y Schälte, P Stapor, J Hasenauer, IFAC-PapersOnLine 51 (19), 98-101.

2 Summary

Different parameter estimation settings necessitate the use of different optimization techniques. Hence, different local and global optimization techniques were compared in this paper by use of classic optimization test problems and also 8 ODE models. This article especially focused on the performance of derivative free optimizations (DFO) and if they are a valuable alternative to gradient-based methods.

3 Study outcomes

This section purely focuses on the outcomes obtained for ODE-models

3.1 ODE models are complicated

The results on the test models are not representative for the behavior of the optimization routines in ODE models.

3.2 Performance of DFOs

Gradient-based methods outperform DFOs in the realistic ODE models in terms of converged runs. However, in cases where gradient-based methods failed, the particle swarm method PSWARM or the evolutionary method CMAES were reasonable alternatives, see Fig. 2 in the original publication.

3.3 Finite Differences vs Sensitivity Analysis

In the gradient-based approach, finite differences lead to inferior convergence performance compared to forward/adjoint sensitivities.

4 Study design and evidence level

  • The optimizers were tested on 466 classic test problems and 8 ODE models.
  • The study advised special care to adequately comparing local and global optimization techniques.
  • The study protocol was varied (with respect to the classic problems) to account for the different needs of comparing optimizers in the ODE model setting.
  • The biological models were tested using the parameter estimation toolbox PESTO.
  • The gradient-based optimization was performed with the function fmincon with sensitivities calculated for finite differences and with forward/adjoint sensitivity analysis.