Difference between revisions of "Evaluation of Derivative-Free Optimizers for Parameter Estimation in Systems Biology"

(Study outcomes)
(Study design and evidence level)
Line 16: Line 16:
  
 
=== Study design and evidence level ===
 
=== Study design and evidence level ===
==== General aspects ====
 
You can describe general design aspects here.
 
The study designs for describing specific outcomes are listed in the following subsections:
 
  
==== Design for Outcome O1 ====
+
* The optimizers were tested on 466 classic test problems and 8 ODE models.
* The outcome was generated for ...
+
* The study advised special care to adequately comparing local and global optimization techniques.
* Configuration parameters were chosen ...
+
* The study protocol was varied (with respect to the classic problems) to account for the different needs of comparing optimizers in the ODE model setting.
* ...
+
* The biological models were tested using the parameter estimation toolbox '''PESTO'''.
==== Design for Outcome O2 ====
+
* The gradient-based optimization was performed with the function ''fmincon'' with sensitivities calculated for finite differences and with forward/adjoint sensitivity analysis.
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
 
 
...
 
 
 
==== Design for Outcome O ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
  
 
=== Further comments and aspects ===
 
=== Further comments and aspects ===

Revision as of 14:52, 25 February 2020

1 Citation

Evaluation of Derivative-Free Optimizers for Parameter Estimation in Systems Biology, Y Schälte, P Stapor, J Hasenauer, IFAC-PapersOnLine 51 (19), 98-101.

2 Summary

Different parameter estimation settings necessitate the use of different optimization techniques. Hence, different local and global optimization techniques were compared in this paper by use of classic optimization test problems and also 8 ODE models. This article especially focused on the performance of derivative free optimizations (DFO) and if they are a valuable alternative to gradient-based methods.

3 Study outcomes

This section purely focuses on the outcomes obtained for ODE-models

3.1 ODE models are complicated

The results on the test models are not representative for the behavior of the optimization routines in ODE models.

3.2 Performance of DFOs

Gradient-based methods outperform DFOs in the realistic ODE models in terms of converged runs. However, in cases where gradient-based methods failed, the particle swarm method PSWARM or the evolutionary method CMAES were reasonable alternatives, see Fig. 2 in the original publication.

4 Study design and evidence level

  • The optimizers were tested on 466 classic test problems and 8 ODE models.
  • The study advised special care to adequately comparing local and global optimization techniques.
  • The study protocol was varied (with respect to the classic problems) to account for the different needs of comparing optimizers in the ODE model setting.
  • The biological models were tested using the parameter estimation toolbox PESTO.
  • The gradient-based optimization was performed with the function fmincon with sensitivities calculated for finite differences and with forward/adjoint sensitivity analysis.

5 Further comments and aspects

6 References

The list of cited or related literature is placed here.