Difference between revisions of "Performance of objective functions and optimization procedures for parameter estimation in system biology models"

m (Study design and evidence level)
 
(2 intermediate revisions by the same user not shown)
Line 7: Line 7:
  
 
=== Study outcomes ===
 
=== Study outcomes ===
The provided claims are tested on 3 parameter estimation problems.
 
  
 
==== Identifiability ====
 
==== Identifiability ====
Line 16: Line 15:
  
 
=== Study design and evidence level ===
 
=== Study design and evidence level ===
==== General aspects ====
+
* The provided claims are tested on 3 parameter estimation problems with varying amount of parameters.
 
+
* The 3 main algorithms tested were GLSDC, LevMar SE, LevMar FD with scaling factors and data normalization each. These were tested in 96 runs each.
* Although the previously best-performing method using LSQNONLIN with sensitivity equations as found in [[Lessons Learned from Quantitative Dynamical Modeling in Systems Biology]] has been mentioned, but a comparison with GLSDC was restricted to use of their implementations of it.
+
* Although the previously best-performing method using LSQNONLIN with sensitivity equations as found in [[Lessons Learned from Quantitative Dynamical Modeling in Systems Biology]] has been mentioned, but a comparison with GLSDC was restricted to use of their implementation of the algorithm.
 
* The study used Least-Squares instead of Likelihood as objective function, omitting error model fits.
 
* The study used Least-Squares instead of Likelihood as objective function, omitting error model fits.
 
==== Design for Outcome O1 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
==== Design for Outcome O2 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
 
...
 
 
==== Design for Outcome O ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
  
 
=== Further comments and aspects ===
 
=== Further comments and aspects ===
  
 
* Additionally to the performance advantages of not using scaling factors, it is also stated that the amount of overfitting is reduced.
 
* Additionally to the performance advantages of not using scaling factors, it is also stated that the amount of overfitting is reduced.
 
+
* The notion of practical identifiability does deviates from other literature, see for example e.g. [https://doi.org/10.1093/bioinformatics/btp358 Structural and practical identifiability analysis of partially observed dynamical models by exploiting the profile likelihood]
=== References ===
+
* The objective function values in Fig. 4 and Fig. 5 are not entirely obvious to interpret, since stochastic algorithms and multi-start algorithms are analyzed.
The list of cited or related literature is placed here.
 

Latest revision as of 13:38, 25 February 2020

1 Citation

Andrea Degasperi, Dirk Fey & Boris N. Kholodenko, Performance of objective functions and optimisation procedures for parameter estimation in system biology models, 2017, Systems Biology and Applications volume 3, Article number: 20

2 Summary

In systems biology, relative data are a common occurrence. In ODE-based models, this is regarded by either introducing scaling parameters or data-driven normalization to bring data and simulations onto the same scale. It was shown in this article, that data-driven normalization improves optimization performance and does not aggravate non-identifiability problems compared to a scaling factor approach. Furthermore, this article reports that hybrid optimization methods which combine stochastic global and deterministic local search outperforms deterministic local gradient-based strategies.

3 Study outcomes

3.1 Identifiability

Employing data-driven normalization instead of scaling factors improved the identifiability of dynamic parameters, providing a computational example to demonstrate how this occurs.

3.2 Convergence Speed

As visualized in Fig. 4 and Fig. 5 of the original publication, convergence speed was consistently improved using data driven normalization compared to scaling factors. Combining the data-driven normalization with the hybrid optimization algorithm GLSDC provided the best performance results especially in high-parameter settings.

4 Study design and evidence level

  • The provided claims are tested on 3 parameter estimation problems with varying amount of parameters.
  • The 3 main algorithms tested were GLSDC, LevMar SE, LevMar FD with scaling factors and data normalization each. These were tested in 96 runs each.
  • Although the previously best-performing method using LSQNONLIN with sensitivity equations as found in Lessons Learned from Quantitative Dynamical Modeling in Systems Biology has been mentioned, but a comparison with GLSDC was restricted to use of their implementation of the algorithm.
  • The study used Least-Squares instead of Likelihood as objective function, omitting error model fits.

5 Further comments and aspects