Difference between revisions of "Hierarchical optimization for the efficient parametrization of ODE models"

m (Further comments and aspects)
m
 
(9 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
__NUMBEREDHEADINGS__
 
__NUMBEREDHEADINGS__
== General Information ==
+
=== Citation ===
C Loos, S Krause, J Hasenauer (2013)
+
C Loos, S Krause, J Hasenauer (2013) [https://doi.org/10.1093/bioinformatics/bty514 Hierarchical optimization for the efficient parametrization of ODE models] Bioinformatics, Volume 34, Issue 24, Pages 4266–4273.
[https://doi.org/10.1093/bioinformatics/bty514 : Hierarchical optimization for the efficient parametrization of ODE models]
 
Bioinformatics, Volume 34, Issue 24, Pages 4266–4273.
 
 
 
[https://doi.org/10.1093/bioinformatics/bty514: https://doi.org/10.1093/bioinformatics/bty514]
 
  
 
=== Summary ===
 
=== Summary ===
Line 12: Line 8:
  
 
=== Study outcomes ===
 
=== Study outcomes ===
In order to generate the outcomes of the study, the '''MATLAB''' toolbox '''PESTO''' was used. In this framework, multi-start local optimization using the function ''fmincon.m'' was employed.
 
  
 
==== Best Fit ====
 
==== Best Fit ====
Line 26: Line 21:
  
 
=== Study design and evidence level ===
 
=== Study design and evidence level ===
==== General aspects ====
+
* In order to generate the outcomes of the study, the '''MATLAB''' toolbox '''PESTO''' was used. In this framework, multi-start local optimization using the function ''fmincon.m'' was employed.
You can describe general design aspects here.
 
The study designs for describing specific outcomes are listed in the following subsections:
 
  
==== Design for Outcome O1 ====
+
* For error and scaling parameters, optimal estimates in each step have been derived analytically for Gaussian and Laplace noise. These analytical results were used where possible in this study.
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
==== Design for Outcome O2 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
  
...
+
* The study used data of 3 models which were already calibrated. Application in more realistic modeling settings has not been performed.
 
 
==== Design for Outcome O ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
  
 
=== Further comments and aspects ===
 
=== Further comments and aspects ===
  
* For error and scaling parameters, optimal estimates in each step have been derived analytically for Gaussian and Laplace noise. These analytical results were used where possible in this study.
+
* Alternatively to using additional parameters to deal with relative data, relative data changes could be evaluated directly: [[Performance of objective functions and optimization procedures for parameter estimation in system biology models]]
 
 
=== References ===
 
The list of cited or related literature is placed here.
 

Latest revision as of 14:11, 26 February 2020

1 Citation

C Loos, S Krause, J Hasenauer (2013) Hierarchical optimization for the efficient parametrization of ODE models Bioinformatics, Volume 34, Issue 24, Pages 4266–4273.

2 Summary

In ODE-based modeling in the systems biology field, often only relative data is available whose measurement errors are not unknown. A common approach to deal with this setting is the introduction of scaling and noise parameters, see Lessons Learned from Quantitative Dynamical Modeling in Systems Biology. Since introducing additional parameters can decrease the performance of the parameter optimization algorithm, this paper introduced an hierarchical approach to separate the fitting of the nuisance from the dynamic parameters in every step. This was compared to the standard approach of fitting all parameters simultaneously in terms of optimizer convergence and computational efficiency for 3 systems biology models.

3 Study outcomes

3.1 Best Fit

Standard and hierarchical multi-start optimization both find the same globally optimal objective function for both proposed error models (Gaussian and Laplace noise) except for one model with Laplace noise, where both approaches did not produce the same model trajectories.

3.2 Convergence of Optimizer

The hierarchical optimization improved the number of converged local fits from 18.4% to 29.3%, presented in Fig. 4C in the original publication.

3.3 Computational Efficiency

Fixing the computational budget, it was shown that the reduction in computation time per converged fit leads to 5.06 times more optimization runs reaching the best objective function value, visualized in Fig. 4D-E.

4 Study design and evidence level

  • In order to generate the outcomes of the study, the MATLAB toolbox PESTO was used. In this framework, multi-start local optimization using the function fmincon.m was employed.
  • For error and scaling parameters, optimal estimates in each step have been derived analytically for Gaussian and Laplace noise. These analytical results were used where possible in this study.
  • The study used data of 3 models which were already calibrated. Application in more realistic modeling settings has not been performed.

5 Further comments and aspects