Difference between revisions of "Hierarchical optimization for the efficient parametrization of ODE models"

(Summary)
m
 
(19 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
__NUMBEREDHEADINGS__
 
__NUMBEREDHEADINGS__
== General Information ==
+
=== Citation ===
C Loos, S Krause, J Hasenauer (2013)
+
C Loos, S Krause, J Hasenauer (2013) [https://doi.org/10.1093/bioinformatics/bty514 Hierarchical optimization for the efficient parametrization of ODE models] Bioinformatics, Volume 34, Issue 24, Pages 4266–4273.
[https://doi.org/10.1093/bioinformatics/bty514 : Hierarchical optimization for the efficient parametrization of ODE models]
 
Bioinformatics, Volume 34, Issue 24, Pages 4266–4273.
 
 
 
[https://doi.org/10.1093/bioinformatics/bty514: https://doi.org/10.1093/bioinformatics/bty514]
 
  
 
=== Summary ===
 
=== Summary ===
  
In ODE-based modeling in the systems biology field, often only relative data is available whose measurement errors are not unknown. A common approach to deal with this setting is the introduction of scaling and noise parameters, see [[Lessons Learned from Quantitative Dynamical Modeling in Systems Biology]]. Since introducing additional parameters can decrease the performance of the parameter optimization algorithm, this paper introduced an hierarchical approach to separate the fitting of the *nuisance* from the *dynamic* parameters in every step. This was compared to the standard approach of fitting all parameters simultaneously in terms of optimizer convergence and computational efficiency.
+
In ODE-based modeling in the systems biology field, often only relative data is available whose measurement errors are not unknown. A common approach to deal with this setting is the introduction of scaling and noise parameters, see [[Lessons Learned from Quantitative Dynamical Modeling in Systems Biology]]. Since introducing additional parameters can decrease the performance of the parameter optimization algorithm, this paper introduced an hierarchical approach to separate the fitting of the ''nuisance'' from the ''dynamic'' parameters in every step. This was compared to the standard approach of fitting all parameters simultaneously in terms of optimizer convergence and computational efficiency for 3 systems biology models.
  
 
=== Study outcomes ===
 
=== Study outcomes ===
List the paper results concerning method comparison and benchmarking:
 
==== Outcome O1 ====
 
The performance of ...
 
  
Outcome O1 is presented as Figure X in the original publication.  
+
==== Best Fit ====
 +
Standard and hierarchical multi-start optimization both find the same globally optimal objective function for both proposed error models (Gaussian and Laplace noise) except for one model with Laplace noise, where both approaches did not produce the same model trajectories.  
  
==== Outcome O2 ====
+
==== Convergence of Optimizer ====
...
 
  
Outcome O2 is presented as Figure X in the original publication.
+
The hierarchical optimization improved the number of converged local fits from 18.4% to 29.3%, presented in Fig. 4C in the original publication.
 
==== Outcome On ====
 
...
 
  
Outcome On is presented as Figure X in the original publication.
+
==== Computational Efficiency ====
 
 
==== Further outcomes ====
 
If intended, you can add further outcomes here.
 
  
 +
Fixing the computational budget, it was shown that the reduction in computation time per converged fit leads to 5.06 times more optimization runs reaching the best objective function value, visualized in Fig. 4D-E.
  
 
=== Study design and evidence level ===
 
=== Study design and evidence level ===
==== General aspects ====
+
* In order to generate the outcomes of the study, the '''MATLAB''' toolbox '''PESTO''' was used. In this framework, multi-start local optimization using the function ''fmincon.m'' was employed.
You can describe general design aspects here.
 
The study designs for describing specific outcomes are listed in the following subsections:
 
 
 
==== Design for Outcome O1 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
==== Design for Outcome O2 ====
 
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
  
...  
+
* For error and scaling parameters, optimal estimates in each step have been derived analytically for Gaussian and Laplace noise. These analytical results were used where possible in this study.
  
==== Design for Outcome O ====
+
* The study used data of 3 models which were already calibrated. Application in more realistic modeling settings has not been performed.
* The outcome was generated for ...
 
* Configuration parameters were chosen ...
 
* ...
 
  
 
=== Further comments and aspects ===
 
=== Further comments and aspects ===
  
=== References ===
+
* Alternatively to using additional parameters to deal with relative data, relative data changes could be evaluated directly: [[Performance of objective functions and optimization procedures for parameter estimation in system biology models]]
The list of cited or related literature is placed here.
 

Latest revision as of 14:11, 26 February 2020

1 Citation

C Loos, S Krause, J Hasenauer (2013) Hierarchical optimization for the efficient parametrization of ODE models Bioinformatics, Volume 34, Issue 24, Pages 4266–4273.

2 Summary

In ODE-based modeling in the systems biology field, often only relative data is available whose measurement errors are not unknown. A common approach to deal with this setting is the introduction of scaling and noise parameters, see Lessons Learned from Quantitative Dynamical Modeling in Systems Biology. Since introducing additional parameters can decrease the performance of the parameter optimization algorithm, this paper introduced an hierarchical approach to separate the fitting of the nuisance from the dynamic parameters in every step. This was compared to the standard approach of fitting all parameters simultaneously in terms of optimizer convergence and computational efficiency for 3 systems biology models.

3 Study outcomes

3.1 Best Fit

Standard and hierarchical multi-start optimization both find the same globally optimal objective function for both proposed error models (Gaussian and Laplace noise) except for one model with Laplace noise, where both approaches did not produce the same model trajectories.

3.2 Convergence of Optimizer

The hierarchical optimization improved the number of converged local fits from 18.4% to 29.3%, presented in Fig. 4C in the original publication.

3.3 Computational Efficiency

Fixing the computational budget, it was shown that the reduction in computation time per converged fit leads to 5.06 times more optimization runs reaching the best objective function value, visualized in Fig. 4D-E.

4 Study design and evidence level

  • In order to generate the outcomes of the study, the MATLAB toolbox PESTO was used. In this framework, multi-start local optimization using the function fmincon.m was employed.
  • For error and scaling parameters, optimal estimates in each step have been derived analytically for Gaussian and Laplace noise. These analytical results were used where possible in this study.
  • The study used data of 3 models which were already calibrated. Application in more realistic modeling settings has not been performed.

5 Further comments and aspects