# Difference between revisions of "Fast derivatives of likelihood functionals for ODE based models using adjoint-state method"

Line 2: | Line 2: | ||

=== Citation === | === Citation === | ||

Melicher, V., Haber, T. & Vanroose, W. [https://doi.org/10.1007/s00180-017-0765-8 Fast derivatives of likelihood functionals for ODE based models using adjoint-state method]. Comput Stat 32, 1621–1643 (2017). | Melicher, V., Haber, T. & Vanroose, W. [https://doi.org/10.1007/s00180-017-0765-8 Fast derivatives of likelihood functionals for ODE based models using adjoint-state method]. Comput Stat 32, 1621–1643 (2017). | ||

+ | |||

+ | === Summary === | ||

+ | |||

+ | In this paper, the adjoint-state method (ASM) for computation of the gradient and the Hessian of likelihood functionals for time series data modelled by ordinary differential equations (ODEs) derived and analyzed. Discrete data and the continuous model are interfaced on the level of likelihood functional, using the concept of point-wise distributions. | ||

+ | |||

+ | This alternative approach is compared to sensitivity equations and finite differences. | ||

+ | |||

+ | === Study outcomes === | ||

+ | List the paper results concerning method comparison and benchmarking: | ||

+ | ==== Outcome O1 ==== | ||

+ | When using ASM for computing the gradient of linear (and diagonal) ODE models, the speed improves linearly with the number of states which is here equivalent to the number of parameters. So, the higher dimensionality of the problem, the more beneficial is the use of ASM. | ||

+ | |||

+ | Outcome O1 is presented as Figure 1 in the original publication. | ||

+ | |||

+ | ==== Outcome O2 ==== | ||

+ | When using ASM for computing the gradient of linear (and diagonal) ODE models, the acceleration of ASM compared to sensitivity equations declines exponentially with the number of data points. So, the more observations of the problem, the less beneficial is the use of ASM. In the extreme case of many data points, it seems that ASM and SE have no significant difference. | ||

+ | |||

+ | Outcome O2 is presented as Figure 2 in the original publication. | ||

+ | |||

+ | ==== Outcome On ==== | ||

+ | ... | ||

+ | |||

+ | Outcome On is presented as Figure X in the original publication. | ||

+ | |||

+ | ==== Further outcomes ==== | ||

+ | If intended, you can add further outcomes here. | ||

+ | |||

+ | |||

+ | === Study design and evidence level === | ||

+ | ==== General aspects ==== | ||

+ | * They use CVODES solver from the SUNDIALS | ||

+ | * As a modelling toolbox, they use ''DiffMEM'' which is originated is mixed effects modelling. It is a C-library with R and Python interfaces provided. | ||

+ | * Accuracies (Tolerances) of the ODE solver are provided and claimed to be sufficient to have no effect on results. | ||

+ | |||

+ | ==== Design for Outcome O1 ==== | ||

+ | * ODE models with linear and diagonal RHS (-> number of states = number of parameters) are randomly generated with -0.1 > p<sub>i</sub> > -1.1. | ||

+ | * Dimensionality of problem is varied between 2 and 122 with 13 different dimensions. | ||

+ | * Synthetic data: 11 equidistant measurements between 0 and 100. | ||

+ | * 100 repetitions to estimate variance of the performance. | ||

+ | |||

+ | ==== Design for Outcome O2 ==== | ||

+ | * ODE models with linear and diagonal RHS (-> number of states = number of parameters) are randomly generated with -0.1 > p<sub>i</sub> > -1.1. | ||

+ | * Dimensionality of problem is fixed to 50 dimensions. | ||

+ | * Synthetic data: The number of time observations fluctuates between 2 and 122 in 12 steps | ||

+ | 11 equidistant measurements between 0 and 100. | ||

+ | * 100 repetitions to estimate variance of the performance. | ||

+ | |||

+ | ==== Design for Outcome O ==== | ||

+ | * The outcome was generated for ... | ||

+ | * Configuration parameters were chosen ... | ||

+ | * ... | ||

+ | |||

+ | === Further comments and aspects === | ||

+ | |||

+ | * Quite a lot of calculus is provided, also in the abstract. | ||

+ | * Rather abstract problem formulation, i.e. it is not specifically designed for systems biology modeling problems. | ||

+ | * "For models with a high-number of parameters and a small number of measurement times, the ASM is a clear winner." | ||

+ | * First, the implementation of SE approach is so efficient, that it renders the finite difference approximation practically obsolete, due to its superior accuracy. Second, the ASM efficiency is dependent on the number of measurement times, which is not the case for SE approach. | ||

+ | |||

+ | === References === | ||

+ | The list of cited or related literature is placed here. |

## Revision as of 13:27, 25 February 2020

## Contents

### 1 Citation

Melicher, V., Haber, T. & Vanroose, W. Fast derivatives of likelihood functionals for ODE based models using adjoint-state method. Comput Stat 32, 1621–1643 (2017).

### 2 Summary

In this paper, the adjoint-state method (ASM) for computation of the gradient and the Hessian of likelihood functionals for time series data modelled by ordinary differential equations (ODEs) derived and analyzed. Discrete data and the continuous model are interfaced on the level of likelihood functional, using the concept of point-wise distributions.

This alternative approach is compared to sensitivity equations and finite differences.

### 3 Study outcomes

List the paper results concerning method comparison and benchmarking:

#### 3.1 Outcome O1

When using ASM for computing the gradient of linear (and diagonal) ODE models, the speed improves linearly with the number of states which is here equivalent to the number of parameters. So, the higher dimensionality of the problem, the more beneficial is the use of ASM.

Outcome O1 is presented as Figure 1 in the original publication.

#### 3.2 Outcome O2

When using ASM for computing the gradient of linear (and diagonal) ODE models, the acceleration of ASM compared to sensitivity equations declines exponentially with the number of data points. So, the more observations of the problem, the less beneficial is the use of ASM. In the extreme case of many data points, it seems that ASM and SE have no significant difference.

Outcome O2 is presented as Figure 2 in the original publication.

#### 3.3 Outcome On

...

Outcome On is presented as Figure X in the original publication.

#### 3.4 Further outcomes

If intended, you can add further outcomes here.

### 4 Study design and evidence level

#### 4.1 General aspects

- They use CVODES solver from the SUNDIALS
- As a modelling toolbox, they use
*DiffMEM*which is originated is mixed effects modelling. It is a C-library with R and Python interfaces provided. - Accuracies (Tolerances) of the ODE solver are provided and claimed to be sufficient to have no effect on results.

#### 4.2 Design for Outcome O1

- ODE models with linear and diagonal RHS (-> number of states = number of parameters) are randomly generated with -0.1 > p
_{i}> -1.1. - Dimensionality of problem is varied between 2 and 122 with 13 different dimensions.
- Synthetic data: 11 equidistant measurements between 0 and 100.
- 100 repetitions to estimate variance of the performance.

#### 4.3 Design for Outcome O2

- ODE models with linear and diagonal RHS (-> number of states = number of parameters) are randomly generated with -0.1 > p
_{i}> -1.1. - Dimensionality of problem is fixed to 50 dimensions.
- Synthetic data: The number of time observations fluctuates between 2 and 122 in 12 steps

11 equidistant measurements between 0 and 100.

- 100 repetitions to estimate variance of the performance.

#### 4.4 Design for Outcome O

- The outcome was generated for ...
- Configuration parameters were chosen ...
- ...

### 5 Further comments and aspects

- Quite a lot of calculus is provided, also in the abstract.
- Rather abstract problem formulation, i.e. it is not specifically designed for systems biology modeling problems.
- "For models with a high-number of parameters and a small number of measurement times, the ASM is a clear winner."
- First, the implementation of SE approach is so efficient, that it renders the finite difference approximation practically obsolete, due to its superior accuracy. Second, the ASM efficiency is dependent on the number of measurement times, which is not the case for SE approach.

### 6 References

The list of cited or related literature is placed here.