Dynamic Sequentially Thresholded Least Squares (DSTLS) | Poster

news
math
Author

Štěpán Zapadlo

Published

October 8, 2023

Introductory words

This article was presented as a poster (which you can find here) at Julia & Optimization Days 2023 in Paris – again more information along with the entire schedule can be found here.

With the abundance of data we have nowadays, it is only natural to want to extract some (potentially useful) information from it. What’s more, these observed systems can sometimes be thought of as being governed by some unknown differential equation, thus we may strive to learn this governing differential equation.

While a myriad of tools arose precisely for this job, one of the most popular is the Sparse Identification of Nonlinear Dynamics (SINDy) by Brunton and Kutz (2019), which solves an \(l\)-dimensional regularized linear regression described by an optimization problem \[ \min_{\vi \Xi} \frac 1 2 \norm{\dvi X - \vi \Theta(\vi X) \vi \Xi}^2_2 \] subject to some sparsity inducing constraint. Although it can be solved using various optimization techniques, we propose Dynamic Sequentially Thresholded Least Squares (DSTLS); a modification of Sequentially Thresholded Least Squares (STLS).

Definition 1 (DSTLS Optimization Problem) For a \(k\)-th variable of our system, we can estimate its derivative using a DSTLS optimization problem \[ \begin{gathered} \min_{\vi \xi_k} \frac 1 2 \norm{\dvi X_{\cdot, k} - \Theta(\vi X)\vi \xi_k}^2_2 \\ \constraint \forall i \in \set{1, \dots, p}: \quad \absval{\xi_{k_i}} \geq \tau \cdot \max \absval{\vi \xi_k}. \end{gathered} \]

In the Definition 1, the meaning of the symbols is as follows:

Such modification is motivated by the FitzHugh-Nagumo (FHN) model of a neuron \[\begin{align*} \dot V &= V - \frac {V^3} 3 - W + i_e, \\ \dot W &= a \cdot (bV - cW + d), \end{align*}\] with the following parameters and initial conditions \[ \begin{gathered} a = 0.08, b = 1, c = 0.8, d = 0.7, i_e = 0.8, \\ V(0) = 3.3, W(0) = -2. \end{gathered} \]

The disparity in magnitudes of parameters between equations causes sensitivity to the value of threshold \(\tau\) with the ordinary STLS method. Choosing \(\tau\) too big, only a constant zero solution will be discovered for \(\dot W\). For \(\tau\) small enough unnecessary terms will be identified for \(\dot V\). Scaling the threshold by the largest absolute value of estimated parameters aims to address this issue.

For the illustration of the proposed method, let us assume the derivatives are unknown and our data are corrupted by a additive white Gaussian noise (AWGN) with its variance equal to 5 percent of the data’s variance. The derivative is estimated with total variation regularized numerical differentiation, which is computed on raw noisy data, and smooth the data using a total variation. At last, polynomials up to 4th order were used as a candidate library. DSTLS optimizer correctly chooses the appropriate candidate functions, unlike STLS.

\[\begin{align*} \dot{V} =&\hphantom{+} 0.85 + 0.81 \cdot V + 0.27 \cdot W \cdot V ^2 + 0.28 \cdot V \cdot W ^2 \\ &- 1.02 \cdot W - 0.16 \cdot V ^2 - 0.22 \cdot V ^3 - 0.44 \cdot V \cdot W \\ &- 0.09 \cdot V ^2 \cdot W ^2,\\ \dot{W} =&\hphantom{+} 0.08 \cdot V \end{align*}\]

\[\begin{align*} \dot{V} =&\hphantom{+} 0.71 + 0.95 \cdot V - 0.32 \cdot V ^3 - 0.91 \cdot W ,\\ \dot{W} =&\hphantom{+} 0.05 + 0.08 \cdot V - 0.06 \cdot W \end{align*}\]


More comparisons and use cases are presented in the poster (and in a work-in-progress article). If I have time, I will add them here as well :).

All in all, I would like to thank the organizers of Julia & Optimization Days for such a great event, along with the participants who allowed for an interesting discussion. Last but not least, a big thanks go to my supervisor assoc. prof. Lenka Přibylová, who arranged most of the bureaucratic side of things.

Presenting the poster…

Presenting the poster…

References

Brunton, Steven L., and J. Nathan Kutz. 2019. Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control. Cambridge University Press. https://doi.org/10.1017/9781108380690.