Numerically Robust Fixed-Point Smoothing Without State Augmentation

Authors: Nicholas Krämer

TMLR 2025 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experiments serve two purposes. To start with, they investigate whether the proposed Cholesky-based implementation (Algorithm 3) of the fixed-point smoother recursion (Algorithm 1) holds its promises about memory, runtime, and numerical robustness.
Researcher Affiliation Academia Nicholas Krämer EMAIL Technical University of Denmark Kongens Lyngby, Denmark
Pseudocode Yes Algorithm 1 (Fixed-point smoother). To compute the solution to the fixed-point smoothing problem, assemble p(x K | y1:K) with a Kalman filter and evaluate p(x0 | x K, y1:K) as follows. (To simplify the index-related notation in this algorithm, read y1: 1 = y1:0 = .) ... Algorithm 2 (Covariance-based implementation of Equation 12). ... Algorithm 3 (Cholesky-based implementation of Equation 12).
Open Source Code Yes Code: https://github.com/pnkraemer/code-numerically-robust-fixedpoint-smoother
Open Datasets Yes Problem setup We solve a boundary value problem based on an ordinary differential equation. More specifically, we solve the 15th in the collection of test problems by Mazzia and Cash (2015) (Figure 3)
Dataset Splits No The paper describes generating synthetic data for experiments (e.g., "randomly populate all system matrices in the state-space model with independent samples from N(0, 1/K2)" or "sample an initial condition"). It does not use or specify traditional training/test/validation splits for any pre-existing dataset.
Hardware Specification No All experiments run on the CPU of a consumer-grade laptop and finish within a few minutes.
Software Dependencies No Our JAX implementation (Bradbury et al., 2018) of Kalman filters, Rauch Tung Striebel smoothers, and fixed-point smoothers is at https://github.com/pnkraemer/code-numerically-robust-fixedpoint-smoother. The paper mentions JAX but does not specify its version number, nor does it list versions for any other key software libraries or dependencies.
Experiment Setup Yes For Experiment I, we choose K = 1,000, vary d, set the size of the hidden state to D = 2d... For Experiment II, We choose a twice-integrated Wiener process prior and discretise it on K equispaced points in [ 1, 1]... We choose the initial mean m0|0 = (1, 0, 0) and initial covariance C0|0 = diag(0, 1, 1)... For the case study, Define the Wiener velocity model on K = 10 equispaced points ( t = 1/10)... We implement the fixed-point smoother recursion in Cholesky-based arithmetic and run expectation maximisation for three iterations. We initialise the mean guess by sampling all entries independently from a centred normal distribution with a variance of 100.