22-24 September 2021
Alpen-Adria-Universität Klagenfurt
Europe/Vienna timezone

A discrepancy-type stopping rule for conjugate gradients under white noise

22 Sep 2021, 14:40
30m
HS 1 (Alpen-Adria-Universität Klagenfurt)

HS 1

Alpen-Adria-Universität Klagenfurt

Speaker

Markus Reiß (Humboldt-Universität zu Berlin)

Description

We consider a linear inverse problem of the form $y=Ax+\epsilon \dot W$ where the action of the operator (matrix) $A$ on the unknown $x$ is corrupted by white noise (a standard Gaussian vector) $\dot W$ of level $\epsilon>0$. We study the candidate solutions $\hat x_m$ provided by the $m$-th conjugate gradient CGNE iterates. Refining Nemirovskii's trick, we are able to provide explicit error bounds for the best (oracle) iterate along the iteration path. This yields optimal estimation rates over polynomial source conditions.
In a second step we identify monotonic proxies for bias (approximation error) and variance (stochastic error) of the nonlinear estimators $\hat x_m$ and develop a residual-based stopping rule for a data-driven choice $\hat m$ of the number of iterations. This yields a stochastic version of the discrepancy principle. Using tools from concentration of measure and extending deterministic ideas by Hanke, we can provide an oracle-type inequality for the prediction error $E[\|A(\hat x_{\hat m}-x)\|^2]$ (non-trivial under white noise), which gives rate-optimality up to a dimensionality effect. Finally, we provide partial results also for the estimation error $E[\|\hat x_{\hat m}-x\|^2]$, discussing the challenges generated by the statistical noise.

Primary author

Markus Reiß (Humboldt-Universität zu Berlin)

Presentation Materials

There are no materials yet.