Speaker
Description
Variable exponent Lebesgue spaces $\ell^{(p_n)}$ have been recently proved to be the appropriate functional framework to enforce pixel-adaptive regularisation in signal and image processing applications, combined with gradient descent (GD) or proximal GD strategies. Compared to standard Hilbert or Euclidean settings, however, the application of these algorithms in the Banach setting of $\ell^{(p_n)}$ is not straightforward due to the lack of a closed-form expression and the non-separability property of the underlying norm. We propose the use of the associated separable modular function, instead of the norm, to define algorithms based on GD in $\ell^{(p_n)}$ and consider a stochastic GD to reduce the per-iteration cost of iterative schemes, used to solve linear inverse real-world image reconstruction problems.