We consider the classical multivariate convolution model in nonparametric statistics, formulated as $ \mathbf{Y} = \mathbf{X} + \boldsymbol{\varepsilon}$, where \(\mathbf{X}\) and \(\boldsymbol{\varepsilon}\) are independent \(d\)-dimensional random vectors with \(d \geq 2\). The goal is to recover the distribution function of \(\mathbf{X}\) from independent, contaminated observations of \(\mathbf{Y}\)-replicates. The distribution of the noise \(\boldsymbol{\varepsilon}\) is assumed to be known, ordinary smooth and anisotropic. We extend to the multivariate setting a recently proposed minimum \(L^p\)-distance estimator (\(p \geq 1\)) for the distribution function in the univariate convolution model, which is based on integrated kernel density estimation. We analyze its performance under the \(L^1\)-risk, establishing convergence rates over anisotropic Nikol'skii classes. In particular, we derive a non-asymptotic upper bound for the \(L^1\)-risk and prove associated convergence rates over these function spaces. These results are complemented by a matching asymptotic lower bound, thereby confirming theoretical optimality of the estimator. Furthermore, we propose a fully data-driven, rate-adaptive estimation procedure that automatically selects an optimal bandwidth vector across the full Nikol'skii regularity scale of the mixing distribution. This procedure requires no prior knowledge of the oracle bandwidth, yet it guarantees that the resulting distribution function estimator achieves the optimal convergence rate. Finally, we discuss a potential extension to the setting of an unknown noise density when an auxiliary sample from the noise distribution is available.

Adaptive minimax-optimal multivariate deconvolution of anisotropic distribution functions

Catia Scricciolo
2026-01-01

Abstract

We consider the classical multivariate convolution model in nonparametric statistics, formulated as $ \mathbf{Y} = \mathbf{X} + \boldsymbol{\varepsilon}$, where \(\mathbf{X}\) and \(\boldsymbol{\varepsilon}\) are independent \(d\)-dimensional random vectors with \(d \geq 2\). The goal is to recover the distribution function of \(\mathbf{X}\) from independent, contaminated observations of \(\mathbf{Y}\)-replicates. The distribution of the noise \(\boldsymbol{\varepsilon}\) is assumed to be known, ordinary smooth and anisotropic. We extend to the multivariate setting a recently proposed minimum \(L^p\)-distance estimator (\(p \geq 1\)) for the distribution function in the univariate convolution model, which is based on integrated kernel density estimation. We analyze its performance under the \(L^1\)-risk, establishing convergence rates over anisotropic Nikol'skii classes. In particular, we derive a non-asymptotic upper bound for the \(L^1\)-risk and prove associated convergence rates over these function spaces. These results are complemented by a matching asymptotic lower bound, thereby confirming theoretical optimality of the estimator. Furthermore, we propose a fully data-driven, rate-adaptive estimation procedure that automatically selects an optimal bandwidth vector across the full Nikol'skii regularity scale of the mixing distribution. This procedure requires no prior knowledge of the oracle bandwidth, yet it guarantees that the resulting distribution function estimator achieves the optimal convergence rate. Finally, we discuss a potential extension to the setting of an unknown noise density when an auxiliary sample from the noise distribution is available.
2026
Adaptive estimation
Anisotropic Nikol'skii classes
Minimax optimality
Multivariate distribution function deconvolution
Ordinary smooth errors
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1182413
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact