2017, Articolo in rivista, ENG
L. Lenarduzzi and R. Schaback
One of the basic principles of Approximation Theory is that the quality of approximations increase with the smoothness of the function to be approximated. Functions that are smooth in certain subdomains will have good approximations in those subdomains, and these {\em sub-approximations} can possibly be calculated efficiently in parallel, as long as the subdomains do not overlap. This paper proposes a class of algorithms that first calculate sub-approximations on non-overlapping subdomains, then extend the subdomains as much as possible and finally produce a global solution on the given domain by letting the subdomains fill the whole domain.
2017, Articolo in rivista, ENG
L. Lenarduzzi and M.Pepe
Glaciers' changes are an important indicator of global warming, and the interpretation of satellite images offers a good source of information for such a monitoring. It is important to select areas of interest for the detection of glaciers bodies and in particular to capture also the part covered with debris, which is an open issue in remote sensing of the glaciers. We use enhancing features with a physical meaning and we provide a coarse segmentation of the image of the Lys glacier belonging to the Monte Rosa district in the Alps. The image data correspond to electromagnetic sensing of sunlight as reflected in band of the visible red spectral range, TM3TM3, and in two infrared bands TM4TM4 and TM5TM5, as operated from the thematic mapper (TM) sensor aboard the Landsat satellite in clear sunny days.
2015, Articolo in rivista, ENG
L. Lenarduzzi
We consider a map of curvature of the cornea that presents a central singularity. The application is that one of compressing the information of the map in order to recover it later from the selected information, in a faithful way and avoiding blurring effects.
2015, Articolo in rivista, ENG
M. Bozzini, L. Lenarduzzi, M. Rossini, and R. Schaback
Within kernel-based interpolation and its many applications, the handling of the scaling or the shape parameter is a well-documented but unsolved problem. We consider native spaces whose kernels allow us to change the kernel scale of a d-variate interpolation problem locally, depending on the requirements of the application. The trick is to define a scale function c on the domain ? ? Rd to transform an interpolation problem from data locations xj in Rd to data locations (xj, c(xj)) and to use a fixed-scale kernel on Rd+1 for interpolation there. The (d+1)-variate solution is then evaluated at (x, c(x)) for x ? Rd to give a d-variate interpolant with a varying scale. A large number of examples show how this can be done in practice to get results that are better than the fixed-scale technique, with respect to both condition number and error. The background theory coincides with fixed-scale interpolation on the submanifold of Rd+1 given by the points (x, c(x)) of the graph of the scale function c.
2014, Articolo in rivista, ENG
M. Bozzini and L. Lenarduzzi
When the data are unevenly distributed and the behaviour of a function changes abruptly, the approximant can present undue oscillations. We present an algorithm to identify a domain decomposition, such that on each subdomain the behaviour of the function is sufficiently homogeneous in order to calculate separate approximants and to blend them together.
2013, Contributo in atti di convegno, ENG
L. Lenarduzzi
MAMERN'13, Granada, 22-25/4/20132012, Presentazione, ENG
L. Lenarduzzi
Eightht International Conference on Mathematical Methods for Curves and Surfaces, Oslo2012, Abstract in atti di convegno, ENG
L. Lenarduzzi
Congresso Nazionale SIMAI 20122011, Contributo in atti di convegno
M. Bozzini and L. Lenarduzzi
4th International Conference on Approximation Methods and Numerical Modelling in Environment and Natural Resources2010, Contributo in atti di convegno, ENG
M. Bozzini and L. Lenarduzzi
In this paper our concern is the recovery of a highly regular function by a discrete set $X$ of data with arbitrary distribution. We consider the case of a nonstationary multiquadric interpolant that presents numerical breakdown. Therefore we propose a global least squares multiquadric approximant with a center set $T$ of maximal size and obtained by a new thinning technique. The new thinning scheme removes the local bad conditions in order to obtain $\axt$ well conditioned. The choice of working on local subsets of the data set $X$ provides an effective solution. Some numerical examples to validate the goodness of our proposal are given.
2010, Articolo in rivista, ENG
M. Bozzini, L. Lenarduzzi, and M. Rossini
The aim of the paper is to provide a computationally effective way to construct stable bases on general non-degenerate lattices. In particular, we define new stable bases on hexagonal lattices and we give some numerical examples which show their usefulness in applications.
2010, Articolo in rivista, ENG
M. Bozzini, L. Lenarduzzi, and M. Rossini
The aim of this paper is to provide a fast method, with a good quality of reproduction, to recover functions from very large and irregularly scattered samples of noisy data, which may present outliers. To the given sample of size N, we associate a uniform grid and, around each grid point, we condense the local information given by the noisy data by a suitable estimator. The recovering is then performed by a stable interpolation based on isotropic polyharmonic B-splines. Due to the good approximation rate, we need only M\lt N degrees of freedom to recover the phenomenon faiythully.
2009, Curatela di monografia/trattato scientifico
Lenarduzzi L.
2009, Articolo in rivista, ENG
Bozzini; M., Lenarduzzi L.
We propose an adaptive local procedure, which uses the modified Shepard's method with local polyharmonic interpolants. The aim is to reconstruct, in a faithful way, a function known by a large and highly irregularly distributed sample. Such a problem is generally related to the recovering of geophysical surfaces, where the sample is measured according to the behaviour of the surface. The adaptive local procedure is used to calculate, by an efficient algorithm, an interpolating polyharmonic function, when a very large sample is assigned. When we consider a sample of size $N<10^4$, we propose an approximating polyharmonic function obtained by combining adaptively a global interpolant, relevant to a subset of the data, with local adaptive interpolants. The goodness of the approximating functions in two different cases is shown by real examples.
DOI: 10.1685/CSC09260
2008, Presentazione
Bozzini M., Lenarduzzi L., Rossini M.
Mathematical Methods for Curves and Surfaces, Tonsberg, Norvegia2008, Presentazione
Bozzini M., Lenarduzzi L.
Tenth Int. Conf. Zaragoza-Pau in Applied Mathematics and Statistics, Jaca, Spagna2008, Presentazione
Bozzini M., Lenarduzzi L.
Convegno SIMAI, Roma2007, Poster
Pepe M., Rampini A., Carrara P., Lenarduzzi L.
Workshop on Applied Remote Sensing in Mountain Regions, Bolzano2007, Presentazione
Bozzini M. , Lenarduzzi L.
Fourth International Conference on Multivariate Approximation, Cancun Mexico2007, Presentazione
Bozzini M., Lenarduzzi L., Rossini M.
MAIA 07: International Conference on Multivariate Approximation, Alesund, Norway