2019
DOI: 10.14393/rbcv71n2-47697
|View full text |Cite
|
Sign up to set email alerts
|

Two aspects on L1-norm adjustment of leveling networks

Abstract: L1-norm adjustment corresponds to the minimization of the sum of weighted absolute residuals. Unlike Least Squares, it is a robust estimator, i.e., insensitive to outliers. In geodetic networks, the main application of L1-norm refers to the identification of outliers. There is no general analytical expression for its solution. Linear programming is the usual strategy, but it demands decorrelated observations. In the context of Least Squares, it is well known that the application of Cholesky factorization decor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 13 publications
0
5
0
1
Order By: Relevance
“…For all networks, the standard deviation of the observations was given by σ i � 1.0(mm) * �� K i , where K i (in km) is the length of the respective leveling line. In the ascending order of the observation index, the lengths (in km) of each leveling line were as follows: for network A, [42,38,27,22,23,33]; for network B, [37,28,33,26,40,32,39,29,34,41]; and for network C, [30,34,25,37,28,38,29,35,31,26,33,36,27,32,24]. erefore, for example, σ i of the 4 th observation of network A (which is also the lowest σ i of all networks) is σ 4(Network A) � 1.0(mm) * �� 22 √ � 4.69 mm.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For all networks, the standard deviation of the observations was given by σ i � 1.0(mm) * �� K i , where K i (in km) is the length of the respective leveling line. In the ascending order of the observation index, the lengths (in km) of each leveling line were as follows: for network A, [42,38,27,22,23,33]; for network B, [37,28,33,26,40,32,39,29,34,41]; and for network C, [30,34,25,37,28,38,29,35,31,26,33,36,27,32,24]. erefore, for example, σ i of the 4 th observation of network A (which is also the lowest σ i of all networks) is σ 4(Network A) � 1.0(mm) * �� 22 √ � 4.69 mm.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…is work focuses on the solution by the simplex method [28] of linear programming, the most widely used approach for solving minimum L1-norm [23]. In geodetic networks, it has already been applied by many authors (see, e.g., [19,20,[29][30][31]).…”
Section: Introductionmentioning
confidence: 99%
“…Ao levar em consideração esta perspectiva, é importante ressaltar que dependendo da natureza ou comportamento dos erros aleatórios, o MMQ pode ou não ser o melhor estimador imparcial (KLEIN, 2012). Abordagens alternativas ao ajustamento pelo MMQ em redes altimétricas podem ser obtidas, por exemplo, em Suraci, Oliveira e Klein (2019) ou Suraci e Oliveira (2020).…”
Section: Resíduos Do Ajustamentounclassified
“…In the present work we will work with the leveling network presented in Suraci et al (2019), Figure 2, and personal programming code developed under Matlab R2019b (Matlab, 2019) to examine the performance of the different approaches outlined above for outlier detection. As we can see the situation will become increasingly intractable as the number of outliers approaches half the number of degrees of freedom of the network.…”
Section: Examplesmentioning
confidence: 99%
“…The example network presented in Suraci et al (2019) was simulated with zero error in its measurements; therefore, the least squares adjustment yields only null residual for all its observations. To resemble more a real network we must consider the accidental errors which are inherent to any measuring process, so that we add to each It is worth recalling here that the data snooping test is based on the assumption of a possible single outlier only, although it is frequently used with the unjustified hope of a high efficiency for the multiple outlier case.…”
Section: Only Accidental Errorsmentioning
confidence: 99%