2023
DOI: 10.1109/ojsp.2023.3250104
|View full text |Cite
|
Sign up to set email alerts
|

Delaunay-Triangulation-Based Learning With Hessian Total-Variation Regularization

Abstract: Regression is one of the core problems tackled in supervised learning. Neural networks with rectified linear units generate continuous and piecewise-linear (CPWL) mappings and are the state-of-the-art approach for solving regression problems. In this paper, we propose an alternative method that leverages the expressivity of CPWL functions. In contrast to deep neural networks, our CPWL parameterization guarantees stability and is interpretable. Our approach relies on the partitioning of the domain of the CPWL f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 42 publications
0
1
0
Order By: Relevance
“…This is a variant of the classical second-order total variation ( [5]). It has been inspired by [6,[14][15][16][17] and used in [10,18]; c) in the critical case d = 2 we consider as linear forward operator the evaluation functional at certain points x 1 , . .…”
Section: Introductionmentioning
confidence: 99%
“…This is a variant of the classical second-order total variation ( [5]). It has been inspired by [6,[14][15][16][17] and used in [10,18]; c) in the critical case d = 2 we consider as linear forward operator the evaluation functional at certain points x 1 , . .…”
Section: Introductionmentioning
confidence: 99%