2020
DOI: 10.7900/jot.2019jul21.2266
|View full text |Cite
|
Sign up to set email alerts
|

Spectral dissection of finite rank perturbations of normal operators

Abstract: Finite rank perturbations T=N+K of a bounded normal operator N acting on a separable Hilbert space are studied thanks to a natural functional model of T; in its turn the functional model solely relies on a perturbation matrix/characteristic function previously defined by the second author. Function theoretic features of this perturbation matrix encode in a closed-form the spectral behavior of T. Under mild geometric conditions on the spectral measure of N and some smoothness constraints on K we show that the o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…In a more general setting, rank-one perturbations of normal operators have been extensively studied for decades (see the recent papers [4][5][6][7] and the references therein). Recently, in [31], the authors have provided conditions for a possible dissection of the spectrum of T along a curve implying a decomposition of T as a direct sum of two operators with localized spectrum and providing sufficient conditions to ensure the existence of invariant subspaces for T .…”
Section: Introductionmentioning
confidence: 99%
“…In a more general setting, rank-one perturbations of normal operators have been extensively studied for decades (see the recent papers [4][5][6][7] and the references therein). Recently, in [31], the authors have provided conditions for a possible dissection of the spectrum of T along a curve implying a decomposition of T as a direct sum of two operators with localized spectrum and providing sufficient conditions to ensure the existence of invariant subspaces for T .…”
Section: Introductionmentioning
confidence: 99%