2022
DOI: 10.48550/arxiv.2202.07773
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The efficacy and generalizability of conditional GANs for posterior inference in physics-based inverse problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…2.3) from the elasticity images produced by the generator, while the generator is trained to fool the critic by producing elasticity images that resemble the ground truth elasticity images. The loss function, β„’(𝑑, 𝑔), is described in detail by Ray et al [47] and shown in Eq. 3, where 𝑑(π‘₯, 𝑦) is the output from the critic; 𝑔(𝑧, 𝑦) is the output from the generator; (π‘₯, 𝑦) represents an elasticity/phase difference image pair from the true distribution, 𝑝 π‘₯𝑦 ; 𝑧 represents the latent vector from the latent space, 𝑝 𝑧 ; 𝒒𝒫 is the gradient penalty term used to enforce Lipschitz continuity on the critic, and 𝔼 denotes the expectation value.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…2.3) from the elasticity images produced by the generator, while the generator is trained to fool the critic by producing elasticity images that resemble the ground truth elasticity images. The loss function, β„’(𝑑, 𝑔), is described in detail by Ray et al [47] and shown in Eq. 3, where 𝑑(π‘₯, 𝑦) is the output from the critic; 𝑔(𝑧, 𝑦) is the output from the generator; (π‘₯, 𝑦) represents an elasticity/phase difference image pair from the true distribution, 𝑝 π‘₯𝑦 ; 𝑧 represents the latent vector from the latent space, 𝑝 𝑧 ; 𝒒𝒫 is the gradient penalty term used to enforce Lipschitz continuity on the critic, and 𝔼 denotes the expectation value.…”
Section: Methodsmentioning
confidence: 99%
“…As shown in Figure 1a, the generator consists of a contracting path (in which the image is progressively downsampled to enable feature extraction at different scales) followed by an expanding path (in which the image is progressively upscaled to achieve the desired output dimensions), with skip connections that transfer learned features from the contracting path to the expanding path. Conditional instance normalization is used to inject the latent vector at each scale of the generator, as outlined by Ray et al [47].…”
Section: Architecture Of the Cganmentioning
confidence: 99%
See 3 more Smart Citations