2022
DOI: 10.48550/arxiv.2202.03532
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MINER: Multiscale Implicit Neural Representations

Abstract: Scale 0 (finest) 22 min | 4.4mil. Params. 22 min 17mil params.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…[5,82] utilize meta-learning techniques [20,49] to initialize network parameters, thereby improving the training speed. Some methods [7,10,55,70,80,97] attempt to design scene representations that support efficient training. [21,48,74,83] augments the approximation ability of networks by designing encoding techniques.…”
Section: Related Workmentioning
confidence: 99%
“…[5,82] utilize meta-learning techniques [20,49] to initialize network parameters, thereby improving the training speed. Some methods [7,10,55,70,80,97] attempt to design scene representations that support efficient training. [21,48,74,83] augments the approximation ability of networks by designing encoding techniques.…”
Section: Related Workmentioning
confidence: 99%
“…[Fathony et al 2020] propose a band-limited network to obtain a multi-scale representation by restricting the frequency magnitude of the basis functions. Recently, [Saragadam et al 2022] adopt the Laplacian pyramid to extract multi-scale coefficients for multiple neural networks. Unlike our work, this work overfits each input object with an individual representation for efficient storage and rendering.…”
Section: Related Workmentioning
confidence: 99%
“…ACORN [26] and CoordX [20] aim at reducing the number of queries to coordinate-based models with different approaches: ACORN [26] adopts a hierarchical way to decompose the multi-scale coordinates while CoordX [20] designs a split MLP architecture to leverage the locality between input coordinate points. The following MINER [42] improves ACORN via a cross-scale similarity prior.…”
Section: Optimization Of Inrsmentioning
confidence: 99%