2017
DOI: 10.3103/s073527271705003x
|View full text |Cite
|
Sign up to set email alerts
|

Wireless sensor networks based on modular arithmetic

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 8 publications
0
4
0
1
Order By: Relevance
“…For modular operations such as addition, subtraction, multiplication, and exact division, the computations with residues are independent with each other, which provides parallel, carry-free, and high-speed computer arithmetic. Thanks to these properties, RNS is now applied in many resource-intensive applications, such as blockchain [3], neural networks [4]- [6], cryptography [7]- [11], digital signal processing [12]- [14], wireless sensor networks [15], and digital image processing [16].…”
Section: Introductionmentioning
confidence: 99%
“…For modular operations such as addition, subtraction, multiplication, and exact division, the computations with residues are independent with each other, which provides parallel, carry-free, and high-speed computer arithmetic. Thanks to these properties, RNS is now applied in many resource-intensive applications, such as blockchain [3], neural networks [4]- [6], cryptography [7]- [11], digital signal processing [12]- [14], wireless sensor networks [15], and digital image processing [16].…”
Section: Introductionmentioning
confidence: 99%
“…Here, B d is the set of bi-terms which is extracted from the service document d, and each bi-term b ∈ B d contains two unordered words (w i , w j ), D is the documents and d is the biterms, respectively. BTM can construct the generation of bi-term with the latent topic structure, according to [26]. The extended service feature is used as training corpus B.…”
Section: Btm Topic Modelmentioning
confidence: 99%
“…The major knowledge representation learning methods include the neural network model [26][27][28], matrix decomposition [29], and translation model [30,31]. TransR [32] is a kind of translation model for knowledge graph. It embeds heterogeneous entities and relations within the same vector space, and represents these entities in the distinct semantic space bridged by relation-specific metrics.…”
Section: Knowledge Graph Embeddingmentioning
confidence: 99%