2019 IEEE 30th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC) 2019
DOI: 10.1109/pimrc.2019.8904244
|View full text |Cite
|
Sign up to set email alerts
|

Trading Off Computation with Transmission in Status Update Systems

Abstract: This paper is motivated by emerging edge computing applications in which generated data are preprocessed at the source and then transmitted to an edge server. In such a scenario, there is typically a tradeoff between the amount of pre-processing and the amount of data to be transmitted. We model such a system by considering two non-preemptive queues in tandem whose service times are independent over time but the transmission service time is dependent on the computation service time in mean value. The first que… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
13
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 37 publications
1
13
0
Order By: Relevance
“…Thus, a decision has to be made on whether to drop the newly arriving updates or switch to them via preemption. Recently, in the context of computing, AoI analysis has been carried out through various tandem queuing models in [24]- [27], and through a task-specific age metric in [28]. The notion of sending timely measurements to the cloud has been discussed in the context of gaming in [29].…”
Section: Introductionmentioning
confidence: 99%
“…Thus, a decision has to be made on whether to drop the newly arriving updates or switch to them via preemption. Recently, in the context of computing, AoI analysis has been carried out through various tandem queuing models in [24]- [27], and through a task-specific age metric in [28]. The notion of sending timely measurements to the cloud has been discussed in the context of gaming in [29].…”
Section: Introductionmentioning
confidence: 99%
“…We use the equivalent queue model approach in [6]- [8] to analyze the system. This model produces identical AoI expression as in the actual system and simplifies analysis.…”
Section: Equivalent Queue Modelmentioning
confidence: 99%
“…server system for improving average AoI. In [6], [7] waiting is used as a mechanism to regulate the traffic while in [8] tandem computation-transmission operations and queue management are combined. We also refer the reader to [9]- [24] for other work closely related to this paper.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…[28] considered AoI minimization in two data processing scenes, the complicated initial feature extraction and classification in computer vision, as well as the optimization of sampling and updating processes in an Internet of things device's sampled physical process. The authors in [29] considered a general analysis with packet management for average AoI and average peak AoI with a computation server and a transmission queue. Ref.…”
Section: Introductionmentioning
confidence: 99%