2022
DOI: 10.1007/978-3-030-94437-7_1
|View full text |Cite
|
Sign up to set email alerts
|

A YCSB Workload for Benchmarking Hotspot Object Behaviour in NoSQL Databases

Abstract: Many contemporary applications have to deal with unexpected spikes or unforeseen peaks in demand for specific data objectsso-called hotspot objects. For example in social networks, specific media items can go viral quickly and unexpectedly and therefore, properly provisioning for such behavior is not trivial. NoSQL databases are specifically designed for enhanced scalability, high availability, and elasticity to deal with increasing data volumes. Although existing performance benchmarking systems such as the Y… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…Unfortunately, current database benchmarks like the Star Schema Benchmark (SSB) [22], TPC-DS Benchmark [23], and TPC-H [24] are inadequate for making these comparisons because the workloads depicted in these benchmarks do not accurately reflect how database queries are produced by user activities, such tools like Tableau [25] or Spotfire [20]. Some research benchmark database systems using interactive workload [26], [27], [28], [29], [30] while others use tools like Yahoo Cloud Service Benchmarking tool(YCSB) [31], [32], [33]and HammerDB [34]. In both approaches, a predefined static set of operations is to be performed as the workload; for example, a given workload can perform 1000 operations of 950 read and 50 updates.…”
Section: Introductionmentioning
confidence: 99%
“…Unfortunately, current database benchmarks like the Star Schema Benchmark (SSB) [22], TPC-DS Benchmark [23], and TPC-H [24] are inadequate for making these comparisons because the workloads depicted in these benchmarks do not accurately reflect how database queries are produced by user activities, such tools like Tableau [25] or Spotfire [20]. Some research benchmark database systems using interactive workload [26], [27], [28], [29], [30] while others use tools like Yahoo Cloud Service Benchmarking tool(YCSB) [31], [32], [33]and HammerDB [34]. In both approaches, a predefined static set of operations is to be performed as the workload; for example, a given workload can perform 1000 operations of 950 read and 50 updates.…”
Section: Introductionmentioning
confidence: 99%