Abstract:Not only SQL (NoSQL) databases are becoming increasingly popular and have some interesting strengths such as scalability and flexibility. In this paper, we investigate on the use of NoSQL systems for implementing OLAP (On-Line Analytical Processing) systems. More precisely, we are interested in instantiating OLAP systems (from the conceptual level to the logical level) and instantiating an aggregation lattice (optimization). We define a set of rules to map star schemas into two NoSQL models: columnoriented and document-oriented. The experimental part is carried out using the reference benchmark TPC. Our experiments show that our rules can effectively instantiate such systems (star schema and lattice). We also analyze differences between the two NoSQL systems considered. In our experiments, HBase (columnoriented) happens to be faster than MongoDB (document-oriented) in terms of loading time.
Abstract. NoSQL (Not Only SQL) systems are becoming popular due to known advantages such as horizontal scalability and elasticity. In this paper, we study the implementation of data warehouses with document-oriented NoSQL systems. We propose mapping rules that transform the multidimensional data model to logical document-oriented models. We consider three different logical models and we use them to instantiate data warehouses. We focus on data loading, model-to-model conversion and OLAP cuboid computation.
People taking part in argumentative debates through collective annotations face a highly cognitive task when trying to estimate the group's global opinion. In order to reduce this effort, we propose in this paper to model such debates prior to evaluating their "social validation." Computing the degree of global confirmation (or refutation) enables the identification of consensual (or controversial) debates. Readers as well as prominent information systems may thus benefit from this information. The accuracy of the social validation measure was tested through an online study conducted with 121 participants. We compared their human perception of consensus in argumentative debates with the results of the three proposed social validation algorithms. Their efficiency in synthesizing opinions was demonstrated by the fact that they achieved an accuracy of up to 84%. Introduction and MotivationsAnnotating paper documents is a common activity practiced since the early middle ages (Fraenkel & Klein, 1999). Field studies show that readers still make extensive use of annotations nowadays (Marshall, 1998;Wolfe & Neuwirth, 2001). Although seemingly insignificant, they actually facilitate key purposes such as "active reading" by supporting critical thinking while reading (Adler & van Doren, 1972), learning by facilitating document appropriation, and proofreading, among many others. With the widespread adoption of digital documents both in the workplace and at home, people have felt frustrated at not being able to annotate them (Sellen & Harper, 2003, p. 96). Such a need led both research labs and companies to design a plethora of annotation systems-mostly targeting Web documents-since the 1990s (Wolfe, 2002). At first, they implemented the usual paper-based annotation functions, mainly for personal use. Then, taking advantage of modern computer storage and networking capabilities, annotation systems provided novel features for collective use. In particular, they enabled annotation sharing through dedicated servers, so that users could view and access them in context, i.e., within the annotated document. Moreover, subsequent readers may later reply to any annotation or any reply in turn, thus forming a debate anchored to the discussed passage. Such a debate is also called a "discussion thread" (e.g., Figure 1).As an asynchronous way of communicating, collective annotations (i.e., annotations along with their discussion threads) are useful at two levels:• From the readers' point of view, collective annotations enable them to discuss document passages in context. This is an advantage in comparison with Internet bulletin boards and forums where one needs to explain the context of his/her post to be understood. Moreover, feedback about a given annotated document is directly accessible to readers, whereas it would be scattered over multiple sources otherwise. Besides being useful for readers, authors also benefit from collective annotations as they can improve their documents by taking into account the associated remarks.• For syste...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.