Abstract:Power consumption can make or break the success of mobile applications. This is especially true for applications requiring constant access to sensor readings as sensors tend to consume considerable amounts of energy. A lot of attention has been focused on reducing power consumption for hardware sensors both from a hardware and software perspective. However, mobile phones enable applications to also gather software artifacts employing so called soft sensors, e.g., calendar, contacts, browsing history, etc. Soft… Show more
“…Furthermore, the Google Activity Recognition Transition API 43 can detect a user's specific activity type constants (i.e., IN_VEHICLE, ON_BYCYLE, RUNNING, STILL, WALKING) to identify when a user starts or stops a specific activity [109]. Therefore, through mobile sensors and Google Activity Recognition APIs mentioned above, the physical activity state information are currently collected and utilized in many existing studies (e.g., [34,16,103,104]).…”
Section: Notification Api Frameworkmentioning
confidence: 99%
“…Location APIs (e.g., LocationManager, LocationProvider) and Google Location Services API are representative API frameworks mainly used to collect location context information in many existing studies (e.g., [20,17,101,110,102,37,103,104,23,16,15,27,68]). In terms of efficiency and accuracy, Google Location Services APIs are superior to the Location APIs.…”
Section: Types Of Touch Interaction Ref Nummentioning
confidence: 99%
“…Among 45 studies using interaction data, 18 studies use terms related to type of touch interaction (i.e., click, long click, typing, scroll, pinch, swipe) to describe the type of touch interaction used in References [17,21,37,87,88,103,104,15,70,69,61,68,67,139,73,74,77,80]. The 16 studies which are the names of Accessibili-tyEvent related to the type of touch interaction (TYPES_VIEW_CLICKED, TYPES_VIEW_LONG_CLICKED, TYPES_VIEW_TEXT_CHANGED, TYPES_VIEW_SCOLLED) were used to describe the touch interaction used in References [18,30,28,66,49,132,57,58,75,84,86,79,82,83,81,78].…”
Section: Types Of Touch Interaction Ref Nummentioning
Recently, there has been an increase in industrial and academic research on data-driven analytics with smartphones based on the collection of app usage patterns and surrounding context data. The Android mobile operating system utilizes Usage Statistics API (US API) and Accessibility Service API (AS API) as representative APIs to passively collect app usage data. These APIs are used for various research purposes as they can collect app usage patterns (e.g., app status, usage time, app name, user interaction state, and smartphone use state) and fine-grained data (e.g., user interface elements & hierarchy and user interaction type & target & time) of each application. In addition, other sensing APIs help to collect the user's surroundings context (location, network, ambient environment) and device state data, along with AS/US API. In this review, we provide insights on the types of mobile usage and sensor data that can be collected for each research purpose by considering Android built-in APIs and sensors (AS/US API, and other sensing APIs). Moreover, we classify the research purposes of the surveyed papers into four categories and 17 sub-categories, and create a hierarchical structure for data classification, comprising three layers. We present the important trends in the usage of Android's built-in APIs and sensors, including AS/US API, the types of data collected using the presented APIs, and discuss the utilization of mobile usage and sensor data in future research.
“…Furthermore, the Google Activity Recognition Transition API 43 can detect a user's specific activity type constants (i.e., IN_VEHICLE, ON_BYCYLE, RUNNING, STILL, WALKING) to identify when a user starts or stops a specific activity [109]. Therefore, through mobile sensors and Google Activity Recognition APIs mentioned above, the physical activity state information are currently collected and utilized in many existing studies (e.g., [34,16,103,104]).…”
Section: Notification Api Frameworkmentioning
confidence: 99%
“…Location APIs (e.g., LocationManager, LocationProvider) and Google Location Services API are representative API frameworks mainly used to collect location context information in many existing studies (e.g., [20,17,101,110,102,37,103,104,23,16,15,27,68]). In terms of efficiency and accuracy, Google Location Services APIs are superior to the Location APIs.…”
Section: Types Of Touch Interaction Ref Nummentioning
confidence: 99%
“…Among 45 studies using interaction data, 18 studies use terms related to type of touch interaction (i.e., click, long click, typing, scroll, pinch, swipe) to describe the type of touch interaction used in References [17,21,37,87,88,103,104,15,70,69,61,68,67,139,73,74,77,80]. The 16 studies which are the names of Accessibili-tyEvent related to the type of touch interaction (TYPES_VIEW_CLICKED, TYPES_VIEW_LONG_CLICKED, TYPES_VIEW_TEXT_CHANGED, TYPES_VIEW_SCOLLED) were used to describe the touch interaction used in References [18,30,28,66,49,132,57,58,75,84,86,79,82,83,81,78].…”
Section: Types Of Touch Interaction Ref Nummentioning
Recently, there has been an increase in industrial and academic research on data-driven analytics with smartphones based on the collection of app usage patterns and surrounding context data. The Android mobile operating system utilizes Usage Statistics API (US API) and Accessibility Service API (AS API) as representative APIs to passively collect app usage data. These APIs are used for various research purposes as they can collect app usage patterns (e.g., app status, usage time, app name, user interaction state, and smartphone use state) and fine-grained data (e.g., user interface elements & hierarchy and user interaction type & target & time) of each application. In addition, other sensing APIs help to collect the user's surroundings context (location, network, ambient environment) and device state data, along with AS/US API. In this review, we provide insights on the types of mobile usage and sensor data that can be collected for each research purpose by considering Android built-in APIs and sensors (AS/US API, and other sensing APIs). Moreover, we classify the research purposes of the surveyed papers into four categories and 17 sub-categories, and create a hierarchical structure for data classification, comprising three layers. We present the important trends in the usage of Android's built-in APIs and sensors, including AS/US API, the types of data collected using the presented APIs, and discuss the utilization of mobile usage and sensor data in future research.
“…But other responsive use cases or dataset are imaginable, e.g., speak, activity, face, object or gesture recognition [20]. To ensure repeatability across different benchmark runs, the input data consisting of location values is fixed and equal, i.e., we ignore the tracking of sensor data that is not relevant for this paper, but we reference to our previous work for measuring sensor tracking [26]. For our benchmark purpose, we created six datasets varying in their data size (50kB, 100kB, 200kB, 300kB, 400kB, 500kB) in advance to measure their impact.…”
Abstract. Smartphones become more and more popular over recent years due to their small form factors. However, such mobile systems are resource-constrained in view of computational power, storage and battery life. Offloading resource-intensive tasks (aka mobile cloud computing) to distant (e.g., cloud computing) or closely located data centers (e.g., cloudlet) overcomes these issues. Especially, cloudlets provide computational power with low latency for responsive applications due to their proximity to mobile users. However, a large-scale deployment of rangerestricted cloudlets is still an open challenge. In this paper, we propose a novel concept for a large-scale deployment of cloudlets by upgrading wireless home routers. Beside router's native purpose of routing data packets through the network, it can now offer computing resources with low latency and high bandwidth without additional hardware. Proving our concept, we conducted comprehensive benchmark tests against existing concepts. As result, the feasibility of this concept is shown and provide a promising way to large-scale deploy cloudlets in existing infrastructures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.