Tiny intermittently powered computers can monitor objects in hard to reach places maintenance free for decades by leaving batteries behind and surviving off energy harvested from the environmentavoiding the cost of replacing and disposing of billions or trillions of dead batteries. However, creating programs for these sensors is difficult. Energy harvesting is inconsistent, energy storage is scarce, and batteryless sensors can lose power at any point in timecausing volatile memory, execution progress, and time to reset. In response to these disruptions, developers must write unwieldy programs attempting to protect against failures, instead of focusing on sensing goals, defining tasks, and generating useful data in a timely manner. To address these shortcomings, we have designed Mayfly, a language and runtime for timely execution of sensing tasks on tiny, intermittently-powered, energy harvesting sensing devices. Mayfly is a coordination language and runtime built on top of Embedded-C that combines intermittent execution fragments to form coherent sensing schedules-maintaining forward progress, data consistency, data freshness, and data utility across multiple power failures. Mayfly makes the passing of time explicit, binding data to the time it was gathered, and keeping track of data and time through power failures. We evaluated Mayfly against state-of-the art systems, conducted a user study, and implemented multiple real world applications across application domains in inventory tracking, and wearables. CCS CONCEPTS • Computer systems organization → Embedded systems; Architectures; • Human-centered computing → Ubiquitous and mobile computing systems and tools; • Software and its engineering → Context specific languages;
Battery-free sensing promises to revolutionize the scientific and industrial communities by enabling long term, maintenance free deployments in tough to reach places. However, developing applications for these intermittently-powered, batteryless devices is notoriously demanding. Each device's performance is closely tied to its environment at runtime, and developers are often unable to predict how their system will be behave upon deployment. In this paper we present an instruction level simulator based off of MSPsim for intermittently-powered devices that can accurately emulate real-world energy harvesting conditions, taking into account power models of common hardware peripherals like a radio, and accelerometer. These harvester conditions are represented by IV surfaces recorded by the Ekho hardware emulator We have provided this simulator as an open source tool for the benefit of the community. CCS Concepts •Computer systems organization → Architectures; Embedded and cyber-physical systems; •Computing methodologies → Modeling and simulation; •Hardware → Power and energy;
Voice-activated personal assistants (VAPAs) are becoming smaller, cheaper, and more accurate, such that they are now prevalent in homes (e.g., Amazon Echo, Sonos One) and on mobile devices (e.g., Google Assistant, Apple Siri) around the world. VAPAs offer considerable potential to individuals who are blind, offering efficiencies over gesture-based input on touchscreen devices. However, research is just beginning to reveal the ways in which these technologies are used by people who are blind. In the first of two studies, we interviewed 14 blind adults with experience of home and/or mobile-based VAPAs, surfacing myriad accessibility, usability, and privacy issues for this community. A second study analyzing podcast content from 28 episodes relating to blind interactions with VAPAs was then undertaken to validate and extend findings from the first study. In addition to verifying prior findings, we learned that blind users wanted to leverage VAPAs for more productivity-oriented tasks and increased efficiency over other interaction modalities. We conclude that (1) VAPAs need to support a greater variety of AI personas, each specializing in a specific type of task; (2) VAPAs need to maintain continuity of voice interaction for both usability and accessibility; and (3) blind VAPA users, and especially blind technology podcasters, are expert voice interface users who should be incorporated into design processes from the beginning. We argue that when the blind lead the sighted through voice interface design, both blind and sighted users can benefit.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.