SciPy is an open-source scientific computing library for the Python programming language. Since its initial release in 2001, SciPy has become a de facto standard for leveraging scientific algorithms in Python, with over 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories and millions of downloads per year. In this work, we provide an overview of the capabilities and development practices of SciPy 1.0 and highlight some recent technical developments.
THE COMPUTER HAS come a long way from its initial role as a scientific tool in the research lab. We live in a world where a host of computer systems, distributed throughout our physical and information environments, are increasingly implicated in our everyday actions. Computer technologies impact all aspects of our lives and our relationship with the digital has fundamentally altered as computers have moved out of the workplace and away from the desktop. Networked computers, tablets, phones and personal devices are now commonplace, as are an increasingly diverse set of digital devices built into the world around us. Data and information is generated at unprecedented speeds and volumes from an increasingly diverse range of sources and via ever more sensor types. It is then combined in unforeseen ways, limited only by human imagination. People's activities and collaborations are becoming ever more dependent upon and intertwined with this ubiquitous information substrate.As these trends continue apace, it is becoming apparent that many endeavors involve the symbiotic interleaving of humans and computers. Moreover, the emergence of these close-knit partnerships is inducing profound change. The ability of computer systems to sense and respond to our ongoing activities in the real world is transforming our daily lives and shaping the emergence of a new digital society for the 21 st century. More specifically, rather than issuing instructions to passive machines that wait until they are asked before doing anything, we are now starting to work in tandem with highly interconnected computational components that act autonomously and intelligently (aka agents 42 ). This shift is needed to cope with the volume, variety, and pace of the information and services that are available.It is simply infeasible to expect individuals to be aware of the full range of potentially relevant possibilities and be able to pull them together manually. Computers need to do more to proactively guide users' interactions based on their preferences and constraints. In so doing, greater attention must be given to the balance of control between people and machines. In many situations, humans are in charge and agents predominantly act in a supporting role, providing advice and suggesting options. In other cases, however, agents are in control and humans play the supporting role (for example, automatic parking systems on cars and algorith- Human-Agent Collectives key insightsHACs are a new class of socio-technical systems in which humans and smart software (agents) engage in flexible relationships in order to achieve both their individual and collective goals. Sometimes the humans take the lead, sometimes the computer does and this relationship can vary dynamically.There are major scientific challenges that must be addressed in developing systems that interact and motivate humans to work alongside agents in large, dynamic, and uncertain environments where privacy and ethical concerns may arise.Key research challenges in HACs include achieving flexible ...
Songbirds provide an excellent model system for understanding sensorimotor learning. Many analyses of learning require annotating song, but songbirds produce more songs than can be annotated by hand. Existing methods for automating annotation are challenged by variable song, like that of Bengalese finches. For particularly complex song like that of canaries, no methods exist, limiting the questions researchers can investigate. We developed an artificial neural network, TweetyNet, that automates annotation. First we benchmark the network on open datasets of Bengalese finch song, showing that TweetyNet achieves significantly lower error than a similar method, using less training data, and maintains low error across multiple days of song. We then show TweetyNet performs similarly on canary song. This accuracy allowed fully-automated analyses of datasets an order of magnitude larger than previous studies, improved the precision of statistical models of syntax, and revealed novel details of syntax in a new canary strain. Hence TweetyNet enables automated annotation and analysis of Bengalese finch and canary song that was formerly manual.
Songbirds provide a powerful model system for studying sensory-motor learning. However, many analyses of birdsong require time-consuming, manual annotation of its elements, called syllables. Automated methods for annotation have been proposed, but these methods assume that audio can be cleanly segmented into syllables, or they require carefully tuning multiple statistical models. Here we present TweetyNet: a single neural network model that learns how to segment spectrograms of birdsong into annotated syllables. We show that TweetyNet mitigates limitations of methods that rely on segmented audio. We also show that TweetyNet performs well across multiple individuals from two species of songbirds, Bengalese finches and canaries. Lastly, we demonstrate that using TweetyNet we can accurately annotate very large datasets containing multiple days of song, and that these predicted annotations replicate key findings from behavioral studies. In addition, we provide open-source software to assist other researchers, and a large dataset of annotated canary song that can serve as a benchmark. We conclude that TweetyNet makes it possible to address a wide range of new questions about birdsong.
The thalamostriatal system is a major network in the mammalian brain, originating principally from the intralaminar nuclei of thalamus. Its functions remain unclear, but a subset of these projections provides a pathway through which the cerebellum communicates with the basal ganglia. Both the cerebellum and basal ganglia play crucial roles in motor control. Although songbirds have yielded key insights into the neural basis of vocal learning, it is unknown whether a thalamostriatal system exists in the songbird brain. Thalamic nucleus DLM is an important part of the song system, the network of nuclei required for learning and producing song. DLM receives output from song system basal ganglia nucleus Area X and sits within dorsal thalamus, the proposed avian homolog of the mammalian intralaminar nuclei that also receives projections from the cerebellar nuclei. Using a viral vector that specifically labels presynaptic axon segments, we show in Bengalese finches that dorsal thalamus projects to Area X, the basal ganglia nucleus of the song system, and to surrounding medial striatum. To identify the sources of thalamic input to Area X, we map DLM and cerebellar-recipient dorsal thalamus (DT ). Surprisingly, we find both DLM and dorsal anterior DT adjacent to DLM project to Area X. In contrast, the ventral medial subregion of DT projects to medial striatum outside Area X. Our results suggest the basal ganglia in the song system, like the mammalian basal ganglia, integrate feedback from the thalamic region to which they project as well as thalamic regions that receive cerebellar output.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.