We propose 'Tapestry', a novel approach to pooled testing with application to COVID-19 testing with quantitative Reverse Transcription Polymerase Chain Reaction (RT-PCR) that can result in shorter testing time and conservation of reagents and testing kits. Tapestry combines ideas from compressed sensing and combinatorial group testing with a novel noise model for RT-PCR used for generation of synthetic data. Unlike Boolean group testing algorithms, the input is a quantitative readout from each test and the output is a list of viral loads for each sample relative to the pool with the highest viral load. While other pooling techniques require a second confirmatory assay, Tapestry obtains individual sample-level results in a single round of testing, at clinically acceptable false positive or false negative rates. We also propose designs for pooling matrices that facilitate good prediction of the infected samples while remaining practically viable. When testing n samples out of which k n are infected, our method needs only O(k log n) tests when using random binary pooling matrices, with high probability. However, we also use deterministic binary pooling matrices based on combinatorial design ideas of Kirkman Triple Systems to balance between good reconstruction properties and matrix sparsity for ease of pooling. A lower bound on the number of tests with these matrices for satisfying a sufficient condition for guaranteed recovery is k √ n. In practice, we have observed the need for fewer tests with such matrices than with random pooling matrices. This makes Tapestry capable of very large savings at low prevalence rates, while simultaneously remaining viable even at prevalence rates as high as 9.5%. Empirically we find that single-round Tapestry pooling improves over two-round Dorfman pooling by almost a factor of 2 in the number of tests required. We describe how to combine combinatorial group testing and compressed sensing algorithmic ideas together to create a new kind of algorithm that is very effective in deconvoluting pooled tests. We validate Tapestry in simulations and wet lab experiments with oligomers in quantitative RT-PCR assays. An accompanying Android application Byom Smart Testing makes the Tapestry protocol straightforward to implement in testing centres, and is made available for free download. Lastly, we describe use-case scenarios for deployment.
While recent benchmarks have spurred a lot of new work on improving the generalization of pretrained multilingual language models on multilingual tasks, techniques to improve code-switched natural language understanding tasks have been far less explored. In this work, we propose the use of bilingual intermediate pretraining as a reliable technique to derive large and consistent performance gains using code-switched text on three different NLP tasks: Natural Language Inference (NLI), Question Answering (QA) and Sentiment Analysis (SA). We show consistent performance gains on four different code-switched language-pairs (Hindi-English, Spanish-English, Tamil-English and Malayalam-English) for SA and on Hindi-English for NLI and QA. We also present a code-switched masked language modeling (MLM) pretraining technique that consistently benefits SA compared to standard MLM pretraining using real code-switched text.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.