Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.288
|View full text |Cite
|
Sign up to set email alerts
|

CDRNN: Discovering Complex Dynamics in Human Language Processing

Abstract: The human mind is a dynamical system, yet many analysis techniques used to study it are limited in their ability to capture the complex dynamics that may characterize mental processes. This study proposes the continuoustime deconvolutional regressive neural network (CDRNN), a deep neural extension of continuous-time deconvolutional regression (CDR, Shain and Schuler, 2021) that jointly captures time-varying, non-linear, and delayed influences of predictors (e.g. word surprisal) on the response (e.g. reading ti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 11 publications
(15 citation statements)
references
References 69 publications
1
14
0
Order By: Relevance
“…Our approach—the continuous-time deconvolutional regressive neural network (CDR-NN)—uses deep learning to relax the key simplifying assumptions above (discrete time, linearity, stationarity, and homoscedasticity) in order to estimate, visualize, and test properties of the response structure of a complex process from data. Our study expands significantly upon an earlier proposal of the CDR-NN approach (Shain, 2021 , see SI A for detailed comparison). We evaluate CDR-NNs on a range of synthetic data, as well as on publicly available behavioral and neural data from studies of human language processing.…”
Section: Introductionsupporting
confidence: 68%
See 1 more Smart Citation
“…Our approach—the continuous-time deconvolutional regressive neural network (CDR-NN)—uses deep learning to relax the key simplifying assumptions above (discrete time, linearity, stationarity, and homoscedasticity) in order to estimate, visualize, and test properties of the response structure of a complex process from data. Our study expands significantly upon an earlier proposal of the CDR-NN approach (Shain, 2021 , see SI A for detailed comparison). We evaluate CDR-NNs on a range of synthetic data, as well as on publicly available behavioral and neural data from studies of human language processing.…”
Section: Introductionsupporting
confidence: 68%
“…Formally, the parameters s for the response distribution 𝓕 are computed as the sum of (i) the temporal convolution of X ′ with G 1 , …, G N and (ii) learned bias vector (intercept) s 0 , where each transposed row , 1 ≤ n ≤ N of X ′ is vertically concatenated with a bias. This bias, which we have called rate in prior work (Shain, 2021 ; Shain & Schuler, 2018 , 2021 ), serves to capture general effects of stimulus timing, or, equivalently, the baseline response of the system to a stimulus, without regard to stimulus properties. Rate can therefore be regarded as a kind of “deconvolutional intercept”, i.e., a baseline response that is added to any stimulus-specific responses.…”
Section: The Cdr-nn Modelmentioning
confidence: 99%
“…More flexible finite impulse response models (FIR; Dale, 1999;Glover, 1999) require fitting a large number of parameters and are computationally brittle. Thus, instead of sticking with linearized approaches, some researchers are suggesting to model the HRF shape explicitly within a family of nonlinear functions motivated by physiological data (see, for instance, Lindquist & Wager, 2007;Shain et al, 2020;Shain, 2021). By using a constrained space of nonlinear mappings (rather than an unconstrained space of linear mappings, as in FIR), one can estimate the veridical shape of the HRFs using a relatively small number of parameters.…”
Section: Incorporate Measurement-related Considerationsmentioning
confidence: 99%
“…To this end, one approach, called Monte Carlo dropout (MCD), has seen increasing use within the domain of neuroimaging classification [13]. MCD has been used in a variety of studies, including those focused on cortex parcellation [32], dynamics estimation [33], [34], and classification of autism spectrum disorder [35] and Parkinson’s disease [36]. A more recently developed alternative to MCD that has seen comparatively little use in the domain of neuroimaging classification is Monte Carlo batch normalization (MCBN) [14].…”
Section: Introductionmentioning
confidence: 99%