2021
DOI: 10.1108/ijcs-03-2021-0011
|View full text |Cite
|
Sign up to set email alerts
|

Behavioral data assists decisions: exploring the mental representation of digital-self

Abstract: Purpose The behavioral decision-making of digital-self is one of the important research contents of the network of crowd intelligence. The factors and mechanisms that affect decision-making have attracted the attention of many researchers. Among the factors that influence decision-making, the mind of digital-self plays an important role. Exploring the influence mechanism of digital-selfs’ mind on decision-making is helpful to understand the behaviors of the crowd intelligence network and improve the transactio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Finally, to compare BERT with other models available in the HuggingFace Transformers library (RQ3), we experiment with two recent Transformer-based architectures: (1) DeBERTa [15] -a model [52], [40], [68], [38], [29], [7], [64], [55], [62], [18], [60], [53] [69], [32], [65], [58], [47] [3], [9] ML-1M* [14] 18 13 (72%) 3 (17%) 2 (11%) [52], [40], [68], [29], [7], [23], [55], [18], [60], [51], [42], [46], [28] [12], [62], [47] [64], [9] Yelp [2] 10 6 (60%) 4 (40%) 0 (0%) [69], [1], [3], [58], [47], [42] [12], [32], [65], [33] Steam that improves BERT with a disentangled attention mechanism [15] where each word is encoded using two vectors (a vector for content and a vector for position); (2) ALBERT [27] -a model that improves BERT via separating the size of the hidden state of the vocabulary embedding from t...…”
Section: Modelsmentioning
confidence: 99%
“…Finally, to compare BERT with other models available in the HuggingFace Transformers library (RQ3), we experiment with two recent Transformer-based architectures: (1) DeBERTa [15] -a model [52], [40], [68], [38], [29], [7], [64], [55], [62], [18], [60], [53] [69], [32], [65], [58], [47] [3], [9] ML-1M* [14] 18 13 (72%) 3 (17%) 2 (11%) [52], [40], [68], [29], [7], [23], [55], [18], [60], [51], [42], [46], [28] [12], [62], [47] [64], [9] Yelp [2] 10 6 (60%) 4 (40%) 0 (0%) [69], [1], [3], [58], [47], [42] [12], [32], [65], [33] Steam that improves BERT with a disentangled attention mechanism [15] where each word is encoded using two vectors (a vector for content and a vector for position); (2) ALBERT [27] -a model that improves BERT via separating the size of the hidden state of the vocabulary embedding from t...…”
Section: Modelsmentioning
confidence: 99%
“…It has long been a fascinating topic that the structure of interaction networks has a significant influence on people's behavior. Behaviors [66,67] , minds [68] , and tendencies, such as smoking [69][70][71] , drinking [70] , role conflict [72] , and obesity [71] , have all been found to potentially influence individuals through their personal social networks. Therefore, many studies have proposed that in order to implement effective interventions on individual behaviors, the influence of neighbors in social networks needs to be fully considered [73,74] .…”
Section: Effect Of Network Structure On Human Behaviormentioning
confidence: 99%
“…Early researchers try to utilize traditional machine learning methods, i.e. support vector machine (SVM) [11]- [14], naive bayesian (NB) [15] [16], random forest (RF) [17], k-nearest neighbor (KNN) [18], logistic regression (LR) [19] [20] and decision tree (DT) [21] for MDD recognition. Among these methods, feature extraction is crucial for the accuracy improvement of MDD recognition.…”
Section: Introductionmentioning
confidence: 99%