2013
DOI: 10.4028/www.scientific.net/amr.639-640.530
|View full text |Cite
|
Sign up to set email alerts
|

The Spatial Analysis of Lateral Bending on 40-m Prestressed Concrete T-Beam

Abstract: For the reasons of prestressing and other factors, the lateral bending would appear on the post-tensioned concrete T-beam during the pre-tensioning. In this paper, a simulation model is established as an example of the edge beam of 40-m T-beam which is widely used. According to the stress and displacement of main girder under different orthogonal force, the effects of lateral bending on the main girder are analyzed. Finally, some guidelines are suggested.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…In this work, we tried more than one ML algorithm, such as K-nearest neighbor regression (KNR), random forest regression (RFR), support vector regression (SVR), gradient boosting regression (GBR), extreme gradient boosting regression (XGBR), and a kind of composited algorithms produced by TPOT (tree-based pipeline optimization tool) . For each algorithm, we tested various pairs of training and testing data with different separated ratios to gain ideal regression models.…”
Section: Computational Detailsmentioning
confidence: 99%
“…In this work, we tried more than one ML algorithm, such as K-nearest neighbor regression (KNR), random forest regression (RFR), support vector regression (SVR), gradient boosting regression (GBR), extreme gradient boosting regression (XGBR), and a kind of composited algorithms produced by TPOT (tree-based pipeline optimization tool) . For each algorithm, we tested various pairs of training and testing data with different separated ratios to gain ideal regression models.…”
Section: Computational Detailsmentioning
confidence: 99%
“…Supervised classification ML was trained and established by extreme gradient boosting (XGB) algorithms and LR from the scikit-learn package . The Pearson correlation coefficient ( p ) and feature importance ( F i ) were applied to determine the performance of training features.…”
Section: Computational Methodsmentioning
confidence: 99%
“…For the Fisher’s exact test, features were selected at five different p -value cutoffs ranging from 0.01 to 0.05 with an interval of 0.01. For the AUC-ROC, features were selected at five different cutoffs ranging from 0.52 to 0.60 with an interval 0.02 using the “ROCR” and “pROC” packages. , The “xgboost” and “Random Forest” packages were applied to retrieve feature importance scores, respectively. Features ranked with Gain or Gini scores were picked at 10 intervals from the top 10 to top 50.…”
Section: Methodsmentioning
confidence: 99%