2003
DOI: 10.9746/sicetr1965.39.789
|View full text |Cite
|
Sign up to set email alerts
|

Localization of Steel Ball in Impact Perforation Images by Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2006
2006
2010
2010

Publication Types

Select...
2
2

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 2 publications
0
3
0
Order By: Relevance
“…The multilayered neural network model to distinguish a steel ball in the impact perforation image has been studied [3] [4]. However, it is difficult to prepare the appropriate training data set, because of the variance of the fragment shapes.…”
Section: Introductionmentioning
confidence: 99%
“…The multilayered neural network model to distinguish a steel ball in the impact perforation image has been studied [3] [4]. However, it is difficult to prepare the appropriate training data set, because of the variance of the fragment shapes.…”
Section: Introductionmentioning
confidence: 99%
“…We propose the use of neural network to classify them in the image with degradation by fragments of the specimen. We show the image classification of the images degraded by the scattered material fragments accurately by using the neural network (Ogawa et al, 2003). The successive images of the impact perforation test with actual composite materials; polycarbonate (PC), polymethyl methacrylate (PMMA) and alumina (Al 2 O 3 ) specimens are shown in Fig.2.…”
Section: Impact Perforation Testmentioning
confidence: 99%
“…We have already proposed to apply the multilayer neural network for the recognition of the steel ball in the impact perforation image. We have used the three-layered feed-forward neural network which estimated the steel ball location in the impact perforation image [3]. However, the miss-recognition of the steel ball was often seen because of the influence on the scattered fragments of the specimen.…”
mentioning
confidence: 99%