2012
DOI: 10.1145/2109205.2109208
|View full text |Cite
|
Sign up to set email alerts
|

Crawling Ajax-Based Web Applications through Dynamic Analysis of User Interface State Changes

Abstract: Using JavaScript and dynamic DOM manipulation on the client-side of web applications is becoming a widespread approach for achieving rich interactivity and responsiveness in modern web applications. At the same time, such techniques, collectively known as Ajax, shatter the metaphor of web 'pages' with unique URLs, on which traditional web crawlers are based. This paper describes a novel technique for crawling Ajax-based applications through automatic dynamic analysis of user interface state changes in web brow… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
205
0
3

Year Published

2013
2013
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 236 publications
(209 citation statements)
references
References 31 publications
1
205
0
3
Order By: Relevance
“…Other related researches focus on the efficient generation of test data, recognition of test interface, and how to determine whether the actual output according with expectations [8,18] . The popular black box testing tools about Web application security testing include IBM AppScan [29] …”
Section: Detection Methods Based On Software Testingmentioning
confidence: 99%
See 1 more Smart Citation
“…Other related researches focus on the efficient generation of test data, recognition of test interface, and how to determine whether the actual output according with expectations [8,18] . The popular black box testing tools about Web application security testing include IBM AppScan [29] …”
Section: Detection Methods Based On Software Testingmentioning
confidence: 99%
“…First, users interacting with Web applications in more flexible and dynamic ways [7][8]] also provides attackers many more hidden approaches for illegal operation and attack. Second, browser script sets up many obstacles for attack detection.…”
Section: Introductionmentioning
confidence: 99%
“…Tools such as Artemis [3], Kudzu [28], and CrawlJax [19] perform automated testing, or crawling, typically aiming for achieving high code coverage with various heuristics to guide the exploration and not specifically targeting errors related to nondeterminism. As an example, running Artemis on the five web applications with known bugs mentioned in Section 5.5 with a time budget of one hour per web application detected none of the bugs.…”
Section: Algorithm Domain Comparisonmentioning
confidence: 99%
“…The JSBench tool [25] uses this strategy to synthesize standalone web benchmarks. Derived inputs may improve the results of state-exploration tools such as Crawljax [17] by providing real, captured input traces.…”
Section: Supporting Behavior Reproduction and Disseminationmentioning
confidence: 99%