2019
DOI: 10.1007/978-3-030-29962-0_28
|View full text |Cite
|
Sign up to set email alerts
|

Fingerprint Surface-Based Detection of Web Bot Detectors

Abstract: Web bots are used to automate client interactions with websites, which facilitates large-scale web measurements. However, websites may employ web bot detection. When they do, their response to a bot may differ from responses to regular browsers. The discrimination can result in deviating content, restriction of resources or even the exclusion of a bot from a website. This places strict restrictions upon studies: the more bot detection takes place, the more results must be manually verified to confirm the bot's… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 23 publications
(27 citation statements)
references
References 18 publications
0
27
0
Order By: Relevance
“…Aside from our bot mitigation techniques, we do not interact with the visited websites in any way, limitations of this approach are discussed in Section 6. While a website might detect our crawler, it is not detected by current mechanisms seen in the wild, as presented by Jonker et al [24].…”
Section: Measurement Frameworkmentioning
confidence: 77%
“…Aside from our bot mitigation techniques, we do not interact with the visited websites in any way, limitations of this approach are discussed in Section 6. While a website might detect our crawler, it is not detected by current mechanisms seen in the wild, as presented by Jonker et al [24].…”
Section: Measurement Frameworkmentioning
confidence: 77%
“…[EN16]); secondly, such deviations will (also) affect logging in (cf. [JKV19]). Thus, for this project we require an automated way to use a regular (full) browser.…”
Section: A Base Http Platformmentioning
confidence: 99%
“…), which makes their detection more challenging. Even though such web bots, in their vanilla configurations, contain fingerprints that can reveal their bot nature [23], additional configurations can be applied to avoid detection [22,23]. Moreover, web bots can be designed to use the regular browsers of a machine (instead of using an automated browsing software), which makes fingerprint-based detection even harder [2].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Such web bots can crawl web servers in a humanlike manner to collect information making them harder to detect. Additionally, we can assume that the malicious web bots exhibit a fingerprint that is indistinguishable from that of a browser as in the opposite case, such bots could be deterministically detected using advanced fingerprinting techniques [5,22,23,31]. This is a logical assumption, as the respective Indicators of Compromise have a low pain threshold (i.e., they require low effort to be changed) [8].…”
Section: Threat Modelmentioning
confidence: 99%