2015
DOI: 10.1515/popets-2015-0018
|View full text |Cite
|
Sign up to set email alerts
|

An Automated Approach for Complementing Ad Blockers’ Blacklists

Abstract: Privacy in the Web has become a major concern resulting in the popular use of various tools for blocking tracking services. Most of these tools rely on manually maintained blacklists, which need to be kept up-to-date to protect Web users’ privacy efficiently. It is challenging to keep pace with today’s quickly evolving advertisement and analytics landscape. In order to support blacklist maintainers with this task, we identify a set of Web traffic features for identifying privacyintrusive services. Based on the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(40 citation statements)
references
References 36 publications
0
40
0
Order By: Relevance
“…If perceptual ad-blocking is only used passively (e.g., to aid in the maintenance of filter lists, by logging potential ads that filter lists miss), the ad-blocker's adversaries still have incentive to attack to delay the detection of new ads. This stringent threat model above also applies to ML-based adblockers that use URL and DOM features [14,36,43], which have not been evaluated against adaptive white-box attacks.…”
Section: Strategic Advantage Of Adversaries and Lack Of Defensesmentioning
confidence: 99%
“…If perceptual ad-blocking is only used passively (e.g., to aid in the maintenance of filter lists, by logging potential ads that filter lists miss), the ad-blocker's adversaries still have incentive to attack to delay the detection of new ads. This stringent threat model above also applies to ML-based adblockers that use URL and DOM features [14,36,43], which have not been evaluated against adaptive white-box attacks.…”
Section: Strategic Advantage Of Adversaries and Lack Of Defensesmentioning
confidence: 99%
“…Some proposals automatically build tracking blocking lists using specific keys in URL that correspond to user identifying data [52], machine learning on DOM structure [68], Javascript [53], [55] or user browsing behavior [69]. The same machine-learning-based approach was also applied to adblocking list building using network traffic features [51].…”
Section: B Automated Blocking List Buildingmentioning
confidence: 99%
“…Several work [59], [68], [51], [52], [53], [55] automatically build blocking lists. We did not evaluate them because produced lists are not publicly available.…”
Section: B Limitationsmentioning
confidence: 99%
“…Such manipulation forces filter list authors to update filter list rules very frequently, making the laborious process even more challenging. To address this problem, researchers [31] proposed a method based on network traffic analysis (e.g., identify ad-serving domains) for updating HTTP filter list rules automatically. This method, however, does not address HTML manipulation by publishers (like recently done by Facebook [13]).…”
Section: |Doubleclickcomˆ$third-party Aolcom###banner_divmentioning
confidence: 99%