2010 IEEE 2nd Symposium on Web Society 2010
DOI: 10.1109/sws.2010.5607487
|View full text |Cite
|
Sign up to set email alerts
|

An intelligent offline filtering agent for website analysis and content rating

Abstract: The unregulated nature of the web means that anyone can make content available on the web, some of which could be harmful to children and unsuspecting adults. Content filtering is aimed at blocking out undesirable material from reaching the end user. Most existing software content filters make use an access control list which involves some sort of manual search, gathering and classification of undesirable web sites so that the software filter can block the access of these URLs. In this paper, we describe an Of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 2 publications
0
4
0
Order By: Relevance
“…The authors of (Fong et al, 2010) state that content filtering aims to keep undesirable content from getting to the user. Currently available software content filters typically use an access control list, which requires human search, collection, and classification of unwanted websites before the software filter may prevent access to these URLs.…”
Section: Figure 2 Text Classification Processmentioning
confidence: 99%
“…The authors of (Fong et al, 2010) state that content filtering aims to keep undesirable content from getting to the user. Currently available software content filters typically use an access control list, which requires human search, collection, and classification of unwanted websites before the software filter may prevent access to these URLs.…”
Section: Figure 2 Text Classification Processmentioning
confidence: 99%
“…Fong et al [10] have developed an architecture for an intelligent offline content filtering agent. The Offline Filtering Agent includes two major modules such as WebCrawler and WebParser.…”
Section: Related Workmentioning
confidence: 99%
“…To address the problem of web content filtering system some strategies have been used, some used packet filtering approach, this method concern about IP address, but the IP address represents a particular host and this host can contain more than one sites, some of these sites considered acceptable, when blocking this IP this cause to block all the acceptable sites [3], also the control access list of IP is generated manually and this required great human efforts, Other used white/black lists of resources, they classified the sites to white and black, respectively. Such classification is performed by rating agencies, where manual collection and classification is required [4].…”
Section: Introductionmentioning
confidence: 99%