Proceedings of the Twelfth International Conference on World Wide Web - WWW '03 2003
DOI: 10.1145/775189.775192
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive on-line page importance computation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
78
0
4

Year Published

2003
2003
2014
2014

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 52 publications
(82 citation statements)
references
References 0 publications
0
78
0
4
Order By: Relevance
“…PageRank and its variations are currently being used by major search engines. [1,15,16] describe various ways to improve PageRank computation. [2] provides a theoretical justification for the Hub and Authority metric and proposes a mechanism to combine link and text analysis for page ranking.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…PageRank and its variations are currently being used by major search engines. [1,15,16] describe various ways to improve PageRank computation. [2] provides a theoretical justification for the Hub and Authority metric and proposes a mechanism to combine link and text analysis for page ranking.…”
Section: Related Workmentioning
confidence: 99%
“…Quality change during measurement: In our theoretical derivations, we assumed that the quality remains constant during measurement. 1 This assumption is reasonable when we can measure the derivative instantaneously, but when it is measured over a time period, it is possible that the quality may change during the time.…”
Section: Measuring Quality From Web Snapshotsmentioning
confidence: 99%
“…Abiteboul et al [3] designed a crawling strategy based on an algorithm called OPIC (On-line Page Importance Computation). In OPIC, each page is given an initial sum of "cash" which is distributed equally among the pages it points to.…”
Section: Web Crawling Orderingmentioning
confidence: 99%
“…OPIC This strategy is based on OPIC [3], which can be seen as a weighted backlink-count strategy. All pages start with the same amount of "cash".…”
Section: Strategies With No Extra Informationmentioning
confidence: 99%
“…This decision is guided by the minimization of a cost function which prioritizes XML pages. Some parameters of this cost function are, for instance, the importance of the page [2], the estimated page frequency and the crawler bandwidth.…”
Section: The Sample Of the Xml Webmentioning
confidence: 99%