Wikipedia, rich in entities and events, is an invaluable resource for various
knowledge harvesting, extraction and mining tasks. Numerous resources like
DBpedia, YAGO and other knowledge bases are based on extracting entity and
event based knowledge from it. Online news, on the other hand, is an
authoritative and rich source for emerging entities, events and facts relating
to existing entities. In this work, we study the creation of entities in
Wikipedia with respect to news by studying how entity and event based
information flows from news to Wikipedia.
We analyze the lag of Wikipedia (based on the revision history of the English
Wikipedia) with 20 years of \emph{The New York Times} dataset (NYT). We model
and analyze the lag of entities and events, namely their first appearance in
Wikipedia and in NYT, respectively. In our extensive experimental analysis, we
find that almost 20\% of the external references in entity pages are news
articles encoding the importance of news to Wikipedia. Second, we observe that
the entity-based lag follows a normal distribution with a high standard
deviation, whereas the lag for news-based events is typically very low.
Finally, we find that events are responsible for creation of emergent entities
with as many as 12\% of the entities mentioned in the event page are created
after the creation of the event page
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.