WWW2007 Paper Details
Track:
Search
Paper Title:
Random Web Crawls
Authors:
  • Toufik Bennouas (Criteo R&D)
  • Fabien de Montgolfier (Universite Paris 7)
Abstract:
This paper proposes a random Web crawl model. A Web crawl is a (biased and partial) image of the Web. This paper deals with the hyperlink structure, i.e. a Web crawl is a graph, whose vertices are the pages and whose edges are the hypertextual links. Of course a Web crawl has a very particular structure; we recall some known results about it. We then propose a model generating similar structures. Our model simply simulates a crawling, i.e. builds and crawls the graph at the same time. The graphs generated have lot of known properties of Web crawls. Our model is simpler than most random Web graph models, but captures the sames properties. Notice that it modelizes the crawling process instead of the page writting process of Web graph models.
Slot:
Alberta, Wednesday, May 9, 2007, 3:30pm to 5:00pm.
Full-text:
PDF version