Wikipedia workload analysis for decentralized hosting

Guido Urdaneta*, Guillaume Pierre, Maarten van Steen

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

283 Citations (Scopus)

Abstract

We study an access trace containing a sample of Wikipedia's traffic over a 107-day period aiming to identify appropriate replication and distribution strategies in a fully decentralized hosting environment. We perform a global analysis of the whole trace, and a detailed analysis of the requests directed to the English edition of Wikipedia. In our study, we classify client requests and examine aspects such as the number of read and save operations, significant load variations and requests for nonexisting pages. We also review proposed decentralized wiki architectures and discuss how they would handle Wikipedia's workload. We conclude that decentralized architectures must focus on applying techniques to efficiently handle read operations while maintaining consistency and dealing with typical issues on decentralized systems such as churn, unbalanced loads and malicious participating nodes.

Original languageEnglish
Pages (from-to)1830-1845
Number of pages16
JournalComputer networks
Volume53
Issue number11
DOIs
Publication statusPublished - 28 Jul 2009
Externally publishedYes

Keywords

  • Decentralized hosting
  • P2P
  • Wikipedia
  • Workload analysis

Fingerprint

Dive into the research topics of 'Wikipedia workload analysis for decentralized hosting'. Together they form a unique fingerprint.

Cite this