This dataset was prepared for an ongoing study on user reputation and content quality in Wikipedia at University of California, Irvine.
This research is done mainly by
Sara Javanmardi under the supervision of
Prof. Lopes and
We processed the Wikipedia dump,
enwiki-20100130-pages-meta-history.xml.7z, in order to extract inserts and deletes done by each user. The dump contains all English Wikipedia articles up to January 2010. You can access this dataset both online and offline:
We have prepared an XML interface that allows you to extract events happened in a page. To extract events of each article in English Wikipedia you need to pass three parameters to the API and will receive the results in XML format.