|As of 2018, Wikipedia is the sixth most-visited website in the world with more than 35 million users across the globe / Photo by: pixinoo via Shutterstock|
As of 2018, Wikipedia is the sixth most-visited website in the world with more than 35 million users across the globe. Since it was launched in January 2001, the website has created nearly six million articles with over 582 articles added every day.
Developed by internet entrepreneur Jimmy Wales and philosopher Larry Sanger, Wikipedia was an offshoot of a project called Nupedia, an internet encyclopedia. It develops at a rate of over 1.8 edits per second, covering various topics. However, the website was initially designed to be a free encyclopedia of articles written solely by experts. With this setup, they would only produce fewer than 24 articles per year, which is why they created a version where anyone could contribute to the encyclopedia without editorial oversight.
“My initial idea was that the wiki would be set up as part of Nupedia; it was to be a way for the public to develop a stream of content that could be into the Nupedia process,” Sanger said in a memoir. While other experts advise others not to rely on Wikipedia for sources because of its open policy that anyone can edit or write in it, a 2014 study showed that the website was 99.7% more accurate than textbook information.
However, things are not going quite smoothly for Wikipedia. According to Time, an American weekly news magazine and news website published in New York City, a 2013 article in the MIT Technology Review reported that the volunteer workforce that built Wikipedia has shrunk by more than a third since 2007 and is still shrinking. It was tasked to defend the website against vandalism, hoaxes, and manipulation. Since anyone can contribute to it, it is possible that misinformation and slander can slip in.
Using AI to Stop Abusive Comments
Wikipedia has been experiencing an issue with regard to the number of contributors or editors they have, as these people make sure every piece of information uploaded on the site is accurate and truthful. Reports show that the number of contributors or editors has fallen by 40% during an eight-year period. Thus, Wikimedia Foundation, a nonprofit that supports Wikipedia, decided to use artificial intelligence to address the issue.
Last year, the Wikimedia Foundation initiated a research project called Detox in collaboration with Jigsaw, a technology incubator created by Google. According to Forbes, a global media company focusing on business, investing, technology, entrepreneurship, leadership, and lifestyle, this is part of Jigsaw’s initiative to build open-source AI tools, which aims to combat harassment on social media platforms and web forums. Detox uses machine learning to flag comments that might be personal attacks.
The machine learning algorithms were trained using 100,000 toxic comments from Wikipedia Talk pages. These comments were identified by a 4,000-person human team where every comment had ten different human reviewers. After that, the algorithms reviewed over 63 million English Wikipedia comments posted during a 14-year period to find patterns in the abusive comments. The study “Ex Machina: Personal Attacks Seen at Scale” discovered that more than 80% of all comments were characterized as abusive, 34% of all comments left on Wikipedia were made by anonymous users, and nearly 10% of all attacks were made by just 34 users.
With machine learning algorithms, the Wikipedia team can figure out the best way to combat the abusive comments that have been contributing to the community’s toxicity.
While Wikipedia is a human-driven site, thousands of volunteers across the world are assisted by hundreds of artificial intelligence-powered software tools or “bots” that keep the encyclopedia running. Since the work of Wikipedia editors demands so much, bots are extremely helpful in making sure everything runs smoothly. They can perform various editorial and administrative tasks that are tedious, repetitive, and time-consuming, but vital.
Aside from that, the bots can also organize and catalog entries, delete vandalism and foul language, and handle the reams of behind-the-scenes work that keep the encyclopedia running smoothly and efficiently. These functions make sure Wikipedia continues to appear neat and uniform. A recent study conducted by researchers at Stevens Institute of Technology in Hoboken, New Jersey analyzed all 1,601 of Wikipedia's bots to assess the development and use of bots in commercial applications.
"AI is changing the way that we produce knowledge, and Wikipedia is the perfect place to study that. In the future, we'll all be working alongside AI technologies, and this kind of research will help us shape and mold bots into more effective tools,” study author Jeffrey Nickerson, a professor in the School of Business, said.
According to Tech Xplore, an online site that covers the latest engineering, electronics, and technology advances, Wikipedia has bots that fulfill nine core roles. This accounts for up to 88% on some sub-sections such as the site's Wikidata platform and 10% of all activity on the site. Some of these roles include "advisors," which suggest new activities and provide helpful tips; "protectors," which police bad behavior; "connectors," which link pages and resources together; "fixers," which repair broken content or erase vandalism, and more.
While it is still uncertain how bots will shape Wikipedia in the long run, the fact remains that they have been helping the site give accurate and truthful information to people across the world. "By studying Wikipedia, we can prepare for the future, and learn to build AI tools that improve both our productivity and the quality of our work,” Nickerson said.
|While Wikipedia is a human-driven site, thousands of volunteers across the world are assisted by hundreds of artificial intelligence-powered software tools or “bots” that keep the encyclopedia running / Photo by: Jinning Li via Shutterstock|