In Defense of Wikipedia

Hello, my friends. If you’re reading this right now, I can safely assume two things about you: (1) you went to school and, at some point, learned to read, and (2) you have access to the internet.  (My mind powers are stunning, I know.) If these two assumptions are true, I can also infer another fact: you’ve probably used Wikipedia before.

Just in case some of you have lived under a rock since the 90s, Wikipedia is a free and open-collaborative online encyclopedia. It’s one of the last beautiful places on the Internet without ads, as the service is funded by the non-profit Wikimedia foundation. Wikipedia isn’t structured like a traditional encyclopedia or online publication: anyone with a registered Wikipedia account can write or edit articles. Because theoretically anyone anywhere can write about anything, millions of Wikipedia articles have been written in over 300 languages, attracting 2 billion unique devices per month to the site.

Though effective at spreading knowledge quickly and economically, Wikipedia’s open-editing format has been contentious with academics since the site’s founding in 2001. The idea behind their distrust is simple: if anyone can edit, then anyone can add false, misleading or biased information. Over the years, Wikipedia has enabled safeguards to prevent article vandalism — that is, crude or offensive words, advertising or spam, purposefully biased reporting, or the deleting of certain pages — such as restricting the editing of articles covering controversial or political topics. Of course, for every troll edit, there are a number of benevolent editors that usually fix mistakes and inaccuracies within minutes. Despite their efforts, however, numerous false or misleading pieces of information have persisted on Wikipedia for years. Knowing this, Wikipedia does not claim to be a reliable academic source, and teachers have admonished students to not even think about citing Wikipedia in their school reports.

And yet… we all continue to use it on the downlow. (We just don’t cite it.) In fact, my main source for writing this article was — drum roll please — Wikipedia. Despite all the criticism the site gets from our professors, most of the information I’ve gleaned from Wikipedia has been pretty helpful and, to my knowledge, accurate.

So here’s a crazy idea: what if academics have been judging Wikipedia too harshly?

Hear me out: in a 2004 interview, a British librarian named Philip Bradley stated, “the main problem [with Wikipedia] is [its] lack of authority. With printed publications, the publishers have to ensure that their data are reliable, as their livelihood depends on it. But with something like this, all that goes out the window." In 2005, when much of academia was still making the transition to online servers, a statement like this made sense; Wikipedia’s lack of a centralized editing authority does allow for potentially misleading or incorrect information to propagate. And as I mentioned earlier, the false information problem has plagued Wikipedia since its founding.

But here’s the kicker: in 2021, it’s hard to find reliable data anywhere. In the world of fake news and extreme media bias, educational and news sources alike are plagued with misleading statistics and inflammatory language meant to confirm the ideological slant of the organization and its readers. Sure, their facts may be right (the writer’s livelihoods do depend on it, after all), but those with editing power get to pick and choose the facts that best suit their point of view. Because of this, two reputable organizations reporting on the same issue with the same set of facts may have completely different information in their respective articles, leading their audiences to vastly different conclusions. Having a limited set of writers and editors that think similarly leads to one-sided takes on important issues, which can have dangerous effects on readers — effects that we’ve unfortunately seen over the last year or so.

Wikipedia’s openness is designed to prevent this very phenomenon. From day one, Wikipedia’s non-negotiable principle for editors has been a neutral point of view. Of course, it’s hard to enforce this principle on every editor and in every article out there. But there is strength in numbers: as I mentioned earlier, for every troll or bad editor, there are (usually) a large number of benevolent editors who rush to fix their mistakes or biases. Plus, because virtually anyone can edit anything, the editors of any given article likely have many different ideological slants, which may encourage them to correct the biases of an editor they disagree with to make the reporting more neutral.

See, in a traditional publication or news source, editors may be less inclined to notice biased reporting or sketchy information from someone they agree with. The open nature of Wikipedia prevents this from happening. No similarity, no groupthink — just pure idea meritocracy. May the most correct editor win.

So, in an ironic turn of events, the least reputable academic source may have become one of the more reliable sources of information in our unprecedented day and age. A source of information is only as good as its writers. When your writers include the entire world, well… your site is sure to be something. And I’d argue that 2 billion unique devices per month is certainly something. In a world where people are continually searching for more accurate information, I hope that more and more people see open collaboration as a strength, not a weakness.