TL;DR: No.

The case against Wikipedia

Last week Elon Musk posted a number of tweets denouncing Wikipedia, the free online encyclopaedia that has been a mainstay of the 21st century Internet.

Screenshot of a Twitter thread started by Jimmy Wales, first tweet reading: 'Fast moving claims and counter claims, and @elonmusk has removed all the core features that made it even remotely possible to tell real journalists from fakes.'. A response from Elon Mush reads 'Please fix wokipedia'.
Tweet Source

The attack came in response to Wikipedia co-founder Jimmy Wales’ complaints that the hamstringing of X/Twitter’s blue tick verification system has made trusting information on the Israel-Palestine conflict on Twitter particularly difficult. Elon’s erudite response was to “Please fix Wokipedia” (I personally prefer Wookiepedia). This was followed up by a number of other critical tweets noting biased editors, an assertion that the entire “legacy” news industry exists solely to fabricate authoritative false information for Wikipedia (???), and an offer of $1bn for the site to change its name to Dickipedia (you’d think he’d have learnt after the previous Twitter board called his bluff on his inflated purchase offer).

Screenshot of a tweet by Elon Musk stating: 'Have you ever wondered why the Wikimedia Foundation wants so much money? It certainly isn’t needed to operate Wikipedia. You can literally fit a copy of the entire text on your phone. So, what’s the money for? Inquiring minds want to know …'. A community note responds: 'The Wikimedia Foundation is a charitable non-profit, providing free access to Wikipedia. While a text & English-only copy of Wikipedia is about 51GB, adding all media + supported languages brings it to 428TB. In 2022 Wikimedia had $154M in revenue with $145M in expenses.'
Tweet Source
Screenshot of a tweet by Elon Musk stating: 'Many orgs & people pay to have their wiki page say nice things about them! The reason many legacy “news” publications still exist, even though no one reads them, is simply to provide fake “verified sources” for Dickipedia. You can tell I don’t do so, because my wiki page is such a nightmare' Screenshot of a tweet by Elon Musk stating: 'Wikipedia is inherently hierarchical and therefore subject to the biases of higher ranking editors, independent of their merits. @CommunityNotes requires people with historically different points of view, based on how they have rated and written Notes, to agree in order for Notes to be shown to the public. All code and data is open source, so you can recreate the outcome yourself. Crucially, even I, as the controlling shareholder of the company, cannot change the outcome of a Note. This is an extremely fundamental difference.' Screenshot of a tweet by Elon Musk stating: 'I will give them a billion dollars if they change their name to Dickipedia'.
Tweet Source 1, 2, 3.

In the midst of this, wider scrutiny of how the Wikimedia Foundation—the charity that runs Wikipedia and its sister projects—appeals for and spends its money has increased. Elon’s flurry of tweets with this line of questioning was backed up by various efforts from supporters analysing the foundations donation figures, each declaring that one of the most popular websites in the world—plus the operations of its parent organisation—could and should be done much more cheaply.

The encyclopaedic view

All of this seems part of a larger trend in the alt right anti-intellectual movement. In addition to donation transparency and payroll issues, there have been protests over how the article for “Recession” was supposedly edited to preserve the reputation of Joe Biden’s presidency, and outcry over editor discussions to delete the fledging article for the “Twitter Files”.

These complaints go back many years, and have occasionally sparked attempts to form alternatives to Wikipedia that are more exclusively in line with the views and interests of particular communities (with varying degrees of extremity), including but not limited to the Christian fundamentalist Conservapedia, the alt right Infogalactic, or the blockchain-based IQ.Wiki (formerly Everipedia). Many of these attempts have failed, and those that are still around are but a shadow of Wikipedia’s success.

There are certainly valid points to be made on critiquing the Wikimedia Foundation’s donation transparency. However, this posturing on Twitter with misplaced confidence in accounting only serves as a bad faith pretext by which to belittle Wikipedia’s entire mission. Engaging in debate here over what of the Wikimedia Foundation’s expenses constitutes appropriate spending or value for money in this case is besides the point; indeed, much more worthwhile discussions on this occur on Wikipedia itself. Moreover, the release of the files is not the scandalous result of some clever sleuthing, or shocking hack. The documents were published (and will continue to be) by the Wikimedia Foundation after Wikipedians themselves campaigned for greater donation transparency. It is the product of a functioning relationship between the editing community and Wikipedia’s stewards. With that out of the way, what of Musk’s underlying concerns with Wikipedia: are they valid? Inquiring minds want to know…

Here I will detail how the way Wikipedia operates, together with academic Wiki research, shows that Elon Musk may have a point, but not in the way he might hope or expect.

Wikipedia does not represent Musk’s views

People regularly confuse Wikipedia’s “Neutral Point Of View” (NPOV) Policy as an expectation that an article to should give equal weight to both sides of an issue. This is incorrect. NPOV dictates that articles should proportionately represent the significant views from reliable sources on a topic. This means that perspectives from flat earthers, climate change deniers, race realists, etc. are broadly not able to flourish unchecked (unlike on certain other social media platforms…), since authoritative sources do not support these positions. Simply put, some of Musk’s views are not backed up by authoritative sources and so do not receive the coverage he expects on Wikipedia. A Wikipedia article is also not the place to argue over the truth of a matter, but to present information as represented in other sources. For editorial discussions, each article has an associated “Talk” page. Scientific and historical publishing, journalism, and even the chatter of social media are the venues for hashing out debate.

Since Wikipedia is a continual work in progress, it of course does not always live up to this ideal. If Musk or his fans do find that an article is not neutrally presented then they would be welcome to participate and make a contribution to the digital commons, rather than decry Wikipedia as some exclusionary club. In these situations, provided the information is well-sourced, the guidance is to rewrite or add to the article, rather than completely remove the content (or indeed leave it in place). Indeed, it has been found that diverse editor teams produce higher quality articles [1]. This process can sometimes be hard work, but you don’t just get what you want without effort to build consensus.

Wikipedia is biased

Given Wikipedia’s reliance on particular sources and editorial processes, it is perhaps not surprising that this could lead to more systemic issues of bias. What if what Wikipedia deems as authoritative sources do not exist, or misrepresent an issue? Elon and the gang have identified another key issue with Wikipedia.

Countless studies have explored the biases present in English, and across all language versions of Wikipedia [2–6]. These consistently find that traditionally marginalised groups (women, LGBTQ people, communities from the global south, etc) are underrepresented as a whole on Wikipedia. Beyond the content itself, the editor demographics skew strongly towards white, western men, and that western editors are overrepresented across different language Wikipedias [6]. This perhaps partly explains the aforementioned content biases. Different parts of, or language editions, of Wikipedia can still present different perspectives on a given topic. This is perhaps most clear when there is a lack of diversity in contributing editors. For example, when the Wikipedia language is tied to a particular national identity, content on Wikipedia can be more akin to digital collective memory [7]. Consider coverage on the Japanese Wikipedia of Japan’s actions in World War 2 or the Atomic Bombings and how it might diverge from the English or Chinese Wikipedias. Regarding ideological bias specifically; whilst it is difficult to study at scale, some studies have found that the English Wikipedia is slightly left-leaning, but that this weakens over time as both articles receive more edits and editors become more experienced [8, 9].

There is regular debate over to what extent Wikipedia should reflect biases present in the communities it represents / written sources / the Internet. However, the fact remains that at best, Wikipedia reflects, and at worst amplifies and exacerbates biases present in the world.

Established Wikipedia editors control the content

More established editors do have more power on Wikipedia. Musk’s frustations in some way echo those of many new users who attempt to edit the site; one can attempt to make good faith edits, and be met with unexpected resistance, rewrites, or reverts. Certain active editors may also patrol their favourite articles and topics, polishing any contributions made by others and reverting any changes that they deem unworthy.

Whilst Wikipedia is the encyclopaedia that “anyone can edit”, there is still a editorial hierarchy—from unregistered editors, to “Extended Confirmed” users (active for at least 30 days with at least 500 edits), up to Administrators and Bureaucrats, who receive special privileges such as the ability to delete articles or block users. None of these roles are paid, and all are decided by simple activity rules or the volunteer editing community, not the Wikimedia Foundation. If there are serious disputes by editors over the content of a given article, these escalate up the community governance structure where other editors will weigh in and attempt to reach a consensus. This is a fundamental part of the miraculous success of Wikipedia, where experience and expertise is valued but not revered. That being said, it can still be a confusing process for new editors who, if facing off against an experienced editor, can be met with policy acronyms and counter-measures that they are not familiar with. Hell hath no fury like a Wikipedia editor scorned.

xkcd comic 'Duty Calls'. Stick drawing of person at a computer having a conversation with someone offscreen asking 'Are you coming to bed?', 'I can't. This is important.', 'What?', 'Someone is WRONG on the Internet.'
xkcd: Duty Calls

What all of this often tedious, occasionally explosive procedure means is that Wikipedia is not some “free speech” free-for-all, where acolytes may purchase some empty social status symbol with elevated privileges in the digital public sphere (too on-the-nose?). The process of cumulative consensus building is not perfect, but it has developed to be robust to vandalism and coordinated interference. Whilst the rules of engagement can be abused, it is a far cry from the helpless situation Musk and his allies claim to find themseleves in.

Wikipedia administration has been taken over

OK, but what if the entire system is rigged? That beyond individual cases of editors acting in bad faith, the rot runs so deep that the community hierarchy has been fatally compromised by bad actors?

Such levels of conspiratorial coordination would be a challenge in any organisation, let alone in a distributed social system on the scale of English Wikipedia. However, this scenario did play out on the Croatian Wikipedia. In 2021, after years of complaints from regular Croatian editors, the Wikimedia Foundation launched the Croatian Wikipedia Disinformation Assessment. The Foundation found that the volunteer administrator team, the editors with the most power on the site, was actively pushing far right nationalist content. The Croatian editing community removed the offending administrators and, with the support of the report, is seeking to rebuild the administrative hierarchy. A similar intervention was made on the Arabic Wikipedia in 2022, when the Wikimedia Foundation banned seven administrator for conflict of interest editing. Whistleblowers later claimed the banned administrators were directly linked to the Saudi Government and partly responsible for the arrests and imprisonment of dissenting editors Ziyad al-Sofiani and Osama Khalid.

Wikipedia is absolutely not immune to attacks by bad actors or misinformation. These examples are particularly serious incidents, and demonstrate the tightrope act the Wikimedia Foundation must follow. However, the idea that a liberal cabal is pulling the strings in an already biased system is fanciful. Given Wikipedia’s history on the matter, it is yet another example of the far right accusing those on the left of what they themselves have already done or intend to do.

Full marks for Musk?

So yes, Elon Musk is right about Wikipedia. It does not represent his views, it is biased, established editors control the content, to the extent that the entire community hierarchy has been infiltrated.

However, as discussed, his beliefs are not represented as they are unsubstantiated, Wikipedia’s biases run counter to Musk’s expectations, hierarchical control is a necessary part of the consensus system, and attempts at administrative takeover have only been made by the far right.

The victim complex displayed by Musk is striking, if unsurprising. Wikipedia as a functioning, even flourishing, online social system after all these years stands in stark contrast to what Musk’s Twitter is fast becoming, all for a lot less than $44bn.

The future of Wikipedia

Despite Musk’s misguided misgivings, we have nevertheless seen the genuine issues that do exist with Wikipedia. These broadly boil down to knowledge gaps and editorial control. Knowledge gaps are a problem. Wikipedia is often a first port of call when researching a subject, plus Wikipedia coverage in and of itself nowadays conveys a level of legitimacy and notability. Moreover, Wikipedia is a font of information for many other websites and services, playing an important role in powering the knowledge graphs used by Google, Meta, and Amazon Alexa, as well as forming a key part of the texts used to train Large Language Model such as ChatGPT. What happens on Wikipedia echoes across many other platforms. Likewise, editor abuse of power or process can be an issue. We do not want this important online resource to be controlled by the decisions of a select few. We have already seen the consequences of this with the far more culturally maligned big social media platforms—Twitter of course included.

The Wikimedia Foundation is conscious of its issues and is already running numerous initiatives to address them. This includes developing schemes to improve the experience of new editors, and their retention. Money also goes towards facilitating events such as edit-a-thons around the world. The foundation rightfully stays out of interfering with the editorial process itself (the vast majority of which remains postive and productive), except for keeping a watchful eye for the most extreme cases, as discussed above.

All of this, plus continued advocacy for open access and open knowledge, wide-ranging support for internal and external research, managing crucial sister projects like Wikidata and Wikimedia Commons, and promoting educational initiatives, is all practised by the foundation. This all helps contribute to what makes Wikipedia the “last best place on the Internet”, and a society that embodies its values and can support it going forward.

Wikipedia is far more than just a collection of text on a server somewhere. If you would like to help contribute to the movement, then you are of course welcome to donate. More importantly though, I would recommend starting to edit. Participating in the process of building the digital commons is a rewarding experience, and your contributions can help make humanity’s greatest reference work (and all its dependents) even better.

References

[1] Shi, F., Teplitskiy, M., Duede, E., & Evans, J. A. (2019). The wisdom of polarized crowds. Nature human behaviour, 3(4), 329-336.

[2] Hill, B. M., & Shaw, A. (2013). The Wikipedia gender gap revisited: Characterizing survey response bias with propensity score estimation. PloS one, 8(6), e65782.

[3] Tripodi, F. (2023). Ms. Categorized: Gender, notability, and inequality on Wikipedia. New Media & Society, 25(7), 1687-1707.

[4] Ribé, M. M., Kaltenbrunner, A., & Keefer, J. M. (2021). Bridging LGBT+ Content Gaps Across Wikipedia Language Editions. The International Journal of Information, Diversity, & Inclusion, 5(4), 90-131.

[5] Dittus, M., & Graham, M. (2019). Mapping Wikipedia’s geolinguistic contours. Digital Culture & Society, 5(1), 147-164.

[6] Graham, M., Straumann, R. K., & Hogan, B. (2015). Digital divisions of labor and informational magnetism: Mapping participation in Wikipedia. Annals of the Association of American Geographers, 105(6), 1158-1178.

[7] Yasseri, T., Gildersleve, P., & David, L. (2022). Chapter 9 - Collective memory in the digital age. In S. M. O’Mara (Ed.), Progress in brain research (Vol. 274, p. 203-226). Elsevier.

[8] Greenstein, S., & Zhu, F. (2012). Is Wikipedia Biased?. American Economic Review, 102(3), 343-348.

[9] Greenstein, Shane, Yuan Gu, and Feng Zhu. Ideological segregation among online collaborators: evidence from Wikipedians. No. w22744. National Bureau of Economic Research, 2016.