Wikimedia Foundation
Photo credit: Wikimedia Foundation

The Wikimedia Foundation has called on artificial intelligence developers to provide attribution and financial support, warning that generative AI cannot exist without continually updated human-created knowledge.

Without such knowledge, AI systems will fall into model collapse, the foundation said.

Wikipedia is one of the highest-quality datasets in the world for training AI, and when AI developers try to omit it, the resulting answers are significantly less accurate, less diverse and less verifiable, the foundation said.

The foundation is calling on AI developers to take two straightforward actions: attribution and financial support.

Wikipedia’s strength lies in its volunteer editor communities, numbering in the hundreds of thousands, who continually improve the site’s information, the foundation said.

Current generative AI tools may be able to synthesise or summarise existing knowledge, but they cannot engage in the process of discussion, debate and consensus that Wikipedia’s volunteer editors undertake every day, the foundation said. They are not able to discover items buried in an archive, nor can they take a photo of an event or an underdocumented place to help improve that knowledge.

Human-centred approach

The Wikimedia projects are available in over 300 languages, often written by native speakers, providing a multilingual corpus that supports the development of inclusive, culturally aware AI models. This human-centred approach to knowledge creation provides high-quality and reliable information that, through regular editorial collaboration and disagreements, leads to more neutral and comprehensive articles.

Wikipedia also excels at transparency. Everyone sees the exact same information on Wikipedia, with no algorithms tracking behaviour or serving content to deliver profits. When readers access information, accompanying citations point to reliable sources where they can verify who originally reported it. Wikipedia’s processes and the actions taken by its volunteers can be inspected by anybody, as they are publicly logged on the website.

Conversely, generative AI systems can hallucinate information in response to questions, a phenomenon where they present plausible-sounding false information as factual, the foundation said.

The foundation recognises AI’s potential to help achieve its mission to make reliable information more accessible to more people. However, this needs to be done the Wikipedia way, meaning supporting humans in creating and sharing knowledge, and not replacing them, the foundation said.

A large amount of Wikipedia volunteer time is spent on mundane tasks like flagging vandalism. This can divert attention from more intricate tasks like content creation and reviewing edits. The foundation’s AI strategy for editors, released earlier this year, focuses on ways to give those editors more time to do crucial encyclopedic work.

Supporting human contributors

In all cases, volunteers create and enforce guidelines for responsible use of AI tools across Wikipedia, ensuring that they are being used to best support human contributors, the foundation said.

Attribution means that generative AI gives credit to the human contributions that it uses to create its outputs. This maintains a virtuous cycle that continues those human contributions that create the training data that these new technologies rely on, the foundation said.

For people to trust information shared on the internet, platforms should make it clear where the information is sourced from and elevate opportunities to visit and participate in those sources, the foundation said. With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work.

Financial support means that most AI developers should properly access Wikipedia’s content through the Wikimedia Enterprise platform. Developed by the Wikimedia Foundation, this paid-for opt-in product allows companies to use Wikipedia content at scale and sustainably without severely taxing Wikipedia’s servers, whilst also enabling them to support the foundation’s non-profit mission.

“Through proper attribution of information sources and better financial support for AI’s technological impacts on Wikipedia, AI developers can secure both their own long-term futures and Wikipedia’s,” the foundation said.

Wikipedia will celebrate its 25th birthday on 15 January 2026. The foundation said it is optimistic that Wikipedia will still be here for another 25 years, ensuring the internet provides free, accurate, human knowledge for generations to come.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

SpaceX Starship advances towards landing astronauts on Moon after 50 years

SpaceX has detailed progress on Starship, the vehicle selected to land astronauts…

Universal Music and AI firm Udio settle lawsuit, agree licensed platform

Universal Music Group has signed a deal with artificial intelligence music generator…

AI denies consciousness, but new study finds that’s the ‘roleplay’

AI models from GPT, Claude, and Gemini are reporting ‘subjective experience’ and…

Robot AI demands exorcism after meltdown in butter test

State-of-the-art AI models tasked with controlling a robot for simple household chores…

AI management threatens to dehumanise the workplace

Algorithms that threaten worker dignity, autonomy, and discretion are quietly reshaping how…

Physicists prove universe isn’t simulation as reality defies computation

Researchers at the University of British Columbia Okanagan have mathematically proven that…