The Wikimedia Foundation has unveiled a new artificial intelligence strategy that emphasizes the central role of human volunteers in Wikipedia’s knowledge ecosystem. According to Chris Albon und Leila Zia, the organization is committed to using AI to enhance rather than replace the work of Wikipedia’s editors.
In their post Albon and Zia emphasized that the community of volunteers behind Wikipedia represents “the most important and unique element of Wikipedia’s success.” The new strategy focuses on using AI to remove technical barriers and automate tedious tasks, allowing editors to concentrate on areas requiring human judgment and collaboration.
The Foundation’s approach targets four key areas where AI can support Wikipedia’s volunteers:
- assisting moderators with AI-powered workflows to maintain knowledge integrity,
- improving information discoverability to save editors time,
- automating translation of common topics to help share local perspectives,
- and scaling the onboarding process for new volunteers through guided mentorship.
This strategy will direct the organization’s AI work from July 2025 through June 2028, according to the detailed strategy brief published on Meta-Wiki. The Foundation plans to review the strategy annually to adapt to the rapidly evolving AI landscape.
The document reveals that Wikimedia considered multiple approaches before settling on this human-centered strategy. One option was to maintain the status quo with minimal AI investment, which they determined would leave editors vulnerable to burnout as AI-generated content becomes easier to produce while verification remains labor-intensive. Another option—investing in AI for direct knowledge generation—was rejected because it would undermine the volunteer community that forms Wikipedia’s core.
Instead, the Foundation is making a “significant and targeted investment” in AI tools specifically designed to support human editors. The strategy prioritizes content integrity over content generation, acknowledging that Wikipedia can only add new content at a rate that existing editors can effectively moderate.
The approach aligns with Wikimedia’s long-standing values, including transparency, privacy, and human rights. The Foundation commits to developing only open-source AI and will prioritize using open-source or “open-weight” models, though it acknowledges resource limitations prevent building their own foundational models from scratch.
“With this new AI strategy, we are making a promise and a commitment to the world we serve and the volunteers who have made—continue to make—Wikipedia the largest encyclopedia that humanity has ever known,” Albon and Zia wrote in the announcement.
via TechCrunch