Submission to the Online Safety (Basic Online SafetyExpectations) Amendment 2023 from Wikimedia Australia and the Wikimedia Foundation

Revision as of 06:46, 10 July 2024 by Ali Smith (talk | contribs) (added category)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Submitted to the Department of Infrastructure, Transport, Regional Development, Communications and the Arts on 16 February 2024.

Information about the consultation process, along with all the public submissions is available here.

Contacts:

Wikimedia Australia

Bennettswood VIC 3125 Australia

contact@wikimedia.org.au

wikimedia.org.au

Wikimedia Foundation

1 Montgomery St., Suite 1600, San Francisco, CA 94104 U.S.A

globaladvocacyteam@wikimedia.org

wikimediafoundation.org

Contributors:

Amanda Lawrence, Belinda Spry, Bunty Avieson of Wikimedia Australia.

Jan Gerlach, Ellen Magallanes and Rachel Judhistari of Wikimedia Foundation.

This submission is public.

PDF version is available here.

About Wikimedia Australia and the Wikimedia Foundation

Wikimedia Australia (WMAU) is the Australian chapter of the international Wikimedia movement. WMAU is an independent, not-for-profit organisation and registered charity. WMAU supports its members, the broader community and partner organisations in Australia to contribute to Wikipedia, Wikidata and other Wikimedia projects through events, training and partnerships.

The Wikimedia Foundation (WMF) is the not-for-profit that hosts Wikipedia and other free knowledge projects that serve the public interest. The mission of WMF is to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally. Wikipedia and other Wikimedia projects are largely built and governed by volunteers. They serve as vital spaces for people to share and access information that impacts their lives and shapes their decisions. Wikipedia is available in more than 300 languages. Its 61 million articles are created by a global community of volunteer editors, who determine the online encyclopedia’s editorial policies and guidelines that ensure content is neutral and based on reliable, published sources. The Wikimedia Foundation does not have an editorial role in line with its Terms of Use.

Australians rely on Wikipedia to provide accurate information on all manner of topics. In January 2024 there were 283 million visits to Wikipedia by users in Australia and Australians are active contributors and editors, ranking 5th in the world. Wikipedia’s status as one of the most visited websites in Australia reflects the value this collaborative knowledge project brings to the country. It is the only online platform in the top 20 most visited sites globally that is not for profit.

Our interest through this submission

Governments worldwide are drafting and revising policies to address online safety challenges and hold technology platforms accountable for harmful content that is spread on their websites. While WMF and WMAU support legislation to increase online safety, many of these bills present significant consequences for community-driven online projects. They are targeted at the operating models of big social media platforms, which utilise algorithms for commercial benefit and monetise user data, and often neglect to sufficiently protect user privacy or promote a diversity of platforms to exist online.

WMAU and WMF welcome the amendment of BOSE, which aims to reduce online harm and ensure the internet is a safe experience for all. In alignment with the BOSE amendment process, WMAU and WMF are making a joint submission. It is worth noting that in November 2021, WMF submitted recommendations on the BOSE Determination 2021, which highlighted the demand for protecting the decentralised and community-based decision-making as part of the diverse internet ecosystem; defending privacy by maintaining pseudonymous contributions; and the importance of having exceptions for not-for-profit public interest projects like Wikipedia.

This current BOSE amendment is an opportunity for the government to protect and foster a diverse information ecosystem which supports not-for-profit public interest projects and community-based decision-making while also protecting privacy and child rights.

Our concerns with the bill focuses on 4 key areas discussed below:

  1. Protecting public interest projects
  2. Privacy and anonymity are cornerstones of a vibrant internet
  3. Safeguarding decentralised and community-led content moderation
  4. Protecting child rights through effective internal mechanism and community participation

1. Protect public interest projects

The proposed amendments to the Basic Online Safety Expectations (BOSE) aim to tackle the negative impacts of algorithm-driven social media and commercial platforms. However, WMF and WMAU are concerned that not-for-profit and public interest projects, such as Wikipedia–which operate on a decentralised, volunteer-based model–could be unduly affected by this wide regulatory net.

Wikipedia is a public interest project and no one owns it. Wikipedia is funded by donations, there is no advertising. Its content does not portray a particular point of view or support a certain position.

Wikipedia's content moderation is regulated by numerous policies developed by the site's volunteer contributors. Editorial decisions and content policies are made by self-organising groups of users via publicly documented community consensus which WMF does not control. Most contributions and uploads that do not meet Wikipedia's reliability and neutrality standards are addressed within five minutes on Wikipedia by volunteer editors. This system allows Wikipedia to present verifiable information to readers across the world and enables WMF to run a global website with a staff of less than 700.

The BOSE amendment is an opportunity to differentiate between various types and sizes of platforms based on their position in the information ecosystem. Not-for-profit public interest projects that depend on community decision-making should not be subject to the same regulations as those imposed on large for-profit technology companies with professional moderators and automated content moderation systems. Regulations for online spaces should strike a balance that allows full participation by everyone. Rigorous, one-size-fits-all content moderation requirements can disproportionately impact non-profit public interest projects.

2. Privacy and anonymity are cornerstones of a vibrant internet

WMF and WMAU are committed to actively promoting wide and equitable participation in online knowledge creation. Both organisations discern that storing personally identifiable information can imperil the privacy of Wikipedia editors and can discourage some individuals from contributing. This is especially likely for those who wish to edit sensitive content or contribute to topics for which they might be harassed or abused, online or offline.

For these reasons, WMF – as the host of the Wikimedia projects – collects very little information about readers and contributors and facilitates and encourages pseudonymous contributions. WMF does not track people across the internet, display ads or sell user data. Users can read and browse Wikipedia to their heart’s content as an encyclopaedic website without creating an account. This is different from most social media platforms where users have to create an account loaded with personal data to access the basic content, functions and utility of the platform.

WMF and WMAU argue that the BOSE amendment should prioritise privacy protections, by allowing pseudonymity and encryption. Privacy protections through pseudonymity and encryption are essential to maintaining a culture of open participation since they create a place of safety. Furthermore, the ability to access Wikipedia privately allows vulnerable populations to participate in and benefit from open knowledge sharing, including by connecting with others or securely accessing information on health and sensitive topics.

While there is a possibility of misuse of anonymous accounts, Wikipedia’s Sockpuppetry policy is enforced by volunteers to address anonymous abusive conduct without compromising protections of users’ rights. The policy embodies the Wikimedia movement’s belief that online safety should not force us to relinquish privacy and security.

WMF and WMAU recommend dismissing the age assurance requirements within paragraph 12(2)(a) of the proposed BOSE amendment, given our commitment to privacy and freedom of speech. Rather than legislate untested age assurance technology, policymakers should rely on effective internal mechanisms to protect children such as those in place on Wikipedia. Community-governance online models should be given the latitude to continue to protect children's privacy online without identifying them, especially when such measures are efficient and successful.

Expectations to implement age assurance mechanisms cannot be the same for all kinds of internet platforms; specifically, public interest projects that serve a public interest and do not operate on an ad-based business model and therefore carry less systemic risk to expose readers to illegal or harmful content should not be expected to implement the same kind of centrally controlled safety measures as commercial platforms that drive revenue through algorithmic engagement of users.

3. Safeguarding decentralised and community-led content moderation

WMF’s mission is to empower and engage people around the world to collect and develop educational content under a free licence or in the public domain and to disseminate it globally. To ensure the encyclopaedic nature and quality of the content hosted by Wikimedia projects, editors have developed and enforced content policies that define what kind of content can live on the websites and how it is presented and discussed. These policies and enforcement mechanisms allow Wikipedia to be an open project where people come to learn more about the world in which they live. The openness of the project and the fact that anyone can contribute to it, is what makes it possible for people to help ensure the quality of content as well as to prevent illegal or harmful content from being available on Wikimedia’s websites.

WMF and WMAU have concerns about the 2023 BOSE amendment discussion paper's emphasis on the role of proactive detection technologies, such as keyword detection. Over-reliance on automated tools and artificial intelligence to proactively detect and remove content can often undermine decentralised community content moderation. This issue is compounded by the lack of safeguards to protect encrypted communications that people rely on for safety and security.

4. Protecting child rights through effective internal mechanism and community participation

Protecting child safety, both for readers and editors, is a top priority for WMF and WMAU. In January 2024, WMF published its first Child Rights Impact Assessment (CRIA) report, which highlighted impacts, risks, and opportunities posed to children who access and participate in the Wikimedia projects.

As the WMF’s CRIA report highlighted, Wikimedia’s model does not present the same risks as for-profit platforms whose business models aim to maximize advertising revenue by targeting users with highly engaging, but often unreliable or unsuitable, content. This report also underscores that existing community-led processes for identifying and removing the rare appearance of child sexual abuse material (CSAM) are effective.

Furthermore, the implementation of child protections measures are most appropriately led by the community in such community-governed models. For the Wikimedia projects, this means that volunteer admins are in a strong position to incorporate child rights considerations into project policies and guidelines and advance efforts to identify and manage interactions that may be harmful to children—such as bullying, exposure to sexually exploitative images, or exposure to hate speech—on topics or pages that may be frequently accessed by minors on Wikimedia projects.

Conclusion

As Australia seeks to ensure the safety of its citizens, including children, through the online safety codes, WMAU and WMF urge care in ensuring that the resulting industry standards avoid unintended consequences and preserve freedom of expression and privacy rights for all users. The BOSE amendment is an opportunity for the government to protect and foster a diverse information ecosystem which supports not-for-profit public interest projects and community-based decision-making while also protecting privacy and child rights. Given the vast scope of the standards and the wide range of actors in the technology ecosystem, regulators should:

  • articulate additional, clear exceptions for encyclopaedic public interest projects that rely on community decision-making.
  • prioritise privacy protections, by allowing pseudonymity and encryption.
  • dismiss the age assurance requirements, and support effective community-led governance mechanisms to protect children.
  • enable community led decentralised content moderation to address harmful content.

We encourage the Minister for Communication and eSafety Commission to consider these recommendations as they revise and update the codes for registration.

WMF and WMAU welcome the opportunity to discuss this submission with you and look forward to further engaging with the Australia Minister for Communication and eSafety Commission on regulation that makes the internet safer for all and protects public interest projects.

Discuss this page