The power of open-source in guiding regulations
As the EU debates the AI Act, lessons from open-source software can inform the regulatory approach to open ML systems.
The AI Act, set to be a global precedent, aims to address the risks associated with AI while encouraging the development of cutting-edge technology. One of the key aspects of this Act is its support for open-source, non-profit, and academic research and development in the AI ecosystem. Such support ensures the development of safe, transparent, and accountable AI systems that benefit all EU citizens.
Drawing from the success of open-source software development, policymakers can craft regulations that encourage open AI development while safeguarding user interests. By providing exemptions and proportional requirements for open ML systems, the EU can foster innovation and competition in the AI market while maintaining a thriving open-source ecosystem.
Representing both commercial and nonprofit stakeholders, several organisations – including GitHub, Hugging Face, EleutherAI, Creative Commons, and more – have banded together to release a policy paper calling on EU policymakers to protect open-source innovation.
The organisations have five proposals:
- Define AI components clearly: Clear definitions of AI components will help stakeholders understand their roles and responsibilities, facilitating collaboration and innovation in the open ecosystem.
- Clarify that collaborative development of open-source AI components is exempt from AI Act requirements: To encourage open-source development, the Act should clarify that contributors to public repositories are not subject to the same regulatory requirements as commercial entities.
- Support the AI Office’s coordination with the open-source ecosystem: The Act should encourage inclusive governance and collaboration between the AI Office and open-source developers to foster transparency and knowledge exchange.
- Ensure practical and effective R&D exception: Allow limited real-world testing in different conditions, combining aspects of the Council’s approach and the Parliament’s Article 2(5d), to facilitate research and development without compromising safety and accountability.
- Set proportional requirements for “foundation models”: Differentiate between various uses and development modalities of foundation models, including open source approaches, to ensure fair treatment and promote competition.
Open-source AI development offers several advantages, including transparency, inclusivity, and modularity. It allows stakeholders to collaborate and build on each other’s work, leading to more robust and diverse AI models. For instance, the EleutherAI community has become a leading open-source ML lab, releasing pre-trained models and code libraries that have enabled foundational research and reduced the barriers to developing large AI models.
Similarly, the BigScience project, which brought together over 1200 multidisciplinary researchers, highlights the importance of facilitating direct access to AI components across institutions and disciplines.
Such open collaborations have democratised access to large AI models, enabling researchers to fine-tune and adapt them to various languages and specific tasks—ultimately contributing to a more diverse and representative AI landscape.
Open research and development also promote transparency and accountability in AI systems. For example, LAION – a non-profit research organisation – released openCLIP models, which have been instrumental in identifying and addressing biases in AI applications. Open access to training data and model components has enabled researchers and the public to scrutinise the inner workings of AI systems and challenge misleading or erroneous claims.
The AI Act’s success depends on striking a balance between regulation and support for the open AI ecosystem. While openness and transparency are essential, regulation must also mitigate risks, ensure standards, and establish clear liability for AI systems’ potential harms.
As the EU sets the stage for regulating AI, embracing open source and open science will be critical to ensure that AI technology benefits all citizens.
By implementing the recommendations provided by organisations representing stakeholders in the open AI ecosystem, the AI Act can foster an environment of collaboration, transparency, and innovation, making Europe a leader in the responsible development and deployment of AI technologies.
(Photo by Nick Page on Unsplash)
Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The event is co-located with Digital Transformation Week.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.