The EU’s AI Act at One Year: Continuing to push for open-source AI and transparency

Saturday, August 2, marked the first anniversary of the entry into force of the EU AI Act, the EU’s contested landmark legislation putting in place rules for AI sold and deployed on its internal market. With a staggered timeline for when different rules take effect, Mozilla continues its work to ensure that the law’s implementation is a success. 

Beginning last week, the AI Act imposes new obligations for the developers of so-called “general-purpose AI models” (GPAI), that is, large AI models like OpenAI’s GPT, Google’s Gemini, or xAI’s Grok models (often also refered to as “foundation models”). Mozilla has long advocated for such rules to be included in the AI Act to ensure that large AI labs must play their part in making the technology they develop safer and more transparent and that due diligence obligations are not entirely passed down the value chain to smaller developers and deployers. These new rules include new transparency and disclosure mandates as well as obligations relating to GPAI developers safety and security practices.

To mark the occasion, Mozilla, in partnership with Hugging Face and Linux Foundation, published a guide for open-source AI developers aiming to help them navigate these rules. Amongst other questions, the guide explains what exactly constitutes a GPAI model or a “GPAI model with systemic risk”, what obligations developers need to meet, and when they might benefit from the AI Act’s exemptions for open-source AI. It builds on and synthesizes the recently adopted Code of Practice for GPAI developers as well as the European Commission’ newly published GPAI guidelines. The guide also includes an interactive flowchart meant to help developers on their AI Act “user journey”. This builds on Mozilla’s long-standing work advocating for better conditions for open-source AI development, including our advocacy to ensure that open-source developers receive proportionate treatment under the AI Act.

In addition, in late July, the European Commission also published a template for the “sufficiently detailed summary” that GPAI developers are now mandated to publish about the data used to train their AI models. While the template falls short of expectations in many respects, it does in parts mirror recommendations made by Mozilla building on our work in partnership with Open Future over the past year.

With additional rules taking effect over the course of the coming years and the European Commission building up its capacity to enforce them, work on the AI Act is not over — it is entering a new phase. Amid discussions of regulatory simplification, a potential revision of the AI Act in the context of the EU’s omnibus, and calls to “stop the clock” on enforcing the AI Act, Mozilla will continue its work to help make the AI Act’s implementation a success. This is grounded in our conviction that good regulation and innovation aren’t inherently contradictory, but rather complements when it comes to steering innovation in a direction that is beneficial to all.