In his recent blogpost, “Licenses are Not Proxies for Openness in AI Models,” Mike Weinberg argues that the openness of AI models cannot be reduced to just their licensing.
Weinberg references his response to the consultation by the US National Telecommunications and Information Administration on “open foundational models”:
[the response] focused on a comparatively narrow issue: whether or not it makes sense to use licenses as an easy way to test for openness in the context of AI models. I argued that it does not, at least not right now.
Weinberg suggests that licensing can function as a proxy for openness when the thing being licensed is discrete, and the licenses used are mature – that is, broadly considered as meeting a standard of openness. Neither of the two conditions is met in the space of AI development.
In the European Union, the newly adopted AI Act does exactly what Weinberg warns against: defines open-source AI using the proxy of a free license as “software and data, including models, released under a free and open-source licence.” As a result, it fails to consider other factors crucial to securing meaningful openness in the space of AI — such as the data transparency standard.
In our work, we also advocate for a more nuanced definition of sharing as not just a matter of open licensing. We have been referring to the concept of the commons as one that allows sharing to be defined in terms broader than just licensing.
Currently, there is little agreement among advocates of open licensing, practitioners, and collections’ stewards on this issue. Weinberg writes that
any definition of open should require a more complex analysis than simply looking at a license.
Weinberg frames his arguments in terms of the complexity and maturity of a given field of open and argues that openness is more than just licensing in the more complex or less mature spaces. Our research shows that the need for more robust definitions of openness emerges all across the fields of open.
According to Weinberg, more complex forms of openness are more commonly seen in hardware than software. And for him AI models are “much more like open hardware than open software.” The full response is worth reading for the full argument on lessons learned from establishing a standard of openness for hardware — as Mike Weinberg is both a leading legal expert and advocate of open hardware.